Often for problems taking integer input x, formal CS will define the input to be something like 1^x (the character '1' repeated x times, as opposed to binary) so that the time complexity is in terms of x. This class of problems seems amenable to sublinear time since only only needs log(x) steps to determine x.
Actually if you try to formally define sublinear time algorithms in this manner they all collapse to constant time algorithms.
To see why, realize that to determine x, the TM needs to look at x many bits. If this algorithm only needed, say, log(x) many bits to produce output then all input values with more than log(x) bits are indistinguishable by this TM.
Turing machines are still quite prevalent in complexity theory. Of course, robust classes like P and NP are invariant under various models, but also classes like DTIME and NTIME are defined for (single- or multi-tape) Turing machines [1].