I think it's just a perspective shift. The main idea is that you can't ever measure a real number, only an approximation to one, so if two values differ by less than the resolution of your measurement they are effectively the same. For example consider the derivative f(x+dx) = f(x) + f'(x) dx + O(dx^2). The analysis version of the derivative says that in the limit dx -> 0 the O(dx^2) part vanishes and so the limit [f(x+dx)-f(x)]/dx = f'(x). The 'finitist' version would be something like: for a sufficiently small dx, the third term is of order dx^2, so pick a value of dx small enough that dx^2 is below your 'resolution', and then the derivative f'(x) is indistinguishable from [f(x+dx)-f(x)]/dx, without a reference to the concept of a limit.