As Abraham Lincoln would know, limits are a very integral (no pun intended) part of calculus. Indeed, limits are the foundation of derivatives and integrals. But what is a limit? Given a function, f(x), the limit of f(x) as x approaches c is equal to L if and only if for any positive value, ε (epsilon), there is a positive number δ (delta) such that |f(x)-L|<ε if and only if 0<|x-c|<δ. What does all of this mean? Consider, for example, the function f(x) = 2x + 1 and the limit as x approaches 0. Let ε = .001. We don't know what the desired limit is, so let's try L=0. It should be true that |f(x)|<.001 whenever |x|<δ. When x=-.50005, f(x)=.0001, so