We can prove that a limit takes on a given value using the epsilon-delta definition, but doing so proves cumbersome, even for simple functions. What's worse is the fact that doing so requires you know what the limit value should be before you start! Many times, it is obvious if we have some familiarity with the graph of the function. However, at other times it may not be so clear. For example, you probably have not graphed the following function before. $$f(x)=x-\sqrt{x}+\frac{\sin^2(3x)}{2x^2}$$ Without any ability to find out what the graph looked like, could you still determine the value of the following limit? $$\lim_{x \rightarrow 0} \left[ x-\sqrt{x}+\frac{\sin^2(3x)}{2x^2} \right]$$ To address this concern, we can take a lesson from the manufacturing industry. In manufacturing, one takes simple materials and combines them to form something more complex.
Similarly, we can start with simple functions -- and if we understand how to find limits of these simple functions, as well as how the limits of combinations of functions relate to the limits and values of the original functions -- then we should be able to find limits of arbitrarily complex combinations of these simpler functions.
The function in the example above is a prime example -- it is "built" from simpler functions ($\sin x$, $3x$, $x^2$, $\sqrt{x}$, etc...) which are then combined through the operations of addition, subtraction, multiplication, division, and composition.
Of course to find the limit of such a monster, we still need to know how to find limits for the simple functions involved, and we know how limits are affected by combining functions. Fortunately, we can use the epsilon-delta definition to establish these things, which collectively are commonly referred to as the "Limit Laws".
To be specific...
Using the epsilon-delta definition of a limit, we can prove all of the results below -- each one of which involves the limit of "simple function":
Limit of a Constant
$\displaystyle{\lim_{x \rightarrow c} \, a = a}$ (for any constant $a$)
Limit of the Identity Function
$\displaystyle{\lim_{x \rightarrow c} \, x = c}$
Limit of $n^{th}$ Root
$\displaystyle{\lim_{x \rightarrow c} \, \sqrt[n]{x} = \sqrt[n]{c}}$ (for any positive integer $n$, assuming $c>0$ when $n$ is even). Also, $\displaystyle{\lim_{x \rightarrow 0^+} \, \sqrt[n]{x} = 0}$ when $n$ is even.
Limit of Absolute Value
$\displaystyle{\lim_{x \rightarrow c} \, |x| = |c|}$
Limit of a Sine
$\displaystyle{\lim_{x \rightarrow c} \, \sin(x) = \sin(c)}$
Limit of a Cosine
$\displaystyle{\lim_{x \rightarrow c} \, \cos(x) = \cos(c)}$
Limit of a Natural Log
$\displaystyle{\lim_{x \rightarrow c} \, \ln(x) = \ln(c)}$ (for any $c \gt 0$ -- otherwise the limit fails to exist)
Limit of an Exponential
$\displaystyle{\lim_{x \rightarrow c} \, e^x = e^c}$
As straight-forward as the above appear, the epsilon-delta proofs of some of these results are not at all obvious (and sometimes, quite complicated). If you are curious about the details, you might try finding a calculus textbook and looking in the appendix.
We can also prove the following results regarding limits of "combinations of functions" using the epsilon-delta definition:
Assuming that $\lim_{x \rightarrow c} f(x)$ and $\lim_{x \rightarrow c} g(x)$ exist:Limit of a Sum
$\displaystyle{\lim_{x \rightarrow c} \, [f(x) + g(x)] = \lim_{x \rightarrow c} f(x) + \lim_{x \rightarrow c} g(x)}$
Limit of a Difference
$\displaystyle{\lim_{x \rightarrow c} \, [f(x) - g(x)] = \lim_{x \rightarrow c} f(x) - \lim_{x \rightarrow c} g(x)}$
Limit of a Product
$\displaystyle{\lim_{x \rightarrow c} \, [f(x) \cdot g(x)] = \lim_{x \rightarrow c} f(x) \cdot \lim_{x \rightarrow c} g(x)}$
Limit of a Quotient
$\displaystyle{\lim_{x \rightarrow c} \, \frac{f(x)}{g(x)} = \frac{\displaystyle{\lim_{x \rightarrow c} \, f(x)}}{\displaystyle{\lim_{x \rightarrow c} \, g(x)}} \quad \textrm{, if } \lim_{x \rightarrow c} \, g(x) \neq 0}$
Limit of a Continuous Composition
If $\displaystyle{\lim_{x \rightarrow c} \, g(x) = b}$ and $\displaystyle{\lim_{x \rightarrow b} \, f(x) = f(b)}$, then $\displaystyle{\lim_{x \rightarrow c} \, f(g(x)) = f(\lim_{x \rightarrow c} \, g(x))}$
If you are wondering about the name of this limit law -- we'll soon discuss continuity in depth -- but for now, know that if $\displaystyle{\lim_{x \rightarrow b} \, f(x) = f(b)}$, we say that $f$ is continuous at $b$.
Again, we'll have more to say about continuity later, but a function whose limit at any $c$ can be evaluated by evaluating the function at that value is simply called continuous (as opposed to being continuous at some specific $b$).
With this in mind, the following is a very useful and immediate consequence of the last limit law introduced:
Limit of a Continuous Function
$\displaystyle{\textrm{If } \lim_{x \rightarrow c} \, f(x) = f(c) \textrm{ for all real values $c$, then $\lim_{x \rightarrow a} \, f(g(x)) = f(\lim_{x \rightarrow a} \, g(x))$}}$
Importantly, the above result gives us the capability to "pull out" a continuous function from inside a limit.
Other useful theorems can also be found as direct consequences of the above limit laws. For example, combining the constant rule and the limit of a product, we find
Scalar Product Rule
$\displaystyle{\lim_{x \rightarrow c} \, [a f(x)] = a \lim_{x \rightarrow c} \, g(x)} \quad \textrm{(for any constant $a$)}$
Power Rule
$\lim_{x \rightarrow c} \, [f(x)]^n = \left[ \lim_{x \rightarrow c} f(x) \right]^n \quad (\textrm{for any positive integer } n)$
A closely related result tells us how to find the limit of the $n^{th}$ root of an expression:
Root Rule
$\displaystyle{\lim_{x \rightarrow c} \sqrt[n]{f(x)} = \sqrt[n]{\lim_{x \rightarrow c} f(x)} \quad \textrm{(where } \lim_{x \rightarrow c} f(x) \ge 0 \textrm{ if } n \textrm{ is even)}}$
Limit of a Polynomial
$\displaystyle{\lim_{x \rightarrow c} \, (a_n x^n + \cdots + a_2 x^2 + a_1 x + a_0) = a_n c^n + \cdots + a_2 c^2 + a_1 c + a_0}$
A similar result holds for rational functions $q(x)$, which are quotients of polynomial functions, upon consideration of the limit of a polynomial function in conjunction with the limit of a quotient:
Limit of a Rational Function
$\displaystyle{\lim_{x \rightarrow c} \, q(x) = q(c) \quad \textrm{whenever $q(x)$ is a rational function with no domain issue at $x=c$}}$
Recall, that we can combine functions in ways other than addition, subtraction, multiplication, division, and composition, in order to form new functions.
One very common way to do this is through piecewise-defined functions. When trying to find the limit of a piecewise-defined function -- especially when the limit is taken at a value on the border between "pieces" -- one must be careful to ensure the left and right limiting values (evaluated independently) agree. That is to say, we must remember that
$$\lim_{x \rightarrow c} \, f(x) = L \textrm{ if and only if } \lim_{x \rightarrow c^-} \, f(x) = \lim_{x \rightarrow c^+} \, f(x)$$At other times, we may not be able to "build" the function related to the limit in question from simpler pieces, but we may be able to relate it to some other function, or "bound" it between two functions, whose limits are easier to find. To these ends, the following results prove very useful:
If $f(x)=g(x)$ everywhere except at $x=c$, then $\displaystyle{\lim_{x \rightarrow c} f(x) = \lim_{x \rightarrow c} g(x)}$
The Squeeze Theorem
If $f(x) \le g(x) \le h(x)$ when $x$ is sufficiently close to $c$ (except possibly at $x=c$), and
$$\lim_{x \rightarrow c} \, f(x) = \lim_{x \rightarrow c} \, h(x) = L$$
then
$$\lim_{x \rightarrow c} \, g(x) = L$$
We can add one more "simple function" limit to our list as a direct (but not necessarily obvious) result of the Squeeze Theorem mentioned above: $$\lim_{x \rightarrow 0} \, \frac{\sin x}{x} = 1$$ This may seem like a peculiar result, whose application would be fairly limited -- but amazingly, this will get used way more often than you might think! We'll prove it soon...