19  Limits, issues, extensions of the concept

This section uses the following add-on packages:

using CalculusWithJulia
using Plots
using SymPy

The limit of a function at \(c\) need not exist for one of many different reasons. Some of these reasons can be handled with extensions to the concept of the limit, others are just problematic in terms of limits. This section covers examples of each.

Let’s begin with a function that is just problematic. Consider

\[ f(x) = \sin(1/x) \]

As this is a composition of nice functions it will have a limit everywhere except possibly when \(x=0\), as then \(1/x\) may not have a limit. So rather than talk about where it is nice, let’s consider the question of whether a limit exists at \(c=0\).

A graph shows the issue:

The graph oscillates between \(-1\) and \(1\) infinitely many times on this interval - so many times, that no matter how close one zooms in, the graph on the screen will fail to capture them all. Graphically, there is no single value of \(L\) that the function gets close to, as it varies between all the values in \([-1,1]\) as \(x\) gets close to \(0\). A simple proof that there is no limit, is to take any \(\epsilon\) less than \(1\), then with any \(\delta > 0\), there are infinitely many \(x\) values where \(f(x)=1\) and infinitely many where \(f(x) = -1\). That is, there is no \(L\) with \(|f(x) - L| < \epsilon\) when \(\epsilon\) is less than \(1\) for all \(x\) near \(0\).

This function basically has too many values it gets close to. Another favorite example of such a function is the function that is \(0\) if \(x\) is rational and \(1\) if not. This function will have no limit anywhere, not just at \(0\), and for basically the same reason as above.

The issue isn’t oscillation though. Take, for example, the function \(f(x) = x \cdot \sin(1/x)\). This function again has a limit everywhere save possibly \(0\). But in this case, there is a limit at \(0\) of \(0\). This is because, the following is true:

\[ -|x| \leq x \sin(1/x) \leq |x|. \]

The following figure illustrates:

f(x) = x * sin(1/x)
plot(f, -1, 1)
plot!(abs)
plot!(x -> -abs(x))

The squeeze theorem of calculus is the formal reason \(f\) has a limit at \(0\), as both the upper function, \(|x|\), and the lower function, \(-|x|\), have a limit of \(0\) at \(0\).

19.1 Right and left limits

Another example where \(f(x)\) has no limit is the function \(f(x) = x /|x|, x \neq 0\). This function is \(-1\) for negative \(x\) and \(1\) for positive \(x\). Again, this function will have a limit everywhere except possibly at \(x=0\), where division by \(0\) is possible.

It’s graph is

f(x) = abs(x)/x
plot(f, -2, 2)

The sharp jump at \(0\) is misleading - again, the plotting algorithm just connects the points, it doesn’t handle what is a fundamental discontinuity well - the function is not defined at \(0\) and jumps from \(-1\) to \(1\) there. Similarly to our example of \(\sin(1/x)\), near \(0\) the function get’s close to both \(1\) and \(-1\), so will have no limit. (Again, just take \(\epsilon\) smaller than \(1\).)

But unlike the previous example, this function would have a limit if the definition didn’t consider values of \(x\) on both sides of \(c\). The limit on the right side would be \(1\), the limit on the left side would be \(-1\). This distinction is useful, so there is an extension of the idea of a limit to one-sided limits.

Let’s loosen up the language in the definition of a limit to read:

The limit of \(f(x)\) as \(x\) approaches \(c\) is \(L\) if for every neighborhood, \(V\), of \(L\) there is a neighborhood, \(U\), of \(c\) for which \(f(x)\) is in \(V\) for every \(x\) in \(U\), except possibly \(x=c\).

The \(\epsilon-\delta\) definition has \(V = (L-\epsilon, L + \epsilon)\) and \(U=(c-\delta, c+\delta)\). This is a rewriting of \(L-\epsilon < f(x) < L + \epsilon\) as \(|f(x) - L| < \epsilon\).

Now for the defintion:

A function \(f(x)\) has a limit on the right of \(c\), written \(\lim_{x \rightarrow c+}f(x) = L\) if for every \(\epsilon > 0\), there exists a \(\delta > 0\) such that whenever \(0 < x - c < \delta\) it holds that \(|f(x) - L| < \epsilon\). That is, \(U\) is \((c, c+\delta)\)

Similarly, a limit on the left is defined where \(U=(c-\delta, c)\).

The SymPy function limit has a keyword argument dir="+" or dir="-" to request that a one-sided limit be formed. The default is dir="+". Passing dir="+-" will compute both one side limits, and throw an error if the two are not equal, in agreement with no limit existing.

@syms x
(x,)
f(x) = abs(x)/x
limit(f(x), x=>0, dir="+"), limit(f(x), x=>0, dir="-")
(1, -1)
Warning

That means the mathematical limit need not exist when SymPy’s limit returns an answer, as SymPy is only carrying out a one sided limit. Explicitly passing dir="+-" or checking that both limit(ex, x=>c) and limit(ex, x=>c, dir="-") are equal would be needed to confirm a limit exists mathematically.

The relation between the two concepts is that a function has a limit at \(c\) if and only if the left and right limits exist and are equal. This function \(f\) has both existing, but the two limits are not equal.

There are other such functions that jump. Another useful one is the floor function, which just rounds down to the nearest integer. A graph shows the basic shape:

plot(floor, -5,5)

Again, the (nearly) vertical lines are an artifact of the graphing algorithm and not actual points that solve \(y=f(x)\). The floor function has limits except at the integers. There the left and right limits differ.

Consider the limit at \(c=0\). If \(0 < x < 1/2\), say, then \(f(x) = 0\) as we round down, so the right limit will be \(0\). However, if \(-1/2 < x < 0\), then the \(f(x) = -1\), again as we round down, so the left limit will be \(-1\). Again, with this example both the left and right limits exists, but at the integer values they are not equal, as they differ by 1.

Some functions only have one-sided limits as they are not defined in an interval around \(c\). There are many examples, but we will take \(f(x) = x^x\) and consider \(c=0\). This function is not well defined for all \(x < 0\), so it is typical to just take the domain to be \(x > 0\). Still it has a right limit \(\lim_{x \rightarrow 0+} x^x = 1\). SymPy can verify:

limit(x^x, x=>0, dir="+")
\[ 1 \]

This agrees with the IEEE convention of assigning 0^0 to be 1.

However, not all such functions with indeterminate forms of \(0^0\) will have a limit of \(1\).

Example

Consider this funny graph:

Describe the limits at \(-1\), \(0\), and \(1\).

  • At \(-1\) we see a jump, there is no limit but instead a left limit of 1 and a right limit appearing to be \(1/2\).
  • At \(0\) we see a limit of \(1\).
  • Finally, at \(1\) again there is a jump, so no limit. Instead the left limit is about \(-1\) and the right limit \(1\).

19.2 Limits at infinity

The loose definition of a horizontal asymptote is “a line such that the distance between the curve and the line approaches \(0\) as they tend to infinity.” This sounds like it should be defined by a limit. The issue is, that the limit would be at \(\pm\infty\) and not some finite \(c\). This requires the idea of a neighborhood of \(c\), \(0 < |x-c| < \delta\) to be reworked.

The basic idea for a limit at \(+\infty\) is that for any \(\epsilon\), there exists an \(M\) such that when \(x > M\) it must be that \(|f(x) - L| < \epsilon\). For a horizontal asymptote, the line would be \(y=L\). Similarly a limit at \(-\infty\) can be defined with \(x < M\) being the condition.

Let’s consider some cases.

The function \(f(x) = \sin(x)\) will not have a limit at \(+\infty\) for exactly the same reason that \(f(x) = \sin(1/x)\) does not have a limit at \(c=0\) - it just oscillates between \(-1\) and \(1\) so never eventually gets close to a single value.

SymPy gives an odd answer here indicating the range of values:

limit(sin(x), x => oo)
\[ \left\langle -1, 1\right\rangle \]

(We used SymPy’s oo for \(\infty\) and not Inf.)


However, a damped oscillation, such as \(f(x) = e^{-x} \sin(x)\) will have a limit:

limit(exp(-x)*sin(x), x => oo)
\[ 0 \]

We have rational functions will have the expected limit. In this example \(m = n\), so we get a horizontal asymptote that is not \(y=0\):

limit((x^2 - 2x +2)/(4x^2 + 3x - 2), x=>oo)
\[ \frac{1}{4} \]

Though rational functions can have only one (at most) horizontal asymptote, this isn’t true for all functions. Consider the following \(f(x) = x / \sqrt{x^2 + 4}\). It has different limits depending if \(x\) goes to \(\infty\) or negative \(\infty\):

f(x) = x / sqrt(x^2 + 4)
limit(f(x), x=>oo), limit(f(x), x=>-oo)
(1, -1)

(A simpler example showing this behavior is just the function \(x/|x|\) considered earlier.)

Example: Limits at infinity and right limits at \(0\)

Given a function \(f\) the question of whether this exists:

\[ \lim_{x \rightarrow \infty} f(x) \]

can be reduced to the question of whether this limit exists:

\[ \lim_{x \rightarrow 0+} f(1/x) \]

So whether \(\lim_{x \rightarrow 0+} \sin(1/x)\) exists is equivalent to whether \(\lim_{x\rightarrow \infty} \sin(x)\) exists, which clearly does not due to the oscillatory nature of \(\sin(x)\).

Similarly, one can make this reduction

\[ \lim_{x \rightarrow c+} f(x) = \lim_{x \rightarrow 0+} f(c + x) = \lim_{x \rightarrow \infty} f(c + \frac{1}{x}). \]

That is, right limits can be analyzed as limits at \(\infty\) or right limits at \(0\), should that prove more convenient.

19.3 Limits of infinity

Vertical asymptotes are nicely defined with horizontal asymptotes by the graph getting close to some line. However, the formal definition of a limit won’t be the same. For a vertical asymptote, the value of \(f(x)\) heads towards positive or negative infinity, not some finite \(L\). As such, a neighborhood like \((L-\epsilon, L+\epsilon)\) will no longer make sense, rather we replace it with an expression like \((M, \infty)\) or \((-\infty, M)\). As in: the limit of \(f(x)\) as \(x\) approaches \(c\) is infinity if for every \(M > 0\) there exists a \(\delta>0\) such that if \(0 < |x-c| < \delta\) then \(f(x) > M\). Approaching \(-\infty\) would conclude with \(f(x) < -M\) for all \(M>0\).

Examples

Consider the function \(f(x) = 1/x^2\). This will have a limit at every point except possibly \(0\), where division by \(0\) is possible. In this case, there is a vertical asymptote, as seen in the following graph. The limit at \(0\) is \(\infty\), in the extended sense above. For \(M>0\), we can take any \(0 < \delta < 1/\sqrt{M}\). The following graph shows \(M=25\) where the function values are outside of the box, as \(f(x) > M\) for those \(x\) values with \(0 < |x-0| < 1/\sqrt{M}\).


The function \(f(x)=1/x\) requires us to talk about left and right limits of infinity, with the natural generalization. We can see that the left limit at \(0\) is \(-\infty\) and the right limit \(\infty\):

SymPy agrees:

f(x) = 1/x
limit(f(x), x=>0, dir="-"), limit(f(x), x=>0, dir="+")
(-oo, oo)

Consider the function \(g(x) = x^x(1 + \log(x)), x > 0\). Does this have a right limit at \(0\)?

A quick graph shows that a limit may be \(-\infty\):

g(x) = x^x * (1 + log(x))
plot(g, 1/100, 1)

We can check with SymPy:

limit(g(x), x=>0, dir="+")
\[ -\infty \]

19.4 Limits of sequences

After all this, we still can’t formalize the basic question asked in the introduction to limits: what is the area contained in a parabola. For that we developed a sequence of sums: \(s_n = 1/2 \cdot((1/4)^0 + (1/4)^1 + (1/4)^2 + \cdots + (1/4)^n)\). This isn’t a function of \(x\), but rather depends only on non-negative integer values of \(n\). However, the same idea as a limit at infinity can be used to define a limit.

Let \(a_0,a_1, a_2, \dots, a_n, \dots\) be a sequence of values indexed by \(n\). We have \(\lim_{n \rightarrow \infty} a_n = L\) if for every \(\epsilon > 0\) there exists an \(M>0\) where if \(n > M\) then \(|a_n - L| < \epsilon\).

Common language is the sequence converges when the limit exists and otherwise diverges.

The above is essentially the same as a limit at infinity for a function, but in this case the function’s domain is only the non-negative integers.

SymPy is happy to compute limits of sequences. Defining this one involving a sum is best done with the summation function:

@syms i::integer n::(integer, positive)
s(n) = 1//2 * summation((1//4)^i, (i, 0, n))    # rationals make for an exact answer
limit(s(n), n=>oo)
\[ \frac{2}{3} \]
Example

The limit

\[ \lim_{x \rightarrow 0} \frac{e^x - 1}{x} = 1, \]

is an important limit. Using the definition of \(e^x\) by an infinite sequence:

\[ e^x = \lim_{n \rightarrow \infty} (1 + \frac{x}{n})^n, \]

we can establish the limit using the squeeze theorem. First,

\[ A = |(1 + \frac{x}{n})^n - 1 - x| = |\Sigma_{k=0}^n {n \choose k}(\frac{x}{n})^k - 1 - x| = |\Sigma_{k=2}^n {n \choose k}(\frac{x}{n})^k|, \]

the first two sums cancelling off. The above comes from the binomial expansion theorem for a polynomial. Now \({n \choose k} \leq n^k\)so we have

\[ A \leq \Sigma_{k=2}^n |x|^k = |x|^2 \frac{1 - |x|^{n+1}}{1 - |x|} \leq \frac{|x|^2}{1 - |x|}. \]

using the geometric sum formula with \(x \approx 0\) (and not \(1\)):

@syms x n i
summation(x^i, (i,0,n))
\[ \begin{cases} n + 1 & \text{for}\: x = 1 \\\frac{1 - x^{n + 1}}{1 - x} & \text{otherwise} \end{cases} \]

As this holds for all \(n\), as \(n\) goes to \(\infty\) we have:

\[ |e^x - 1 - x| \leq \frac{|x|^2}{1 - |x|} \]

Dividing both sides by \(x\) and noting that as \(x \rightarrow 0\), \(|x|/(1-|x|)\) goes to \(0\) by continuity, the squeeze theorem gives the limit:

\[ \lim_{x \rightarrow 0} \frac{e^x -1}{x} - 1 = 0. \]

That \({n \choose k} \leq n^k\) can be viewed as the left side counts the number of combinations of \(k\) choices from \(n\) distinct items, which is less than the number of permutations of \(k\) choices, which is less than the number of choices of \(k\) items from \(n\) distinct ones without replacement – what \(n^k\) counts.

19.4.1 Some limit theorems for sequences

The limit discussion first defined limits of scalar univariate functions at a point \(c\) and then added generalizations. The pedagogical approach can be reversed by starting the discussion with limits of sequences and then generalizing from there. This approach relies on a few theorems to be gathered along the way that are mentioned here for the curious reader:

  • Convergent sequences are bounded.
  • All bounded monotone sequences converge.
  • Every bounded sequence has a convergent subsequence. (Bolzano-Weierstrass)
  • The limit of \(f\) at \(c\) exists and equals \(L\) if and only if for every sequence \(x_n\) in the domain of \(f\) converging to \(c\) the sequence \(s_n = f(x_n)\) converges to \(L\).

19.5 Summary

The following table captures the various changes to the definition of the limit to accommodate some of the possible behaviors.

Type Notation V U
limit \(\lim_{x\rightarrow c}f(x) = L\) \((L-\epsilon, L+\epsilon)\) \((c - \delta, c+\delta)\)
right limit \(\lim_{x\rightarrow c+}f(x) = L\) \((L-\epsilon, L+\epsilon)\) \((c, c+\delta)\)
left limit \(\lim_{x\rightarrow c-}f(x) = L\) \((L-\epsilon, L+\epsilon)\) \((c - \delta, c)\)
limit at \(\infty\) \(\lim_{x\rightarrow \infty}f(x) = L\) \((L-\epsilon, L+\epsilon)\) \((M, \infty)\)
limit at \(-\infty\) \(\lim_{x\rightarrow -\infty}f(x) = L\) \((L-\epsilon, L+\epsilon)\) \((-\infty, M)\)
limit of \(\infty\) \(\lim_{x\rightarrow c}f(x) = \infty\) \((M, \infty)\) \((c - \delta, c+\delta)\)
limit of \(-\infty\) \(\lim_{x\rightarrow c}f(x) = -\infty\) \((-\infty, M)\) \((c - \delta, c+\delta)\)
limit of a sequence \(\lim_{n \rightarrow \infty} a_n = L\) \((L-\epsilon, L+\epsilon)\) \((M, \infty)\)

Ross summarizes this by enumerating the 15 different related definitions for \(\lim_{x \rightarrow a} f(x) = L\) that arise from \(L\) being either finite, \(-\infty\), or \(+\infty\) and \(a\) being any of \(c\), \(c-\), \(c+\), \(-\infty\), or \(+\infty\).

19.6 Rates of growth

Consider two functions \(f\) and \(g\) to be comparable if there are positive integers \(m\) and \(n\) with both

\[ \lim_{x \rightarrow \infty} \frac{f(x)^m}{g(x)} = \infty \quad\text{and } \lim_{x \rightarrow \infty} \frac{g(x)^n}{f(x)} = \infty. \]

The first says \(g\) is eventually bounded by a power of \(f\), the second that \(f\) is eventually bounded by a power of \(g\).

Here we consider which families of functions are comparable.

First consider \(f(x) = x^3\) and \(g(x) = x^4\). We can take \(m=2\) and \(n=1\) to verify \(f\) and \(g\) are comparable:

fx, gx = x^3, x^4
limit(fx^2/gx, x=>oo), limit(gx^1 / fx, x=>oo)
(oo, oo)

Similarly for any pairs of powers, so we could conclude \(f(x) = x^n\) and \(g(x) =x^m\) are comparable. (However, as is easily observed, for \(m\) and \(n\) both positive integers \(\lim_{x \rightarrow \infty} x^{m+n}/x^m = \infty\) and \(\lim_{x \rightarrow \infty} x^{m}/x^{m+n} = 0\), consistent with our discussion on rational functions that higher-order polynomials dominate lower-order polynomials.)

Now consider \(f(x) = x\) and \(g(x) = \log(x)\). These are not compatible as there will be no \(n\) large enough. We might say \(x\) dominates \(\log(x)\).

limit(log(x)^n / x, x => oo)
\[ 0 \]

As \(x\) could be replaced by any monomial \(x^k\), we can say “powers” grow faster than “logarithms”.

Now consider \(f(x)=x\) and \(g(x) = e^x\). These are not compatible as there will be no \(m\) large enough:

@syms m::(positive, integer)
limit(x^m / exp(x), x => oo)
\[ 0 \]

That is \(e^x\) grows faster than any power of \(x\).

Now, if \(a, b > 1\) then \(f(x) = a^x\) and \(g(x) = b^x\) will be comparable. Take \(m\) so that \(a^m > b\) and \(n\) so that \(b^n > a\) as then, say,

\[ \frac{(a^x)^m}{b^x} = \frac{a^{xm}}{b^x} = \frac{(a^m)^x}{b^x} = (\frac{a^m}{b})^x, \]

which will go to \(\infty\) as \(x \rightarrow \infty\) as \(a^m/b > 1\).

Finally, consider \(f(x) = \exp(x^2)\) and \(g(x) = \exp(x)^2\). Are these comparable? No, as no \(n\) is large enough:

@syms x n::(positive, integer)
fx, gx = exp(x^2), exp(x)^2
limit(gx^n / fx, x => oo)
\[ 0 \]

A negative test for compatability is the following: if

\[ \lim_{x \rightarrow \infty} \frac{\log(|f(x)|)}{\log(|g(x)|)} = 0, \]

Then \(f\) and \(g\) are not compatible (and \(g\) grows faster than \(f\)). Applying this to the last two values of \(f\) and \(g\), we have

\[ \lim_{x \rightarrow \infty}\frac{\log(\exp(x)^2)}{\log(\exp(x^2))} = \lim_{x \rightarrow \infty}\frac{2\log(\exp(x))}{x^2} = \lim_{x \rightarrow \infty}\frac{2x}{x^2} = 0, \]

so \(f(x) = \exp(x^2)\) grows faster than \(g(x) = \exp(x)^2\).


Keeping in mind that logarithms grow slower than powers which grow slower than exponentials (\(a > 1\)) can help understand growth at \(\infty\) as a comparison of leading terms does for rational functions.

We can immediately put this to use to compute \(\lim_{x\rightarrow 0+} x^x\). We first express this problem using \(x^x = (\exp(\ln(x)))^x = e^{x\ln(x)}\). Rewriting \(u(x) = \exp(\ln(u(x)))\), which only uses the basic inverse relation between the two functions, can often be a useful step.

As \(f(x) = e^x\) is a suitably nice function (continuous) so that the limit of a composition can be computed through the limit of the inside function, \(x\ln(x)\), it is enough to see what \(\lim_{x\rightarrow 0+} x\ln(x)\) is. We re-express this as a limit at \(\infty\)

\[ \lim_{x\rightarrow 0+} x\ln(x) = \lim_{x \rightarrow \infty} (1/x)\ln(1/x) = \lim_{x \rightarrow \infty} \frac{-\ln(x)}{x} = 0 \]

The last equality follows, as the function \(x\) dominates the function \(\ln(x)\). So by the limit rule involving compositions we have: \(\lim_{x\rightarrow 0+} x^x = e^0 = 1\).

19.7 Questions

Question

Select the graph for which the limit at \(a\) is infinite.

Image for hotspot selection
Question

Select the graph for which the limit at \(\infty\) appears to be defined.

Image for hotspot selection
Question

Consider the function \(f(x) = \sqrt{x}\).

Does this function have a limit at every \(c > 0\)?

Select an item

Does this function have a limit at \(c=0\)?

Select an item

Does this function have a right limit at \(c=0\)?

Select an item

Does this function have a left limit at \(c=0\)?

Select an item
Question

Find \(\lim_{x \rightarrow \infty} \sin(x)/x\).


Question

Find \(\lim_{x \rightarrow \infty} (1-\cos(x))/x^2\).


Question

Find \(\lim_{x \rightarrow \infty} \log(x)/x\).


Question

Find \(\lim_{x \rightarrow 2+} (x-3)/(x-2)\).

Select an item

Find \(\lim_{x \rightarrow -3-} (x-3)/(x+3)\).

Select an item
Question

Let \(f(x) = \exp(x + \exp(-x^2))\) and \(g(x) = \exp(-x^2)\). Compute:

\[ \lim_{x \rightarrow \infty} \frac{\ln(f(x))}{\ln(g(x))}. \]


Question

Consider the following expression:

ex = 1/(exp(-x + exp(-x))) - exp(x)
\[ - e^{x} + e^{x - e^{- x}} \]

We want to find the limit, \(L\), as \(x \rightarrow \infty\), which we assume exists below.

We first rewrite ex using w as exp(-x):

@syms w
ex1 = ex(exp(-x) => w)
\[ - \frac{1}{w} + \frac{e^{- w}}{w} \]

As \(x \rightarrow \infty\), \(w \rightarrow 0+\), so the limit at \(0+\) of ex1 is of interest.

Use this fact, to find \(L\)

limit(ex1 - (w/2 - 1), w=>0)
\[ 0 \]

\(L\) is:


(This awkward approach is generalizable: replacing the limit as \(w \rightarrow 0\) of an expression with the limit of a polynomial in w that is easy to identify.)

Question

As mentioned, for limits that depend on specific values of parameters SymPy may have issues. As an example, SymPy has an issue with this limit, whose answer depends on the value of \(k\)

\[ \lim_{x \rightarrow 0+} \frac{\sin(\sin(x^2))}{x^k}. \]

Note, regardless of \(k\) you find:

@syms x::real k::integer
limit(sin(sin(x^2))/x^k, x=>0)
LoadError: PyError ($(Expr(:escape, :(ccall(#= /Users/jverzani/.julia/packages/PyCall/1gn3u/src/pyfncall.jl:43 =# @pysym(:PyObject_Call), PyPtr, (PyPtr, PyPtr, PyPtr), o, pyargsptr, kw))))) <class 'NotImplementedError'>
NotImplementedError('Not sure of sign of 2 - k')
  File "/Users/jverzani/.julia/conda/3/x86_64/lib/python3.10/site-packages/sympy/series/limits.py", line 64, in limit
    return Limit(e, z, z0, dir).doit(deep=False)
  File "/Users/jverzani/.julia/conda/3/x86_64/lib/python3.10/site-packages/sympy/series/limits.py", line 363, in doit
    raise NotImplementedError("Not sure of sign of %s" % ex)

For which value(s) of \(k\) in \(1,2,3\) is this actually the correct answer? (Do the above \(3\) times using a specific value of k, not a numeric one.

Select an item
Question: No limit

Some functions do not have a limit. Make a graph of \(\sin(1/x)\) from \(0.0001\) to \(1\) and look at the output. Why does a limit not exist?

Select an item
Question \(0^0\) is not always \(1\)

Is the form \(0^0\) really indeterminate? As mentioned 0^0 evaluates to 1.

Consider this limit:

\[ \lim_{x \rightarrow 0+} x^{k\cdot x} = L. \]

Consider different values of \(k\) to see if this limit depends on \(k\) or not. What is \(L\)?

Select an item

Now, consider this limit:

\[ \lim_{x \rightarrow 0+} x^{1/\log_k(x)} = L. \]

In julia, \(\log_k(x)\) is found with log(k,x). The default, log(x) takes \(k=e\) so gives the natural log. So, we would define h, for a given k, with

k = 10              # say. Replace with actual value
h(x) = x^(1/log(k, x))
h (generic function with 1 method)

Consider different values of \(k\) to see if the limit depends on \(k\) or not. What is \(L\)?

Select an item
Question

Limits of infinity at infinity. We could define this concept quite easily mashing together the two definitions. Suppose we did. Which of these ratios would have a limit of infinity at infinity:

\[ x^4/x^3,\quad x^{100+1}/x^{100}, \quad x/\log(x), \quad 3^x / 2^x, \quad e^x/x^{100} \]

Select an item
Question

A slant asymptote is a line \(mx + b\) for which the graph of \(f(x)\) gets close to as \(x\) gets large. We can’t express this directly as a limit, as “\(L\)” is not a number. How might we?

Select an item
Question

Suppose a sequence of points \(x_n\) converges to \(a\) in the limiting sense. For a function \(f(x)\), the sequence of points \(f(x_n)\) may or may not converge. One alternative definition of a limit due to Heine is that \(\lim_{x \rightarrow a}f(x) = L\) if and only if all sequences \(x_n \rightarrow a\) have \(f(x_n) \rightarrow L\).

Consider the function \(f(x) = \sin(1/x)\), \(a=0\), and the two sequences implicitly defined by \(1/x_n = \pi/2 + n \cdot (2\pi)\) and \(y_n = 3\pi/2 + n \cdot(2\pi)\), \(n = 0, 1, 2, \dots\).

What is \(\lim_{x_n \rightarrow 0} f(x_n)\)?


What is \(\lim_{y_n \rightarrow 0} f(y_n)\)?


This shows that

Select an item