1. Suppose that \(f(x)\) is a polynomial of degree \(r\). Then \(f^{(n)}(x)\) is identically zero when \(n > r\), and the theorem leads to the algebraical identity \[f(a + h) = f(a) + hf'(a) + \frac{h^{2}}{2!} f”(a) + \dots + \frac{h^{r}}{r!} f^{(r)}(a).\]
2. By applying the theorem to \(f(x) = 1/x\), and supposing \(x\) and \(x + h\) positive, obtain the result \[\frac{1}{x + h} = \frac{1}{x} – \frac{h}{x^{2}} + \frac{h^{2}}{x^{3}} – \dots + \frac{(-1)^{n-1} h^{n-1}}{x^{n}} + \frac{(-1)^{n} h^{n}}{(x + \theta_{n} h)^{n+1}}.\]
[Since
\[\frac{1}{x + h} = \frac{1}{x} – \frac{h}{x^{2}} + \frac{h^{2}}{x^{3}} – \dots + \frac{(-1)^{n-1} h^{n-1}}{x^{n}} + \frac{(-1)^{n} h^{n}}{x^{n}(x + h)},\quad\] we can verify the result by showing that
\(x^{n}(x + h)\) can be put in the form
\((x + \theta_{n}h)^{n+1}\), or that
\(x^{n+1} < x^{n}(x + h) < (x + h)^{n+1}\), as is evidently the case.]
3. Obtain the formula \[\begin{gathered} \sin(x + h) = \sin x + h\cos x – \frac{h^{2}}{2!}\sin x – h^{3}\frac{3!}\cos x + \dots\\ + (-1)^{n-1}\frac{h^{2n-1}}{(2n – 1)!}\cos x + (-1)^{n} h^{2n}\frac{2n!}\sin(x + \theta_{2n} h),\end{gathered}\] the corresponding formula for \(\cos(x + h)\), and similar formulae involving powers of \(h\) extending up to \(h^{2n+1}\).
4. Show that if \(m\) is a positive integer, and \(n\) a positive integer not greater than \(m\), then \[(x + h)^{m} = x^{m} + \binom{m}{1}x^{m-1} h + \dots + \binom{m}{n – 1}x^{m-n+1} h^{n-1} + \binom{m}{n}(x + \theta_{n} h)^{m-n} h^{n}.\] Show also that, if the interval \({[x, x + h]}\) does not include \(x = 0\), the formula holds for all real values of \(m\) and all positive integral values of \(n\); and that, even if \(x < 0 < x + h\) or \(x + h < 0 < x\), the formula still holds if \(m – n\) is positive.
5. The formula \(f(x + h) = f(x) + hf'(x + \theta_{1}h)\) is not true if \(f(x) = 1/x\) and \(x < 0 < x + h\). [For \(f(x + h) – f(x) > 0\) and \(hf'(x + \theta_{1} h) = -h/(x + \theta_{1} h)^{2} < 0\); it is evident that the conditions for the truth of the Mean Value Theorem are not satisfied.]
6. If \(x = -a\), \(h = 2a\), \(f(x) = x^{1/3}\), then the equation \[f(x + h) = f(x) + hf'(x + \theta_{1} h)\] is satisfied by \(\theta_{1} = \frac{1}{2} \pm\frac{1}{18}\sqrt{3}\). [This example shows that the result of the theorem may hold even if the conditions under which it was proved are not satisfied.]
7. Newton’s method of approximation to the roots of equations. Let \(\xi\) be an approximation to a root of an algebraical equation \(f(x) = 0\), the actual root being \(\xi + h\). Then \[0 = f(\xi + h) = f(\xi) + hf'(\xi) + \tfrac{1}{2} h^{2}f”(\xi + \theta_{2}h),\] so that \[h = -\frac{f(\xi)}{f'(\xi)} – \tfrac{1}{2} h^{2} \frac{f”(\xi + \theta_{2}h)}{f'(\xi)}.\]
It follows that in general a better approximation than \(x = \xi\) is \[x = \xi – \frac{f(\xi)}{f'(\xi)}.\] If the root is a simple root, so that \(f'(\xi + h) \neq 0\), we can, when \(h\) is small enough, find a positive constant \(K\) such that \(|f'(x)| > K\) for all the values of \(x\) which we are considering, and then, if \(h\) is regarded as of the first order of smallness, \(f(\xi)\) is of the first order of smallness, and the error in taking \(\xi – \{f(\xi)/f'(\xi)\}\) as the root is of the second order.
8. Apply this process to the equation \(x^{2} = 2\), taking \(\xi = 3/2\) as the first approximation. [We find \(h = -1/12\), \(\xi + h = 17/12 = 1.417\dots\), which is quite a good approximation, in spite of the roughness of the first. If now we repeat the process, taking \(\xi = 17/12\), we obtain \(\xi + h = 577/408 = 1.414\ 215\dots\), which is correct to \(5\) places of decimals.]
9. By considering in this way the equation \(x^{2} – 1 – y = 0\), where \(y\) is small, show that \(\sqrt{1 + y} = 1 + \frac{1}{2} y – \{\frac{1}{4}y^{2}/(2 + y)\}\) approximately, the error being of the fourth order.
10. Show that the error in taking the root to be \(\xi – (f/f’) – \frac{1}{2}(f^{2}f”/f’^{3})\), where \(\xi\) is the argument of every function, is in general of the third order.
11. The equation \(\sin x = \alpha x\), where \(\alpha\) is small, has a root nearly equal to \(\pi\). Show that \((1 – \alpha)\pi\) is a better approximation, and \((1 – \alpha + \alpha^{2})\pi\) a better still. [The method of Exs. 7–10 does not depend on \(f(x) = 0\) being an algebraical equation, so long as \(f’\) and \(f”\) are continuous.]
12. Show that the limit when \(h \to 0\) of the number \(\theta_{n}\) which occurs in the general Mean Value Theorem is \(1/(n + 1)\), provided that \(f^{(n+1)}(x)\) is continuous.
[For
\(f(x + h)\) is equal to each of
\[f(x) + \dots + \frac{h^{n}}{n!} f^{(n)}(x + \theta_{n}h),\quad f(x) + \dots + \frac{h^{n}}{n!} f^{(n)}(x) + \frac{h^{n+1}}{(n + 1)!} f^{(n+1)}(x + \theta_{n+1}h),\] where
\(\theta_{n+1}\) as well as
\(\theta_{n}\) lies between
\(0\) and
\(1\). Hence
\[f^{(n)}(x + \theta_{n}h) = f^{(n)}(x) + \frac{hf^{(n+1)}(x + \theta_{n+1}h)}{n + 1}.\] But if we apply the original Mean Value Theorem to the function
\(f^{(n)}(x)\), taking
\(\theta_{n}h\) in place of
\(h\), we find
\[f^{(n)}(x + \theta_{n}h) = f^{(n)}(x) + \theta_{n}hf^{(n+1)}(x + \theta\theta_{n}h),\] where
\(\theta\) also lies between
\(0\) and
\(1\). Hence
\[\theta_{n} f^{(n+1)}(x + \theta\theta_{n} h) = \frac{f^{(n+1)}(x + \theta_{n+1} h)}{n + 1},\] from which the result follows, since
\(f^{(n+1)}(x + \theta\theta_{n} h)\) and
\(f^{(n+1)}(x + \theta_{n+1} h)\) tend to the same limit
\(f^{(n+1)}(x)\) as
\(h \to 0\).]
13. Prove that \(\{f(x + 2h) – 2f(x + h) + f(x)\}/h^{2} \to f”(x)\) as \(h \to 0\), provided that \(f”(x)\) is continuous. [Use equation (2) of § 147.]
14. Show that, if the \(f^{(n)}(x)\) is continuous for \(x = 0\), then \[f(x) = a_{0} + a_{1}x + a_{2}x^{2} + \dots + (a_{n} + \epsilon_{x}) x^{n},\] where \(a_{r} = f^{(r)}(0)/r!\) and \(\epsilon_{x} \to 0\) as \(x \to 0\).
15. Show that if \[a_{0} + a_{1}x + a_{2}x^{2} + \dots + (a_{n} + \epsilon_{x}) x^{n} = b_{0} + b_{1}x + b_{2}x^{2} + \dots + (b_{n} + \eta_{x}) x^{n},\] where \(\epsilon_{x}\) and \(\eta_{x}\) tend to zero as \(x \to 0\), then \(a_{0} = b_{0}\), \(a_{1} = b_{1}\), …, \(a_{n} = b_{n}\). [Making \(x \to 0\) we see that \(a_{0} = b_{0}\). Now divide by \(x\) and afterwards make \(x \to 0\). We thus obtain \(a_{1} = b_{1}\); and this process may be repeated as often as is necessary. It follows that if \(f(x) = a_{0} + a_{1}x + a_{2}x^{2} + \dots + (a_{n} + \epsilon_{x}) x^{n}\), and the first \(n\) derivatives of \(f(x)\) are continuous, then \(a_{r} = f^{(r)}(0)/r!\).]