Is there more than one way to express a derivative as a limit of a quotient?

$\begingroup$

Let $r(t)$ be a real-valued function of $t$. Let $v(t)$ be the derivative of $r(t)$. Then $$v(t) = \frac{dr(t)}{dt} = \lim_{\Delta t \to 0} \frac{r(t + \Delta t) - r(t)}{\Delta t}$$ so $$v(t) = \frac{dr(t)}{dt} \approx \frac{r(t + \Delta t) - r(t)}{\Delta t} \text{ for small }\Delta t$$

My question is, is there another way to approximate $v(t) = \dfrac{dr(t)}{dt}$?

For example, I am reading the book Understanding Molecular Simulation by Frenkel and Smit (Second Edition). On page 71 (some pages are available on Google Books here), the authors write $$v(t) = \frac{r(t + \Delta t) - r(t - \Delta t)}{2 \Delta t} + \mathcal{O}(\Delta t^2)$$ or, in other words, $$v(t) \approx \frac{r(t + \Delta t) - r(t - \Delta t)}{2 \Delta t}$$

Basically, then, it seems that there are two ways to express $v(t) = \dfrac{dr(t)}{dt}$:

$$v(t) = \frac{dr(t)}{dt} = \lim_{\Delta t \to 0} \frac{r(t + \Delta t) - r(t)}{\Delta t} \textbf{ (1)}$$

$$v(t) = \frac{dr(t)}{dt} = \lim_{\Delta t \to 0} \frac{r(t + \Delta t) - r(t - \Delta t)}{2 \Delta t} \textbf{ (2)}$$

Are equations (1) and (2) equivalent? Equation (1) Is the definition of derivative that I remember from high school calculus; I do not remember (2). Is (2) an alternative definition of the derivative? Or, what is the relationship between (1) and (2)?

$\endgroup$ 1

5 Answers

$\begingroup$

If the limit (1) exists, the limit (2) exists. To see this, write $$ \frac{r(t+h)-r(t-h)}{2h}=\frac12\left(\frac{r(t+h)-r(t)}{h}+\frac{r(t+(-h))-r(t)}{(-h)}\right). $$ The other implication is wrong, otherwise every even function would be differentiable at $t=0$ with derivative zero (but $t\mapsto|t|$ is a counterexample).

Hence (2) is not a definition of the derivative. The definition of the derivative is (1). But as soon as (1) holds, the derivative coincides with the limit $$ \lim\limits_{h\to0}\frac{r(t+ah)-r(t-bh)}{(a+b)h}, $$ for every $(a,b)$ such that $a+b\ne0$, and in particular with the limit in (2).

$\endgroup$ 1 $\begingroup$

There are many ways to write a modified version of the definition of the derivative that have in fact higher convergence rates and are thus often desirable to use in numerical computation.

Finite difference coefficients can be used for this. For example, we have

$$f'(x_0)=\frac{f(x+h)-f(x)}{h}+O(h)$$

but

$$f'(x_{0}) \approx \displaystyle \frac{-\frac{11}{6}f(x_{0}) + 3f(x_{+1}) -\frac{3}{2}f(x_{+2}) +\frac{1}{3}f(x_{+3}) }{h_{x}} + O\left(h_{x}^3 \right)$$

Addendum: Here is a giant list of high-accuracy derivatives using these coefficients.

$\endgroup$ 6 $\begingroup$

As a complement to other fine answers (did's for example) let's suppose that $f$ admits a Taylor expansion at $t$ then :

$$f(t+h)=f(t)+hf'(t)+\frac {h^2}2 f''(t)+\frac {h^3}6 f'''(t) +O\left(h^4 f''''(t)\right)$$

so that : $$\tag{1}\frac{f(t+h)-f(t)}h=f'(t)+\frac {h}2 f''(t)+\frac {h^2}6 f'''(t) +O\left(h^3 f''''(t)\right)$$ while : $$\tag{2}\frac{f(t+h)-f(t-h)}2=f'(t)+\frac {h^2}6 f'''(t) +O\left(h^3 f''''(t)\right)$$

That is why the second method is more precise : the $f''$ term disappeared (well all the even terms in fact!) so that $f'$ will be evaluated with more precision (with an $h^2$ error instead of $h$). Because of the precision obtained Feynman proposed this second method to evaluate derivatives in his famous Physics Lectures (Vol I 9-6).

But when there is no convergence ($f(t)=\frac 1t$ for example) you'll get a limit in the second case ($0$) without any problem and that's clearly different of the first case.

The second method is much used too when you need a 'discretized' version of a differential equation (respecting time symmetry, energy conservation and so on). Ed Fredkin for example proposed following equation in his article "Feynman, Barton and the reversible Schrödinger difference equation" : $$\frac{C_{x,t+1}-C_{x,t-1}}2=ik\left(C_{x-1,t}-2C_{x,t}+C_{x+1,t}\right)$$

$\endgroup$ 4 $\begingroup$

Consider the linear terms in the Taylor series of $r(t+h)$ and $r(t-h)$, where $h = \Delta t$ and $r'(t)=v(t)$, $$ r(t+h) \approx r(t) + r'(t)\,h $$ and $$ r(t-h) \approx r(t) - r'(t)\,h $$ Multiplying the second equation by $-1$ and adding $$ 2h r'(t) = r(t+h) + r(t-h) \Rightarrow v(t) \approx \frac{r(t+h) + r(t-h)}{2h}\,, $$

which implies

$$ v(t) = \lim_{h \to 0} \frac{r(t+h) + r(t-h)}{2h} $$

$\endgroup$ $\begingroup$

Here is yet another equivalent definition: $f'(x) = \lim_{r \to 1} \frac{f(rx) - f(x)}{(r-1)x}$, if $x \ne 0$. The differentiation rule $\frac{d}{dx}x^n = n x^{n-1}$ is especially easy to derive with this definition.

$\endgroup$ 3

Your Answer

Sign up or log in

Sign up using Google Sign up using Facebook Sign up using Email and Password

Post as a guest

By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy

You Might Also Like