Definition of Derivative

The derivative is defined as $f’(a)=\lim\limits_{x \to a} \frac{f(x)-f(a)}{x-a}=\lim\limits_{x \to 0} \frac{f(a+x)-f(a)}{x}$.

Intuitively, the derivative is measuring the slope of a line at a instantaneous point. So we can define the derivative at $x=a$ as taking the gradient of the $2$ points $(a,f(a))$ and $(b,f(b))$ as $b$ gets closer and closer to $a$, which is what the above formula describes.

If the limit $\lim\limits_{x \to a} \frac{f(x)-f(a)}{x-a}$ exists, then $f’(a)$ is well-defined and we say that $f$ is differentiable at $a$. By this definition, it is a consequence that if a function is only defined on $[l,r]$ it is not differentiable on $l$ and $r$ as the limit does not exist.

Using the above definitions, we can find the derivative of functions. Doing so is known as Differentiation from First Principles.

Here are some examples:

$\frac{d}{dx} x=\lim\limits_{a \to 0} \frac{(x+a)-x}{a}=\lim\limits_{a \to 0} \frac{a}{a}=1$

$\frac{d}{dx} x^n=\lim\limits_{a \to 0} \frac{(x+a)^n-x^n}{a}=\lim\limits_{a \to 0} \frac{\binom n 1 x^{n-1}a + O(a^2)}{a}=\binom n 1 x^{n-1}=n x^{n-1}$, where $n$ is a positive integer

\[\begin{align} \frac{d}{dx} \sin(x)=&\lim\limits_{a \to 0} \frac{sin(x+a)-sin(x)}{a}=\lim\limits_{a \to 0} \frac{sin((x+\frac a2)+\frac a 2)-sin((x+\frac a2)-\frac a 2)}{a}\\\\ =&\lim\limits_{a \to 0} \frac{2 \cos(x+ \frac a2) \sin(\frac a2)}{a} =\lim\limits_{a \to 0} \cos(x+ \frac a2) \cdot \lim\limits_{a \to 0} \frac{\sin(\frac a2)}{\frac a2}=\lim\limits_{a \to 0} \cos(x+ \frac a2)=\cos(x) \end{align}\]

Estimating Derivative given Table of Values

When we need to find a derivative a point given a few datapoints, it is recommended to use $f’(a) \approx \frac{f(a+h)-f(a-h)}{2h}$ unless $a$ is a endpoint, then we have no choice but to use $f’(a) \approx \frac{f(a+h)}{h}$.

If $f$ is differentiable at $a$, then $f$ is continuous at $a$.

Proof: Since $f’(a)$ is well-defined, $\lim\limits_{x \to a} \frac{f(x)-f(a)}{x-a}$ exists.

We wish to show that $\lim\limits_{x \to a} f(x)=f(a)$.

\[\begin{align} \lim\limits_{x \to a} f(x) &= \lim\limits_{x \to a} (f(x)-f(a)+f(a)), \text{since } f(a) \text{ is well-defined} \\\\ &= \lim\limits_{x \to a} \Big(\frac{f(x)-f(a)}{x-a} \cdot (x-a) \Big) + \lim\limits_{x \to a} f(a) \\\\ &= f'(0) \cdot 0 +f(a)\\\\ &=f(a)\end{align}\]

Common Derivatives

$ \frac{d}{dx} c=0$, where $c$ is a constant $\frac{d}{dx} \sin(x) =\cos (x)$
$ \frac{d}{dx} x^n=nx^{n-1}$, $n \ne 0$ $\frac{d}{dx} \cos(x) = -\sin (x)$
$\frac{d}{dx} \ln{x} =\frac 1x$ $\frac{d}{dx} \arcsin(x) = \frac{1}{\sqrt{1-x^2}}$
$\frac{d}{dx} e^x =e^x$ $\frac{d}{dx} \arctan(x) = \frac{1}{1+x^2}$

Rules

$\frac{dy}{dx}=\frac{dy}{du} \cdot \frac{du}{dx}$ (Chain Rule)

$\frac{d}{dx}(uv)=u\frac{dv}{dx}+v\frac{du}{dx}$ (Product rule)

Derivative of Inverse Function

$(f^{-1})’(a)=\frac{1}{f’(f^{-1}(a))}, f’(f^{-1}(a)) \ne 0$

The derivative of $f^{-1}$ at $(a,f^{-1}(a))$, it is related to the derivative of $f$ at $(f^{-1}(a),a)$ by a reciprocal relationship.

Rolle’s Theorem

If $f$ is a function that is both continuous on $[a,b]$, differentiable on the interval $(a,b)$ and $f(a)=f(b)$, then there exists $c \in (a,b)$ such that $f’(c)=0$.

Note that the differentiability condition is important here. Consider the function $f(x)=\lvert x \rvert$ on the interval $[-1,1]$. It can be shown that it is continuous on $[-1,1]$ and $f(-1)=f(1)$. However, there is no point where its derivative is $0$.

Mean Value Theorem

This is a corollary of Rolle’s Theorem.

If $f$ is a function that is both continuous on $[a,b]$ and differentiable on the interval $(a,b)$, then there exists $c \in (a,b)$ such that $f’(c)=\frac{f(b)-f(a)}{b-a}$.

The intuitive proof of this using Rolle’s theorem is that we shift the a function such that $f(a)=f(b)$ by a linear function.

Increasing and Decreasing Functions

$f$ is non-decreasing on $I$ if $f(a) \leq f(b)$ for $a,b \in I$ where $a<b$.
$f$ is increasing on $I$ if $f(a) < f(b)$ for $a,b \in I$ where $a<b$.
$f$ is non-increasing on $I$ if $f(a) \geq f(b)$ for $a,b \in I$ where $a<b$.
$f$ is decreasing on $I$ if $f(a) > f(b)$ for $a,b \in I$ where $a<b$.
$f$ is constant on $I$ if $f(a) = f(b)$ for $a,b \in I$ where $a<b$.

Interestingly, a single point is defined to be both increasing and decreasing by this definition.

Corollaries of Mean Value Theorem

If $f$ is continuous on $[a,b]$ and differentiable on $(a,b)$:
(a) If $f’(x) \geq 0$ for all $x \in (a,b)$, then $f$ is non-decreasing on $[a,b]$.
(b) If $f’(x) >0$ for all $x \in (a,b)$, then $f$ is increasing on $[a,b]$.
(c) If $f’(x) \leq 0$ for all $x \in (a,b)$, then $f$ is non-increasing on $[a,b]$.
(d) If $f’(x) < 0$ for all $x \in (a,b)$, then $f$ is decreasing on $[a,b]$.
(e) If $f’(x) = 0$ for all $x \in (a,b)$, then $f$ is constant on $[a,b]$.

The converse of (a),(c) and (e) are true, but the converse of (b) and (d) is false.

Counter example of converse of (b)

$x^3$ is increasing on $[-1,1]$, but $\frac{d}{dx} x^3 \vert_{x=0}=0$.

Proof of (a)

Suppose that $f$ is decreasing on $[a,b]$, that is there exists $c,d \in [a,b]$ such that $c<d$ and $f(c)>f(d)$.

By Mean Value Theorem, there exists $x \in (c,d)$ such that $f’(x)=\frac{f(d)-f(c)}{d-c}<0$. However, $f’(x) \geq 0$ for all $x \in (a,b)$.

This is a contradiction. Therefore, (a) is true.

Proof of converse of (a) (not rigorous)

Suppose there exists $x \in (a,b)$ such that $f’(x)<0$. This implies that $\lim\limits_{h \to x^+} \frac{f(h)-f(x)}{h-x}<0$.

Since $h-x>0$, this means that for a small interval $h \in (x,x+\epsilon]$, $f(h)-f(x)<0$ or that $f(x)>f(h)$. However, we have $f(x) \leq f(h)$ for $x,h \in [a,b]$ and $x<h$.

This is a contradiction. Therefore, converse of (a) is true.

Critical Points

Let $c$ be an interior point (not endpoint if domain if closed). If $f’(c)=0$ or $f’(c)$ does not exist, then $(c,f(c))$ is a critical point.

First Derivative Test

If $f$ is differentiable on a small open interval around $c$ with exclusion of $c$
(i) If $f’$ changes sign from negative to positive, local minimum
(ii) If $f’$ changes sign from positive to negative, local maximum
(iii) Else if $f’ \ne 0$, it is a inflection point

Second Derivative Test

Suppose $f^{\prime\prime}(c)=0$
(i) If $f^{\prime\prime}(c)>0$, local minimum
(ii) If $f^{\prime\prime}(c)<0$, local maximum
(iii) Else, inconclusive

Concavity

If $f$ is differentiable on a open interval $I$.
(i) If $f$ is increasing, concave upwards
(ii) If $f$ is decreasing, concave downwards

Suppose $f$ is twice differentiable on a open interval $I$.
(i) If $f^{\prime\prime}>0$, concave upwards
(ii) If $f^{\prime\prime}<0$, concave downwards

However, the converse is not true. $x^3$ is concave upwards on $(-1,1)$ but $\frac{d^2}{dx^2} x^3 \vert_{x=0}=0$.

Inflection Point

$(c,f(c))$ is a inflection point if $f$ is continuous at $c$ and the concavity changes around $c$.

Note that $f^{\prime\prime}(c)=0$ does not imply inflection point. $x^3$ is concave upwards on $(-1,1)$ but $\frac{d^2}{dx^2} x^3 \vert_{x=0}=0$.

L’Hopital’s Rule

$\lim\limits_{x \to a} \frac{f(x)}{g(x)}=\lim\limits_{x \to a} \frac{f’(x)}{g’(x)}$ if the original limit is of the form $\frac{0}{0}$ or $\pm \frac{\infty}{\infty}$.

Other indeterminant forms such as $0 \cdot \infty$, $\infty - \infty$, $0^0$, $\infty ^ \infty$,$1 ^\infty$ can be turned in $\frac{0}{0}$ or $\pm \frac{\infty}{\infty}$ using algebraic manipulation to apply L’Hopital’s Rule.

Growth of Functions

Order from slowest to fastest growth: constant, logarithmic, polynomial, exponential, factorial.


<
Previous Post
MA5131 Summary 1
>
Next Post
MA5131 Summary 3