Taylor Series for f(x+a) is given byf(x+a)=f(a)+f'(a)x+f"(a)(x^2)+.....
Now consider two continous and differentiable functions g(x) and h(x)
g(x)=f(x)
=g1(x) (a inf]
h(x)=f(x)
=h1(x) (a inf]
where h1 and g1 are not the same.
Now my question is:
The taylor series expansion seems to be the same for both g(x+a) and h(x+a) as they depend only on the derivative of f(x) at the point a. Why should they be same?
I think I am making some fundamental mistake. Can anyone help me out?
Also what is the diff b/w continous and differentiable.
>Taylor Series for f(x+a) is given by
>
>f(x+a)=f(a)+f'(a)x+f"(a)(x^2)+.....
>
>
>Now consider two continous and differentiable functions g(x)
>and h(x)
>
>g(x)=f(x)
> =g1(x) (a inf]
>
>
>h(x)=f(x)
> =h1(x) (a inf]
>
>where h1 and g1 are not the same.
>
>Now my question is:
>
>The taylor series expansion seems to be the same for both
>g(x+a) and h(x+a) as they depend only on the derivative of
>f(x) at the point a. Why should they be same? First, the Taylor series depends on infinitely many derivatives, not just one. Second, there are two issues (and this assuming f is infinitely differentiable at a):
1. Does a Taylor series converge?
2. If it does, where?
For one, a Taylor series may have radius of convergence 0. Just think of a function whose derivatives grow sufficiently fast.
Two, even when a Taylor series converges it may not converge to the originating function. There is a classic example:
f(x) = exp(-x-2), for x ≠ 0 and f(0) = 0.
As you can verify, all the derivative of f at 0 are 0 so that its Maclaurin series is identically 0, while the function iteself is obviously not.
The function that is the limit of its Taylor series is analytic, which is a more demanding property than infinite differentiability.
>Also what is the diff b/w continous and differentiable.
The function f(x) = |x| is continuous, but is not differentiable at 0.