CTK Exchange
Front Page
Movie shortcuts
Personal info
Awards
Reciprocal links
Terms of use
Privacy Policy

Interactive Activities

Cut The Knot!
MSET99 Talk
Games & Puzzles
Arithmetic/Algebra
Geometry
Probability
Eye Opener
Analog Gadgets
Inventor's Paradox
Did you know?...
Proofs
Math as Language
Things Impossible
My Logo
Math Poll
Other Math sit's
Guest book
News sit's

Recommend this site

Manifesto: what CTK is about Search CTK Buying a book is a commitment to learning Table of content Things you can find on CTK Chronology of updates Email to Cut The Knot Recommend this page

CTK Exchange

Subject: "is taylor series always accurate?"     Previous Topic | Next Topic
Printer-friendly copy     Email this topic to a friend    
Conferences The CTK Exchange College math Topic #581
Reading Topic #581
calculus
guest
Jul-20-06, 03:17 PM (EST)
 
"is taylor series always accurate?"
 
   Taylor Series for f(x+a) is given by

f(x+a)=f(a)+f'(a)x+f"(a)(x^2)+.....


Now consider two continous and differentiable functions g(x) and h(x)

g(x)=f(x) <0 a>
=g1(x) (a inf]


h(x)=f(x) <0 a>
=h1(x) (a inf]

where h1 and g1 are not the same.

Now my question is:

The taylor series expansion seems to be the same for both g(x+a) and h(x+a) as they depend only on the derivative of f(x) at the point a. Why should they be same?

I think I am making some fundamental mistake. Can anyone help me out?

Also what is the diff b/w continous and differentiable.


  Alert | IP Printer-friendly page | Reply | Reply With Quote | Top
alexbadmin
Charter Member
1863 posts
Jul-20-06, 10:28 PM (EST)
Click to EMail alexb Click to send private message to alexb Click to view user profileClick to add this user to your buddy list  
1. "RE: is taylor series always accurate?"
In response to message #0
 
   >Taylor Series for f(x+a) is given by
>
>f(x+a)=f(a)+f'(a)x+f"(a)(x^2)+.....
>
>
>Now consider two continous and differentiable functions g(x)
>and h(x)
>
>g(x)=f(x) <0 a>
> =g1(x) (a inf]
>
>
>h(x)=f(x) <0 a>
> =h1(x) (a inf]
>
>where h1 and g1 are not the same.
>
>Now my question is:
>
>The taylor series expansion seems to be the same for both
>g(x+a) and h(x+a) as they depend only on the derivative of
>f(x) at the point a. Why should they be same?

First, the Taylor series depends on infinitely many derivatives, not just one. Second, there are two issues (and this assuming f is infinitely differentiable at a):

1. Does a Taylor series converge?
2. If it does, where?

For one, a Taylor series may have radius of convergence 0. Just think of a function whose derivatives grow sufficiently fast.

Two, even when a Taylor series converges it may not converge to the originating function. There is a classic example:

f(x) = exp(-x-2), for x ≠ 0 and f(0) = 0.

As you can verify, all the derivative of f at 0 are 0 so that its Maclaurin series is identically 0, while the function it'self is obviously not.

The function that is the limit of its Taylor series is analytic, which is a more demanding property than infinite differentiability.

>Also what is the diff b/w continous and differentiable.

The function f(x) = |x| is continuous, but is not differentiable at 0.


  Alert | IP Printer-friendly page | Reply | Reply With Quote | Top

Conferences | Forums | Topics | Previous Topic | Next Topic

You may be curious to have a look at the old CTK Exchange archive.
Please do not post there.

Copyright © 1996-2018 Alexander Bogomolny

Search:
Keywords:

Google
Web CTK