Go back to previous page
Forum URL: http://www.cut-the-knot.org/cgi-bin/dcforum/forumctk.cgi
Forum Name: College math
Topic ID: 581
Message ID: 0
#0, is taylor series always accurate?
Posted by calculus on Jul-20-06 at 03:17 PM
Taylor Series for f(x+a) is given by

f(x+a)=f(a)+f'(a)x+f"(a)(x^2)+.....


Now consider two continous and differentiable functions g(x) and h(x)

g(x)=f(x)
=g1(x) (a inf]


h(x)=f(x)
=h1(x) (a inf]

where h1 and g1 are not the same.

Now my question is:

The taylor series expansion seems to be the same for both g(x+a) and h(x+a) as they depend only on the derivative of f(x) at the point a. Why should they be same?

I think I am making some fundamental mistake. Can anyone help me out?

Also what is the diff b/w continous and differentiable.