Consider a continuous function f on an interval [a,b]. Then, by a classic approximation theorem due to Weierstrass, f can be uniformly approximated on [a,b] to any degree of accuracy by polynomials. On the other hand, given N+1 distinct points x_i in [a,b], it is well known that there exists a unique polynomial P_N of degree at most N that interpolates the function f in these points, i.e. that fulfills P_N(x_i) = f(x_i). It seems promising to use these interpolation polynomials to approximate f on [a,b], in particular, one might believe that if the number of interpolation points x_i in [a,b] increases, the corresponding interpolation polynomials will approximate the function f better and better. While this might indeed be the case, it was first observed by Carl Runge that depending on the function f and the location of the interpolation points in [a,b], the opposite might actually occur, meaning that increasing the number of interpolation points makes the accuracy of the approximation worse. This somewhat surprising result, known as "Runge's phenomenon", shows that interpolation polynomials can not in general be used to approximate functions.
The goal of this project is to understand and explore Runge's phenomenon, and to illustrate it with numerical calculations. Moreover, different approaches to circumvent the phenomenon, such as interpolating in the so-called Chebyshev nodes, can also be investigated.
Supervisors: Thierry Meyrath
Difficulty level: EML 2