Approximation
Author: dr inż. Robert Szmurło
1 / 25
Approximation
●
Interpolation presented during previous
lectures will not work well for noisy
data
●
For this problem we shall reduce
requirements
– Instead of forcing the approximating
function to go directly through a
given set of input data
– We will find a function of a given
form which goes as close as possible
to the given set (we find the best fit
to a given set of data). This problem
is called approximation.
2 / 25
Approximation mathematical definition
●
Let the points be given by
●
The equation defining a curve can be expressed as a weighted sum of arbitrary
base functions multiplied by scalar coefficients
●
In the approximation problem we usually try to select a set of base functions
(eg. monomials, trigonometric functions, exponents, etc.) which allows to best
fit the given data
● Then we find a set of coefficients, a0, a1, …, an which fits best the data.
3 / 25
Typical base functions sets
●
Monomials (polynomial approximation)
●
Trigonometric functions (for periodic functions)
●
Chebyshev polynomials
●
Legandre polynomials
Chebyshev polynomials
4 / 25
Binomial
coefficient
Legendre polynomials
Recursive formula Explicit formula
Matlab code
function L4_legendre
x = -1:0.01:1;
hold on
for n = 0:5
plot(x, legendre(n,x));
end
hold off
end
function y = legendre(n,x)
y = zeros(size(x));
for k = 0:n
y = y+binomial(n,k)*binomial(n+k,k)*((x-1)/2).^k;
end
end
function b = binomial(n,k)
b = factorial(n) / (factorial(n-k) * factorial(k));
end
5 / 25
Finding coefficients - Least squares definition
●
Let we define an error of the fit, then we shall try to find coefficients of the
approximating function which will minimize it.
●
To allow easy minimizing we shall define a sum of squares of the errors
●
where N – number of input data points, n – number of base functions.
●
Now we want to minimize S(x,…) with
respect to coefficients a0, a1, …, an. To do
it we shall calculate partial derivatives
of S(x, …) with respect to each
coefficient and compare it to zero
(finding extrema of a given function
by comparing its derivative to zero)
6 / 25
Polynomial approximation
●
Let we take a set monomials as base functions, we get
●
The weighted sum of squared errors is then
●
and the derivatives are
Algebraic form Matrix form
7 / 25
Polynomial approximation example
●
Fit a second-order polynomial to the data
●
Solution
●
Answer
8 / 25
Polynomial approximation
close all
x = [-1 2 3 6]; MATLAB code
y = [1 2 3 5];
n = 3;
A = [];
b = [];
for r = 1:n
for c = 1:n
s = 0;
for j=1:length(x)
s = s + x(j)^(r+c-2)
end
A(r,c) = s;
end
s = 0;
for j=1:length(x)
s = s + y(j)*x(j)^(r-1)
end
b(r) = s;
end
a = A\b;
xd = linspace(min(x), max(x), 100);
yd = [];
for j = 1:length(xd)
s = 0;
for i = 1:n
s = s + a(i) * xd(j)^(i-1);
end
yd(j) = s;
end
plot(x,y,'o', xd, yd);
9 / 25
Polynomial approximation
MATLAB code – slightly improved
close all
x = [-1 2 3 6];
y = [1 2 3 5];
n = 3;
A = [];
b = [];
for r = 1:n
for c = 1:n
A(r,c) = sum(x.^(r+c-2));
end
b(r) = sum(y.*x.^(r-1));
end
a = A\b;
xd = linspace(min(x), max(x), 100);
yd = zeros(size(xd));
for i = 1:n
yd = yd + a(i) * xd.^(i-1);
end
plot(x,y,'o', xd, yd);
10 / 25
We can use approximation for calculation of derivatives and
integrals for data with ERRORS
●
Primary approach for determining derivatives for
imprecise data is to use least-squares regression
(approximation) to fit a smooth, differentiable
function
●
A common choice is the polynomial
least-squares approximation
●
Where for example for n = 2 the polynomial is:
●
and the derivative
11 / 25
Derivative of Least-squares approximation for 1 st order
polynomial
●
System of equations for 1st order polynomial linear regression
●
If we derive a_0 and a_1 we get:
●
then the average derivative over the given data set is
12 / 25
Derivatives for data with ERRORS
Example
function L04_derivative_approximation
x = [1 1.5 2 2.5 3 3.5 4 4.5 5]
y_exact = x.^3 Artificially introduce
dy = rand(1,9)*4
y = y_exact + dy error
plot(x, y_exact, x, y);
x_test = [1.6 2.6 3 3.4 4.4]
a = approx(x,y,1);
dy_approx_1 = a(2)
x =
a = approx(x,y,2); 1.0000 1.5000 2.0000 2.5000 3.0000 3.5000 4.0000 4.5000 5.0000
dy_approx_2 = 2*a(3) .* x_test + a(2)
y_exact =
a = approx(x,y,3); 1.0000 3.3750 8.0000 15.6250 27.00 42.8750 64.00 91.1250 125.00
dy_approx_3 = 3*a(4) .* x_test.^2 +
2*a(3).*x_test + a(2) dy =
2.49 2.3482 0.8310 1.2050 1.8837 0.9220 3.3772 0.7791 0.9037
dy_exact = 3.*x_test.^2
end y =
3.49 5.7232 8.8310 16.8300 28.8837 43.797 67.3772 91.9041 125.903
function a = approx(xd, yd, order)
N = order+1; x_test =
n = length(xd); 1.6000 2.6000 3.0000 3.4000 4.4000
A = zeros(N, N);
b = zeros(N, 1); dy_approx_1 =
for i=1:N 29.7419
for j = 1:N
for k = 1:n dy_approx_2 =
A(i,j) = A(i,j) + xd(k)^(i+j-2); 4.4278 22.5093 29.7419 36.9745 55.0560
end
end dy_approx_3 =
for k = 1:n 6.5335 20.7342 27.6218 35.1994 57.1617
b(i) = b(i) + xd(k)^(i-1) * yd(k);
end dy_exact =
end 7.6800 20.2800 27.0000 34.6800 58.0800
a = A\b;
end
13 / 25
General base approximation
●
Let we take a set of arbitrary base function as base functions, then we get
●
The weighted sum of squared errors is then
●
and the derivatives in an algebraic form are
14 / 25
Legendre polynomials approximation - example
●
Let we assume that 3 first base functions of Legendre polynomials sequence
We must scale original interval to <1,1>
●
Solution
15 / 25
Legendre polynomials approximation – example cont.
●
The final form of the polynomial is:
function L4_legendre_example_calc_naive
close all;
xp = [-3 2 5 6]
scale = 2/(max(xp) - min(xp))
offset = -1 - scale * min(xp)
xps = scale * xp + offset
yp = [-2 1 2 1]
xd = linspace(min(xp)-0.5,max(xp) + 0.5, 100);
xds = scale * xd + offset;
yd = 0.5691+1.6823*xds-0.9199*(3/2*xds.^2-1/2);
plot(xp, yp, 'o', xd, yd)
end
16 / 25
Legendre polynomials approximation – MATLAB code
function L4_legendre_approx_calculation_1
●
Let we define by hand six first close all
x =-1:0.01:1;
Legendre polynomials p0, p1, …, p5 hold on
as MATLAB functions plot(x, p0(x));
plot(x, p1(x));
●
Then we plot first three polynomials plot(x, p2(x));
hold off
in range -1 to 1. end
function y = p0(x)
y = 1;
end
function y = p1(x)
y = x;
end
function y = p2(x)
y = 3/2*x.^2 -1/2;
end
function y = p3(x)
y = 1./2*(5*x.^3 - 3.*x);
end
function y = p4(x)
y = 1./8*(35*x.^4 - 30.*x.^2 + 3);
end
function y = p5(x)
y = 1./16*(231*x.^6 - 315*x.^4 + 105*x.^2 - 5);
end 17 / 25
function L4_legendre_approx_calculation_2
close all
Legendre polynomials xp = [-3 2 5 6]
scale = 2/(max(xp) - min(xp))
approximation – MATLAB code
offset = -1 - scale * min(xp)
xps = scale * xp + offset
yp = [-2 1 2 1]
N = length(xps);
A = zeros(3);
●
Caution! New MATLAB syntax: b = zeros(3,1);
– {@p0, @p1}
p = {@p0, @p1, @p2, @p3, @p4, @p5}
for r = 1:3
for c = 1:3
●
This is a table, similar to array of numbers A(r,c) = sumuj(xps, p{r}, p{c});
end
but this time, we hold in it not b(r) = sumujb(xps, yp, p{r});
numbers but references to functions. end
a = A\b
xd = linspace(min(xp),max(xp),50);
●
It is then convenient to refer to them yd = legendre(a, p, xd*scale + offset);
figure
using a number as an index. plot(xp, yp, 'o', xd, yd)
end
function y = legendre(a, p, xd)
y = zeros(size(xd));
●
Here the functions from the array for i = 1:length(a)
y = y + a(i) * p{i}(xd);
act as normal functions: end
end
function s = sumuj(xp, pa, pb)
s = 0;
for j = 1:length(xp)
s = s + pa(xp(j)) * pb(xp(j));
end
end
function s = sumujb(xp, yp, pa)
s = 0;
for j = 1:length(xp)
s = s + yp(j) * pa(xp(j));
end 18 / 25
end
Approximation – nonlinear relations that can be reduced to
linear form
●
We can often transform nonlinear relations to the linear form by
transforming the coordinate system.
●
Let we consider for example
●
to transform this equation to linear form we can take natural logarithm from
both sides
●
if we substitute
●
we get
19 / 25
Reducing nonlinear relations - example
●
Find curve of the form
using the data given.
●
Solution
by defining
20 / 25
Reducing nonlinear relations - example
●
Find curve of the form
using the data given.
●
Solution
by defining
we get simple ‘linear regression’ (polynomial approximation for n=1)
21 / 25
Reducing nonlinear relations - example
●
Find curve of the form
using the data given.
●
Solution
by defining
we get simple ‘linear regression’ (polynomial approximation for n=1)
finally
22 / 25
Reducing nonlinear relations – MATLAB Code
xi = [0 2 4 6];
yi = [5 3.7 2.7 2];
yb = log(yi)
n = 2;
A = [];
b = [];
for r = 1:n
for c = 1:n
A(r,c) = sum(xi.^(r+c-2));
end
b(r) = sum(xi.^(r-1) .* yb);
end
c = A\b'
a = exp(c(1))
b = -c(2);
xd = linspace(min(xi)-0.15,max(xi)+0.15,100);
yd = [];
for i = 1:length(xd)
yd(i) = a * exp(-b*xd(i));
end
plot(xi,yi,'o', xd, yd);
23 / 25
Approximation – nonlinear relations that can be reduced to
linear form
24 / 25
Thank you
25 / 25