Estimating maxima and minima of a noisy curve turns out to be very hard and to a large part is still an open question. In this blog post we will discover some strategies to encounter the problem. Another interpretation is given in the context of optimization. Consider an optimization problem where the (smooth) objective function is only evaluated at some, maybe random and noisy, points or due to computational issues can only be sparsely evaluated. Then this algorithm will give a fast estimate of the optimal values.
1. Estimation based on Splines
We assume that a smooth at least two times differentiable curve is observed at independent randomly-distributed points
contaminated with additional noise. Our model is then given by
(1)
where

![Rendered by QuickLaTeX.com \mathbf{E}\left[\varepsilon_{k}|X\right]=0](https://www.thebigdatablog.com/wp-content/ql-cache/quicklatex.com-a129ee5e69b947d55c08ed620548b922_l3.png)




(2)
1.1 Natural cubic splines
Consider the following minimization problem, among all functions with two continuous derivatives find such that
(3)
is minimized. where is a fixed smoothing parameter and
. It can be shown [1], that (3) has an unique minimizer which is a particular natural cubic spline. More generally a natural cubic spline with knots
is a piecewise polynomial of order 4, with additional constrains such that the the function is linear beyond the boundary knots, represented with
Basis functions
. When the knots are given by
the natural cubic spines solves (3) in that case an optimal solution for 3 is given by
.
Let
the solution to the minimization problem using is thus given by
(4)
A popular basis choice is derived from the truncated power series and given by
(5)
where
(6)
We will use this basis as illustration for our method from here on. An important inside is that for a fixed interval limited by the nodes
![Rendered by QuickLaTeX.com t \in [\zeta_k, \zeta_{k+1}]](https://www.thebigdatablog.com/wp-content/ql-cache/quicklatex.com-e6303c863d0c94281bc1fc58efb0a6fc_l3.png)

(7)


(8)
For , the real possible extrema
of
are thus given by
(9)
this will then be an extrema of

![Rendered by QuickLaTeX.com a_k \in [\zeta_k, \zeta_{k+1}]](https://www.thebigdatablog.com/wp-content/ql-cache/quicklatex.com-e5986668ed1e76bb5c6247b3e107089c_l3.png)

1.2 Estimation and Asymptotic
To estimate the extrema we replace the spline coefficients by
, their estimated counterpart, as specified in (4) to derive
and
according to (8) and (9). First verify that for fixed
,
(10)
and
(11)
therefore
(12)
(13)
(14)
where ,
,
.
After using a Taylor (see here for example) expansion and some calculus we can derive
(15)
The first order Taylor expansion of the variance is given by
(16)
where
(17)
(18)
(19)
(20)
2. White noise representation
The white noise version, as descibed in [3], of (1) is
(21)
where the noise term





that

In the white noise case, the corresponding equivalent basis to is given by with the knot density
then
(22)
Then the continuous analogue to (8) is given by
(23)
(24)
(25)
Accordingly is an extrema of
if and only if
(26)
therefore an extrema is observed if

[Bibtex]
@article{Wahba1979,
title={Smoothing noisy data with spline functions},
author={Peter Craven and Grace Wahba},
journal={Numerische Mathematik},
year={1978},
volume={31},
pages={377-403}
}
[Bibtex]
@article{Reinsch1967,
title={Smoothing by spline functions},
author={Christian H. Reinsch},
journal={Numerische Mathematik},
year={1967},
volume={10},
pages={177-183}
}
[Bibtex]
@article{Hall92,
ISSN = {00063444},
URL = {http://www.jstor.org/stable/20441169},
abstract = {Penalised spline regression is a popular new approach to smoothing, but its theoretical properties are not yet well understood. In this paper, mean squared error expressions and consistency results are derived by using a white-noise model representation for the estimator. The effect of the penalty on the bias and variance of the estimator is discussed, both for general splines and for the case of polynomial splines. The penalised spline regression estimator is shown to achieve the optimal nonparametric convergence rate established by Stone (1982).},
author = {Peter Hall and J. D. Opsomer},
journal = {Biometrika},
number = {1},
pages = {105--118},
publisher = {[Oxford University Press, Biometrika Trust]},
title = {Theory for Penalised Spline Regression},
urldate = {2022-04-16},
volume = {92},
year = {2005}
}