Volume 28, pp. 1-15, 2007-2008.

Separable least squares, variable projection, and the Gauss-Newton algorithm

M. R. Osborne

Abstract

A regression problem is separable if the model can be represented as a linear combination of functions which have a nonlinear parametric dependence. The Gauss-Newton algorithm is a method for minimizing the residual sum of squares in such problems. It is known to be effective both when residuals are small, and when measurement errors are additive and the data set is large. The large data set result that the iteration asymptotes to a second order rate as the data set size becomes unbounded is sketched here. Variable projection is a technique introduced by Golub and Pereyra for reducing the separable estimation problem to one of minimizing a sum of squares in the nonlinear parameters only. The application of Gauss-Newton to minimize this sum of squares (the RGN algorithm) is known to be effective in small residual problems. The main result presented is that the RGN algorithm shares the good convergence rate behaviour of the Gauss-Newton algorithm on large data sets even though the errors are no longer additive. A modification of the RGN algorithm due to Kaufman, which aims to reduce its computational cost, is shown to produce iterates which are almost identical to those of the Gauss-Newton algorithm on the original problem. Aspects of the question of which algorithm is preferable are discussed briefly, and an example is used to illustrate the importance of the large data set behaviour.

Full Text (PDF) [257 KB], BibTeX

Key words

nonlinear least squares, scoring, Newton's method, expected Hessian, Kaufman's modification, rate of convergence, random errors, law of large numbers, consistency, large data sets, maximum likelihood

AMS subject classifications

< Back