7Optimization
For all the methods discussed in the previous chapters we saw numerous examples that required the numerical minimization of a nonlinear function. For example, the most efficient method, the maximum likelihood estimate equation, requires the likelihood function to be minimized (2.7). However, in general form there is no explicit answer to this problem. Unlike linear regression, in a general nonlinear regression model there is no explicit formula for parameter estimates in any of the discussed methods. This fact creates a vital need for numerical optimization techniques. The problems that may arise in optimization are the convergence of algorithms and sensitivity to an initial value. In addition, in robust statistics the effect of outliers on rounding errors is an issue. Outliers have a large effect on numerical computation and on the computation of the objective function, gradients and Hessian matrices.
This chapter discusses the optimization methods applied to the robust nonlinear regression estimators used in the book, as well as situations that involve with outliers and their rounding error effects. We have observed that outliers not only have a null effect during iteration procedures, but also affect the initial values, and this has an effect on the overall convergence of iteration. Accordingly, we need iterative optimization methods that are robust against outlier rounding error effects, are less sensitive to initial values, and that have been developed for robust ...
Get Robust Nonlinear Regression now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.