Bissantz, NicolaiDümbgen, LutzMunk, AxelStratmann, Bernd2009-01-132009-01-132009-01-13http://hdl.handle.net/2003/25991http://dx.doi.org/10.17877/DE290R-14171The computation of robust regression estimates often relies on minimization of a convex functional on a convex set. In this paper we discuss a general technique for a large class of convex functionals to compute the minimizers iteratively which is closely related to majorization-minimization algorithms. Our approach is based on a quadratic approximation of the functional to be minimized and includes the iteratively reweighted least squares algorithm as a special case. We prove convergence on convex function spaces for general coercive and convex functionals F and derive geometric convergence in certain unconstrained settings. The algorithm is applied to TV penalized quantile regression and is compared with a step size corrected Newton-Raphson algorithm. It is found that typically in the first steps the iteratively reweighted least squares algorithm performs significantly better, whereas the Newton type method outpaces the former only after many iterations. Finally, in the setting of bivariate regression with unimodality constraints we illustrate how this algorithm allows to utilize highly efficient algorithms for special quadratic programs in more complex settings.enConvex approximationFermat’s problemL1 regressionMonotone regressionNonparametric regressionPool adjacent violators algorithmQuadratic approximationQuantile regressionRegression analysisReweighted least squaresShape constraintsTotal variation semi-norm004Convergence analysis of generalized iteratively reweighted least squares algorithms on convex function spacesText