A note on estimating a monotone regression by combining kernel and density estimates
Loading...
Date
2005-10-12T06:59:17Z
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
In a recent paper Dette, Neumeyer and Pilz (2005) proposed a new nonparametric estimate
of a monotone regression function. This method is based on a non-decreasing rearrangement
of an arbitrary unconstrained nonparametric estimator. Under the assumption of a twice
continuously differentiable regression function the estimate is first order asymptotic equivalent
to the unconstrained estimate and other type of monotone estimates. In this note we provide
a more refined asymptotic analysis of the monotone regression estimate. It is shown that
in the case of a non-decreasing regression function the new method produces an estimate
with nearly the same Lp-norm as the given function for any p ≥ 1. Moreover, in the case,
where the regression function is increasing but only once continuously differentiable we prove
asymptotic normality of an appropriately standardized version of the estimate, where the
asymptotic variance is of order n^{−2/3−ε}, the bias is of order n^{−1/3+ε} and ε > 0 is arbitrarily small. Therefore the rate of convergence of the new estimate is arbitrarily close to the rate of the estimate obtained from monotone least squares estimation, but the asymptotic distribution of the new estimate is substantially simpler.
Description
Table of contents
Keywords
greatest convex minorant, monotone estimation, Nadaraya-Watson estimate, order restricted inference