Eldorado Community:
http://hdl.handle.net/2003/9
2018-06-25T09:34:25ZOn axiomizing and extending the quasi-arithmetic mean
http://hdl.handle.net/2003/36880
Title: On axiomizing and extending the quasi-arithmetic mean
Authors: Hansen, Maurice
Abstract: Quasi-arithmetic means contain many other mean value concepts
such as the arithmetic, the geometric or the harmonic mean as
special cases. Treating quasi-arithmetic means as sequences of mappings
from I^n into I (for some real interval I) this paper shows that under
mild additional conditions this mapping is uniquely determined by its
values on I^2. This extends a well-known result by Huntington [4] where
this claim is proven only for special cases.2018-05-29T08:46:48ZSimar and Wilson two-stage efficiency analysis for Stata
http://hdl.handle.net/2003/36879
Title: Simar and Wilson two-stage efficiency analysis for Stata
Authors: Badunenko, Oleg; Tauchmann, Harald
Abstract: When analyzing what determines the efficiency of production, regressing
efficiency scores estimated by DEA on explanatory variables has much intuitive
appeal. Simar and Wilson (2007) show that this na¨ıve two-stage estimation
procedure suffers from severe flaws, that render its results, and in particular
statistical inference based on them, questionable. At the same time they propose
a statistically grounded bootstrap based two-stage estimator that eliminates the
above mentioned weaknesses of its na¨ıve predecessors and comes in two variants.
This article introduces the new Stata command simarwilson that implements
either variant of the suggested estimator in Stata. The command allows for various
options, and extends the original procedure in some respects. For instance, it
allows for analyzing both, output- and input-oriented efficiency. To demonstrate
the capabilities of the new command simarwilson we use data from the Penn
World Tables and the Global Competitiveness Report by the World Economic
Forum to perform a cross-country empirical study about the importance of quality
of governance of a country for its efficiency of output production.2018-05-25T13:51:06ZRobust discrimination between long-range dependence and a change in mean
http://hdl.handle.net/2003/36842
Title: Robust discrimination between long-range dependence and a change in mean
Authors: Gerstenberger, Carina
Abstract: In this paper we introduce a robust to outliers Wilcoxon change-point testing procedure,
for distinguishing between short-range dependent time series with a change in mean at unknown
time and stationary long-range dependent time series. We establish the asymptotic
distribution of the test statistic under the null hypothesis for L1 near epoch dependent
processes and show its consistency under the alternative. The Wilcoxon-type testing procedure
similarly as the CUSUM-type testing procedure of Berkes, Horvath, Kokoszka and
Shao (2006), requires estimation of the location of a possible change-point, and then using
pre- and post-break subsamples to discriminate between short and long-range dependence.
A simulation study examines the empirical size and power of the Wilcoxon-type testing
procedure in standard cases and with disturbances by outliers. It shows that in standard
cases the Wilcoxon-type testing procedure behaves equally well as the CUSUM-type testing
procedure but outperforms it in presence of outliers.2018-04-23T12:22:48ZDeviations from triangular arbitrage parity in foreign exchange and bitcoin markets
http://hdl.handle.net/2003/36820
Title: Deviations from triangular arbitrage parity in foreign exchange and bitcoin markets
Authors: Reynolds, Julia; Sögner, Leopold; Wagner, Martin; Wied, Dominik
Abstract: This paper applies new econometric tools to monitor and detect so-called "financial market dislocations",
defined as periods in which substantial deviations from arbitrage parities take place. In particular,
we focus on deviations from the triangular arbitrage parity for exchange rate triplets. Due to
increasing media attention towards mispricing in the market for cryptocurrencies, we include the cryptocurrency Bitcoin in addition to fiat currencies. We do not find evidence for substantial deviations
from the triangular arbitrage parity when only traditional fiat currencies are concerned. However, we
document significant deviations from triangular arbitrage parities in the newer markets for Bitcoin.2018-03-27T14:57:15ZEfficient designs for the estimation of mixed and self carryover effects
http://hdl.handle.net/2003/36819
Title: Efficient designs for the estimation of mixed and self carryover effects
Authors: Kunert, Joachim; Mielke, Johanna
Abstract: Biosimilars are copies of biological medicines that are developed by a competitor
after the patent for the originator drug has expired. Extensive clinical trials are
required to show therapeutic equivalence between the biosimilar and its reference
product before a biosimilar can be sold on the market. However, even after more
than 10 years of experience with biosimilars in Europe, there is still some uncertainty
if the patients who are already taking the reference product can switch between
the biosimilar and its reference product. One convenient way to assess the impact
of switches is the analysis of mixed and self carryover effects: if the products are
switchable, there should not be any difference in the carryover effects. This paper
determines a series of simple designs which are highly efficient for the comparison
of the mixed and self carryover effects of two treatments. The proof of efficiency
is not straightforward because the information matrix of the efficient designs is not
completely symmetric.2018-03-27T14:54:28ZThe nonparametric location-scale mixture cure model
http://hdl.handle.net/2003/36818
Title: The nonparametric location-scale mixture cure model
Authors: Chown, Justin; Heuchenne, Cédric; Van Keilegom, Ingrid
Abstract: We propose completely nonparametric methodology to investigate location-scale modelling of two-component mixture cure models, where the responses of interest are only indirectly observable due to the presence of censoring and the presence of so-called long-term survivors that are always censored. We use covariate-localized nonparametric estimators, which depend on a bandwidth sequence, to propose an estimator of the error distribution function that has not been considered before in the literature. When this bandwidth belongs to a certain range of undersmoothing band-widths, the asymptotic distribution of the proposed estimator of the error distribution function does not depend on this bandwidth, and this estimator is shown to be root-n consistent. This suggests that a computationally costly bandwidth selection procedure is unnecessary to obtain an effective estimator of the error distribution, and that a simpler rule-of-thumb approach can be used instead.
A simulation study investigates the finite sample properties of our approach, and the methodology is illustrated using data obtained to study the behavior of distant metastasis in lymph-node-negative breast cancer patients.2018-03-27T14:52:08ZEquity and the Willingness to Pay for Green Electricity: Evidence from Germany
http://hdl.handle.net/2003/36800
Title: Equity and the Willingness to Pay for Green Electricity: Evidence from Germany
Authors: Andor, Mark; Frondel, Manuel; Sommer, Stephan
Abstract: The production of electricity on the basis of renewable energy technologies is a
classic example of an impure public good. It is often discriminatively financed by industrial
and household consumers, such as in Germany, where the energy-intensive
sector benefits from far-reaching exemptions, while all other electricity consumers
are forced to bear a higher burden. Based on randomized information treatments
in a stated-choice experiment among about 11,000 German households, we explore
whether this coercive payment rule affects households’ willingness-to-pay (WTP) for
green electricity. Our central result is that reducing inequity by abolishing the exemption
for the energy-intensive industry raises households’ WTP, a finding that may
have high external validity.2018-03-14T11:52:35ZA study on the least square estimator of multiple isotonic regression function
http://hdl.handle.net/2003/36799
Title: A study on the least square estimator of multiple isotonic regression function
Authors: Bagchi, Pramita; Dhar, Subhra Sankar
Abstract: Consider the problem of pointwise estimation of f in a multiple isotonic regression model Z = f(X1, ... ,Xd) + ε , where Z is the response variable, f is an unknown non-parametric regression function,
which is isotonic with respect to each component, and is the error term. In this article, we investigate
the behaviour of the least square estimator of f and establish its asymptotic properties. We generalize the
greatest convex minorant characterization of isotonic regression estimator for the multivariate case and use
it to establish the asymptotic distribution of properly normalized version of the estimator. Moreover, we
test whether the multiple isotonic regression function at a fixed point is larger (or smaller) than a specified
value or not based on this estimator, and the consistency of the test is established. The practicability of the
estimator and the test are shown on simulated and real data as well.2018-03-14T11:50:47ZOn detecting changes in the jumps of arbitrary size of a time-continuous stochastic process
http://hdl.handle.net/2003/36786
Title: On detecting changes in the jumps of arbitrary size of a time-continuous stochastic process
Authors: Hoffmann, Michael
Abstract: This paper introduces test and estimation procedures for abrupt and gradual changes in
the entire jump behaviour of a discretely observed Ito semimartingale. In contrast to existing
work we analyse jumps of arbitrary size which are not restricted to a minimum height. Our
methods are based on weak convergence of a truncated sequential empirical distribution
function of the jump characteristic of the underlying Ito semimartingale. Critical values
for the new tests are obtained by a multiplier bootstrap approach and we investigate the
performance of the tests also under local alternatives. An extensive simulation study shows
the finite-sample properties of the new procedures.2018-03-02T12:37:55ZUniversally optimal crossover designs for the estimation of mixed-carryover effects with an application to biosimilar development
http://hdl.handle.net/2003/36785
Title: Universally optimal crossover designs for the estimation of mixed-carryover effects with an application to biosimilar development
Authors: Mielke, Johanna; Kunert, Joachim
Abstract: Biosimilars are medical products that are developed as copies of already
established, large molecule drugs (biologics). For gaining approval, sponsors have to
confirm that the proposed biosimilar has the same efficacy and safety as the originator
product. This comparability exercise includes also, in most cases, that large clinical
trials are conducted in patients. However, even with the evidence gained during the
clinical studies, there is still some uncertainty if patients who were already treated
with the originator can be switched to the biosimilar or if even multiple switches between
the biosimilar and the originator are acceptable. A simple way to address the
question of switchability is the estimation of so-called mixed and self-carryover effects,
which are carryover effects that not only depend on the treatment in the current
period, but also on the treatment in the previous period. In this paper, we determine
universally optimal designs for the estimation of mixed-carryover effects in a linear
model with treatment, period, subject and self-carryover as nuisance parameters.2018-03-02T12:35:46ZA likelihood ratio approach to sequential change point detection
http://hdl.handle.net/2003/36782
Title: A likelihood ratio approach to sequential change point detection
Authors: Dette, Holger; Gösmann, Josua
Abstract: In this paper we propose a new approach for sequential monitoring of a parameter
of a d-dimensional time series. We consider a closed-end-method, which is motivated
by the likelihood ratio test principle and compare the new method with two alternative
procedures. We also incorporate self-normalization such that estimation of the longrun
variance is not necessary. We prove that for a large class of testing problems the
new detection scheme has asymptotic level a and is consistent. The asymptotic theory
is illustrated for the important cases of monitoring a change in the mean, variance and
correlation. By means of a simulation study it is demonstrated that the new test performs
better than the currently available procedures for these problems.2018-02-28T12:58:43ZChange point analysis in non-stationary processes - a mass excess approach
http://hdl.handle.net/2003/36346
Title: Change point analysis in non-stationary processes - a mass excess approach
Authors: Dette, Holger; Wu, Weichi
Abstract: This paper considers the problem of testing if a sequence of means (μ t)t=1,...,n of a non-stationary time series (Xt)t=1,...,n is stable in the sense that the di fference of the means μ1 and μt between the initial time t = 1 and any other time is smaller than a given level, that is |μ1 — μt| ≤ c for all t = 1,..., n. A test for hypotheses of this type is developed using a bias corrected monotone rearranged local linear estimator and asymptotic normality of the corresponding test statistic is established. As the asymptotic variance depends on the location and order of the critical roots of the equation |μ1 — μt| = c a new bootstrap procedure is proposed to obtain critical values and its consistency is established. As a consequence we are able to quantitatively describe relevant deviations of a non-stationary sequence from its initial value. The results are illustrated by means of a simulation study and by analyzing data examples.2018-02-01T11:52:48ZDoes financial compensation increase the acceptance of power lines? Evidence from Germany
http://hdl.handle.net/2003/36310
Title: Does financial compensation increase the acceptance of power lines? Evidence from Germany
Authors: Simora, Michael; Frondel, Manuel; Vance, Colin
Abstract: Although public support for renewable energy promotion in Germany is
strong, the required power line construction has incited a groundswell of opposition
from residents concerned about the impacts on their neighborhoods. This paper
evaluates a large randomized one-shot binary-choice experiment to examine the
effect of different compensation schemes on the acceptance of new power line construction.
Results reveal that community compensations have no bearing on the acceptance
level, whereas personal compensations have a negative effect. Two possible
channels through which financial compensation reduces the willingness-to-accept are
(1) crowding out of intrinsic motivation to support the construction project and (2) a
signaling effect that alerts residents to potential negative impacts of the power lines.
Both explanations call into question the efficacy of financial payments to decrease local
opposition.2017-12-21T10:03:16ZPredatory short sales and bailouts
http://hdl.handle.net/2003/36232
Title: Predatory short sales and bailouts
Authors: Kranz, Sebastian; Löffler, Gunter; Posch, Peter N.
Abstract: This paper extends the literature on predatory short selling and bailouts
through a joint analysis of the two. We consider a model with informed short
sales, as well as uninformed predatory short sales, which can trigger the inefficient
liquidation of a firm. We obtain several novel results: A government commitment
to bail out insolvent firms with positive probability can increase welfare because
it selectively deters predatory short selling without hampering desirable informed
short sales. Contrasting a common view, bailouts can be optimal ex ante but
undesirable ex post. Furthermore, bailouts in our model are a better policy tool
than short selling restrictions. Welfare gains from the bailout policy are unevenly
distributed: shareholders gain while taxpayers lose. Bailout taxes allow ex-ante
Pareto improvements.2017-12-04T13:50:38ZBayesian optimal designs for dose-response curves with common parameters
http://hdl.handle.net/2003/36181
Title: Bayesian optimal designs for dose-response curves with common parameters
Authors: Schorning, Kirsten; Konstantinou, Maria
Abstract: The issue of determining not only an adequate dose but also a dosing frequency
of a drug arises frequently in Phase II clinical trials. This results in the comparison
of models which have some parameters in common. Planning such studies based on
Bayesian optimal designs offers robustness to our conclusions since these designs,
unlike locally optimal designs, are efficient even if the parameters are misspecified.
In this paper we develop approximate design theory for Bayesian D-optimality for
nonlinear regression models with common parameters and investigate the cases of
common location or common location and scale parameters separately. Analytical
characterisations of saturated Bayesian D-optimal designs are derived for frequently
used dose-response models and the advantages of our results are illustrated via a
numerical investigation.2017-11-14T13:23:20ZDer Wert von Versorgungssicherheit mit Strom: Evidenz für deutsche Haushalte
http://hdl.handle.net/2003/36180
Title: Der Wert von Versorgungssicherheit mit Strom: Evidenz für deutsche Haushalte
Authors: Frondel, Manuel; Sommer, Stephan
Abstract: Dieser Artikel untersucht auf Basis einer Befragung von mehr als 5.000
Haushaltsvorständen, wie viel sie für Versorgungssicherheit mit Strom zu zahlen bereit
sind. Alternativ zur Zahlungsbereitschaft (willingness to pay, WTP) wird auch nach der
Bereitschaft gefragt, gegen eine Entschädigungszahlung auf ein gewisses Maß an
Versorgungssicherheit zu verzichten (willingness to accept, WTA). In Übereinstimmung
mit zahlreichen empirischen Studien finden wir mittlere WTA-Werte, die deutlich über
den mittleren WTP-Werten für die Vermeidung eines unangekündigten, vierstündigen
Stromausfalls liegen. Den Grund für diese Diskrepanz sehen wir darin, dass die
bekundeten Entschädigungsforderungen für den Verzicht auf Versorgungssicherheit
tendenziell über dem tatsächlichen Wert liegen, der der Versorgungssicherheit mit Strom
beigemessen wird, wohingegen die dafür bekundete Zahlungsbereitschaft tendenziell
untertrieben wird.2017-11-14T13:22:11ZA test for separability in covariance operators of random surfaces
http://hdl.handle.net/2003/36169
Title: A test for separability in covariance operators of random surfaces
Authors: Bagchi, Pramita; Dette, Holger
Abstract: The assumption of separability is a simplifying and very popular assumption in
the analysis of spatio-temporal or hypersurface data structures. It is often made in
situations where the covariance structure cannot be easily estimated, for example
because of a small sample size or because of computational storage problems. In
this paper we propose a new and very simple test to validate this assumption. Our
approach is based on a measure of separability which is zero in the case of separability
and positive otherwise. The measure can be estimated without calculating
the full non-separable covariance operator. We prove asymptotic normality of the
corresponding statistic with a limiting variance, which can easily be estimated from
the available data. As a consequence quantiles of the standard normal distribution
can be used to obtain critical values and the new test of separability is very easy to
implement. In particular, our approach does neither require projections on subspaces
generated by the eigenfunctions of the covariance operator, nor resampling
procedures to obtain critical values nor distributional assumptions as recently used
by Aston et al. (2017) and Constantinou et al. (2017) to construct tests for separability.
We investigate the finite sample performance by means of a simulation study
and also provide a comparison with the currently available methodology. Finally,
the new procedure is illustrated analyzing wind speed and temperature data.2017-11-08T09:31:50ZOptimal designs for regression with spherical data
http://hdl.handle.net/2003/36168
Title: Optimal designs for regression with spherical data
Authors: Dette, Holger; Konstantinou, Maria; Schorning, Kirsten; Gösmann, Josua
Abstract: In this paper optimal designs for regression problems with spherical predictors of
arbitrary dimension are considered. Our work is motivated by applications in material
sciences, where crystallographic textures such as the missorientation distribution
or the grain boundary distribution (depending on a four dimensional spherical predictor)
are represented by series of hyperspherical harmonics, which are estimated
from experimental or simulated data.
For this type of estimation problems we explicitly determine optimal designs with
respect to Kiefers op-criteria and a class of orthogonally invariant information criteria
recently introduced in the literature. In particular, we show that the uniform
distribution on the m-dimensional sphere is optimal and construct discrete and implementable
designs with the same information matrices as the continuous optimal
designs. Finally, we illustrate the advantages of the new designs for series estimation
by hyperspherical harmonics, which are symmetric with respect to the first and
second crystallographic point group.2017-11-08T09:29:23ZFunctional data analysis in the Banach space of continuous functions
http://hdl.handle.net/2003/36129
Title: Functional data analysis in the Banach space of continuous functions
Authors: Dette, Holger; Kokot, Kevin; Aue, Alexander
Abstract: Functional data analysis is typically conducted within the L2-Hilbert space framework. There is by now a fully developed statistical toolbox allowing for the principled application of the functional data machinery to real-world problems, often based on dimension reduction techniques such as functional principal component analysis. At the same time, there have recently been a number of publications that sidestep dimension reduction steps and focus on a fully functional L2-methodology. This paper goes one step further and develops data analysis methodology for functional time series in the space of all continuous functions. The work is motivated by the fact that objects with rather different shapes may still have a small L2-distance and are therefore identified as similar when using an L2-metric. However, in applications it is often desirable to
use metrics reflecting the visualaization of the curves in the statistical analysis. The methodological contributions are focused on developing two-sample and change-point tests as well as confidence bands, as these procedures appear do be conducive to the proposed setting. Particular interest is put on relevant differences; that is, on not trying to test for exact equality, but rather for pre-specified deviations under the null hypothesis.
The procedures are justified through large-sample theory. To ensure practicability, nonstandard bootstrap procedures are developed and investigated addressing particular features that arise in the problem of testing relevant hypotheses. The finite sample properties are explored through a simulation study and an application to annual temperature profiles.2017-10-19T14:17:45ZOptimal designs for enzyme inhibition kinetic models
http://hdl.handle.net/2003/36099
Title: Optimal designs for enzyme inhibition kinetic models
Authors: Schorning, Kirsten; Dette, Holger; Kettelhake, Katrin; Möller, Tilman
Abstract: In this paper we present a new method for determining optimal designs for enzyme
inhibition kinetic models, which are used to model the influence of the concentration of a
substrate and an inhibition on the velocity of a reaction. The approach uses a nonlinear
transformation of the vector of predictors such that the model in the new coordinates is
given by an incomplete response surface model. Although there exist no explicit solutions
of the optimal design problem for incomplete response surface models so far, the corre-
sponding design problem in the new coordinates is substantially more transparent, such
that explicit or numerical solutions can be determined more easily. The designs for the
original problem can finally be found by an inverse transformation of the optimal designs
determined for the response surface model. We illustrate the method determining explicit
solutions for the D-optimal design and for the optimal design problem for estimating the
individual coefficients in a non-competitive enzyme inhibition kinetic model.2017-09-15T13:15:48ZCombining cumulative sum change-point detection tests for assessing the stationarity of univariate time series
http://hdl.handle.net/2003/36098
Title: Combining cumulative sum change-point detection tests for assessing the stationarity of univariate time series
Authors: Bücher, Axel; Fermanian, Jean-David; Kojadinovic, Ivan
Abstract: We derive tests of stationarity for continuous univariate time series by combining changepoint
tests sensitive to changes in the contemporary distribution with tests sensitive to
changes in the serial dependence. Rank-based cumulative sum tests based on the empirical
distribution function and on the empirical autocopula at a given lag are considered first.
The combination of their dependent p-values relies on a joint dependent multiplier bootstrap
of the two underlying statistics. Conditions under which the proposed combined testing
procedure is asymptotically valid under stationarity are provided. After discussing the
choice of the maximum lag to investigate, extensions based on tests solely focusing on second-order
characteristics are proposed. The finite-sample behaviors of all the derived statistical
procedures are investigated in large-scale Monte Carlo experiments and illustrations on two
real data sets are provided. Extensions to multivariate time series are briefly discussed as
well.2017-09-15T12:55:28ZA nonparametric test for stationarity in functional time series
http://hdl.handle.net/2003/36083
Title: A nonparametric test for stationarity in functional time series
Authors: van Delft, Anne; Bagchi, Pramita; Characiejus, Vaidotas; Dette, Holger
Abstract: We propose a new measure for stationarity of a functional time series, which is based on an explicit representation of the L2-distance between the spectral density operator of a non-stationary process and its best (L2-)approximation by a spectral density operator corresponding to a stationary process. This distance can easily be estimated by sums of Hilbert-Schmidt inner products of periodogram operators (evaluated at different frequencies), and asymptotic normality of an appropriately standardised version of the estimator can be established for the corresponding estimate under the null hypothesis and alternative. As a
result we obtain confidence intervals for the discrepancy of the underlying process from a functional stationary process and a simple asymptotic frequency domain level ® test (using the quantiles of the normal distribution) for the hypothesis of stationarity of functional time series. Moreover, the new methodology allows also to test precise hypotheses of the form “the functional time series is approximately stationarity”, which means that the new measure of stationarity is smaller than a given threshold. Thus in contrast to methods proposed in the literature our approach also allows to test for “relevant” deviations from stationarity. We demonstrate in a small simulation study that the new method has very good finite sample properties and compare it with the currently available alternative procedures. Moreover, we apply our test to annual temperature curves.2017-09-06T13:48:08ZBehavioral economics and energy conservation - a systematic review of nonprice interventions and their causal effects
http://hdl.handle.net/2003/36037
Title: Behavioral economics and energy conservation - a systematic review of nonprice interventions and their causal effects
Authors: Andor, Mark; Fels, Katja
Abstract: Research from economics and psychology suggests that behavioral
interventions can be a powerful climate policy instrument. This paper
provides a systematic review of the existing empirical evidence on non-price
interventions targeting energy conservation behavior of private households.
Specifically, we analyze the four nudge-like interventions referred to as social
comparison, pre-commitment, goal setting and labeling in 38 international
studies comprising 91 treatments. This paper differs from previous systematic
reviews by solely focusing on studies that permit the identification of causal
effects. We find that all four interventions have the potential to significantly
reduce energy consumption of private households, yet effect sizes vary
immensely. We conclude by emphasizing the importance of impact
evaluations before rolling out behavioral policy interventions at scale.2017-07-28T12:09:18ZA note on conditional versus joint unconditional weak convergence in bootstrap consistency results
http://hdl.handle.net/2003/35989
Title: A note on conditional versus joint unconditional weak convergence in bootstrap consistency results
Authors: Bücher, Axel; Kojadinovic, Ivan
Abstract: The consistency of a bootstrap or resampling scheme is classically validated by weak convergence of conditional laws. However, when working with stochastic processes in the space of bounded functions and their weak convergence in the Hoffmann-Jorgensen sense, an obstacle occurs: due to possible non-measurability, neither laws nor conditional laws are well-defined. Starting from an equivalent formulation of weak convergence based on the bounded Lipschitz metric, a classical circumvent is to formulate bootstrap consistency in terms of the latter distance between what might be called a conditional law of the (nonmeasurable) bootstrap process and the law of the limiting process. The main contribution of this note is to provide an equivalent formulation of bootstrap consistency in the space of bounded functions which is more intuitive and easy to work with. Essentially, the equivalent formulation consists of (unconditional) weak convergence of the original process jointly with an arbitrary large number of bootstrap replicates. As a by-product, we provide two equivalent formulations of bootstrap consistency for Rd-valued statistics: the first in terms of (unconditional) weak convergence of the statistic jointly with its bootstrap replicates, the second in terms of convergence in probability of the empirical distribution function of the bootstrap replicates. Finally, the asymptotic validity of bootstrap-based confidence intervals and tests is briefly revisited, with particular emphasis on the, in practice unavoidable, Monte Carlo approximation of conditional quantiles.2017-06-10T12:42:56ZInference for heavy tailed stationary time series based on sliding blocks
http://hdl.handle.net/2003/35988
Title: Inference for heavy tailed stationary time series based on sliding blocks
Authors: Bücher, Axel; Segers, Johan
Abstract: The block maxima method in extreme value theory consists of fitting an extreme value distribution to a sample of block maxima extracted from a time series. Traditionally, the maxima are taken over disjoint blocks of observations. Alternatively, the blocks can be chosen to slide through the observation period, yielding a
larger number of overlapping blocks. Inference based on sliding blocks is found to be more efficient than inference based on disjoint blocks. The asymptotic variance of the maximum likelihood estimator of the Fréchet shape parameter is reduced by more than 18%. Interestingly, the amount of the efficiency gain is the same whatever the serial dependence of the underlying time series: as for disjoint blocks, the asymptotic
distribution depends on the serial dependence only through the sequence of scaling constants. The findings are illustrated by simulation experiments and are applied to the estimation of high return levels of the daily log-returns of the Standard & Poor's 500 stock market index.2017-06-10T12:40:34ZDie Gerechtigkeitslücke in der Verteilung der Kosten der Energiewende auf die privaten Haushalte
http://hdl.handle.net/2003/35978
Title: Die Gerechtigkeitslücke in der Verteilung der Kosten der Energiewende auf die privaten Haushalte
Authors: Frondel, Manuel; Kutzschbauch, Ole; Sommer, Stephan; Traub, Stefan
Abstract: Die Energiewende bürdet den Verbrauchern zunehmende Lasten auf. Relativ zu
ihrem Einkommen fallen diese Belastungen für einkommensschwache Haushalte
stärker aus als für einkommensstarke Haushalte. Die Ergebnisse unserer empirischen
Erhebung unter mehr als 11.000 Haushalten zeigen jedoch, dass in der Regel eine
Aufteilung der Kosten der Energiewende gewünscht wird, die Haushalte mit hohen
Einkommen vergleichsweise stärker in die Pflicht nimmt als einkommensschwache
Haushalte. Die auf dieser Grundlage von uns konstatierte Gerechtigkeitslücke zwischen
der gewünschten und tatsächlichen Kostenbelastung der Haushalte nimmt
mit den wachsenden Kosten der Energiewende voraussichtlich weiter zu. Diese
Lücke könnte im Prinzip jedoch leicht geschlossen werden, wie die in diesem Beitrag
dargestellten empirischen Schätzungen der Zahlungsbereitschaft der Haushalte
für die Förderung der Erneuerbaren auf Basis von Diskreten-Wahl-Modellen nahelegen.
So könnten die einkommenstärkeren Haushalte bei der Finanzierung der
Energiewende stärker als bislang in die Pflicht genommen werden, da nach unseren
Schätzergebnissen die Haushalte des oberen Einkommensdrittels eine statistisch
signifikant höhere Zustimmung zu zukünftigen EEG-Umlageerhöhungen zeigen als
die Haushalte des unteren Einkommensdrittels.2017-06-01T10:13:43ZThe speed of transition revisited
http://hdl.handle.net/2003/35942
Title: The speed of transition revisited
Authors: Naevdal, Eric; Wagner, Martin
Abstract: The speed of transition literature appears to have overlooked the fact that due to the
dynamic nature of the economy the post-transition economic performance influences optimal
behavior already during transition. We illustrate the implications of this neglect
using the well-known model of Aghion and Blanchard (1994, Section 6.4). The correct
solution differs in several respects from the "approximate" solution presented by Aghion
and Blanchard. First, unemployment is increasing up to a certain endogenous point in
time, when, second, the remaining state sector is closed down. This point in time can be
defined as the end of transition. The correct solution is based on transforming the problem
to a type of a dynamic optimization problem often encountered in resource economics: a
scrap value problem with free terminal time.2017-05-02T12:17:22ZConsequentiality and the Willingness-To-Pay for Renewables: Evidence from Germany
http://hdl.handle.net/2003/35937
Title: Consequentiality and the Willingness-To-Pay for Renewables: Evidence from Germany
Authors: Andor, Mark A.; Frondel, Manuel; Horvath, Marco
Abstract: Based on hypothetical responses originating from a large-scale survey among
about 7,000 German households, this study investigates the discrepancy in willingness-to-
pay (WTP) estimates for green electricity across discrete-choice and open-ended valuation
formats, thereby accounting for perceived consequentiality: respondents selfselect
into two groups distinguished by their belief in the consequentiality of their
answers for policy making. Recognizing that consequentiality status and WTP might
be jointly influenced by unobservable factors, we employ a switching regression model
that accounts for the potential endogeneity of respondents’ belief in consequences and,
hence, biases from sample selectivity. Contrasting with the received literature, we find
WTP bids that tend to be higher among those respondents who obtained questions
in the open-ended format, rather than single binary choice questions. This difference
shrinks, however, when focusing on individuals who perceive the survey as politically
consequential.2017-04-25T07:44:39ZRelevant change points in high dimensional time series
http://hdl.handle.net/2003/35934
Title: Relevant change points in high dimensional time series
Authors: Dette, Holger; Gösmann, Josua2017-04-19T13:32:46ZSequential detection of parameter changes in dynamic conditional correlation models
http://hdl.handle.net/2003/35914
Title: Sequential detection of parameter changes in dynamic conditional correlation models
Authors: Pape, Katharina; Galeano, Pedro; Wied, Dominik
Abstract: A multivariate monitoring procedure is presented to detect changes in the parameter vector of
the dynamic conditional correlation model proposed by Robert Engle in 2002. The benefit of
the proposed procedure is that it can be used to detect changes in both the conditional and
unconditional variance as well as in the correlation structure of the model. The detector is based
on quasi log likelihood scores. More precisely, standardized derivations of quasi log likelihood
contributions of points in the monitoring period are evaluated at parameter estimates calculated
from a historical period. The null hypothesis of a constant parameter vector is rejected if these
standardized terms differ too much from those that were expected under the assumption of a
constant parameter vector. Under appropriate assumptions on moments and the structure of
the parameter space, limit results are derived both under null hypothesis and alternatives. In a
simulation study, size and power properties of the procedure are examined in various scenarios.2017-04-06T11:27:08ZFourier analysis of serial dependence measures
http://hdl.handle.net/2003/35853
Title: Fourier analysis of serial dependence measures
Authors: Van Hecke, Ria; Volgushev, Stanislav; Dette, Holger
Abstract: Classical spectral analysis is based on the discrete Fourier transform of the auto-covariances.
In this paper we investigate the asymptotic properties of new frequency domain methods where the auto-covariances in the spectral density are replaced by alternative dependence measures which can be estimated by U-statistics. An interesting example is given by
Kendall's r , for which the limiting variance exhibits a surprising behavior.2017-03-15T11:40:32ZCointegration in singular ARMA models
http://hdl.handle.net/2003/35778
Title: Cointegration in singular ARMA models
Authors: Deistler, Manfred; Wagner, Martin
Abstract: We consider the cointegration properties of singular ARMA processes integrated of order one.
Such processes are necessarily cointegrated as opposed to the regular case. We show that in the
left coprime case the cointegrating space only depends upon the autoregressive polynomial at
one.2017-02-03T11:14:06ZRisk estimators for choosing regularization parameters in ill-posed problems - properties and limitations
http://hdl.handle.net/2003/35772
Title: Risk estimators for choosing regularization parameters in ill-posed problems - properties and limitations
Authors: Lucka, Felix; Proksch, Katharina; Brune, Christoph; Bissantz, Nicolai; Burger, Martin; Dette, Holger; Wübbeling, Frank
Abstract: This paper discusses the properties of certain risk estimators recently proposed to
choose regularization parameters in ill-posed problems. A simple approach is Stein's unbiased
risk estimator (SURE), which estimates the risk in the data space, while a recent
modification (GSURE) estimates the risk in the space of the unknown variable. It seems
intuitive that the latter is more appropriate for ill-posed problems, since the properties
in the data space do not tell much about the quality of the reconstruction. We provide
theoretical studies of both estimators for linear Tikhonov regularization in a finite
dimensional setting and estimate the quality of the risk estimators, which also leads to
asymptotic convergence results as the dimension of the problem tends to infinity. Unlike
previous papers, who studied image processing problems with a very low degree of
ill-posedness, we are interested in the behavior of the risk estimators for increasing illposedness.
Interestingly, our theoretical results indicate that the quality of the GSURE
risk can deteriorate asymptotically for ill-posed problems, which is confirmed by a detailed
numerical study. The latter shows that in many cases the GSURE estimator leads
to extremely small regularization parameters, which obviously cannot stabilize the reconstruction.
Similar but less severe issues with respect to robustness also appear for the
SURE estimator, which in comparison to the rather conservative discrepancy principle
leads to the conclusion that regularization parameter choice based on unbiased risk estimation
is not a reliable procedure for ill-posed problems. A similar numerical study for
sparsity regularization demonstrates that the same issue appears in nonlinear variational
regularization approaches.2017-02-01T10:36:22ZClimate change, population ageing and public spending: Evidence on individual preferences
http://hdl.handle.net/2003/35771
Title: Climate change, population ageing and public spending: Evidence on individual preferences
Authors: Andor, Mark; Schmidt, Christoph M.; Sommer, Stephan
Abstract: Economic theory, as well as empirical research, suggest that elderly people
prefer public spending on policies yielding short-term benefits. This might be bad
news for policies aimed at combating climate change: while the unavoidable costs of
these policies arise today, the expected benefits occur in the distant future. Drawing
on data from over 12,000 households and using the ordered logit and the generalized
ordered logit model, we analyze whether attitudes towards climate change and climate
policies, as well as public spending preferences, differ with respect to age. Our
estimates show that elderly people are less concerned about climate change, but more
concerned about other global challenges. Furthermore, they are less likely to support
climate-friendly policies, such as the subsidization of renewables, and allocate less
public resources to environmental policies. Thus, our results suggest that the ongoing
demographic change in industrialized countries may undermine climate policies.2017-01-31T14:58:48ZRobust estimation of change-point location
http://hdl.handle.net/2003/35748
Title: Robust estimation of change-point location
Authors: Gerstenberger, Carina
Abstract: We introduce a robust estimator of the location parameter for the change-point in the
mean based on the Wilcoxon statistic and establish its consistency for L1 near epoch
dependent processes. It is shown that the consistency rate depends on the magnitude
of change. A simulation study is performed to evaluate finite sample properties of the
Wilcoxon-type estimator in standard cases, as well as under heavy-tailed distributions and
disturbances by outliers, and to compare it with a CUSUM-type estimator. It shows that
the Wilcoxon-type estimator is equivalent to the CUSUM-type estimator in standard cases,
but outperforms the CUSUM-type estimator in presence of heavy tails or outliers in the
data.2017-01-11T09:33:07ZOn MSE-optimal crossover designs
http://hdl.handle.net/2003/35743
Title: On MSE-optimal crossover designs
Authors: Neumann, Christoph; Kunert, Joachim
Abstract: In crossover designs, each subject receives a series of treatments
one after the other. Most papers on optimal crossover designs consider an
estimate which is corrected for carryover effects. We look at the estimate
for direct effects of treatment, which is not corrected for carryover effects.
If there are carryover effects, this estimate will be biased. We try to find a
design that minimizes the mean square error, that is the sum of the squared
bias and the variance. It turns out that the designs which are optimal for
the corrected estimate are highly efficient for the uncorrected estimate.2017-01-06T12:30:55ZOrdinal pattern dependence between hydrological time series
http://hdl.handle.net/2003/35732
Title: Ordinal pattern dependence between hydrological time series
Authors: Fischer, Svenja; Schumann, Andreas; Schnurr, Alexander
Abstract: Ordinal patterns provide a method to measure correlation between time series. In
contrast to classical correlation measures like the Pearson correlation coefficient they
are able to measure not only linear correlation but also non-linear correlation even
in the presence of non-stationarity. Hence, they are a noteworthy alternative to the
classical approaches when considering discharge series. Discharge series naturally
show a high variation as well as single extraordinary extreme events and, caused by
anthropogenic and climatic impacts, non-stationary behaviour. Here, the method
of ordinal patterns is used to compare pairwise discharge series derived from macroand
mesoscale catchments in Germany. Differences of coincident groups were detected
for winter and summer annual maxima. Hydrological series, which are mainly
driven by annual climatic conditions (yearly discharges and low water discharges)
showed other and in some cases surprising interdependencies between macroscale
catchments. Anthropogenic impacts as the construction of a reservoir or different
flood conditions caused by urbanization could be detected.2016-12-22T12:29:13ZA simple test for white noise in functional time series
http://hdl.handle.net/2003/35731
Title: A simple test for white noise in functional time series
Authors: Bagchi, Pramita; Characiejus, Vaidotas; Dette, Holger
Abstract: We propose a new procedure for white noise testing of a functional time series.
Our approach is based on an explicit representation of the L2-distance between the
spectral density operator and its best (L2-)approximation by a spectral density operator
corresponding to a white noise process. The estimation of this distance can be
easily accomplished by sums of periodogram kernels and it is shown that an appropriately
standardized version of the estimator is asymptotically normal distributed
under the null hypothesis (of functional white noise) and under the alternative. As a
consequence we obtain a very simple test (using the quantiles of the normal distribution)
for the hypothesis of a white noise functional process. In particular the test
does neither require the estimation of a long run variance (including a fourth order
cumulant) nor resampling procedures to calculate critical values. Moreover, in contrast
to all other methods proposed in the literature our approach also allows to test
for "relevant" deviations from white noise and to construct confidence intervals for
a measure which measures the discrcepancy of the underlying process from a functional
white noise process.2016-12-22T12:27:11ZA multivariate approach for onset detection using supervised classification
http://hdl.handle.net/2003/35699
Title: A multivariate approach for onset detection using supervised classification
Authors: Bauer, Nadja; Friedrichs, Klaus; Weihs, Claus
Abstract: In this paper we introduce a new onset detection approach which incorporates a
supervised classification model for estimating the tone onset probability in signal
frames. In contrast to the most classical strategies where only one detection
function can be applied for signal feature extraction, the classification model
can be fitted on a large feature set. This is meaningful since, depending on the
music characteristics, some detection functions can be more advantageous that
the others.
Although the idea of the considering of many detection functions is not new
in the literature, these functions are, so far, treated in a univariate way by, e.g.,
building of weighted sums. This probably lies on the difficulties of the direct
transfer of the classification ideas to the onset detection task. The goodness
measure of onset detection is namely based on the comparison of two time
vectors while by the classification such a measure is derived from the framewise
matches of predicted and true labels.
In this work we first construct { based on several resent publications { a
comprehensive univariate onset detection algorithm which depends on many free
settable parameters. Then, the new multivariate approach also depending on
many free parameters is introduced. The parameters of the both onset detection
strategies are optimized for online and offline cases by utilizing an appropriate
validation technique. The main funding is that the multivariate strategy outperforms
the univariate one significantly regarding the F-measure. Furthermore,
the multivariate approach seems to be especially beneficial in online case since
it requires only the halve of the future signal information comparing to the best
setting of the univariate onset detection.2016-12-14T14:10:55ZTime efficient optimization of instance based problems with application to tone onset detection
http://hdl.handle.net/2003/35698
Title: Time efficient optimization of instance based problems with application to tone onset detection
Authors: Bauer, Nadja; Friedrichs, Klaus; Weihs, Claus
Abstract: A time efficient optimization technique for instance based problems is proposed,
where for each parameter setting the target function has to be evaluated on a
large set of problem instances. Computational time is reduced by beginning with
a performance estimation based on the evaluation of a representative subset of
instances. Subsequently, only promising settings are evaluated on the whole
data set.
As application a comprehensive music onset detection algorithm is introduced
where several numerical and categorical algorithm parameters are optimized
simultaneously. Here, problem instances are music pieces of a data base.
Sequential model based optimization is an appropriate technique to solve this
optimization problem. The proposed optimization strategy is compared to the
usual model based approach with respect to the goodness measure for tone onset
detection. The performance of the proposed method appears to be competitive
with the usual one while saving more than 84% of instance evaluation time
on average. One other aspect is a comparison of two strategies for handling
categorical parameters in Kriging based optimization.2016-12-14T14:08:04ZA Bayesian heterogeneous coefficients spatial autoregressive panel data model of retail fuel duopoly pricing
http://hdl.handle.net/2003/35678
Title: A Bayesian heterogeneous coefficients spatial autoregressive panel data model of retail fuel duopoly pricing
Authors: LeSage, James P.; Vance, Colin; Chih, Yao-Yu
Abstract: We apply a heterogenous coefficient spatial autoregressive panel model to explore
competition/cooperation by duopoly pairs of German fueling stations in setting prices
for diesel and E5 fuel. We rely on a Markov Chain Monte Carlo (MCMC) estimation
methodology applied with non-informative priors, which produces estimates equivalent
to those from (quasi-) maximum likelihood. We explore station-level pricing behavior
using pairs of proximately situated fueling stations with no nearby neighbors. Our sample
data represents average daily diesel and e5 fuel prices, and refinery cost information
covering more than 487 days.
The heterogeneous coefficients spatial autoregressive panel data model uses the large
sample of daily time periods to produce spatial autoregressive model estimates for each
fueling station. These estimates provide information regarding the price reaction function
of each station to its duopoly rival station. This is in contrast to conventional
estimates of price reaction functions that average over the entire cross-sectional sample
of stations. We show how these estimates can be used to infer competition versus
cooperation in price setting by individual stations.2016-11-30T12:01:00ZJoint modeling of annual maximum precipitation across different duration levels
http://hdl.handle.net/2003/35667
Title: Joint modeling of annual maximum precipitation across different duration levels
Authors: Gräler, Benedikt; Fischer, Svenja; Schumann, Andreas
Abstract: Summarizing a series of rainfall events for different duration levels by their annual maxima provides
valuable information. These statistics are e.g. the design base of urban drainage systems. Investigating
an entire set of duration levels, the dependence among them has to be taken into account. We propose
an approach where a set of generalized extreme value distributions and a D-vine copula are
exibly
parameterized by the set of duration levels of interest. A priori, it is not necessary to fix the duration
levels nor the number of duration levels. This joint model produces increasing values for both, longer
duration levels and larger return periods. In a sample application, we show that this model is
exible
enough to capture variations across the duration levels while reproducing the correlation structure of
the data. A joint probabilistic model allows to study a new set of design questions where conditional
probabilities or joint return periods are of interest. This is for instance the case when nested sub-
basins are studied. An urban area within a larger catchment will be sensitive to annual maxima of
shorter durations due to high intensities while the enclosing catchment is prone to annual maxima of
long durations due to huge volumes. A risk analysis of the entire catchment requires a joint study of
both and an approach where the duration levels' dependence is taken into account.2016-11-28T12:55:24ZTests for scale changes based on pairwise differences
http://hdl.handle.net/2003/35630
Title: Tests for scale changes based on pairwise differences
Authors: Gerstenberger, Carina; Vogel, Daniel; Wendler, Martin
Abstract: In many applications it is important to know whether the amount of
uctuation in a
series of observations changes over time. In this article, we investigate different tests for
detecting change in the scale of mean-stationary time series. The classical approach based
on the CUSUM test applied to the squared centered, is very vulnerable to outliers and
impractical for heavy-tailed data, which leads us to contemplate test statistics based on
alternative, less outlier-sensitive scale estimators.
It turns out that the tests based on Gini's mean difference (the average of all pairwise
distances) or generalized Qn estimators (sample quantiles of all pairwise distances) are very
suitable candidates. They improve upon the classical test not only under heavy tails or in
the presence of outliers, but also under normality. An explanation for this counterintuitive
result is that the corresponding long-run variance estimates are less affected by a scale
change than in the case of the sample-variance-based test.
We use recent results on the process convergence of U-statistics and U-quantiles for
dependent sequences to derive the limiting distribution of the test statistics and propose
estimators for the long-run variance. We perform a simulation study to investigate the
finite sample behavior of the tests and their power. Furthermore, we demonstrate the
applicability of the new change-point detection methods at two real-life data examples
from hydrology and finance.2016-11-24T15:46:45ZOn Wigner-Ville spectra and the unicity of time-varying quantile-based spectral densities
http://hdl.handle.net/2003/35629
Title: On Wigner-Ville spectra and the unicity of time-varying quantile-based spectral densities
Authors: Birr, Stefan; Dette, Holger; Hallin, Marc; Kley, Tobias; Volgushev, Stanislav
Abstract: The unicity of the time-varying quantile-based spectrum
proposed in Birr et al. (2016) is established via an asymptotic representation
result involving Wigner-Ville spectra.2016-11-24T15:35:05ZHeterogeneity of regional growth in the EU: A recursive partitioning approach
http://hdl.handle.net/2003/35627
Title: Heterogeneity of regional growth in the EU: A recursive partitioning approach
Authors: Wagner, Martin; Zeileis, Achim
Abstract: We use model-based recursive partitioning as a technique to assess heterogeneity
of growth and convergence processes based on an economic growth regression for
255 European Union NUTS2 regions from 1995 to 2005. The starting point of the
analysis is a human-capital-augmented Solow-type growth equation similar in spirit
to Mankiw, Romer, and Weil (1992). Initial GDP and the share of highly educated
in the working age population are found to be important for explaining economic
growth, whereas the investment share in physical capital is only significant for coastal
regions in the PIIGS countries. Recursive partitioning leads to a regression tree with
four terminal nodes with partitioning according to (i) capital regions, (ii) non-capital
regions in or outside the so-called PIIGS countries and (iii) inside the respective
PIIGS regions furthermore between coastal and non-coastal regions.2016-11-24T15:32:24ZMultiscale inference for multivariate deconvolution
http://hdl.handle.net/2003/35626
Title: Multiscale inference for multivariate deconvolution
Authors: Eckle, Konstantin; Bissantz, Nicolai; Dette, Holger
Abstract: In this paper we provide new methodology for inference of the geometric features of
a multivariate density in deconvolution. Our approach is based on multiscale tests to
detect significant directional derivatives of the unknown density at arbitrary points in
arbitrary directions. The multiscale method is used to identify regions of monotonicity
and to construct a general procedure for the detection of modes of the multivariate density.
Moreover, as an important application a significance test for the presence of a local
maximum at a pre-specified point is proposed. The performance of the new methods is investigated
from a theoretical point of view and the finite sample properties are illustrated
by means of a small simulation study.2016-11-24T15:30:35ZPredictive, finite-sample model choice for time series under stationarity and non-stationarity
http://hdl.handle.net/2003/35394
Title: Predictive, finite-sample model choice for time series under stationarity and non-stationarity
Authors: Kley, Tobias; Preuß, Philip; Fryzlewicz, Piotr
Abstract: In statistical research there usually exists a choice between structurally simpler or
more complex models. We argue that, even if a more complex, locally stationary time
series model were true, then a simple, stationary time series model may be advantageous
to work with under parameter uncertainty. We present a new model choice
methodology, where one of two competing approaches is chosen based on its empirical
finite-sample performance with respect to prediction. A rigorous, theoretical analysis
of the procedure is provided. As an important side result we prove, for possibly diverging
model order, that the localised Yule-Walker estimator is strongly, uniformly
consistent under local stationarity. An R package, forecastSNSTS, is provided and
used to apply the methodology to financial and meteorological data in empirical examples.
We further provide an extensive simulation study and discuss when it is
preferable to base forecasts on the more volatile time-varying estimates and when it
is advantageous to forecast as if the data were from a stationary process, even though
they might not be.2016-11-23T13:01:56Z“Linear” fully modified OLS estimation of cointegrating polynomial regressions
http://hdl.handle.net/2003/35393
Title: “Linear” fully modified OLS estimation of cointegrating polynomial regressions
Authors: Stypka, Oliver; Grabarczyk, Peter; Kawka, Rafael; Wagner, Martin
Abstract: A large part of the empirical environmental Kuznets curve literature uses cointegrating regressions
involving a unit root process and its powers as regressors. In this literature the unit root
process and its powers are, incorrectly, all treated as integrated processes and modified least
squares estimation methods for linear cointegrating regressions are routinely employed. We
show that this approach to estimation leads for the Fully Modified OLS estimator surprisingly
to the same limiting distribution as obtained for the version of the Fully Modified OLS estimator
adapted to the cointegrating polynomial regression setting of Wagner and Hong (2016).2016-11-23T12:59:02ZA note on functional equivalence between intertemporal and multisectoral investment adjustment costs
http://hdl.handle.net/2003/35392
Title: A note on functional equivalence between intertemporal and multisectoral investment adjustment costs
Authors: Ivashchenko, Sergey; Mutschler, Willi
Abstract: Kim (2003, JEDC) shows functional equivalence between intertemporal and multisectoral
investment adjustments costs in a linearized RBC model. From an identification point
of view, two parameters are not separately distinguishable, they enter as a sum into the
linearized solution. We demonstrate that estimating the quadratic approximation of the
model provides means to extract more information on the structural parameters from
data and thus estimate both parameters that are unidentiable under the log-linearized
model.2016-11-23T12:56:25ZThe environmental Kuznets curve for carbon dioxide emissions: A seemingly unrelated cointegrating polynomial regressions approach
http://hdl.handle.net/2003/35391
Title: The environmental Kuznets curve for carbon dioxide emissions: A seemingly unrelated cointegrating polynomial regressions approach
Authors: Wagner, Martin; Grabarczyk, Peter
Abstract: We present estimation and inference techniques for systems of seemingly unrelated cointegrating
polynomial regressions. In particular, we present two fully modified-type estimators and
Wald-type hypothesis tests based upon them. We develop tests for poolability of subsets of
coefficients over subsets of equations. For the case that these restrictions are not rejected, we
provide the correspondingly pooled estimators. This group-wise pooling turns out to be very
useful in our application where we analyze the environmental Kuznets curve for CO2 emissions
for seven early industrialized countries. Group-wise pooled estimation leads to almost the same
results as unrestricted estimation whilst reducing the number of estimated parameters by about
one third. Fully pooled, panel-data type estimation performs poorly in comparison.2016-11-23T12:53:27ZIntegrated modified OLS estimation for cointegrating polynomial regressions - with an application to the environmental Kuznets curve for CO2 emissions
http://hdl.handle.net/2003/35390
Title: Integrated modified OLS estimation for cointegrating polynomial regressions - with an application to the environmental Kuznets curve for CO2 emissions
Authors: Frondel, Manuel; Grabarczyk, Peter; Wagner, Martin
Abstract: This paper considers the integrated modified OLS (IM-OLS) estimator for cointegrating
polynomial regressions recently developed in Vogelsang and Wagner (2014a; 2014b).
Cointegrating polynomial regressions include deterministic variables, integrated processes
and integer powers of integrated processes as explanatory variables. The stochastic
regressors are allowed to be endogenous and the stationary errors are allowed to
be serially correlated. The IM-OLS estimator allows for asymptotically standard inference
in this framework when using consistent estimators of the long run variance.
Additionally, we also provide fixed-b asymptotic theory for the case of full design to
capture the impact of kernel and bandwidth choice on the sampling distributions of
estimators and test statistics. We investigate the properties of the IM-OLS estimator
and hypothesis tests based upon it by means of a simulation study to compare its
performance with fully modified OLS (FM-OLS) and dynamic OLS (D-OLS). Finally,
we apply the method to estimate the environmental Kuznets curve for CO2 emissions
over the period 1870-2009.2016-11-23T12:50:25ZChange point detection in autoregressive models with no moment assumptions
http://hdl.handle.net/2003/35389
Title: Change point detection in autoregressive models with no moment assumptions
Authors: Akashi, Fumiya; Dette, Holger; Liu, Yan
Abstract: In this paper we consider the problem of detecting a change in the parameters
of an autoregressive process, where the moments of the innovation process
do not necessarily exist. An empirical likelihood ratio test for the existence
of a change point is proposed and its asymptotic properties are studied. In
contrast to other work on change point tests using empirical likelihood, we do
not assume knowledge of the location of the change point. In particular, we
prove that the maximizer of the empirical likelihood is a consistent estimator
for the parameters of the autoregressive model in the case of no change point
and derive the limiting distribution of the corresponding test statistic under
the null hypothesis. We also establish consistency of the new test. A nice
feature of the method consists in the fact that the resulting test is asymptotically
distribution free and does not require an estimate of the long run
variance. The asymptotic properties of the test are investigated by means of
a small simulation study, which demonstrates good finite sample properties of
the proposed method.2016-11-23T12:47:34ZA computational study of auditory models in music recognition tasks for normalhearing and hearing-impaired listeners
http://hdl.handle.net/2003/35382
Title: A computational study of auditory models in music recognition tasks for normalhearing and hearing-impaired listeners
Authors: Friedrichs, Klaus; Bauer, Nadja; Martin, Rainer; Weihs, Claus
Abstract: The utility of auditory models for solving three music recognition
tasks { onset detection, pitch estimation and instrument recognition
{ is analyzed. Appropriate features are introduced which enable the
use of supervised classification. The auditory model-based approaches are tested in a comprehensive study and compared to state-of-the-art methods, which usually do not employ an auditory model. For this study, music data is selected according to an experimental design, which enables statements about performance differences with respect to specific music characteristics. The results confirm that the performance of music classification using the auditory model is at least comparable to the traditional methods. Furthermore, the auditory model is modified to exemplify the decrease of recognition rates in the presence of hearing deficits. The resulting system is a basis for estimating the intelligibility of music which in the future might be used for the automatic assessment of hearing instruments.2016-11-23T12:24:15ZA cointegrating polynomial regression analysis of the material Kuznets curve hypothesis
http://hdl.handle.net/2003/35381
Title: A cointegrating polynomial regression analysis of the material Kuznets curve hypothesis
Authors: Frondel, Manuel; Grabarczyk, Peter; Sommer, Stephan; Wagner, Martin
Abstract: Employing consumption data for aluminum, lead and zinc for eight OECD countries
spanning from 1900 to 2006, this paper tests the hypothesis underlying the notion
of the Material Kuznets Curve (MKC), which postulates an inverted U-shaped
relationship between a country’s level of economic development and its intensity
of metal use. Applying the tests and estimation techniques for nonlinear cointegration
developed by Saikkonen and Choi (2004),Wagner (2013) as well as Wagner
and Hong (2016), we find that the MKC hypothesis is less strongly supported by
the data than when employing the standard methods that have been used in the
empirical Environmental Kuznets Curve (EKC) literature so far. The evidence for a
cointegrating MKC is mixed, at best.2016-11-23T12:15:06ZAn asymptotic test on the stationarity of the variance
http://hdl.handle.net/2003/35380
Title: An asymptotic test on the stationarity of the variance
Authors: Dehling, Herold; Fried, Roland; Wornowizki, Max
Abstract: We reconsider a statistic introduced in Wornowizki et al. (2016) allowing to
test the stationarity of the variance for a sequence of independent random variables. In-
stead of determining rejection regions via the permutation principle as proposed before, we
provide asymptotic critical values leading to huge savings in computation time. To prove
the required limit theorems, the test statistic is viewed as a U-statistic constructed from
blockwise variance estimates. Since the distribution of the test statistic depends on the
sample size, a suitable new law of large numbers as well as a central limit theorem are
developed. These asymptotic results are illustrated on artificial data. The permutation and
asymptotic version of the test are compared to alternative procedures in extensive Monte
Carlo experiments. The simulation results suggest that the methods offer similar results
and high power when compared to their competitors, particularly in the case of multiple
structural breaks. They also estimate the structural break positions adequately.2016-11-23T12:09:07ZTrimmed likelihood estimators for stochastic differential equations with an application to crack growth analysis from photos
http://hdl.handle.net/2003/35358
Title: Trimmed likelihood estimators for stochastic differential equations with an application to crack growth analysis from photos
Authors: Müller, Christine H.; Meinke, Stefan H.
Abstract: We introduce trimmed likelihood estimators for processes given by a
stochastic differential equation for which a transition density is known or can
be approximated and present an algorithm to calculate them. To measure the
fit of the observations to a given stochastic process, two performance measures
based on the trimmed likelihood estimator are proposed. The approach is applied
to crack growth data which are obtained from a series of photos by backtracking
large cracks which were detected in the last photo. Such crack growth
data are contaminated by several outliers caused by errors in the automatic
image analysis. We show that trimming 20% of the data of a growth curve
leads to good results when 100 obtained crack growth curves are fitted with
the Ornstein-Uhlenbeck process and the Cox-Ingersoll-Ross processes while
the fit of the Geometric Brownian Motion is significantly worse. The method
is sensitive in the sense that crack curves obtained under different stress conditions
provide significantly different parameter estimates.2016-11-09T14:58:35ZA new method for adaptive spectral complexity reduction of music signals
http://hdl.handle.net/2003/35319
Title: A new method for adaptive spectral complexity reduction of music signals
Authors: Krymova, Ekaterina; Nagathil, Anil; Belomestny, Denis; Martin, Rainer
Abstract: In this discussion paper we present a novel unsupervised segmentation
procedure for music signals which relies on an explained variance criterion in the eigenspace of the constant-Q spectral domain. The procedure
is used in the context of a spectral complexity reduction method which
mitigates effects of cochlear hearing loss. It is compared to a segmentation based on equidistant boundaries. The results demonstrate that the
proposed segmentation procedure gives an improvement in terms of signal-
to-artefacts ratio in comparison to a segmentation based on equidistant
boundaries.2016-11-08T09:39:50ZChange point estimation based on the Wilcoxon test in the presence of long-range dependence
http://hdl.handle.net/2003/35318
Title: Change point estimation based on the Wilcoxon test in the presence of long-range dependence
Authors: Betken, Annika
Abstract: We consider an estimator, based on the two-sample Wilcoxon statistic, for the location of a
shift in the mean of long-range dependent sequences. Consistency and the rate of convergence for the
estimated change point are established. In particular, the 1/n convergence rate (with n denoting the number
of observations), which is typical under the assumption of independent observations, is also achieved for
long memory sequences in case of a constant shift height. It is proved that after a suitable normalization
the estimator converges in distribution to a functional of a fractional Brownian motion, if the change point
height decreases to 0 with a certain rate. The estimator is tested on two well-known data sets. Finite sample
behaviors are investigated in a Monte Carlo simulation study.2016-11-08T09:38:12ZTesting for change in stochastic volatility with long range dependence
http://hdl.handle.net/2003/35317
Title: Testing for change in stochastic volatility with long range dependence
Authors: Betken, Annika; Kulik, Rafal
Abstract: In this paper we consider a change point problem for long memory stochastic
volatility models. We show that the limiting behavior for the CUSUM test statistics
may not be affected by long memory, unlike the Wilcoxon test statistic which is
infuenced by long range dependence. We compare our results to subordinated long
memory Gaussian processes. Theoretical properties are accompanied by simulation
studies.2016-11-08T09:22:08ZA computational study of auditory models in music recognition tasks for normalhearing and hearing-impaired listeners
http://hdl.handle.net/2003/35316
Title: A computational study of auditory models in music recognition tasks for normalhearing and hearing-impaired listeners
Authors: Friedrichs, Klaus; Bauer, Nadja; Martin, Rainer; Weihs, Claus
Abstract: The utility of auditory models for solving three music recognition tasks { onset detection, pitch estimation and
instrument recognition { is analyzed. Appropriate features are introduced which enable the use of supervised
classification. The auditory model-based approaches are tested in a comprehensive study and compared to
state-of-the-art methods, which usually do not employ an auditory model. For this study, music data is selected
according to an experimental design, which enables statements about performance differences with respect to
specific music characteristics. The results conirm that the performance of music classification using the
auditory model is at least comparable to the traditional methods. Furthermore, the auditory model is modified
to exemplify the decrease of recognition rates in the presence of hearing deficits. The resulting system is a
basis for estimating the intelligibility of music which in the future might be used for the automatic assessment
of hearing instruments.2016-11-08T09:18:54ZEfficient global optimization: Motivation, variations and applications
http://hdl.handle.net/2003/35315
Title: Efficient global optimization: Motivation, variations and applications
Authors: Weihs, Claus; Herbrandt, Swetlana; Bauer, Nadja; Friedrichs, Klaus; Horn, Daniel
Abstract: A popular optimization method of a black box objective function is
Efficient Global Optimization (EGO), also known as Sequential Model Based
Optimization, SMBO, with kriging and expected improvement. EGO is a sequential
design of experiments aiming at gaining as much information as possible
from as few experiments as feasible by a skillful choice of the factor
settings in a sequential way. In this paper we will introduce the standard procedure
and some of its variants. In particular, we will propose some new variants
like regression as a modeling alternative to kriging and two simple methods for
the handling of categorical variables, and we will discuss focus search for the
optimization of the infill criterion. Finally, we will give relevant examples for
the application of the method. Moreover, in our group, we implemented all the
described methods in the publicly available R package mlrMBO.2016-11-08T09:15:59ZOn the method of probability weighted moments in regional frequency analysis
http://hdl.handle.net/2003/35311
Title: On the method of probability weighted moments in regional frequency analysis
Authors: Lilienthal, Jona; Kinsvater, Paul; Fried, Roland
Abstract: In regional flood frequency analysis it is of interest to estimate high quantiles of a local river
flow distribution by gathering information from similar stations in the neighborhood. E. g., the
popular Index Flood (IF) approach is based on an assumption termed regional homogeneity,
which states that the quantile curves of those stations only differ by a site-specific factor, the
so-called index flood, and it is assumed that the station's distribution is known up to some
finite-dimensional parameter. In this context the method of probability weighted moments (or
equivalently L-moments) is most popular for parameter estimation. While the observations
often can be regarded as independent in time, a challenge arises from the fact that river
flows from nearby stations are strongly dependent in space. To the best of our knowledge, none of the
approaches from the literature based on the IF-model and on L-moments is able to take spatial
dependence adequately into account. Our goal is to fill this gap. We present asymptotic theory
that does not ignore inter-site dependence, which, for instance, allows to evaluate estimation
uncertainty. As an application of this theory, a test procedure to check for regional homogeneity
under index-flood assumptions is given and reviewed in a simulation study.2016-11-03T13:42:40ZMultiscale inference for multivariate deconvolution
http://hdl.handle.net/2003/35310
Title: Multiscale inference for multivariate deconvolution
Authors: Eckle, Konstantin; Bissantz, Nicolai; Dette, Holger
Abstract: We propose multiscale tests for deconvolution in order to detect geometric features of
an unknown multivariate density. Our approach uses simultaneous tests on all scales for
the monotonicity of the density at arbitrary points in arbitrary directions. We consider
the situation of polynomial decay of the Fourier transform of the error density in the de-
convolution model (moderately ill-posed). We develop multiscale methods for identifying
regions of monotonicity and a general procedure to detect the modes of a multivariate
density. The theoretical results are illustrated by means of a simulation study.2016-11-03T13:40:17ZConsumer inattention, heuristic thinking and the role of energy labels
http://hdl.handle.net/2003/35309
Title: Consumer inattention, heuristic thinking and the role of energy labels
Authors: Andor, Mark; Gerster, Andreas; Sommer, Stephan
Abstract: Energy labels have been introduced in many countries to increase consumers’
attention to energy use in purchase decisions of durables. In a discrete-choice experiment
among about 5,000 households, we implement randomized information
treatments to explore the effects of various kinds of energy labels on purchasing decisions.
Our results show that adding annual operating cost information to the EU
energy label promotes the choice of energy-efficient durables. In addition, we find
that a majority of participants value efficiency classes beyond the economic value
of the underlying energy use differences. Our results further indicate that displaying
operating cost affects choices through two distinct channels: it increases the
attention to operating cost and reduces the valuation of efficiency class differences.2016-11-03T13:36:21ZLocally adaptive confidence bands
http://hdl.handle.net/2003/35308
Title: Locally adaptive confidence bands
Authors: Patschkowski, Tim; Rohde, Angelika
Abstract: We develop honest and locally adaptive confidence bands for probability
densities. They provide substantially improved confidence statements in
case of inhomogeneous smoothness, and are easily implemented and visualized.
The article contributes conceptual work on locally adaptive inference
as a straightforward modification of the global setting imposes severe obstacles
for statistical purposes. Among others, we introduce a statistical notion
of local Hölder regularity and prove a correspondingly strong version of local
adaptivity. We substantially relax the straightforward localization of
the self-similarity condition in order not to rule out prototypical densities.
The set of densities permanently excluded from the consideration is shown
to be pathological in a mathematically rigorous sense. On a technical level,
the crucial component for the verification of honesty is the identification
of an asymptotically least favorable stationary case by means of Slepian's
comparison inequality.2016-11-03T13:34:14ZOptimal discrimination designs for semi-parametric models
http://hdl.handle.net/2003/35306
Title: Optimal discrimination designs for semi-parametric models
Authors: Dette, Holger; Guchenko, Roman; Melas, Viatcheslav; Wong, Weng Kee
Abstract: Much of the work in the literature on optimal discrimination designs assumes that the
models of interest are fully specified, apart from unknown parameters in some models.
Recent work allows errors in the models to be non-normally distributed but still requires
the specification of the mean structures. This research is motivated by the interesting
work of Otsu (2008) to discriminate among semi-parametric models by generalizing
the KL-optimality criterion proposed by Lopez-Fidalgo et al. (2007) and Tommasi and
Lopez-Fidalgo (2010). In our work we provide further important insights in this interesting
optimality criterion. In particular, we propose a practical strategy for finding
optimal discrimination designs among semi-parametric models that can also be verified
using an equivalence theorem. In addition, we study properties of such optimal designs
and identify important cases where the proposed semi-parametric optimal discrimination
designs coincide with the celebrated T-optimal designs.2016-10-28T12:33:36ZThe impact of disclosure obligations on executive compensation - A policy evaluation using quantile treatment estimators
http://hdl.handle.net/2003/35305
Title: The impact of disclosure obligations on executive compensation - A policy evaluation using quantile treatment estimators
Authors: Dyballa, Katharina; Kraft, Kornelius
Abstract: This empirical study analyses the effects of the introduction of strongly increased
disclosure requirements in Germany on the level of executive compensation. One
innovative aspect is the comparison of companies which voluntarily followed a
recommendation of the German Governance Code before the relevant law was
implemented and published detailed information on executive compensation with
other firms which did not. Conditional and unconditional quantile difference-indifferences
models are estimated. The companies which refused to publish data
before it became mandatory show a reduction in compensation levels for the upper
quantiles. Hence, the mandatory requirement to publish detailed information
reduced the higher levels of executive compensations, but did not affect executive
compensation at lower or medium levels.2016-10-28T12:30:25ZBest linear unbiased estimators in continuous time regression models
http://hdl.handle.net/2003/35304
Title: Best linear unbiased estimators in continuous time regression models
Authors: Dette, Holger; Pepelyshev, Andrey; Zhigljavsy, Anatoly
Abstract: In this paper the problem of best linear unbiased estimation is
investigated for continuous-time regression models. We prove several
general statements concerning the explicit form of the best linear unbiased
estimator (BLUE), in particular when the error process is a
smooth process with one or several derivatives of the response process
available for construction of the estimators. We derive the explicit
form of the BLUE for many specific models including the cases
of continuous autoregressive errors of order two and integrated error
processes (such as integrated Brownian motion). The results are
illustrated by several examples.2016-10-28T12:27:26ZRegularization parameter selection in indirect regression by residual based bootstrap
http://hdl.handle.net/2003/35302
Title: Regularization parameter selection in indirect regression by residual based bootstrap
Authors: Bissantz, Nicolai; Chown, Justin; Dette, Holger
Abstract: Residual-based analysis is generally considered a cornerstone of statistical methodology.
For a special case of indirect regression, we investigate the residual-based empirical distribution
function and provide a uniform expansion of this estimator, which is also shown to
be asymptotically most precise. This investigation naturally leads to a completely data-driven
technique for selecting a regularization parameter used in our indirect regression function estimator.
The resulting methodology is based on a smooth bootstrap of the model residuals. A
simulation study demonstrates the effectiveness of our approach.2016-10-28T09:00:10ZThe effect of intraday periodicity on realized volatility measures
http://hdl.handle.net/2003/35301
Title: The effect of intraday periodicity on realized volatility measures
Authors: Dette, Holger; Golosnoy, Vasyl; Kellermann, Janosch
Abstract: U-shaped intraday periodicity (IP) is a typical stylized fact characterizing intraday returns
on risky assets. In this study we focus on the realized volatility and bipower variation
estimators for daily integrated volatility (IV ) which are based on intraday returns following
a discrete-time model with IP. We demonstrate that neglecting the impact of IP on
realized estimators may lead to non-valid statistical inference concerning IV for the commonly
available number of intraday returns, moreover, the size of daily jump tests may be
distorted. Given the functional form of IP, we derive corrections for the realized measures
of IV . We show in a Monte Carlo and an empirical study that the proposed corrections
improve commonly point and interval estimators of the IV and tests for jumps.2016-10-28T08:57:43ZSwitching on electricity demand response: Evidence for German households
http://hdl.handle.net/2003/35300
Title: Switching on electricity demand response: Evidence for German households
Authors: Frondel, Manuel; Kussel, Gerhard
Abstract: Empirical evidence on the response of German households to electricity price
changes is sparse. Using panel data originating from Germany’s Residential Energy
Consumption Survey (GRECS), we fill this void by employing an instrumental variable
approach to cope with the endogeneity of the consumers’ tariff choice. By additionally
exploiting our information on the households’ knowledge about power prices, we also
employ an Endogenous Switching Regression Model to estimate price elasticities for
two groups of households, finding that only those households that are informed about
prices are sensitive to price changes, whereas the electricity demand of uninformed
households is entirely price-inelastic.2016-10-28T08:55:10Z'Change in space’-point estimation, Part I: Lower bound for rates of consistency
http://hdl.handle.net/2003/35299
Title: 'Change in space’-point estimation, Part I: Lower bound for rates of consistency
Authors: Brauer, Marcel; Rohde, Angelika
Abstract: Given n discrete observations of a homogeneous diffusion process with a
piecewise constant diffusion coefficient containing one point of discontinuity
p0, we study the semiparametric problem of estimating its 'change in space'-
point p_0 in the high-frequency setting. We establish a lower bound for the
minimax rate of convergence n^--3/4, which is slower than the n^-1-rate in
traditional change-point problems.2016-10-28T08:52:16ZNew backtests for unconditional coverage of the expected shortfall
http://hdl.handle.net/2003/35286
Title: New backtests for unconditional coverage of the expected shortfall
Authors: Löser, Robert; Wied, Dominik; Ziggel, Daniel
Abstract: We present a new backtest for the unconditional coverage property of the ES. The test statistic is available
for finite out-of-sample size which leads to better size and power properties compared to existing tests.
Moreover, it can be easily extended to a multivariate test.2016-10-14T10:24:48ZEfficient estimation of the error distribution function in heteroskedastic nonparametric regression with missing data
http://hdl.handle.net/2003/35285
Title: Efficient estimation of the error distribution function in heteroskedastic nonparametric regression with missing data
Authors: Chown, Justin
Abstract: We propose a residual-based empirical distribution function to estimate the distribution function
of the errors of a heteroskedastic nonparametric regression with responses missing at random based on
completely observed data, and we show this estimator is asymptotically most precise.2016-10-14T10:22:43ZDetecting heteroskedasticity in nonparametric regression using weighted empirical processes
http://hdl.handle.net/2003/35284
Title: Detecting heteroskedasticity in nonparametric regression using weighted empirical processes
Authors: Chown, Justin; Müller, Ursula U.
Abstract: Heteroskedastic errors can lead to inaccurate statistical conclusions if they are
not properly handled. We introduce a test for heteroskedasticity for the nonparametric regression
model with multiple covariates. It is based on a suitable residual-based empirical
distribution function. The residuals are constructed using local polynomial smoothing. Our
test statistic involves a "detection function" that can verify heteroskedasticity by exploiting
just the independence-dependence structure between the detection function and model
errors, i.e. we do not require a specific model of the variance function. The procedure is
asymptotically distribution free: inferences made from it do not depend on unknown parameters.
It is consistent at the parametric (root-n) rate of convergence. Our results are
extended to the case of missing responses and illustrated with simulations.2016-10-14T10:19:04ZNonparametric inference of gradual changes in the jump behaviour of time-continuous processes
http://hdl.handle.net/2003/35233
Title: Nonparametric inference of gradual changes in the jump behaviour of time-continuous processes
Authors: Hoffmann, Michael; Vetter, Mathias; Dette, Holger
Abstract: In applications changes of the properties of a stochastic feature occur often gradually
rather than abruptly, that is: after a constant phase for some time they slowly start to
change. Efficient analysis for change points should address the specific features of such a
smooth change. In this paper we discuss statistical inference for localizing and detecting
gradual changes in the jump characteristic of a discretely observed Ito semimartingale. We
propose a new measure of time variation for the jump behaviour of the process. The statistical
uncertainty of a corresponding estimate is analyzed deriving new results on the weak
convergence of a sequential empirical tail integral process and a corresponding multiplier
bootstrap procedure.2016-10-10T10:46:10ZHigher-order statistics for DSGE models
http://hdl.handle.net/2003/35215
Title: Higher-order statistics for DSGE models
Authors: Mutschler, Willi
Abstract: Closed-form expressions for unconditional moments, cumulants and polyspectra of order
higher than two are derived for non-Gaussian or nonlinear (pruned) solutions to DSGE
models. Apart from the existence of moments and white noise property no distributional
assumptions are needed. The accuracy and utility of the formulas for computing
skewness and kurtosis are demonstrated by three prominent models: Smets and Wouters
(AER, 586-606, 97, 2007) (first-order approximation), An and Schorfheide (Econom.
Rev., 113-172, 26, 2007) (second-order approximation) and the neoclassical growth model
(third-order approximation). Both the Gaussian as well as Student's t-distribution are
considered as the underlying stochastic processes. Lastly, the efficiency gain of including
higher-order statistics is demonstrated by the estimation of a RBC model within a
Generalized Method of Moments framework.2016-09-19T10:14:48ZWeak convergence of a pseudo maximum likelihood estimator for the extremal index
http://hdl.handle.net/2003/35214
Title: Weak convergence of a pseudo maximum likelihood estimator for the extremal index
Authors: Berghaus, Betina; Bücher, Axel
Abstract: The extremes of a stationary time series typically occur in clusters. A
primary measure for this phenomenon is the extremal index, representing the reciprocal
of the expected cluster size. Both a disjoint and a sliding blocks estimator for the
extremal index are analyzed in detail. In contrast to many competitors, the estimators
only depend on the choice of one parameter sequence. We derive an asymptotic
expansion, prove asymptotic normality and show consistency of an estimator for the
asymptotic variance. Explicit calculations in certain models and a finite-sample Monte
Carlo simulation study reveal that the sliding blocks estimator is outperforming other
blocks estimators, and that it is competitive to runs- and inter-exceedance estimators
in various models. The methods are applied to a variety of financial time series.2016-09-19T10:06:11ZLow-frequency estimation of continuous-time moving average Lévy processes
http://hdl.handle.net/2003/35197
Title: Low-frequency estimation of continuous-time moving average Lévy processes
Authors: Belomestny, Denis; Panov, Vladimir; Woerner, Jeannette H. C.
Abstract: In this paper we study the problem of statistical inference for a continuoustime
moving average Lévy process of the form
Zt=∫ℝκ(t-s)dLs, t∈ℝ
with a deterministic kernel κ and a Lévy process L. Especially the estimation
of the Lévy measure v of L from low-frequency observations of the process
Z is considered. We construct a consistent estimator, derive its convergence
rates and illustrate its performance by a numerical example. On the technical
level, the main challenge is to establish a kind of exponential mixing for
continuous-time moving average Lévy processes.2016-09-02T09:00:44ZSimulation free prediction intervals for a state dependent failure process using accellerated lifetime experiments
http://hdl.handle.net/2003/35193
Title: Simulation free prediction intervals for a state dependent failure process using accellerated lifetime experiments
Authors: Müller, Christine H.; Szugat, Sebastian; Maurer, Reinhard
Abstract: We consider the problem of constructing prediction intervals for the
time point at which a given number of components of a system exposed to
degradation fails. The failure process with respect to the failure times of
the components is modeled by a state dependent point process which is an
alternative to the nonhomogeneous Poisson process often used in failure
analysis. Several failure processes observed at different usually higher
stress conditions are incorporated by a link function. Two new simulation-
free prediction intervals are proposed. One is constructed with the
method and the implicit function theorem applied to the hypoexponential
distribution and does not need the construction of confidence sets for the
unknown parameters. The other is based on data depth using a recent
result for constructing outlier robust confidence sets for general regression.
The two new methods are compared with two methods based on classical
confidence sets for generalized linear models. The comparison is done by
leave-one-out analysis of data coming from failure processes observed at
prestressed concrete beams exposed to different cyclic loading where the
time points of breaking tension wires were reported.2016-08-31T13:11:44ZCycling on the extensive and intensive margin: The role of paths and prices
http://hdl.handle.net/2003/35179
Title: Cycling on the extensive and intensive margin: The role of paths and prices
Authors: Frondel, Manuel; Vance, Colin; Wagner, Martin
Abstract: Drawing on a panel of German survey data spanning 1997-2013, this paper
identifies the correlates of non-recreational bicycling, focusing specifically on the roles
of bicycle paths and fuel prices. Our approach conceptualizes ridership as a two stage
decision process comprising the discrete choice of whether to use the bike (i.e. the intensive
margin) and the continuous choice of how far to ride (i.e. the extensive margin).
To the extent that these two choices are related and, moreover, potentially influenced by
factors unobservable to the researcher, we explore alternative estimators using two-stage
censored regression techniques to assess whether the results are subject to biases from
sample selectivity. A key finding is that while higher fuel costs are associated with an
increased probability of undertaking non-recreational bike trips, this effect is predicated
on residence in an urbanized region. We also find evidence for a positive association with
the extent of bike paths, both in increasing the probability of non-recreational bike travel
as well as the distance traveled.2016-08-16T07:24:56ZFourier methods for analysing piecewise constant volatilities
http://hdl.handle.net/2003/35178
Title: Fourier methods for analysing piecewise constant volatilities
Authors: Wornowizki, Max; Fried, Roland; Meintanis, Simos G.
Abstract: We develop procedures for testing the hypothesis that a parameter of
a distribution is constant throughout a sequence of independent random
variables. Our proposals are illustrated considering the variance and the
kurtosis. Under the null hypothesis of constant variance, the modulus
of a Fourier type transformation of the volatility process is identically
equal to one. The approach proposed utilizes this property considering
a canonical estimator for this modulus under the assumption of indepen-
dent and piecewise identically distributed observations with zero mean.
Using blockwise estimators we introduce several test statistics resulting
from different weight functions which are all given by simple explicit for-
mulae. The methods are compared to other tests for constant volatility
in extensive Monte Carlo experiments. Our proposals offer comparatively
good power particularly in the case of multiple structural breaks and allow
adequate estimation of the positions of the structural breaks. An appli-
cation to process control data is given, and it is shown how the methods
can be adapted to test for constancy of other quantities like the kurtosis.2016-08-16T07:22:49ZNonparametric estimation and testing on discontinuity of positive supported densities: A kernel truncation approach
http://hdl.handle.net/2003/35168
Title: Nonparametric estimation and testing on discontinuity of positive supported densities: A kernel truncation approach
Authors: Funke, Benedikt; Hirukawa, Masayuki
Abstract: Discontinuity in density functions is of economic importance and interest.
For instance, in studies on regression discontinuity designs, discontinuity in
the density of a running variable suggests violation of the no-manipulation
assumption. In this paper we develop estimation and testing procedures on
discontinuity in densities with positive support. Our approach is built on splitting
the gamma kernel (Chen, 2000) into two parts at a given (dis)continuity
point and constructing two truncated kernels. The jump-size magnitude of the
density at the point can be estimated nonparametrically by two kernels and a
multiplicative bias correction method. The estimator is easy to implement, and
its convergence properties are delivered by various approximation techniques on
incomplete gamma functions. Based on the jump-size estimator, two versions
of test statistics for the null of continuity at a given point are also proposed.
Moreover, estimation theory of the entire density in the presence of a discontinuity
point is explored. Monte Carlo simulations confirm nice finite-sample
properties of the jump-size estimator and the test statistics.2016-08-03T13:06:16ZNonparametric IV regression with an Archimedean dependence structure
http://hdl.handle.net/2003/35166
Title: Nonparametric IV regression with an Archimedean dependence structure
Authors: van Kampen, Maarten
Abstract: This paper provides a characterization of the completeness of a family of distributions
in terms of the copula between the random variables. We give sufficient conditions
for a family of Archimedean copulas to be (boundedly) complete. Some
copulas are typically excluded in nonparametric IV regression since they have
non-square integrable densities. We provide conditions under which we can identify
the nonparametric IV regression model if the dependence structure between
the regressors and instrument variables can be described by an Archimedean
copula.2016-08-01T10:28:58ZEfficient sampling in materials simulation - exploring the parameter space of grain boundaries
http://hdl.handle.net/2003/35163
Title: Efficient sampling in materials simulation - exploring the parameter space of grain boundaries
Authors: Dette, Holger; Goesmann, Josua; Greiff, Christian; Janisch, Rebecca
Abstract: In the framework of materials design there is the demand for extensive databases of specific materials
properties. In this work we suggest an improved strategy for creating future databases, especially for
extrinsic properties that depend on several material parameters. As an example we choose the energy of
grain boundaries as a function of their geometric degrees of freedom. The construction of existing databases
of grain boundary energies in face-centred and body centred cubic metals relied on the a-priori knowledge of
the location of important cusps and maxima in the five-dimensional energy landscape, and on an as-densely-
as-possible sampling strategy. We introduce two methods to improve the current state of the art. The
location and number of the energy minima along which the hierarchical sampling takes place is predicted
from existing data points without any a-priori knowledge, using a predictor function. Furthermore we
show that it is more efficient to use a sequential sampling in a \design of experiment" scheme, rather than
sampling all observations homogeneously in one batch. This sequential design exhibits a smaller error than
the simultaneous one, and thus can provide the same accuracy with fewer data points. The new strategy
should be particularly beneficial in the exploration of grain boundary energies in new alloys and/or non-cubic
structures.2016-07-29T10:59:20ZA focused information criterion for quantile regression: Evidence for the rebound effect
http://hdl.handle.net/2003/35160
Title: A focused information criterion for quantile regression: Evidence for the rebound effect
Authors: Behl, Peter; Dette, Holger; Frondel, Manuel; Vance, Colin
Abstract: In contrast to conventional model selection criteria, the Focused Information
Criterion (FIC) allows for the purpose-specific choice of model specifications.
This accommodates the idea that one kind of model might be highly
appropriate for inferences on a particular focus parameter, but not for another.
Using the FIC concept that is developed by BEHL, CLAESKENS and DETTE (2014)
for quantile regression analysis, and the estimation of the rebound effect in individual
mobility behavior as an example, this paper provides for an empirical
application of the FIC in the selection of quantile regression models.2016-07-27T14:11:20ZModel robust designs for survival trials
http://hdl.handle.net/2003/35156
Title: Model robust designs for survival trials
Authors: Konstantinou, Maria; Biedermann, Stefanie; Kimber, Alan
Abstract: The exponential-based proportional hazards model is often assumed in time-
to-event experiments but may only approximately hold. We consider deviations
in different neighbourhoods of this model that include other widely used paramet-
ric proportional hazards models and we further assume that the data are subject
to censoring. Minimax designs are then found explicitly based on criteria corre-
sponding to classical c- and D-optimality. We provide analytical characterisations
of optimal designs which, unlike optimal designs for related problems in the litera-
ture, have finite support and thus avoid the issues of implementing a density-based
design in practice. Finally, our designs are compared with the balanced design that
is traditionally used in practice, and recommendations for practitioners are given.2016-07-25T11:23:48ZResidual-based inference on moment hypotheses, with an application to testing for constant correlation
http://hdl.handle.net/2003/35155
Title: Residual-based inference on moment hypotheses, with an application to testing for constant correlation
Authors: Demetrescu, Matei; Wied, Dominik
Abstract: Often, inference on moment properties of unobserved processes are conducted on the basis of estimated
counterparts obtained in a preliminary step. In some situations, the use of residuals instead of the
true quantities affects inference even in the limit, while in others there is no asymptotic residual effect.
For the case of statistics based on partial sums of nonlinear functions of the residuals, we give here a
characterization of the conditions under which the residual effect does not vanish as the sample size goes
to infinity (generic regularity conditions provided). An overview of methods to account for the residual
effect is also provided. The analysis extends to models with change points in parameters at estimated
time, in spite of the discontinuous manner in which the break time enters the model of interest. To
illustrate the usefulness of the results, we propose a test for constant correlations allowing for breaks
at unknown time in the marginal means and variances. We find, in Monte Carlo simulations and in an
application to US and German stock returns, that not accounting for changes in the marginal moments
has severe consequences.2016-07-22T13:37:55ZAssessing the similarity of dose response and target doses in two non-overlapping subgroups
http://hdl.handle.net/2003/35138
Title: Assessing the similarity of dose response and target doses in two non-overlapping subgroups
Authors: Bretz, Frank; Möllenhoff, Kathrin; Dette, Holger; Liu, Wei; Trampisch, Matthias
Abstract: We consider two problems that are attracting increasing attention in clinical dose
finding studies. First, we assess the similarity of two non-linear regression models
for two non-overlapping subgroups of patients over a restricted covariate space. To
this end, we derive a confidence interval for the maximum difference between the two
given models. If this confidence interval excludes the equivalence margins, similarity
of dose response can be claimed. Second, we address the problem of demonstrating
the similarity of two target doses for two non-overlapping subgroups, using again a
confidence interval based approach. We illustrate the proposed methods with a real
case study and investigate their operating characteristics (coverage probabilities, Type
I error rates, power) via simulation.2016-07-13T13:02:57ZConditional heavy-tail behavior with applications to precipitation and river flow extremes
http://hdl.handle.net/2003/35131
Title: Conditional heavy-tail behavior with applications to precipitation and river flow extremes
Authors: Kinsvater, Paul; Fried, Roland
Abstract: This article deals with the right-tail behavior of a response distribution F_Y conditional on a regressor vector X = x restricted to the heavy-tailed case of Pareto-type conditional distributions F_Y (y| x) = P(Y ≤ y| X = x), with heaviness of the right tail characterized by the conditional extreme value index γ(x) > 0. We particularly focus on testing the hypothesis H_0;tail : γ(x) = γ0 of constant tail behavior for some
γ0 > 0 and all possible x.
When considering x as a time index, the term trend analysis is commonly used. In the recent past several such trend analyses in extreme value data have been published, mostly focusing on time-varying modeling of location and scale parameters of the response distribution. In many such environmental studies a simple test against trend based on Kendall's tau statistic is applied. This test is powerful when the center of the conditional distribution F_Y (y|x) changes monotonically in x, for instance, in a simple location model μ(x) = μ_0 + x * μ_1, x = (1, x)’, but the test is rather insensitive against monotonic tail behavior, say, μ(x) = η_0 + x * η_1. This has to be considered, since for many environmental applications the main interest is on the tail rather than the center of a distribution. Our work is motivated by this problem and it is our goal to demonstrate the opportunities and the limits of detecting and estimating non-constant conditional heavy-tail behavior with regard to applications from hydrology. We present and compare four different procedures by simulations and illustrate our findings on real data from hydrology: Weekly maxima of hourly precipitation from France and monthly maximal river
flows from Germany.2016-07-04T09:48:33ZModeling of Gibbs energies of pure elements down to 0K using segmented regression
http://hdl.handle.net/2003/35130
Title: Modeling of Gibbs energies of pure elements down to 0K using segmented regression
Authors: Roslyakova, Irina; Sundmann, Bo; Dette, Holger; Zhang, Lijun; Steinbach, Ingo
Abstract: A novel thermodynamic modeling strategy of stable solid alloy phases is
proposed based on segmented regression approach. The model considers several
physical effects (e.g. electronic, vibrational etc.) and is valid from 0K up to
the melting temperature. The preceding approach has been applied for several
pure elements. Results show good agreement with experimental data at low
and high temperatures. Since it is not a first attempt to propose a "universal"
physical-based model down to 0K for the pure elements as an alternative to
current SGTE description, we also compare the results to existing models.
Analysis of the obtained results shows that the newly proposed model delivers
more accurate description down to 0K for all studied pure elements according
to several statistical tests.2016-07-04T08:51:14ZRisk perception of climate change: Empirical evidence for Germany
http://hdl.handle.net/2003/35125
Title: Risk perception of climate change: Empirical evidence for Germany
Authors: Frondel, Manuel; Simora, Michael; Sommer, Stephan
Abstract: The perception of risks resulting from climate change is a key factor in motivating
individual adaptation and prevention behavior, as well as for the support of climate
policy measures. Using a generalized ordered logit approach and drawing on a
unique data set originating from two surveys conducted in 2012 and 2014, each among
more than 6,000 German households, we analyze the determinants of individual risk
perception associated with three kinds of natural hazards: heat waves, storms, and
floods. Our focus is on the role of objective risk measures and experience with these
natural hazards, whose frequency is likely to be affected by climate change. In line
with the received literature, the results suggest that personal experience with adverse
events and, even more importantly, personal damage therefrom are strong drivers of
individual risk perception.2016-06-29T12:09:35ZEstimation methods for the LRD parameter under a change in the mean
http://hdl.handle.net/2003/35124
Title: Estimation methods for the LRD parameter under a change in the mean
Authors: Rooch, Aeneas; Zelo, Ieva; Fried, Roland
Abstract: When analyzing time series which are supposed to exhibit long-range dependence (LRD), a basic
issue is the estimation of the LRD parameter, for example the Hurst parameter H 2 (1=2; 1). Conventional
estimators of H easily lead to spurious detection of long memory if the time series includes a shift in the
mean. This defect has fatal consequences in change-point problems: Tests for a level shift rely on H, which
needs to be estimated before, but this estimation is distorted by the level shift.
We investigate two blocks approaches to adapt estimators of H to the case that the time series includes
a jump and compare them with other natural techniques as well as with estimators based on the trimming
idea via simulations. These techniques improve the estimation of H if there is indeed a change in the mean.
In the absence of such a change, the methods little affect the usual estimation. As adaption, we recommend
an overlapping blocks approach: If one uses a consistent estimator, the adaption will preserve this property
and it performs well in simulations.2016-06-28T14:50:45ZGermany’s Energiewende: A tale of increasing costs and decreasing willingness-to-pay
http://hdl.handle.net/2003/35123
Title: Germany’s Energiewende: A tale of increasing costs and decreasing willingness-to-pay
Authors: Andor, Mark A.; Frondel, Manuel; Vance, Colin
Abstract: This paper presents evidence that the accumulating costs of Germany’s ambitious
plan to transform its system of energy provision – the so-called Energiewende –
are butting up against consumers’ decreased willingness-to-pay (WTP) for it. Following
a descriptive presentation that traces the German promotion of renewable energy
technologies since 2000, we draw on two stated-preference surveys conducted in 2013
and 2015 that elicit the households’ WTP for green electricity. To deal with the bias
that typifies hypothetical responses, a switching regression model is estimated that
distinguishes respondents according to whether they express definite certainty in their
reported WTP. Our results reveal a strong contrast between the households’ general
acceptance of supporting renewable energy technologies and their own WTP for green
electricity.2016-06-28T14:48:24ZAdaptive grid semidefinite programming for finding optimal designs
http://hdl.handle.net/2003/35122
Title: Adaptive grid semidefinite programming for finding optimal designs
Authors: Duarte, Belmiro P.M.; Wong, Weng Kee; Dette, Holger
Abstract: We find optimal designs for linear models using a novel algorithm that iteratively combines a Semidefinite
Programming (SDP) approach with adaptive grid (AG) techniques. The search space is first discretized
and SDP is applied to find the optimal design based on the initial grid. The points in the next grid set are
points that maximize the dispersion function of the SDP-generated optimal design using Nonlinear Programming
(NLP). The procedure is repeated until a user-specified stopping rule is reached. The proposed
algorithm is broadly applicable and we demonstrate its flexibility using (i) models with one or more variables,
and (ii) differentiable design criteria, such as A-, D-optimality, and non-differentiable criterion like
E-optimality, including the mathematically more challenging case when the minimum eigenvalue of the
information matrix of the optimal design has geometric multiplicity larger than 1. Our algorithm is computationally
efficient because it is based on mathematical programming tools and so optimality is assured at
each stage; it also exploits the convexity of the problems whenever possible. Using several linear models,
we show the proposed algorithm can efficiently find both old and new optimal designs.2016-06-28T14:33:05ZBeyond inequality: A novel measure of skewness and its properties
http://hdl.handle.net/2003/35087
Title: Beyond inequality: A novel measure of skewness and its properties
Authors: Krämer, Walter; Dette, Holger
Abstract: We show that a recent appendix to the Gini-coeffcient to make
the latter more sensitive to asymmetric income distributions can be
viewed as an abstract measure of skewness. We develop some of its
properties and apply it to the US-income distribution in 1974 and
2010.2016-06-13T09:22:31ZBaPreStoPro: an R package for Bayesian prediction of stochastic processes
http://hdl.handle.net/2003/35066
Title: BaPreStoPro: an R package for Bayesian prediction of stochastic processes
Authors: Hermann, Simone
Abstract: In many applications, stochastic processes are used for modeling. Bayesian
analysis is a strong tool for inference as well as for prediction. We here present
an R package for a large class of models, all based on the definition of a jump
diffusion with a non-homogeneous Poisson process. Special cases, as the Poisson
process itself, a general diffusion process or a hierarchical (mixed) diffusion model,
are considered. The package is a general tool box, because it is based on the
stochastic differential equation, approximated with the Euler scheme. Functions
for simulation, estimation and prediction are provided for each considered model.2016-06-07T11:49:58ZBayesian prediction for stochastic processes
http://hdl.handle.net/2003/35019
Title: Bayesian prediction for stochastic processes
Authors: Hermann, Simone
Abstract: In many fields of statistical analysis, one is not only interested in estimation
of model parameters, but in a prediction for future observations. For stochastic
processes, on the one hand, one can be interested in the prediction for the further
development of the current, i.e. observed, series. On the other hand, prediction
for a new series can be of interest. This work presents two Bayesian prediction
procedures based on the transition density of the Euler approximation, that include
estimation uncertainty as well as the model variance. In a first algorithm,
the pointwise predictive distribution is calculated, in a second, trajectories will
be drawn. Both methods will be compared and analyzed with respect to their
advantages and drawbacks and set in contrast to two commonly used prediction
approaches.2016-06-03T11:24:37ZAsymmetry and performance metrics for equity returns
http://hdl.handle.net/2003/35016
Title: Asymmetry and performance metrics for equity returns
Authors: Bowden, Roger J.; Posch, Peter N.; Ullmann, Daniel
Abstract: An assumption of symmetric asset returns, together with globally risk averse utility
functions, is unappealing for fund managers and other activist investors, whose preferences
switch between risk aversion on the downside and risk seeking on the upside. A performance
return criterion is originated that is more consistent with the implicit Friedman-Savage utility
ordering. Adapted from recent developments in the income distribution literature, the proposed
metric weights the lower versus upper conditional expected returns, while a dual spread or
dispersion metric also exists. The resulting performance metric is easy to compute. A point of
departure is the conventional Sharpe performance ratio, with the empirical comparisons extending
to a range of existing performance criteria. In contrast, the proposed W-metric results in
different and more embracing performance rankings.2016-06-03T08:35:29ZDual disadvantage and dispersion dynamics for income distributions
http://hdl.handle.net/2003/35015
Title: Dual disadvantage and dispersion dynamics for income distributions
Authors: Bowden, Roger J.; Posch, Peter N.; Ullmann, Daniel
Abstract: Income distribution has been a longstanding focus of social and economic interest,
but never more so than in recent times. New metrics for disadvantage and spread enable a
more precise differentiation of directional asymmetry and dispersion, drawing on an internal
contextual perspective. The dual metrics for asymmetry and spread can be plotted over time
into a phase plane, enabling comparative social welfare perspectives over time and between
countries. The methods are utilised to study the dramatic changes that took place in Europe
prior to and after the GFC. Major differences are revealed. In terms of asymmetry and spread,
some countries have been fallers (lower in both) while other countries are risers.2016-06-03T08:33:06Z