Eldorado Collection: Nichtlineare dynamische Modelle in Wirtschaft und Technik
http://hdl.handle.net/2003/26250
Nichtlineare dynamische Modelle in Wirtschaft und Technik2021-09-27T19:03:14ZSome practical aspects of sequential change point detection
http://hdl.handle.net/2003/40472
Title: Some practical aspects of sequential change point detection
Authors: Sivanesan, Sivanja; Dette, Holger; Ziggel, Daniel
Abstract: In this report we investigate the finite sample properties of a new online monitoring
scheme which was recently introduced by Gösmann et al. (2020) by means of a simulation
study and a real data example. We also develop an algorithm which can be used in an
active risk management.
We start with an introduction in the basic notation and an explanation of the monitoring
procedure, continue with an extensive simulation study to provide recommendations
for the choice of several tuning parameters. Finally we present some illustration analyzing
the Standard & Poor’s 500, MSCI World and MSCI Emerging Markets indices.2021-01-01T00:00:00ZDose response signal detection by parametric and least squares bootstrap
http://hdl.handle.net/2003/40471
Title: Dose response signal detection by parametric and least squares bootstrap
Authors: Bastian, Patrick; Dette, Holger; Kokot, Kevin; Bornkamp, Björn; Bretz, Frank2021-01-01T00:00:00ZWasserverbrauch privater Haushalte in Deutschland: Eine empirische Mikroanalyse
http://hdl.handle.net/2003/40470
Title: Wasserverbrauch privater Haushalte in Deutschland: Eine empirische Mikroanalyse
Authors: Frondel, Manuel; Niehues, Delia A.; Sommer, Stephan
Abstract: Deutschland ist ein eher wasserreiches Land. Dennoch könnten es
klimatische Veränderungen notwendig machen, künftig sorgsam mit der Ressource Wasser
umzugehen, vor allem in Zeiten von Trockenheit. Vor diesem Hintergrund schätzt
dieser Beitrag die Preiselastizität des Wasserverbrauchs privater Haushalte und differenziert
dabei zwischen Haushalten, die eine grobe Kenntnis der Wasserpreise haben, und
Haushalten ohne Preiskenntnis. Auf Basis von ca. 1.100 Beobachtungen für Haushalte,
die in Einfamilienhäusern wohnen, und unter Verwendung der Summe der Kubikmeter-
Preise für Wasser und Abwasser findet sich eine moderate, aber statistisch signifikant
von Null verschiedene Preiselastizität von -0,102. Haushalte, die über die Kenntnis der
Wasserpreise verfügen, weisen tendenziell eine höhere Elastizität auf, während Haushalte
ohne Preiskenntnis keine statistisch signifikante Reaktion in ihrem Wasserverbrauch
zeigen. Preise können demnach nur in begrenztem Umfang als Mittel zur Steuerung des
Wasserverbrauchs eingesetzt werden.2021-01-01T00:00:00ZCarbon pricing in Germany’s road transport and housing sector: Options for reimbursing carbon revenues
http://hdl.handle.net/2003/40354
Title: Carbon pricing in Germany’s road transport and housing sector: Options for reimbursing carbon revenues
Authors: Frondel, Manuel; Schubert, Stefanie
Abstract: In 2021, Germany launched a national emissions trading system (ETS) in its
road transport and housing sectors that increases the cost burden of consumers of fossil
fuels, the major source of carbon dioxide (CO2) emissions. A promising approach to secure
public acceptance for such a carbon pricing would be to entirely reallocate the resulting
“carbon” revenues to consumers. This article discusses three alternatives that
were discussed in the political arena prior to the introduction of the national carbon pricing:
a) a per-capita reallocation to private households, b) the reduction of electricity prices
by, e.g., decreasing the electricity tax, as well as c) targeted financial aid for vulnerable
consumers, such as increasing housing benefits. To estimate both the revenues originating
from carbon pricing and the resulting emission savings, we employ a partial equilibrium
approach that is based on price elasticity estimates on individual fossil fuel consumption
from the empirical literature. Most effective with respect to alleviating the burden
of poor households would be increasing housing benefits. While this measure would
not require large monetary resources, we argue that the remaining revenues should be
preferably employed to reduce Germany’s electricity tax, which becomes more and more
obsolete given the steadily increasing amount of electricity generated by renewable energy
technologies.2021-01-01T00:00:00ZDisaggregate consumption feedback and energy conservation
http://hdl.handle.net/2003/40286.2
Title: Disaggregate consumption feedback and energy conservation
Authors: Gerster, Andreas; Andor, Mark A.; Goette, Lorenz
Abstract: Novel information technologies have the potential to improve decision making. In
the context of smart metering, we investigate the impact of providing households with
appliance-level electricity feedback. In a randomized controlled trial, we find that the provision
of appliance-level feedback creates a conservation effect of an additional 5% relative
to a group receiving standard (aggregate) feedback. Consumers with poor knowledge of
appliance wattage respond most strongly to appliance-level feedback, consistent with the
mechanism in our model. We estimate that a smart-meter rollout will yield much larger
gains in consumer surplus if appliance-level feedback can be provided.2021-01-01T00:00:00ZBayesian analysis of reduced rank regression models using post-processing
http://hdl.handle.net/2003/40285
Title: Bayesian analysis of reduced rank regression models using post-processing
Authors: Aßmann, Christian; Boysen-Hogrefe, Jens; Pape, Markus
Abstract: Bayesian estimation of reduced rank regression models requires careful consideration of the
well known identification problem. We demonstrate that this identification problem can be handled
efficiently by using prior distributions that restrict a part of the parameter space to the
Stiefel manifold and post-processing the obtained Gibbs sampler output according to an appropriately
specified loss function. This extends the possibilities for Bayesian inference in reduced
rank regression models. Besides inference, we also discuss model selection in terms of posterior
predictive assessment. We choose this approach because computing the marginal data likelihood
under the identifying restrictions implies prohibitive computational burden. We illustrate the
proposed approach with a simulation study and an empirical application.2021-01-01T00:00:00ZBargaining power and the labor share – a structural break approach
http://hdl.handle.net/2003/40228
Title: Bargaining power and the labor share – a structural break approach
Authors: Kraft, Kornelius; Lammers, Alexander
Abstract: In this paper we investigate the relevance of bargaining institutions in the decline in
labor share. Several explanations for the decline exist, which consider the relevance
of technology, globalization and markups. Surprisingly neglected so far, however, is
the influence of bargaining institutions, in particular with a focus on changes in the
outside option. We provide evidence of this issue, using the Hartz IV labor market
reform in Germany as an exogenous shock in the wage bargaining of employees,
and investigate its impact on the labor share. We begin by developing a theoretical
model in which we outline the effect of a decrease in the outside option within a
wage bargaining framework. Thereafter, the approach is twofold. Combining the
EU KLEMS and Penn World Table databases, we first endogenously identify the
Hartz IV reform as a significant structural break in the German labor share. Second,
we estimate the effect of the Hartz IV legislation on the aggregated labor share
using a synthetic control approach in which we construct a counterfactual Germany
doppelganger. Finally, we use rich firm-level panel data compiled by Bureau van
Dijk to support our results on the aggregated labor share. We find that the reform
decreases the labor share by 1.6 - 2.7 percentage points depending on method and
aggregation level. The synthetic control approach furthermore provides evidence
that this effect is persistent over time since the reform.2021-01-01T00:00:00ZThe effects of reforming a federal employment agency on labor demand
http://hdl.handle.net/2003/40192
Title: The effects of reforming a federal employment agency on labor demand
Authors: Kraft, Kornelius; Lammers, Alexander
Abstract: In this paper we report the results of an empirical study on the employment growth
effects of a policy intervention, explicitly aimed at increasing placement efficiency
of the Federal Employment Agency in Germany. We use the Hartz III reform in
the year 2004 as an exogenous intervention that improves the matching process and
compare establishments that use the services of the Federal Employment Agency
with establishments that do not use the placement services. Using detailed German
establishment level data, our difference-in-differences estimates reveal an increase
in employment growth among those firms that use the agency for their recruitment
activities compared to non-user firms. After the Hartz III reform was in place, establishments
using the agency grew roughly two percentage points faster in terms
of employment relative to non-users and those establishments achieve an increase
in the proportion of hires. We provide several robustness tests using for example
inverse-probability weighting to additionally account for differences in observable
characteristics. Our paper highlights the importance of the placement service on
the labor demand side, in particular on the so far overlooked establishment level.2019-01-01T00:00:00ZLocally stationary multiplicative volatility modelling
http://hdl.handle.net/2003/40191
Title: Locally stationary multiplicative volatility modelling
Authors: Walsh, Christopher; Vogt, Michael
Abstract: In this paper, we study a semiparametric multiplicative volatility model, which
splits up into a nonparametric part and a parametric GARCH component. The
nonparametric part is modelled as a product of a deterministic time trend com-
ponent and of further components that depend on stochastic regressors. We
propose a two-step procedure to estimate the model. To estimate the nonpara-
metric components, we transform the model in order to apply the backfitting
procedure used in Vogt and Walsh (2019). The GARCH parameters are esti-
mated in a second step via quasi maximum likelihood. We show consistency
and asymptotic normality of our estimators. Our results are obtained using
mixing properties and local stationarity. We illustrate our method using finan-
cial data. Finally, a small simulation study illustrates a substantial bias in the
GARCH parameter estimates when omitting the stochastic regressors.2021-01-01T00:00:00ZDigitalisierung und Nachhaltigkeit im Haushalts-, Gebäude- und Verkehrssektor: Ein kurzer Überblick
http://hdl.handle.net/2003/40102
Title: Digitalisierung und Nachhaltigkeit im Haushalts-, Gebäude- und Verkehrssektor: Ein kurzer Überblick
Authors: Frondel, Manuel
Abstract: Der Digitalisierung wird ein großes Potential zur Senkung des Energieverbrauchs und der
damit einhergehenden Umwelteffekte zugeschrieben. Die in diesem Beitrag
zusammengetragene empirische Evidenz deutet jedoch darauf hin, dass damit häufig
lediglich geringe Effekte einhergehen. So fallen die Energieeinsparwirkungen von Smart-
Home- und Smart-Metering-Technologien eher moderat aus und bewegen sich im
niedrigen einstelligen Prozentbereich. Dementsprechend gering sind auch die mit der
Energieeinsparung verbundenen Umwelteffekte. In Bezug auf den Ausstoß an Kohlendioxid
sind wegen des Wasserbetteffektes gar keinerlei Minderungseffekte in Sektoren zu
erwarten, die in den EU-Emissionshandel integriert sind. Dieser Beitrag argumentiert, dass
in Kombination mit der Etablierung von Mautsystemen die größten Effekte in dem noch
nicht in den EU-Emissionshandel integrierten Sektor Verkehr zu erwarten sein dürften.2021-01-01T00:00:00ZNonparametric and high-dimensional functional graphical models
http://hdl.handle.net/2003/40101
Title: Nonparametric and high-dimensional functional graphical models
Authors: Solea, Eftychia; Dette, Holger
Abstract: We consider the problem of constructing nonparametric undirected graphical models for highdimensional
functional data. Most existing statistical methods in this context assume either a Gaussian
distribution on the vertices or linear conditional means. In this article we provide a more
flexible model which relaxes the linearity assumption by replacing it by an arbitrary additive form. The use
of functional principal components offers an estimation strategy that uses a group lasso penalty to
estimate the relevant edges of the graph. We establish statistical guarantees for the resulting estimators,
which can be used to prove consistency if the dimension and the number of functional principal
components diverge to infinity with the sample size. We also investigate the empirical performance of
our method through simulation studies and a real data application.2021-01-01T00:00:00ZWeighted bootstrap consistency for matching estimators: The role of bias-correction
http://hdl.handle.net/2003/40083
Title: Weighted bootstrap consistency for matching estimators: The role of bias-correction
Authors: Walsh, Christopher; Jentsch, Carsten; Hossain, Shaikh Tanvir
Abstract: We show that the purpose of consistent bias-correction for matching estimators of treatment effects is two-fold. Firstly, it is known to improve point estimation to get rid of asymptotically non-negligible bias terms. Secondly,
point estimates, it will also distort inference leading e.g. to invalid confidence intervals. In simulations, we show that the choice of the bias-correction estimator that practitioners still have to make, can severely affect the weighted bootstrap’s performance when estimating the asymptotic variance in finite samples. In particular, simple rules such as estimating the bias based on linear regressions in the treatment arms can lead to very poor weighted bootstrap based variance estimates.2021-01-01T00:00:00ZClimate policy in times of the corona pandemic: Empirical evidence from Germany
http://hdl.handle.net/2003/40066
Title: Climate policy in times of the corona pandemic: Empirical evidence from Germany
Authors: Frondel, Manuel; Kussel, Gerhard; Larysch, Tobias; Osberghaus, Daniel
Abstract: Given the dramatic changes triggered by the Corona pandemic, the question arises whether it
has displaced people’s concerns about climate change and whether Corona-related financial losses among
affected households can influence their assessment of climate change. Based on a survey among more
than 6,000 German household heads conducted in the period spanning from May 18 to June 14, 2020, this
paper provides empirical evidence on the impact of the pandemic on perceptions of climate change and
climate policy, as well as the extent to which respondents are affected in terms of health and finances.
Although the majority of almost 77% of the respondents is concerned about their own health and that of
their families, according to our descriptive results, climate change appears to remain an important issue:
only six percent of the respondents feel that climate change has become less important since the beginning
of 2020, while about 70% of the respondents see no change in the importance of the issue. Yet, employing
discrete-choice models, our estimation results indicate that households that suffered from Coronarelated
financial losses consider climate change to be less important than households that remained unaffected
in this respect. In accord with Engler et al. (2020), we thus conclude that lowering individual financial
losses is not only relevant from a social perspective, but it is also critical for the acceptance of
climate policy measures.2021-01-01T00:00:00ZRobust inference under timevarying volatility: A real-time evaluation of professional forecasters
http://hdl.handle.net/2003/40065
Title: Robust inference under timevarying volatility: A real-time evaluation of professional forecasters
Authors: Demetrescu, Matei; Hanck, Christoph; Kruse, Robinson
Abstract: In many forecast evaluation applications, standard tests (e.g., Diebold and Mariano, 1995) as
well as tests allowing for time-variation in relative forecast ability (e.g., Giacomini and Rossi,
2010) build on heteroskedasticity-and-autocorrelation consistent (HAC) covariance estimators.
Yet, the finite-sample performance of these asymptotics is often poor. "Fixed-b" asymptotics
(Kiefer and Vogelsang, 2005), used to account for long-run variance estimation, improve finitesample
performance under homoskedasticity, but lose asymptotic pivotality under time-varying
volatility. Moreover, loss of pivotality due to time-varying volatility is found in the standard
HAC framework in certain cases as well. We prove a wild bootstrap implementation to restore
asymptotically pivotal inference for the above and new CUSUM- and Cramér-von Mises based
tests in a fairly general setup, allowing for estimation uncertainty from either a rolling window
or a recursive approach when fixed-b asymptotics are adopted to achieve good finite-sample
performance. We then investigate the (time-varying) performance of professional forecasters
relative to naive no-change and model-based predictions in real-time. We exploit the Survey of
Professional Forecasters (SPF) database and analyze nowcasts and forecasts at different horizons
for output and inflation. We find that not accounting for time-varying volatility seriously
affects outcomes of tests for equal forecast ability: wild bootstrap inference typically yields convincing
evidence for advantages of the SPF, while tests using non-robust critical values provide
remarkably less. Moreover, we find significant evidence for time-variation of relative forecast
ability, the advantages of the SPF weakening considerably after the "Great Moderation".2021-01-01T00:00:00ZNearest neighbor matching: Does the M-out-of-N bootstrap work when the naive bootstrap fails?
http://hdl.handle.net/2003/40027
Title: Nearest neighbor matching: Does the M-out-of-N bootstrap work when the naive bootstrap fails?
Authors: Walsh, Christopher; Jentsch, Carsten; Hossain, Shaikh Tanvir
Abstract: In a seminal paper Abadie and Imbens (2008) showed that the limiting variance of the classi-
cal nearest neighbor matching estimator cannot be consistently estimated by a naive Efron-type
bootstrap. Specifically, they show that the conditional variance of the Efron-type boostrap es-
timator does not converge to the correct limit in expectation. In essence this is due to drawing
with replacement such that original observations appear more than once in the bootstrap sample
with positive probability even when the sample size becomes large. In the same paper, it is con-
jectured that the limiting variance should be consistently estimable by an M-out-of-N bootstrap.
Here, we prove that the conditional variance of an M-out-of-N-type bootstrap estimator does in-
deed converge to the correct limit in expectation in the setting considered in Abadie and Imbens
(2008). The key to the proof lies in the fact that asymptotically the M-out-of-N-type bootstrap
sample does not contain any observations more than once with probability one. The finite sample
performance of the M-out-of-N-type bootstrap is investigated in a simulation study of the DGP
considered by Abadie and Imbens (2008).2021-01-01T00:00:00ZReproducing kernel Hilbert spaces, polynomials and the classical moment problems
http://hdl.handle.net/2003/40015
Title: Reproducing kernel Hilbert spaces, polynomials and the classical moment problems
Authors: Dette, Holger; Zhigljavsky, Anatoly
Abstract: We show that polynomials do not belong to the reproducing kernel Hilbert space
of infinitely differentiable translation-invariant kernels whose spectral measures have
moments corresponding to a determinate moment problem. Our proof is based
on relating this question to the problem of best linear estimation in continuous
time one-parameter regression models with a stationary error process defined by
the kernel. In particular, we show that the existence of a sequence of estimators
with variances converging to 0 implies that the regression function cannot be an
element of the reproducing kernel Hilbert space. This question is then related
to the determinacy of the Hamburger moment problem for the spectral measure
corresponding to the kernel.
In the literature it was observed that a non-vanishing constant function does not
belong to the reproducing kernel Hilbert space associated with the Gaussian kernel
(see Corollary 4.44 in Steinwart and Christmann, 2008). Our results provide a unifying
view of this phenomenon and show that the mentioned result can be extended
for arbitrary polynomials and a broad class of translation-invariant kernels.2021-01-01T00:00:00ZModel order selection for cascade autoregressive (CAR) models
http://hdl.handle.net/2003/40014
Title: Model order selection for cascade autoregressive (CAR) models
Authors: Köhler, Steffen
Abstract: In recent years, Cascade Autoregression (CAR) models enjoy increasing popularity in applied econometrics.
This is due to the fact that they are able to approximate both short- and long-memory processes and are easy
to implement. However, their model order, namely the timing of the steps, relies on ad-hoc decisions rather
than being data-driven. In this paper, techniques for model order selection of CAR models in finite samples
are presented. The approaches are evaluated in an extensive simulation study, as well as in an empirical
application. The results suggest that model order selection may provide gains in both in- and out-of-sample
performance.2021-01-01T00:00:00ZOptimal designs for comparing regression curves - dependence within and between groups
http://hdl.handle.net/2003/39974
Title: Optimal designs for comparing regression curves - dependence within and between groups
Authors: Schorning, Kirsten; Dette, Holger
Abstract: We consider the problem of designing experiments for the comparison of two regression
curves describing the relation between a predictor and a response in two groups,
where the data between and within the group may be dependent. In order to derive effi-
cient designs we use results from stochastic analysis to identify the best linear unbiased
estimator (BLUE) in a corresponding continuous time model. It is demonstrated that
in general simultaneous estimation using the data from both groups yields more precise
results than estimation of the parameters separately in the two groups. Using the BLUE
from simultaneous estimation, we then construct an efficient linear estimator for finite
sample size by minimizing the mean squared error between the optimal solution in the
continuous time model and its discrete approximation with respect to the weights (of the
linear estimator). Finally, the optimal design points are determined by minimizing the
maximal width of a simultaneous confidence band for the difference of the two regression
functions. The advantages of the new approach are illustrated by means of a simulation
study, where it is shown that the use of the optimal designs yields substantially narrower
confidence bands than the application of uniform designs.2021-01-01T00:00:00ZSoziale Normen und der Emissionsausgleich bei Flügen: Evidenz für deutsche Haushalte
http://hdl.handle.net/2003/39973
Title: Soziale Normen und der Emissionsausgleich bei Flügen: Evidenz für deutsche Haushalte
Authors: Eßer, Jana; Frondel, Manuel; Sommer, Stephan
Abstract: Die Bereitschaft, freiwillige Zahlungen zum Ausgleich von CO2-
Emissionen zu leisten, etwa bei Flügen, hat in den vergangenen Jahren erheblich
zugenommen. Eine Möglichkeit, diese Kompensationsbereitschaft weiter zu erhöhen,
besteht in der Aktivierung einer sozialen Norm, indem darauf aufmerksam gemacht wird,
dass ein Emissionsausgleich gesellschaftlich erwünscht ist. Vor diesem Hintergrund
untersucht dieser Beitrag die Bereitschaft, die durch Flugreisen verursachten CO2-
Emissionen durch den Kauf von Ausgleichszertifikaten zu kompensieren anhand eines
diskreten Entscheidungsexperimentes, das in eine Erhebung aus dem Jahr 2019
eingebettet wurde. Dabei wurde eine soziale Norm in zufälliger Weise vorgegeben,
ebenso wie eine von drei Kompensationshöhen von 5, 10 oder 15 Euro. Im Ergebnis zeigt
sich, dass 57,0% der Probanden sich dafür entscheiden, die Emissionen eines künftig
anstehenden Fluges auszugleichen. Hierbei gibt es nur geringe, statistisch nicht
signifikante Unterschiede zwischen der Gruppe, die mit einer sozialen Norm konfrontiert
wurde, und der Kontrollgruppe. Auch die Kompensationshöhe scheint keinen statistisch
signifikanten Einfluss auf die Kompensationsbereitschaft zu haben, möglicherweise weil
die Unterschiede in den Kompensationshöhen gering sind.2021-01-01T00:00:00ZReducing vehicle cold start emissions through carbon pricing: Evidence from Germany
http://hdl.handle.net/2003/39968
Title: Reducing vehicle cold start emissions through carbon pricing: Evidence from Germany
Authors: Frondel, Manuel; Marggraf, Clemens; Sommer, Stephan; Vance, Colin
Abstract: A large proportion of local pollutants originating from the road transport sector
is generated during the so-called cold-start phase of driving, that is, the first
few minutes of driving after a car has stood inactive for several hours. Drawing on
data from the German Mobility Panel (MOP), this paper analyzes the factors that
affect the frequency of cold starts, approximated here by the number of car tours
that a household takes over the course of a week. Based on fixed-effects panel
estimations, we find a negative and statistically significant effect of fuel prices on
the number of tours and, hence, cold starts. Using our estimates to explore the
spatial implications arising from fuel price increases stipulated under Germany’s
Climate Programme 2030, we find substantial impacts on the number of avoided
tours even for modest fuel price increases of 20 cents per liter, particularly in urban
areas. This outcome lends support to using carbon pricing as a means to improve
both global climate and local air quality, pointing to a co-benefit of climate policy.2020-01-01T00:00:00ZAccurate and (almost) tuning parameter free inference in cointegrating regressions
http://hdl.handle.net/2003/39964
Title: Accurate and (almost) tuning parameter free inference in cointegrating regressions
Authors: Reichold, Karsten; Jentsch, Carsten
Abstract: Tuning parameter choices complicate statistical inference in cointegrating
regressions and affect finite sample distributions of test statistics. As commonly
used asymptotic theory fails to capture these effects, tests often suffer
from severe size distortions. We propose a novel self-normalized test statistic
for general linear hypotheses, which avoids the choice of tuning parameters.
Its limiting null distributions is nonstandard, but simulating asymptotically
valid critical values is straightforward. To further improve the performance
of the test in small to medium samples, we employ the vector autoregressive
sieve bootstrap to construct critical values. To show its consistency, we
establish a bootstrap invariance principle result under conditions that go
beyond the assumptions commonly imposed in the literature. Simulation
results demonstrate that our new test outperforms competing approaches,
as it has good power properties and is considerably less prone to size distortions.2020-01-01T00:00:00ZMonetary policy and the stock market - A partly recursive SVAR estimator
http://hdl.handle.net/2003/39831
Title: Monetary policy and the stock market - A partly recursive SVAR estimator
Authors: Keweloh, Sascha Alexander; Seepe, Andre
Abstract: This study analyzes the interdependence of monetary policy and the stock market in a structural
VAR model. We argue that commonly used short- and long-run restrictions on the interaction
of both variables might not hold and propose an estimator not requiring any of these restrictions
on the interaction of monetary policy and the stock market. The proposed estimator combines
a data driven and restriction based identification approach. In particular, the estimator allows
the researcher to order and identify some shocks recursively, while other shocks can remain unrestricted
and are identified based on independence and non-Gaussianity. We find that a positive
stock market shock contemporaneously increases the nominal interest rate, while a contractionary
monetary policy shock leads to lower stock returns on impact. Furthermore, we present evidence
that monetary policy is non-neutral with respect to long-run real stock prices.2020-01-01T00:00:00Z“The mother of all political problems?” On asylum seekers and elections
http://hdl.handle.net/2003/39817
Title: “The mother of all political problems?” On asylum seekers and elections
Authors: Tomberg, Lukas; Smith Stegen, Karen; Vance, Colin
Abstract: As immigration to Europe has increased, so has support for extremist parties. While many studies
have examined the effect of immigration on election outcomes, few have probed the effect of asylum
seekers – those fleeing strife and persecution – on voting, nor has there been much research on the
mediating role of local economic conditions. Drawing on county level panel data from Germany, our
study fills both gaps. We find that economic circumstances, as measured by the unemployment rate
and the level of disposable income, condition voters’ responses to the presence of asylum seekers, but
the effects for parties on the far right and left diverge markedly. Under economic prosperity, immigration
increases support on both sides of the political spectrum. As economic conditions worsen,
however, the effect of asylum seekers on the vote share for the far right remains stable, but weakens
for the left, eventually becoming negative. This divergence – which has not yet been reported in the
literature – suggests that an influx of asylum seekers, particularly when coupled with an economic
downturn, could tilt a political system rightwards. From a policy perspective, these results suggest
that heterogeneity arising from local economic conditions has important implications for the regional
allocation of asylum seekers.2020-01-01T00:00:00ZDetermining the efficiency of residential electricity consumption
http://hdl.handle.net/2003/39809
Title: Determining the efficiency of residential electricity consumption
Authors: Andor, Mark A.; Bernstein, David H.; Sommer, Stephan
Abstract: Increasing energy efficiency is a key global policy goal for climate protection.
An important step towards an optimal reduction of energy consumption is the identification
of energy saving potentials in different sectors and the best strategies for increasing
efficiency. This paper analyzes these potentials in the household sector by estimating the
degree of inefficiency in the use of electricity and its determinants. Using stochastic frontier
analysis and disaggregated household data, we estimate an input requirement function
and inefficiency on a sample of 2,000 German households. Our results suggest that the
mean inefficiency amounts to around 20%, indicating a notable potential for energy savings.
Moreover, we find that the household size and income are among the main determinants
of individual inefficiency. This information can be used to increase the cost-efficiency of
programs aimed to enhance energy efficiency.2020-01-01T00:00:00ZWeak convergence of sample covariance matrices and testing for seasonal unit roots
http://hdl.handle.net/2003/39808
Title: Weak convergence of sample covariance matrices and testing for seasonal unit roots
Authors: Kawka, Rafael
Abstract: The paper has two main contributions. First, weak convergence results are derived from
sampling moments of processes that contains a unit root at an arbitrary frequency, where,
in contrast to the previous literature, the proofs are mainly based on algebraic manipulations
and well known weak convergence results for martingale difference sequences. These
convergence results are used to derive the limiting distribution of the ordinary least squares
estimator for unit root autoregressions. As as second contribution, a Phillips-Perron type
test for a unit root at an arbitrary frequency is introduced and its limiting distributions are
derived. This test is further extended to a joint test for multiple unit roots and seasonal
integration. The limiting distributions of these test statistics are asymptotically equivalent
to various statistics presented earlier in the seasonal unit root literature.2020-01-01T00:00:00ZIntegrated modified OLS and fixed-b inference for seasonally cointegrated processes
http://hdl.handle.net/2003/39807
Title: Integrated modified OLS and fixed-b inference for seasonally cointegrated processes
Authors: Kawka, Rafael
Abstract: Many economic time series exhibit persistent seasonal patterns. One approach to model
this phenomenon is given by models including seasonal unit roots and, if several time series
are considered jointly, seasonal cointegration. For quarterly time series, e.g., unit roots may
be present at frequencies =2 and , in addition to the “standard unit root” at frequency
zero. Gregoir (2010) has extended the fully modified OLS estimator of Phillips and Hansen
(1990) from the cointegrating regression to the seasonally cointegrating regression case. In
this paper, we have a similar agenda, in that we undertake the corresponding extension for
the IM-OLS estimator of Vogelsang and Wagner (2014). The benefit of the seasonal IMOLS
estimator, or SIM-OLS estimator, is that it forms the basis not only for asymptotic
standard inference but also allows for fixed-b inference. The paper furthermore proposes a
test for seasonal cointegration at all unit root frequencies. Note here that the cointegrating
spaces in general differ across frequencies and have to be estimated separately for each
frequency. The theoretical analysis is complemented by a simulation study.2020-01-01T00:00:00ZPivotal tests for relevant differences in the second order dynamics of functional time series
http://hdl.handle.net/2003/39791
Title: Pivotal tests for relevant differences in the second order dynamics of functional time series
Authors: van Delft, Anne; Dette, Holger
Abstract: Motivated by the need to statistically quantify differences between modern (complex) datasets which commonly result as high-resolution measurements of stochastic processes varying over a continuum, we propose novel testing procedures to detect relevant differences between the second order dynamics of two functional time series. In order to take the between-function dynamics into account that characterize this type of functional data, a frequency domain approach is taken. Test statistics are developed to compare differences in the spectral density operators and in the primary modes of variation as encoded in the associated eigenelements. Under mild moment conditions, we show convergence of the underlying statistics to Brownian motions and obtain pivotal test statistics via a self-normalization approach. The latter is essential because the nuisance parameters can be unwieldly and their robust estimation infeasible, especially if the two functional time series are dependent. Besides from these novel features, the properties of the tests are robust to any choice of frequency band enabling also to compare energy contents at a single frequency. The finite sample performance of the tests are verified through a simulation study and are illustrated with an application to fMRI data.2020-01-01T00:00:00ZPhotovoltaics and the solar rebound: Evidence for Germany
http://hdl.handle.net/2003/39758
Title: Photovoltaics and the solar rebound: Evidence for Germany
Authors: Frondel, Manuel; Kaestner, Kathrin; Sommer, Stephan; Vance, Colin
Abstract: Recent research suggests that households would increase their electricity consumption
in the aftermath of installing photovoltaics (PV) panels, a behavioral
change commonly referred to as the solar rebound. Drawing on panel data originating
from the German Residential Energy Consumption Survey (GRECS), we
employ panel estimation methods and the dynamic system estimator developed
by Blundell and Bond (1998) to investigate the solar rebound effect, thereby accounting
for simultaneity and endogeneity issues relating to PV installation and
the electricity price. Our empirical results suggest that PV panel adoption of households
hardly reduces the amount of electricity taken from the grid. As we derive
theoretically, this outcome implies that the rebound reaches a maximum that is
bounded by about 30% for German households. Yet, we are skeptical of whether
there is such a large solar rebound effect given the strong economic incentives to
feed solar electricty into the public grid in the past.2020-01-01T00:00:00ZTesting for nonlinear cointegration under heteroskedasticity
http://hdl.handle.net/2003/39533
Title: Testing for nonlinear cointegration under heteroskedasticity
Authors: Hanck, Christoph; Massing, Till
Abstract: This article discusses cointegration tests for nonlinear cointegration in the presence of variance
breaks in the errors. We build on approaches of Cavaliere and Taylor (2006, Journal of
Time Series Analysis) for heteroskedastic cointegration tests and of Choi and Saikkonen (2010,
Econometric Theory) for nonlinear cointegration tests. We propose a bootstrap test and prove
its consistency.
A Monte Carlo study shows the approach to have appealing finite sample properties and
to work better than an approach using subresiduals. We provide an empirical application to
the environmental Kuznets curves (EKC), finding that the cointegration tests do not reject the
EKC hypothesis in most cases.2020-01-01T00:00:00ZA portmanteau-type test for detecting serial correlation in locally stationary functional time series
http://hdl.handle.net/2003/39304
Title: A portmanteau-type test for detecting serial correlation in locally stationary functional time series
Authors: Bücher, Axel; Dette, Holger; Heinrichs, Florian
Abstract: The Portmanteau test provides the vanilla method for detecting serial
correlations in classical univariate time series analysis. The method is extended to
the case of observations from a locally stationary functional time series. Asymptotic
critical values are obtained by a suitable block multiplier bootstrap procedure. The
test is shown to asymptotically hold its level and to be consistent against general
alternatives.2020-01-01T00:00:00ZA note on optimal designs for estimating the slope of a polynomial regression
http://hdl.handle.net/2003/39301
Title: A note on optimal designs for estimating the slope of a polynomial regression
Authors: Dette, Holger; Melas, Viatcheslav B.; Shpilev, Petr
Abstract: In this note we consider the optimal design problem for estimating the slope of
a polynomial regression with no intercept at a given point, say z. In contrast to
previous work, which considers symmetric design spaces we investigate the model on
the interval [0; a] and characterize those values of z, where an explicit solution of the
optimal design is possible.2020-01-01T00:00:00ZCorrecting intraday periodicity bias in realized volatility measures
http://hdl.handle.net/2003/39275
Title: Correcting intraday periodicity bias in realized volatility measures
Authors: Dette, Holger; Golosnoy, Vasyl; Kellermann, Janosch
Abstract: Diurnal ﬂuctuations in volatility are a well-documented stylized fact of intraday price data. We
investigate how this intraday periodicity (IP) aﬀects both ﬁnite sample as well as asymptotic
properties of several popular realized estimators of daily integrated volatility which are based on
functionals of M intraday returns. We demonstrate that most of the estimators considered in our
study exhibit a ﬁnite-sample bias due to IP, which can however get negligible if the number of
intraday returns diverges to inﬁnity. We suggest appropriate correction factors for this bias based
on estimates of the IP. The adequacy of the new corrections is evaluated by means of a Monte
Carlo simulation study and an empirical example.2020-01-01T00:00:00ZData-based priors for vector error correction models
http://hdl.handle.net/2003/39251
Title: Data-based priors for vector error correction models
Authors: Prüser, Jan
Abstract: We propose two data-based priors for vector error correction models. Both
priors lead to highly automatic approaches which require only minimal user
input. An empirical investigation reveals that Bayesian vector error correction
(BVEC) models equipped with our proposed priors turn out to scale well to
higher dimensions and to forecast well. In addition, we find that exploiting
information in the level variables has the potential for improving long-term
forecasts. Thus, working with VARs in first differences may ignore valuable
information. A simulation study reveals that it is beneficial, in terms of estimation
accuracy, to use BVEC in the presence of cointegration. But if there is
no cointegration, the proposed priors provide a sufficient amount of shrinkage
so that the BVEC model has a similar estimation accuracy compared to the
Bayesian vector autoregressive (BVAR) estimated in first differences.2020-01-01T00:00:00ZDifference-in-differences estimation under non-parallel trends
http://hdl.handle.net/2003/39250
Title: Difference-in-differences estimation under non-parallel trends
Authors: Dette, Holger; Schumann, Martin
Abstract: Classic difference-in-differences estimation relies on the validity of the "parallel
trends assumption" (PTA), which ensures that the evolution of the variable of interest in the
control group can be used to determine its counterfactual development in the treatment group
in the absence of treatment. The plausibility of the PTA is usually assessed by a test of the
null hypothesis that the difference between the means of both groups is constant over time
before the treatment. However, this procedure is problematic as failure to reject the null
hypothesis does not imply the absence of differences in time trends between both groups due
to low power to detect economically relevant differences. We provide three tests of equivalence
leading to a "common range" (CR) condition that replaces the PTA and which naturally reflects
differences between treatment and control. We combine the CR with standard confidence
intervals to capture both design and sampling uncertainty in the data and show that the
combined confidence intervals yield more reliable inference when the PTA is violated.2020-01-01T00:00:00ZCorrecting intraday periodicity bias in realized volatility measures
http://hdl.handle.net/2003/39209
Title: Correcting intraday periodicity bias in realized volatility measures
Authors: Dette, Holger; Golosnoy, Vasyl; Kellermann, Janosch
Abstract: Diurnal fluctuations in volatility are a well-documented stylized fact of intraday price data. We
investigate how this intraday periodicity (IP) affects both finite sample as well as asymptotic
properties of several popular realized estimators of daily integrated volatility which are based on
functionals of M intraday returns. We demonstrate that most of the estimators considered in our
study exhibit a finite-sample bias due to IP, which can however get negligible if the number of
intraday returns diverges to infinity. We suggest appropriate correction factors for this bias based
on estimates of the IP. The adequacy of the new corrections is evaluated by means of a Monte
Carlo simulation study and an empirical example.2020-01-01T00:00:00ZNew model-based bioequivalence statistical approaches for pharmacokinetic studies with sparse sampling
http://hdl.handle.net/2003/39208
Title: New model-based bioequivalence statistical approaches for pharmacokinetic studies with sparse sampling
Authors: Loingeville, Florence; Bertrand, Julie; Nguyen, Thu Thuy; Sharan, Satish; Feng, Kairui; Sun, Wanjie; Han, Jing; Grosser, Stella; Zhao, Liang; Fang, Lanyan; Möllenhoff, Kathrin; Dette, Holger; Mentré, France
Abstract: In traditional pharmacokinetic (PK) bioequivalence analysis, two one-sided tests (TOST) are conducted on the area under the concentration-time curve and the maximal concentration derived using a non-compartmental approach. When rich sampling is unfeasible, a model-based (MB) approach, using nonlinear mixed effect models (NLMEM) is possible. However, MB-TOST using asymptotic standard errors (SE) presents increased type I error when asymptotic conditions do not hold. Methods : In this work, we propose three alternative calculations of the SE based on i) an adaptation to NLMEM of the correction proposed by Gallant, ii) the a posteriori distribution of the treatment coefficient using the Hamiltonian Monte Carlo algorithm, and iii) parametric random effects and residual errors bootstrap. We evaluate these approaches by simulations, for two-arms parallel and two-periods two-sequences cross-over design with rich (n=10) and sparse (n=3) sampling under the null and the alternative hypotheses, with MB-TOST. Results: All new approaches correct for the in ation of MB-TOST type I error in PK studies with sparse designs. The approach based on the a posteriori distribution appears to be the best compromise between controlled type I errors and computing times. Conclusion: MB-TOST using non-asymptotic SE controls type I error rate better than when using asymptotic SE estimates for bioequivalence on PK studies with sparse sampling.2020-01-01T00:00:00ZA global-local prior for time-varying parameter VARs and monetary policy
http://hdl.handle.net/2003/39207
Title: A global-local prior for time-varying parameter VARs and monetary policy
Authors: Prüser, Jan
Abstract: Time-varying parameter VARs have become the workhorse models in empirical
macroeconomics. These models are usually equipped with tightly
parametrized prior distributions which favor a small and gradual change in
parameters. Do such prior distributions suppress some degree of time variation
in the VAR coefficients? We address this question by proposing a
exible global-local prior. It turns out that the conventional prior may suppress economically
relevant patterns of time variation. Using the global-local prior,
we observe that parameter change can be abrupt rather than smooth. We
find that, during the chairmanship of Paul Volcker, the Fed has been fighting
inflation pressures by raising the interest rate in response to a negative supply
shock. However, during the chairmanship of Alan Greenspan, this policy
came to an end. In contrast, using the conventional prior, we do not detect
this pattern.2020-01-01T00:00:00ZDetecting relevant differences in the covariance operators of functional time series - a sup-norm approach
http://hdl.handle.net/2003/39181
Title: Detecting relevant differences in the covariance operators of functional time series - a sup-norm approach
Authors: Dette, Holger; Kokot, Kevin
Abstract: In this paper we propose statistical inference tools for the covariance operators of functional
time series in the two sample and change point problem. In contrast to most of
the literature the focus of our approach is not testing the null hypothesis of exact equality
of the covariance operators. Instead we propose to formulate the null hypotheses in the
form that "the distance between the operators is small", where we measure deviations by
the sup-norm. We provide powerful bootstrap tests for these type of hypotheses, investigate
their asymptotic properties and study their finite sample properties by means of a
simulation study.2020-01-01T00:00:00ZDekarbonisierung bis zum Jahr 2050? Klimapolitische Maßnahmen und Energieprognosen für Deutschland, Österreich und die Schweiz
http://hdl.handle.net/2003/39180
Title: Dekarbonisierung bis zum Jahr 2050? Klimapolitische Maßnahmen und Energieprognosen für Deutschland, Österreich und die Schweiz
Authors: Frondel, Manuel; Thomas, Tobias
Abstract: Angesichts der wachsenden klimapolitischen Herausforderungen streben viele Länder Europas
bis zum Jahr 2050 eine Dekarbonisierung an, das heißt den Ausstieg aus der Nutzung
fossiler Energieträger. Vor diesem Hintergrund präsentiert dieser Beitrag Prognosen des
Energiebedarfs und der Energiemixe für Deutschland, Österreich und die Schweiz für das
Jahr 2030 sowie einen Ausblick auf das Jahr 2050. Der Vergleich der bisherigen Energiepolitiken
dieser Länder offenbart gravierende Unterschiede: Während Deutschland bislang
vorwiegend auf die massive Subventionierung alternativer Stromerzeugungstechnologien
gesetzt hat, war der bisherige Ansatz Österreichs eher, Energieverbrauch und Treibhausgasausstoß
mit ordnungsrechtlichen Maßnahmen, insbesondere Ge- und Verboten, aber
auch Subventionen, senken zu wollen. Im Gegensatz dazu setzt die Schweiz bereits seit
dem Jahr 2008 auf das marktwirtschaftliche Instrument der CO2-Abgabe. Die hier präsentierten
Prognosen des Energiebedarfs der drei Länder deuten darauf hin, dass vor allem
Deutschland und Österreich mit einer Fortführung der bisherigen Politik das langfristige
Ziel einer weitgehenden Dekarbonisierung nicht erreichen dürften, während es in der
Schweiz bereits zu einem spürbaren Rückgang des Primärenergieverbrauchs gekommen
ist. Vor diesem Hintergrund gewinnt die jüngst in Deutschland beschlossene CO2-Bepreisung
der Emissionen in den Bereichen Verkehr und Wärme besondere Bedeutung. Auch
Österreich möchte in diesen Sektoren eine CO2-Bepreisung einführen. Es bleibt allerdings
abzuwarten, wie konsequent das marktwirtschaftliche Instrument der CO2-Bepreisung tatsächlich
verfolgt werden wird.2020-01-01T00:00:00ZSequential change point detection in high dimensional time series
http://hdl.handle.net/2003/39167
Title: Sequential change point detection in high dimensional time series
Authors: Gösmann, Josua; Stoehr, Christina; Dette, Holger
Abstract: Change point detection in high dimensional data has found considerable interest
in recent years. Most of the literature designs methodology for a retrospective
analysis, where the whole sample is already available when the statistical inference begins.
This paper takes a different point of view and develops monitoring schemes for the
online scenario, where high dimensional data arrives steadily and the goal is to detect
changes as fast as possible controlling at the same time the probability of a type I error of
a false alarm. We develop sequential procedures capable of detecting changes in the mean
vector of a successively observed high dimensional time series with spatial and temporal
dependence. The statistical properties of the methods are analyzed in the case where
both, the sample size and dimension converge to infinity. In this scenario it is shown that
the new monitoring schemes have asymptotic level alpha under the null hypothesis of no
change and are consistent under the alternative of a change in at least one component
of the high dimensional mean vector. Moreover, we also prove that the new detection
scheme identifies all components affected by a change. The finite sample properties of the
new methodology are illustrated by means of a simulation study and in the analysis of a
data example.
Our approach is based on a new type of monitoring scheme for one-dimensional data
which turns out to be often more powerful than the usually used CUSUM and Page-
CUSUM methods, and the component-wise statistics are aggregated by the maximum
statistic. From a mathematical point of view we use Gaussian approximations for high
dimensional time series to prove our main results and derive extreme value convergence for
the maximum of the maximal increment of dependent Brownian motions. In particular
we show that the range of a Brownian motion on a given interval is in the domain of
attraction of the Gumbel distribution.2020-01-01T00:00:00ZA distribution free test for changes in the trend function of locally stationary processes
http://hdl.handle.net/2003/39154
Title: A distribution free test for changes in the trend function of locally stationary processes
Authors: Heinrichs, Florian; Dette, Holger
Abstract: In the common time series model Xi,n = μ(i/n)+"i,n with non-stationary errors we consider the problem of detecting a significant deviation of the mean function g(μ) from a benchmark g(μ) (such as the initial value μ(0) or the average trend R 1 0 μ(t)dt). The problem is motivated by a more realistic modelling of change point analysis, where one is interested in identifying relevant deviations in a smoothly varying sequence of means (μ(i/n))i=1,...,n and cannot assume that the sequence is piecewise constant. A test for this type of hypotheses is developed using an appropriate estimator for the integrated squared deviation of the mean function and the threshold. By a new concept of self-normalization adapted to non-stationary processes an asymptotically
pivotal test for the hypothesis of a relevant deviation is constructed. The results are illustrated by means of a simulation study and a data example.2020-01-01T00:00:00ZK-sign depth: From asymptotics to efficient implementation
http://hdl.handle.net/2003/39100
Title: K-sign depth: From asymptotics to efficient implementation
Authors: Malcherczyk, Dennis; Leckey, Kevin; Müller, Christine H.
Abstract: The K-sign depth (K-depth) of a model parameter θ in a data set is the relative number of K-tuples among its residual vector that have alternating signs. The K-depth test based on K-depth, recently proposed by Leckey et al. (2019), is equivalent to the classical residual-based sign test for K = 2, but is much more powerful for K ≥ 3. This test has two major drawbacks. First, the computation of the K-depth is fairly time consuming, and second, the test requires knowledge about the quantiles of the test statistic which previously had to be obtained by simulation for each sample size individually. We tackle both of these drawbacks by presenting a limit theorem for the distribution of the test statistic and deriving an (asymptotically equivalent) form of the K-depth which can be computed eﬃciently. For K = 3, such a limit theorem was already derived in Kustosz et al. (2016a) by mimicking the proof for U-statistics. We provide here a much shorter proof based on Donsker’s theorem and extend it to any K ≥ 3. As part of the proof, we derive an asymptotically equivalent form of the K-depth which can be computed in linear time. This alternative and the original implementation of the K-depth are compared with respect to their runtimes and absolute diﬀerence.2020-01-01T00:00:00ZPowerful generalized sign tests based on sign depth
http://hdl.handle.net/2003/39099
Title: Powerful generalized sign tests based on sign depth
Authors: Leckey, Kevin; Malcherczyk, Dennis; Müller, Christine H.
Abstract: The classical sign test usually provides very bad power for certain alternatives. We present
a generalization which is similarly easy to comprehend but much more powerful. It is based on
K-sign depth, shortly denoted by K-depth. These so-called K-depth tests are motivated by
simplicial regression depth, but are not restricted to regression problems. They can be applied
as soon as the true model leads to independent residuals with median equal to zero. Moreover,
general hypotheses on the unknown parameter vector can be tested. Since they depend only
on the signs of the residuals, these test statistics are outlier robust. While the 2-depth test, i.e.
the K-depth test for K = 2, is equivalent to the classical sign test, K-depth test with K ≥3
turn out to be more powerful in many applications. As we will briefly discuss, these tests are
also related to runs tests. A drawback of the K-depth test is its fairly high computational effort
when implemented naively. However, we show how this inherent computational complexity can
be reduced. In order to see why K-depth tests with K ≥ 3 are more powerful than the classical
sign test, we discuss the asymptotic behaviour of its test statistic for residual vectors with only
few sign changes, which is in particular the case for some nonfits the classical sign test cannot
reject. In contrast, we also consider residual vectors with alternating signs, representing models
that fit the data very well. Finally, we demonstrate the good power of the K-depth tests for
quadratic regression.2020-01-01T00:00:00ZMarket premia for renewables in Germany: The effect on electricity prices
http://hdl.handle.net/2003/39098
Title: Market premia for renewables in Germany: The effect on electricity prices
Authors: Frondel, Manuel; Kaeding, Matthias; Sommer, Stephan
Abstract: Due to the growing share of ”green” electricity generated by renewable energy
technologies, the frequency of negative price spikes has substantially increased in
Germany. To reduce such events, in 2012, a market premium scheme (MPS) was introduced
as an alternative to feed-in tariffs for the promotion of green electricity. Drawing
on hourly day-ahead spot prices for the time period spanning 2009 to 2016 and
employing a nonparametric modeling strategy called Bayesian Additive Regression
Trees, this paper empirically evaluates the efficacy of Germany’s MPS. Via counterfactual
analyses, we demonstrate that the introduction of the MPS decreased the number
of hours with negative prices by some 70%.2020-01-01T00:00:00ZEfficient tests for bio-equivalence in functional data
http://hdl.handle.net/2003/39097
Title: Efficient tests for bio-equivalence in functional data
Authors: Dette, Holger; Kokot, Kevin
Abstract: We study the problem of testing the equivalence of functional parameters (such as the
mean or variance function) in the two sample functional data problem. In contrast to
previous work, which reduces the functional problem to a multiple testing problem for the
equivalence of scalar data by comparing the functions at each point, our approach is based
on an estimate of a distance measuring the maximum deviation between the two functional
parameters. Equivalence is claimed if the estimate for the maximum deviation does not
exceed a given threshold. A bootstrap procedure is proposed to obtain quantiles for the
distribution of the test statistic and consistency of the corresponding test is proved in the
large sample scenario. As the methods proposed here avoid the use of the intersectionunion
principle they are less conservative and more powerful than the currently available
methodology.2020-01-01T00:00:00ZQuantifying deviations from separability in space-time functional processes
http://hdl.handle.net/2003/39075
Title: Quantifying deviations from separability in space-time functional processes
Authors: Dette, Holger; Dierickx, Gauthier; Kutta, Tim
Abstract: The estimation of covariance operators of spatio-temporal data is in many applications only computationally feasible under simplifying assumptions, such as separability of the covariance into strictly temporal and spatial factors. Powerful tests for this assumption have been proposed in the literature. However, as real world systems, such as climate data are notoriously inseparable, validating this assumption by statistical tests, seems inherently questionable. In this paper we present an alternative approach: By virtue of separability measures, we quantify how strongly the data’s covariance operator diverges from a separable approximation. Conﬁdence intervals localize these measures with statistical guarantees. This method provides users with a ﬂexible tool, to weigh the computational gains of a separable model against the associated increase in bias. As separable approximations we consider the established methods of partial traces and partial products, and develop weak convergence principles for the corresponding estimators. Moreover, we also prove such results for estimators of optimal, separable approximations, which are arguably of most interest in applications. In particular we present for the ﬁrst time statistical inference for this object, which has been conﬁned to estimation previously. Besides conﬁdence intervals, our results encompass tests for approximate separability. All methods proposed in this paper are free of nuisance parameters and do neither require computationally expensive resampling procedures nor the estimation of nuisance parameters. A simulation study underlines the advantages of our approach and its applicability is demonstrated by the investigation of German annual temperature data.2020-01-01T00:00:00ZDesign admissibility and de la Garza phenomenon in multi-factor experiments
http://hdl.handle.net/2003/39070
Title: Design admissibility and de la Garza phenomenon in multi-factor experiments
Authors: Dette, Holger; Liu, Xin; Yue, Rong-Xian
Abstract: The determination of an optimal design for a given regression problem is an intricate
optimization problem, especially for models with multivariate predictors. Design
admissibility and invariance are main tools to reduce the complexity of the optimization
problem and have been successfully applied for models with univariate predictors.
In particular several authors have developed sufficient conditions for the existence of
saturated designs in univariate models, where the number of support points of the optimal
design equals the number of parameters. These results generalize the celebrated de
la Garza phenomenon (de la Garza, 1954) which states that for a polynomial regression
model of degree p -1 any optimal design can be based on at most p points.
This paper provides - for the first time - extensions of these results for models
with a multivariate predictor. In particular we study a geometric characterization
of the support points of an optimal design to provide sufficient conditions for the
occurrence of the de la Garza phenomenon in models with multivariate predictors and
characterize properties of admissible designs in terms of admissibility of designs in
conditional univariate regression models.2020-01-01T00:00:00ZCO2-Bepreisung in den Sektoren Verkehr und Wärme: Optionen für eine sozial ausgewogene Ausgestaltung
http://hdl.handle.net/2003/39066
Title: CO2-Bepreisung in den Sektoren Verkehr und Wärme: Optionen für eine sozial ausgewogene Ausgestaltung
Authors: Frondel, Manuel
Abstract: Die Einführung einer nationalen CO2-Bepreisung ab dem Jahr 2021 ist beschlossene Sache:
In den Sektoren Verkehr und Wärme soll ein nationales Emissionshandelssystem
etabliert werden, in dem die CO2-Preise in den Jahren 2021 bis 2025 fixiert sind und beginnend
mit 25 Euro je Tonne sukzessive ansteigen. Dies bringt höhere Kostenbelastungen
für die Verbraucher mit sich. Um dennoch eine breite Akzeptanz für eine CO2-
Bepreisung zu gewinnen, wäre ein vielversprechender Ansatz, die daraus resultierenden
Einnahmen wieder vollständig an die Verbraucher zurückzugeben. Vor diesem Hintergrund
diskutiert dieser Beitrag drei Alternativen zur Rückverteilung der zusätzlichen
staatlichen Einnahmen: a) eine pauschale Pro-Kopf-Rückerstattung für private Haushalte,
b) die Senkung der Stromkosten durch (i) die Steuerfinanzierung der Industrieausnahmen
bei der EEG-Umlage und (ii) die Senkung der Stromsteuer und c) gezielte Zuschüsse
für besonders betroffene Verbraucher, etwa in Form einer Erhöhung des Wohngelds. Am
treffsichersten im Hinblick auf die Entlastung bedürftiger Haushalte wäre die dritte Alternative.
Mit den restlichen Mitteln könnte die unter ökologischen Gesichtspunkten zunehmend
obsolet werdende Stromsteuer reduziert werden. Wenngleich es gute Gründe sowohl
für eine Pro-Kopf-Rückerstattung als auch für eine Stromsteuersenkung gibt, hat
eine Stromsteuersenkung mehrere Vorteile gegenüber einer Pro-Kopfpauschale, insbesondere
im Hinblick auf die Sektorkopplung und die Transaktionskosten des Rückverteilungsaufwands,
welche bei einer Stromsteuersenkung vernachlässigbar wären.2020-01-01T00:00:00ZTests based on sign depth for multiple regression
http://hdl.handle.net/2003/39065
Title: Tests based on sign depth for multiple regression
Authors: Horn, Melanie; Müller, Christine H.
Abstract: The extension of simplicial depth to robust regression, the so-called simplicial regression depth,
provides an outlier robust test for the parameter vector of regression models. Since simplicial regression
depth often reduces to counting the subsets with alternating signs of the residuals, this led recently to
the notion of sign depth and sign depth test. Thereby sign depth tests generalize the classical sign tests.
Since sign depth depends on the order of the residuals, one generally assumes that the D-dimensional
regressors (explanatory variables) can be ordered with respect to an inherent order. While the one-dimensional
real space possesses such a natural order, one cannot order these regressors that easily for
D > 1 because there exists no canonical order of the data in most cases.
For this scenario, we present orderings according to the Shortest Hamiltonian Path and an approximation
of it. We compare them with more naive approaches like taking the order in the data set or ordering
on the basis of a single quantity of the regressor. The comparison bases on the computational runtime,
stability of the order when transforming the data, as well as on the power of the resulting sign depth
tests for testing the parameter vector of different multiple regression models. Moreover, we compare the
power of our new tests with the power of the classical sign test and the F-test. Thereby, the sign depth
tests based on our distance based approaches show similar power as the F-test for normally distributed
residuals with the additional benefit of being much more robust against outliers.2020-01-01T00:00:00ZAn asymptotic test for constancy of the variance under short-range dependence
http://hdl.handle.net/2003/39057
Title: An asymptotic test for constancy of the variance under short-range dependence
Authors: Schmidt, Sara; Wornowizki, Max; Fried, Roland; Dehling, Herold
Abstract: We present a novel approach to test for heteroscedasticity of
a non-stationary time series that is based on Gini's mean difference of
logarithmic local sample variances. In order to analyse the large sample behaviour
of our test statistic, we establish new limit theorems for U-statistics
of dependent triangular arrays.We derive the asymptotic distribution of the
test statistic under the null hypothesis of a constant variance and show that
the test is consistent against a large class of alternatives, including multiple
structural breaks in the variance. Our test is applicable even in the case
of non-stationary processes, assuming a locally stationary mean function.
The performance of the test and its comparatively low computation time
are illustrated in an extensive simulation study. As an application, we analyse
data from civil engineering, monitoring crack widths in concrete bridge
surfaces.2020-01-01T00:00:00ZStatistical inference for high dimensional panel functional time series
http://hdl.handle.net/2003/39020
Title: Statistical inference for high dimensional panel functional time series
Authors: Zhou, Zhou; Dette, Holger
Abstract: In this paper we develop statistical inference tools for high dimensional functional
time series. We introduce a new concept of physical dependent processes in
the space of square integrable functions, which adopts the idea of basis decomposition
of functional data in these spaces, and derive Gaussian and multiplier bootstrap
approximations for sums of high dimensional functional time series. These results
have numerous important statistical consequences. Exemplarily, we consider the development
of joint simultaneous confidence bands for the mean functions and the
construction of tests for the hypotheses that the mean functions in the spatial dimension
are parallel. The results are illustrated by means of a small simulation study
and in the analysis of Canadian temperature data.2020-01-01T00:00:00ZAre deviations in a gradually varying mean relevant? A testing approach based on sup-norm estimators
http://hdl.handle.net/2003/38720
Title: Are deviations in a gradually varying mean relevant? A testing approach based on sup-norm estimators
Authors: Bücher, Axel; Dette, Holger; Heinrichs, Florian
Abstract: Classical change point analysis aims at (1) detecting abrupt changes
in the mean of a possibly non-stationary time series and at (2) identifying regions
where the mean exhibits a piecewise constant behavior. In many applications however,
it is more reasonable to assume that the mean changes gradually in a smooth
way. Those gradual changes may either be non-relevant (i.e., small), or relevant
for a specific problem at hand, and the present paper presents statistical methodology
to detect the latter. More precisely, we consider the common nonparametric
regression model Xi = μ(i/n) +εi with possibly non-stationary errors and propose
a test for the null hypothesis that the maximum absolute deviation of the
regression function μ from a functional g(μ) (such as the value μ(0) or the integral 1
0 μ(t)dt) is smaller than a given threshold on a given interval [x0, x1] [0, 1]. A
test for this type of hypotheses is developed using an appropriate estimator, say
ˆ d∞n, for the maximum deviation d∞ = supt∈[x0,x1] |μ(t) − g(μ)|. We derive the
limiting distribution of an appropriately standardized version of ˆ d∞,n, where the
standardization depends on the Lebesgue measure of the set of extremal points of
the function μ(·) − g(μ). A refined procedure based on an estimate of this set is
developed and its consistency is proved. The results are illustrated by means of a
simulation study and a data example.2020-01-01T00:00:00ZExplicit results on conditional distributions of generalized exponential mixtures
http://hdl.handle.net/2003/38570
Title: Explicit results on conditional distributions of generalized exponential mixtures
Authors: Klüppelberg, Claudia; Seifert, Miriam Isabel
Abstract: For independent exponentially distributed random variables Xi, i ∈ N with distinct rates λi we consider sums ∑i∈AXi for A⊆N which follow generalized exponential mixture (GEM) distributions. We provide novel
explicit results on the conditional distribution of the total sum ∑i∈NXi giventhat a subset sum
∑j∈NXj exceeds a certain threshold value t > 0, and vice versa. Moreover, we investigate the characteristic tail behavior of these conditional distributions for t → ∞,. Finally, we illustrate how our probabilistic results can be applied in practice by providing examples
from both reliability theory and risk management.2020-01-01T00:00:00ZPrediction in locally stationary time series
http://hdl.handle.net/2003/38530
Title: Prediction in locally stationary time series
Authors: Dette, Holger; Wu, Weichi
Abstract: We develop an estimator for the high-dimensional covariance matrix of a locally
stationary process with a smoothly varying trend and use this statistic to derive consistent
predictors in non-stationary time series. In contrast to the currently available
methods for this problem the predictor developed here does not rely on fitting an
autoregressive model and does not require a vanishing trend. The finite sample properties
of the new methodology are illustrated by means of a simulation study and a
data example.2020-01-01T00:00:00ZDetecting structural breaks in eigensystems of functional time series
http://hdl.handle.net/2003/38386
Title: Detecting structural breaks in eigensystems of functional time series
Authors: Dette, Holger; Kutta, Tim
Abstract: Detecting structural changes in functional data is a prominent topic in statistical
literature. However not all trends in the data are important in applications, but only
those of large enough in
uence. In this paper we address the problem of identifying
relevant changes in the eigenfunctions and eigenvalues of covariance kernels of L^2[0; 1]-
valued time series. By self-normalization techniques we derive pivotal, asymptotically
consistent tests for relevant changes in these characteristics of the second order structure
and investigate their finite sample properties in a simulation study. The applicability of
our approach is demonstrated analyzing German annual temperature data.2019-01-01T00:00:00ZEquivalence tests for binary efficacy-toxicity responses
http://hdl.handle.net/2003/38379
Title: Equivalence tests for binary efficacy-toxicity responses
Authors: Möllenhoff, Kathrin; Dette, Holger; Bretz, Frank
Abstract: Clinical trials often aim to compare a new drug with a reference treatment in terms of efficacy and/or toxicity depending on covariates such as, for example, the dose level of the drug. Equivalence of these treatments can be claimed if the difference in average outcome is below a certain threshold over the covariate range. In this paper we assume that the efficacy and toxicity of the treatments are measured as binary outcome variables and we address two problems. First, we develop a new test procedure for the assessment of equivalence of two treatments over the entire covariate range for a single binary endpoint. Our approach is based on a parametric bootstrap, which generates data under the constraint that the distance between the curves is equal to the pre-speciﬁed equivalence threshold. Second, we address equivalence for bivariate binary (correlated) outcomes by extending the previous approach for a univariate response. For this purpose we use a 2-dimensional Gumbel model for binary efficacy-toxicity responses. We investigate the operating characteristics of the proposed approaches by means of a simulation study and present a case study as an illustration.2019-01-01T00:00:00ZConvergence of spectral density estimators in the locally stationary framework
http://hdl.handle.net/2003/38260
Title: Convergence of spectral density estimators in the locally stationary framework
Authors: Kawka, Rafael
Abstract: Locally stationary processes are characterised by spectral densities that are functions
of rescaled time. We study the asymptotic properties of spectral density
estimators in the locally stationary framework. In particular, we show that for a
locally stationary process with time-varying spectral density function f(u; ) standard
spectral density estimators consistently estimate the time-averaged spectral
density R 1 0 f(u; ) du. This result is complemented by some illustrative examples
and applications including HAC-inference in the multiple linear regression model
and a simple visual tool for the detection of unconditional heteroskedasticity.2019-01-01T00:00:00ZSteuer versus Emissionshandel: Optionen für die Ausgestaltung einer CO2-Bepreisung
http://hdl.handle.net/2003/38259
Title: Steuer versus Emissionshandel: Optionen für die Ausgestaltung einer CO2-Bepreisung
Authors: Frondel, Manuel
Abstract: Nach Auffassung von Ökonomen können die Treibhausgase in
Europa am kosteneffizientesten dadurch vermieden werden, dass der bislang auf die
Energiewirtschaft und die Industrie beschränkte EU-Emissionshandel auf alle noch nicht
darin integrierten Sektoren ausgeweitet wird. Allerdings müssen für die Ausweitung des
Emissionshandels Mehrheiten in der Europäischen Union gefunden werden. Solange diese
Ausweitung nicht die Zustimmung aller Mitgliedsstaaten findet, könnte die Einführung
einer nationalen CO2-Bepreisung in diesen Sektoren erwogen und im Prinzip auf zwei
Wegen umgesetzt werden: über einen Emissionshandel, entweder separat als nationales
Handelssystem etabliert oder durch einen Opt-in der noch nicht integrierten Sektoren
Deutschlands in den bestehenden EU-Emissionshandel, oder mittels Einführung einer
nationalen CO2-Steuer. Die in diesem Beitrag vorgenommene Abwägung der Vor- und
Nachteile beider Optionen, CO2-Steuer versus Emissionshandel, zeigt, dass eine CO2-
Steuer gravierende Nachteile aufweist, allen voran die mangelnde Treffsicherheit bei der
Erreichung vorgegebener Emissionsziele.2019-01-01T00:00:00ZCognitive reflection and the valuation of energy efficiency
http://hdl.handle.net/2003/38258
Title: Cognitive reflection and the valuation of energy efficiency
Authors: Andor, Mark A.; Frondel, Manuel; Gerster, Andreas; Sommer, Stephan
Abstract: Based on a stated-choice experiment among about 3,600 German household
heads on the purchase of electricity-using durables, this paper explores the impact
of cognitive reflection on consumers’ valuation of energy efficiency, as well as its
interaction with consumers’ response to the EU energy label. Using a standard
cognitive reflection test, our results indicate that consumers with low cognitive
reflection scores value energy efficiency less than those with high scores. Furthermore,
we find that consumers with a low level of cognitive reflection respond more
strongly to grade-like energy efficiency classes than to detailed information on
annual energy use.2019-01-01T00:00:00ZTwo-sample tests for relevant differences in the eigenfunctions of covariance operators
http://hdl.handle.net/2003/38256
Title: Two-sample tests for relevant differences in the eigenfunctions of covariance operators
Authors: Aue, Alexander; Dette, Holger; Rice, Gregory
Abstract: This paper deals with two-sample tests for functional time series data, which have become widely
available in conjunction with the advent of modern complex observation systems. Here, particular interest
is in evaluating whether two sets of functional time series observations share the shape of their primary
modes of variation as encoded by the eigenfunctions of the respective covariance operators. To this end,
a novel testing approach is introduced that connects with, and extends, existing literature in two main
ways. First, tests are set up in the relevant testing framework, where interest is not in testing an exact
null hypothesis but rather in detecting deviations deemed sufficiently relevant, with relevance determined
by the practitioner and perhaps guided by domain experts. Second, the proposed test statistics rely on
a self-normalization principle that helps to avoid the notoriously difficult task of estimating the long-run
covariance structure of the underlying functional time series. The main theoretical result of this paper is
the derivation of the large-sample behavior of the proposed test statistics. Empirical evidence, indicating
that the proposed procedures work well in finite samples and compare favorably with competing methods,
is provided through a simulation study, and an application to annual temperature data.2019-01-01T00:00:00ZA generalized method of moments estimator for structural vector autoregressions based on higher moments
http://hdl.handle.net/2003/38224
Title: A generalized method of moments estimator for structural vector autoregressions based on higher moments
Authors: Keweloh, Alexander Sascha
Abstract: I propose a generalized method of moments estimator for structural vector
autoregressions with independent and non-Gaussian shocks. The shocks are
identified by exploiting information contained in higher moments of the
data. Extending the standard identification approach, which relies on the
covariance, to the coskewness and cokurtosis allows to identify and
estimate the simultaneous interaction without any further restrictions. I
analyze the finite sample properties of the estimator and apply it to
illustrate the simultaneous interaction between economic activity, oil and
stock prices.2019-09-11T00:00:00ZEfficient model-based bioequivalence testing
http://hdl.handle.net/2003/38213
Title: Efficient model-based bioequivalence testing
Authors: Möllenhoff, Kathrin; Loingeville, Florence; Bertrand, Julie; Nguyen, Thu Thuy; Sharan, Satish; Sun, Guoying; Grosser, Stella; Zhao, Liang; Fang, Lanyan; Mentré, France; Dette, Holger
Abstract: The classical approach to analyze pharmacokinetic (PK) data in bioequivalence studies
aiming to compare two different formulations is to perform noncompartmental analysis
(NCA) followed by two one-sided tests (TOST). In this regard the PK parameters AUC
and Cmax are obtained for both treatment groups and their geometric mean ratios are
considered. According to current guidelines by the U.S. Food and Drug Administration
and the European Medicines Agency the formulations are deemed to be similar if the
90%- confidence interval for these ratios falls between 0:8 and 1:25. As NCA is not a
reliable approach in case of sparse designs, a model-based alternative has already been
proposed for the estimation of AUC and Cmax using non-linear mixed effects models.
Here we propose another test than the TOST, called BOT, and evaluate it through a
simulation study both for NCA and model-based approaches. For products with high
variability on PK parameters, this method appears to have closer type I errors to the
conventionally accepted significance level of 0:05, suggesting its potential use in situations
where conventional bioequivalence analysis is not applicable.2019-01-01T00:00:00ZA note on Herglotz’s theorem for time series on function spaces
http://hdl.handle.net/2003/38207
Title: A note on Herglotz’s theorem for time series on function spaces
Authors: van Delft, Anne; Eichler, Michael
Abstract: In this article, we prove Herglotz’s theorem for Hilbert-valued time series. This requires the notion of an operator-valued measure, which we shall make precise for our setting. Herglotz’s theorem for functional time series allows to generalize existing results that are central to frequency domain analysis on the function space. In particular, we use this result to prove the existence of a functional Cramér representation of a large class of processes, including those with jumps in the spectral distribution and long-memory processes. We furthermore obtain an optimal ﬁnite dimensional reduction of the time series under weaker assumptions than available in the literature. The results of this paper therefore enable Fourier analysis for processes of which the spectral density operator does not necessarily exist.2019-01-01T00:00:00ZTesting for stationarity of functional time series in the frequency domain
http://hdl.handle.net/2003/38206
Title: Testing for stationarity of functional time series in the frequency domain
Authors: Aue, Alexander; van Delft, Anne
Abstract: Interest in functional time series has spiked in the recent past with papers covering both methodology and applications being published at a much increased pace. This article contributes to the research in this area by proposing a new stationarity test for functional time series based on frequency domain methods. The proposed test statistics is based on joint dimension reduction via functional principal components analysis across the spectral density operators at all Fourier frequencies, explicitly allowing for frequency-dependent levels of truncation to adapt to the dynamics of the underlying functional time series. The properties of the test are derived both under the null hypothesis of stationary functional time series and under the smooth alternative of locally stationary functional time series. The methodology is theoretically justiﬁed through asymptotic results. Evidence from simulation studies and an application to annual temperature curves suggests that the test works well in ﬁnite samples.2019-01-01T00:00:00ZA note on quadratic forms of stationary functional time series under mild conditions
http://hdl.handle.net/2003/38205
Title: A note on quadratic forms of stationary functional time series under mild conditions
Authors: van Delft, Anne
Abstract: We study the distributional properties of a quadratic form of a stationary functional time series under mild moment conditions. As an important application, we obtain consistency rates of estimators of spectral density operators and prove joint weak convergence to a vector of complex Gaussian random operators. Weak convergence is established based on an approximation of the form via transforms of Hilbert-valued martingale difference sequences. As a side-result, the distributional properties of the long-run covariance operator are established.2019-01-01T00:00:00ZSampling distributions of optimal portfolio weights and characteristics in low and large dimensions
http://hdl.handle.net/2003/38204
Title: Sampling distributions of optimal portfolio weights and characteristics in low and large dimensions
Authors: Bodnar, Taras; Dette, Holger; Parolya, Nestor; Thorsén, Erik
Abstract: Optimal portfolio selection problems are determined by the (unknown) parameters of
the data generating process. If an investor want to realise the position suggested by the
optimal portfolios he/she needs to estimate the unknown parameters and to account the
parameter uncertainty into the decision process. Most often, the parameters of interest
are the population mean vector and the population covariance matrix of the asset re
turn distribution. In this paper we characterise the exact sampling distribution of the
estimated optimal portfolio weights and their characteristics by deriving their sampling
distribution which is present in terms of a stochastic representation. This approach pos
sesses several advantages, like (i) it determines the sampling distribution of the estimated
optimal portfolio weights by expressions which could be used to draw samples from this
distribution efficiently; (ii) the application of the derived stochastic representation pro
vides an easy way to obtain the asymptotic approximation of the sampling distribution.
The later property is used to show that the high-dimensional asymptotic distribution
of optimal portfolio weights is a multivariate normal and to determine its parameters.
Moreover, a consistent estimator of optimal portfolio weights and their characteristics
is derived under the high-dimensional settings. Via an extensive simulation study, we
investigate the ﬁnite-sample performance of the derived asymptotic approximation and
study its robustness to the violation of the model assumptions used in the derivation of
the theoretical results.2019-01-01T00:00:00ZIdentifying shifts between two regression curves
http://hdl.handle.net/2003/38196
Title: Identifying shifts between two regression curves
Authors: Dette, Holger; Sankar Dhar, Subhra; Wu, Weichi
Abstract: This article studies the problem whether two convex (concave) regression functions
modelling the relation between a response and covariate in two samples differ by a shift
in the horizontal and/or vertical axis. We consider a nonparametric situation assuming
only smoothness of the regression functions. A graphical tool based on the derivatives
of the regression functions and their inverses is proposed to answer this question and
studied in several examples. We also formalize this question in a corresponding hypothesis
and develop a statistical test. The asymptotic properties of the corresponding
test statistic are investigated under the null hypothesis and local alternatives. In contrast
to most of the literature on comparing shape invariant models, which requires
independent data the procedure is applicable for dependent and non-stationary data.
We also illustrate the finite sample properties of the new test by means of a small
simulation study and a real data example.2019-01-01T00:00:00ZPrediction in regression models with continuous observations
http://hdl.handle.net/2003/38195
Title: Prediction in regression models with continuous observations
Authors: Dette, Holger; Pepelyshev, Andrey; Zhigljavsky, Anatoly
Abstract: We consider the problem of predicting values of a random process or ﬁeld satisfying a linear model y(x) = θ>f(x) + ε(x), where errors ε(x) are correlated. This is a common problem in kriging, where the case of discrete observations is standard. By focussing on the case of continuous observations, we derive expressions for the best linear unbiased predictors and their mean squared error. Our results are also applicable in the case where the derivatives of the process y are available, and either a response or one of its derivatives need to be predicted. The theoretical results are illustrated by several examples in particular for the popular Matérn 3/2 kernel.2019-01-01T00:00:00ZVolatility forecasting accuracy for Bitcoin
http://hdl.handle.net/2003/38165
Title: Volatility forecasting accuracy for Bitcoin
Authors: Köchling, Gerrit; Schmidtke, Philipp; Posch, Peter N.
Abstract: We analyse the quality of Bitcoin volatility forecasting of GARCH-type
models applying the commonly used volatility proxy based on squared daily
returns as well as a jump-robust proxy based on intra-day returns and vary
the degrees of asymmetry in robust loss functions. We construct model
confidence sets (MCS) which contain superior models with a high probability
and find them to be systematically smaller for asymmetric loss functions
and the jump robust proxy. Our findings suggest a cautious use of GARCH
models in forecasting Bitcoin's volatility.2019-01-01T00:00:00ZOptimal designs for estimating individual coefficients in polynomial regression with no intercept
http://hdl.handle.net/2003/38137
Title: Optimal designs for estimating individual coefficients in polynomial regression with no intercept
Authors: Dette, Holger; Melas, Viatcheslav B.; Shpilev, Petr
Abstract: In a seminal paper Studden (1968) characterized c-optimal designs in regression
models, where the regression functions form a Chebyshev system. He used these
results to determine the optimal design for estimating the individual coefficients in a
polynomial regression model on the interval [-1; 1] explicitly. In this note we identify
the optimal design for estimating the individual coefficients in a polynomial regression
model with no intercept (here the regression functions do not form a Chebyshev
system).2019-01-01T00:00:00ZFinancial risk measures for a network of individual agents holding portfolios of lighttailed objects
http://hdl.handle.net/2003/38088
Title: Financial risk measures for a network of individual agents holding portfolios of lighttailed objects
Authors: Klüppelberg, Claudia; Seifert, Miriam Isabel
Abstract: We investigate a financial network of agents holding portfolios of independent
light-tailed risky objects whose losses are asymptotically exponentially
distributed with distinct tail parameters. We show that the
asymptotic distributions of portfolio losses belong to the class of functional
exponential mixtures which we introduce in this paper. We also
provide statements for Value-at-Risk and Expected Shortfall risk measures
as well as for their conditional counterparts. Compared to heavy
tail settings we establish important qualitative differences in the asymptotic
behavior of portfolio risks under a light tail assumption which have
to be accounted for in practical risk management.2019-01-01T00:00:00ZA new approach for open-end sequential change point monitoring
http://hdl.handle.net/2003/38081
Title: A new approach for open-end sequential change point monitoring
Authors: Gösmann, Josua; Kley, Tobias; Dette, Holger
Abstract: We propose a new sequential monitoring scheme for changes in the parameters of
a multivariate time series. In contrast to procedures proposed in the literature which
compare an estimator from the training sample with an estimator calculated from the
remaining data, we suggest to divide the sample at each time point after the training
sample. Estimators from the sample before and after all separation points are then
continuously compared calculating a maximum of norms of their differences. For openend
scenarios our approach yields an asymptotic level a procedure, which is consistent
under the alternative of a change in the parameter.2019-01-01T00:00:00ZWirtschaftliche Aktivität und Emissionen: Die Umweltkuznetskurve
http://hdl.handle.net/2003/38076
Title: Wirtschaftliche Aktivität und Emissionen: Die Umweltkuznetskurve
Authors: Wagner, Martin; Knorre, Fabian
Abstract: Seit dem Beginn der industriellen Revolution ist die mittlere globale Temperatur um circa
ein Grad Celsius gestiegen. Es steht außer Zweifel, dass dieser Anstieg wesentlich auch
durch menschliche Aktivitäten getrieben ist - durch Emissionen von Kohlenstoffdioxid
und anderen Treibhausgasen. Wie sehen die Zusammenhänge zwischen wirtschaftlicher
Aktivität und Emissionen aus? Steigen die Emissionen zwingend mit steigender
wirtschaftlicher Aktivität? In diesem Kapitel wollen wir einige grundlegende Probleme
beleuchten, die bei der statistischen - eigentlich ökonometrischen - Analyse dieser
Zusammenhänge auftreten. Diese Probleme sind symptomatisch für wirtschaftswissenschaftliche
Beziehungen und ein Grund warum sich die Ökonometrie als eigenständige
Disziplin etabliert hat.2019-01-01T00:00:00ZLimit theorems for locally stationary processes
http://hdl.handle.net/2003/38046
Title: Limit theorems for locally stationary processes
Authors: Kawka, Rafael
Abstract: We present limit theorems for locally stationary processes that have a one sided
time-varying moving average representation. In particular, we prove a central limit
theorem (CLT), a weak and a strong law of large numbers (WLLN, SLLN) and a
law of the iterated logarithm (LIL) under mild assumptions that are closely related
to those originally imposed by Dahlhaus and Polonik (2006).2019-01-01T00:00:00ZSome explicit solutions of c-optimal design problems for polynomial regression
http://hdl.handle.net/2003/38039
Title: Some explicit solutions of c-optimal design problems for polynomial regression
Authors: Dette, Holger; Melas, Viatcheslav B.; Shpilev, Petr
Abstract: In this paper we consider the optimal design problem for extrapolation and estimation
of the slope at a given point, say z, in a polynomial regression with no intercept.
We provide explicit solutions of these problems in many cases and characterize those
values of z, where this is not possible.2019-01-01T00:00:00ZOn scale estimation under shifts in the mean
http://hdl.handle.net/2003/38014
Title: On scale estimation under shifts in the mean
Authors: Axt, Ieva; Fried, Roland
Abstract: In many situations it is crucial to estimate the variance properly. Ordinary variance estimators
perform poorly in the presence of shifts in the mean. We investigate an approach
based on non-overlapping blocks, which yields good results in this change-point scenario.
We show the strong consistency and the asymptotic normality of such blocks-estimators
of the variance under rather general conditions. For estimation of the standard deviation
a blocks-estimator based on average standard deviations turns out to be preferable over
the square root of the average variances. We provide recommendations on the appropriate
choice of the block size and compare this blocks-approach with difference-based
estimators. If level shifts occur rather frequently even better results can be obtained by
adaptive trimming of the blocks under the assumption of normality.2019-01-01T00:00:00ZOptimal designs for model averaging in non-nested models
http://hdl.handle.net/2003/37979
Title: Optimal designs for model averaging in non-nested models
Authors: Alhorn, Kira; Dette, Holger; Schorning, Kirsten
Abstract: In this paper we construct optimal designs for frequentist model averaging estimation.
We derive the asymptotic distribution of the model averaging estimate with fixed weights
in the case where the competing models are non-nested and none of these models is correctly
specified. A Bayesian optimal design minimizes an expectation of the asymptotic
mean squared error of the model averaging estimate calculated with respect to a suitable
prior distribution. We demonstrate that Bayesian optimal designs can improve the
accuracy of model averaging substantially. Moreover, the derived designs also improve
the accuracy of estimation in a model selected by model selection and model averaging
estimates with random weights.2019-01-01T00:00:00ZWTA-WTP disparity: The role of perceived realism of the valuation setting
http://hdl.handle.net/2003/37944
Title: WTA-WTP disparity: The role of perceived realism of the valuation setting
Authors: Frondel, Manuel; Sommer, Stephan; Tomberg, Lukas
Abstract: Based on a survey among more than 5,000 German households and a single-binary
choice experiment in which we randomly split the respondents into two groups, this
paper elicits both households’ willingness to pay (WTP) for power supply security
and their willingness to accept (WTA) compensations for a reduced security level.
In accord with numerous empirical studies, we find that the mean WTA value substantially
exceeds the mean WTP bid, in our empirical example by a factor of 3.56.
Yet, the WTA-WTP ratio decreases to 2.35 among respondents who believe that the
hypothetical valuation setting is likely to become true. Conversely, the WTA-WTP
ratio increases to 3.81 among respondents who deem the setting unlikely. Given this
discrepancy, we conclude that to diminish the WTA-WTP disparity resulting from
stated-preference surveys at least to some extent, inquiring about respondents’ perception
on the realism of the valuation setting is an essential element of any survey
design.2019-01-01T00:00:00ZEmployee representation and innovation – disentangling the effect of legal and voluntary representation institutions in Germany
http://hdl.handle.net/2003/37916
Title: Employee representation and innovation – disentangling the effect of legal and voluntary representation institutions in Germany
Authors: Kraft, Kornelius; Lammers, Alexander
Abstract: This paper studies the effect of employee representation bodies provided by management on product and process innovations. In contrast to statutory forms of co-determination such as works councils, participative practices initiated by management are not equipped with any legally granted rights at all. Such alternative forms of employee representation are far less frequently and thoroughly analyzed than works councils. We compare the effects of these co-determination institutions established voluntarily with those initiated on a legal basis on different kinds of innovation measures. We differentiate between process and product (incremental and radical) innovations. To tackle endogeneity, the estimations are based on recursive bivariate and multivariate probit models. Results show that employee representation provided voluntarily by management supports incremental as well as radical product and process innovations. The effect is much more pronounced when endogeneity is taken into account. Works councils, however, only exhibit a positive effect on incremental innovations. Moreover, the results point to a substitutive relationship between both types of employee representation.2019-01-01T00:00:00ZEquivalence of regression curves sharing common parameters
http://hdl.handle.net/2003/37915
Title: Equivalence of regression curves sharing common parameters
Authors: Möllenhoff, Kathrin; Bretz, Frank; Dette, Holger
Abstract: In clinical trials the comparison of two different populations is a frequently addressed
problem. Non-linear (parametric) regression models are commonly used to
describe the relationship between covariates as the dose and a response variable in
the two groups. In some situations it is reasonable to assume some model parameters
to be the same, for instance the placebo effect or the maximum treatment effect. In
this paper we develop a (parametric) bootstrap test to establish the similarity of two
regression curves sharing some common parameters. We show by theoretical arguments
and by means of a simulation study that the new test controls its level and
achieves a reasonable power. Moreover, it is demonstrated that under the assumption
of common parameters a considerable more powerful test can be constructed compared
to the test which does not use this assumption. Finally, we illustrate potential
applications of the new methodology by a clinical trial example.2019-01-01T00:00:00ZThe empirical process of residuals from an inverse regression
http://hdl.handle.net/2003/37904
Title: The empirical process of residuals from an inverse regression
Authors: Kutta, Tim; Bissantz, Nicolai; Chown, Justin; Dette, Holger
Abstract: In this paper we investigate an indirect regression model characterized by the
Radon transformation. This model is useful for recovery of medical images obtained by computed tomography scans. The indirect regression function is estimated using a series estimator
motivated by a spectral cut-off technique. Further, we investigate the empirical process of
residuals from this regression, and show that it satsifies a functional central limit theorem.2019-01-01T00:00:00ZGeneralized sign tests based on sign depth
http://hdl.handle.net/2003/37839
Title: Generalized sign tests based on sign depth
Authors: Leckey, Kevin; Malcherczyk, Dennis; Müller, Christine H.
Abstract: We introduce generalized sign tests based on K-sign depth, shortly denoted
by K-depth. These so-called K-depth tests are motivated by simplicial regression
depth. Since they depend only on the signs of the residuals, these test statistics
are easy to comprehend and outlier robust. We show that the K-depth test with
K = 2 is equivalent to the classical sign test so that K-depth tests with K > 2
are generalizations of the classical sign test. Since the K-depth test with K = 2 is
equivalent to the classical sign test, it has the same drawbacks as the classical sign
test. However, the generalized sign tests with K > 2 are much more powerful. We
show this by deriving their behavior at observations with few sign changes. Thereby
we also prove an upper bound for the K-depth which is attained by observations
with alternating signs of residuals. Furthermore, we prove the consistency of the K-
depth. Finally, we demonstrate the good power of the K-depth tests for relevance
testing, quadratic regression, and tests for explosive AR(2) and nonlinear AR(1)
regression.2018-01-01T00:00:00ZOptimal designs for series estimation in nonparametric regression with correlated data
http://hdl.handle.net/2003/37836
Title: Optimal designs for series estimation in nonparametric regression with correlated data
Authors: Dette, Holger; Schorning, Kirsten; Konstantinou, Maria
Abstract: In this paper we investigate the problem of designing experiments for series estimators in nonparametric regression models with correlated observations. We use projection based estimators to derive an explicit solution of the best linear oracle estimator in the continuous time model for all Markovian-type error processes. These solutions are then used to construct estimators, which can be calculated from the available data along with their corresponding optimal design points. Our results are illustrated by means of a simulation study, which demonstrates that the new series estimator has a better performance than the commonly used techniques based on the optimal linear unbiased estimators. Moreover, we show that the performance of the estimators proposed in this paper can be further improved by choosing the design points appropriately.2018-01-01T00:00:00ZGoodness-of-fit testing the error distribution in multivariate indirect regression
http://hdl.handle.net/2003/37835
Title: Goodness-of-fit testing the error distribution in multivariate indirect regression
Authors: Chown, Justin; Bissantz, Nicolai; Dette, Holger
Abstract: We propose a goodness-of-fit test for the distribution of errors from a multivariate
indirect regression model. The test statistic is based on the Khmaladze transformation of the
empirical process of standardized residuals. This goodness-of-fit test is consistent at the root-n
rate of convergence, and the test can maintain power against local alternatives converging to
the null at a root-n rate.2018-01-01T00:00:00ZA similarity measure for second order properties of non-stationary functional time series with applications to clustering and testing
http://hdl.handle.net/2003/37828
Title: A similarity measure for second order properties of non-stationary functional time series with applications to clustering and testing
Authors: van Delft, Anne; Dette, Holger
Abstract: Due to the surge of data storage techniques, the need for the development of appropri-ate techniques to identify patterns and to extract knowledge from the resulting enormous data sets, which can be viewed as collections of dependent functional data, is of increasing interest in many scientific areas. We develop a similarity measure for spectral density oper-ators of a collection of functional time series, which is based on the aggregation of Hilbert-Schmidt differences of the individual time-varying spectral density operators. Under fairly general conditions, the asymptotic properties of the corresponding estimator are derived and asymptotic normality is established. The introduced statistic lends itself naturally to quantify (dis)-similarity between functional time series, which we subsequently exploit in order to build a spectral clustering algorithm. Our algorithm is the first of its kind in the analysis of non-stationary (functional) time series and enables to discover particular pat-terns by grouping together ‘similar’ series into clusters, thereby reducing the complexity of the analysis considerably. The algorithm is simple to implement and computationally fea-sible. As a further application we provide a simple test for the hypothesis that the second order properties of two non-stationary functional time series coincide.2018-01-01T00:00:00ZAliasing effects for random fields over spheres of arbitrary dimension
http://hdl.handle.net/2003/37827
Title: Aliasing effects for random fields over spheres of arbitrary dimension
Authors: Durastanti, Claudio; Patschkowski, Tim
Abstract: In this paper, aliasing effects are investigated for random ﬁelds deﬁned on the d-dimensional
sphere Sd, and reconstructed from discrete samples. First, we introduce the concept of an aliasing function
on Sd. The aliasing function allows to identify explicitly the aliases of a given harmonic coefficient in
the Fourier decomposition. Then, we exploit this tool to establish the aliases of the harmonic coefficients approximated by means of the quadrature procedure named spherical uniform sampling. Subsequently, we
study the consequences of the aliasing errors in the approximation of the angular power spectrum of an isotropic random ﬁeld, the harmonic decomposition of its covariance function. Finally, we show that band-
limited random ﬁelds are aliases-free, under the assumption of a sufficiently large amount of nodes in the quadrature rule.2018-01-01T00:00:00ZIncreased market transparency in Germany’s gasoline market: The death of rockets and feathers?
http://hdl.handle.net/2003/37826
Title: Increased market transparency in Germany’s gasoline market: The death of rockets and feathers?
Authors: Frondel, Manuel; Horvath, Marco; Vance, Colin; Kihm, Alexander
Abstract: Drawing on a consumer search model and a unique panel data set of daily
fuel prices covering over 5,000 fuel stations in Germany, this paper documents a
change in the price setting behavior of retail gas stations following the introduction of
a legally mandated on-line price portal. Prior to the introduction of the portal in 2013,
positive asymmetry is found on the basis of error correction models, with prices following
the “rockets and feathers” pattern documented in many commodity markets,
particularly in retail markets for fuels. In the aftermath of the portal’s introduction, by
contrast, negative asymmetry is observed: fuel price decreases in response to refinery
price decreases are stronger than fuel price increases due to refinery price increases.
This reversal in price pass-through, which is found among both branded and unbranded
stations, suggests welfare gains for consumers from increased market transparency.2018-01-01T00:00:00ZStatistical analysis of the lifetime of diamond impregnated tools for core drilling of concrete
http://hdl.handle.net/2003/37814
Title: Statistical analysis of the lifetime of diamond impregnated tools for core drilling of concrete
Authors: Malevich, Nadja; Müller, Christine H.; Kansteiner, Michael; Biermann, Dirk; Ferreira, Manuel; Tillmann, Wolfgang
Abstract: The lifetime of diamond impregnated tools for core drilling of concrete
is studied via the lifetimes of the single diamonds on the tool. Thereby, the number
of visible and active diamonds on the tool surface is determined by microscopical
inspections of the tool at given points in time. This leads to interval-censored lifetime
data if only the diamonds visible at the beginning are considered. If also the
lifetimes of diamonds appearing during the drilling process are included then the
lifetimes are doubly interval-censored. A statistical method is presented to analyse
the interval-censored data as well as the doubly interval-censored data. The method
is applied to three series of experiments which differ in the size of the diamonds
and the type of concrete. It turns out that the lifetimes of small diamonds used for
drilling into conventional concrete is much shorter than the lifetimes when using
large diamonds or high strength concrete.2018-01-01T00:00:00ZDetection of anomalous sequences in crack data of a bridge monitoring
http://hdl.handle.net/2003/37813
Title: Detection of anomalous sequences in crack data of a bridge monitoring
Authors: Abbas, Sermad; Fried, Roland; Heinrich, Jens; Horn, Melanie; Jakubzik, Mirko; Kohlenbach, Johanna; Maurer, Reinhard; Michels, Anne; Müller, Christine H.
Abstract: For estimating the remaining lifetime of old prestressed concrete bridges,
a monitoring of crack widths can be used. However, the time series of crack widths
show a strong variation mainly caused by temperature and traffic. Additionally, sequences
with extreme volatility appear where the cause is unknown. They are called
anomalous sequences in the following.We present and compare four methods which
aim to detect these anomalous sequences in the time series. Volatilities caused by
traffic should not be detected.2018-01-01T00:00:00ZMultiscale change point detection for dependent data
http://hdl.handle.net/2003/37806
Title: Multiscale change point detection for dependent data
Authors: Dette, Holger; Schüler, Theresa; Vetter, Mathias
Abstract: In this paper we study the theoretical properties of the simultaneous multiscale change
point estimator (SMUCE) proposed by Frick et al. (2014) in regression models with dependent
error processes. Empirical studies show that in this case the change point estimate
is inconsistent, but it is not known if alternatives suggested in the literature for correlated
data are consistent. We propose a modification of SMUCE scaling the basic statistic by
the long run variance of the error process, which is estimated by a difference-type variance
estimator calculated from local means from different blocks. For this modification we prove
model consistency for physical dependent error processes and illustrate the finite sample
performance by means of a simulation study.2018-01-01T00:00:00ZPanel cointegrating polynomial regressions: Group-mean fully modified OLS estimation and inference
http://hdl.handle.net/2003/37669
Title: Panel cointegrating polynomial regressions: Group-mean fully modified OLS estimation and inference
Authors: Wagner, Martin; Reichold, Karsten
Abstract: This paper considers group-mean fully modified OLS estimation for a panel of cointegrating
polynomial regressions, i. e., regressions that include an integrated process and its powers as
explanatory variables. The stationary errors are allowed to be serially correlated, the regressor
to be endogenous and { as usual in the nonstationary panel literature { we include individual
specific fixed effects. We consider a fixed cross-section dimension, asymptotics in the time
dimension only and show that the estimator allows for standard asymptotic inference in this
setting. In both the simulations as well as an illustrative application estimating environmental
Kuznets curves for carbon dioxide emissions we compare our group-mean estimator with the
pooled fully modified OLS estimator of de Jong and Wagner (2018).2018-01-01T00:00:00ZConsistency for the negative binomial regression with fixed covariate
http://hdl.handle.net/2003/37352
Title: Consistency for the negative binomial regression with fixed covariate
Authors: Weißbach, Rafael; Radloff, Lucas
Abstract: We model an overdispersed count as a dependent measurement, by means of
the Negative Binomial distribution. We consider quantitative regressors that
are ﬁxed by design. The expectation of the dependent variable is assumed to
be a known function of a linear combination involving regressors and their coefficients. In the NB1-parametrization of the negative binomial distribution,
the variance is a linear function of the expectation, inﬂated by the dispersion
parameter, and not a generalized linear model. We apply a general result of
Bradley and Gart (1962) to derive weak consistency and asymptotic normality of the maximum likelihood estimator for all parameters. To this end, we
show (i) how to bound the logarithmic density by a function that is linear
in the outcome of the dependent variable, independently of the parameter.
Furthermore (ii) the positive deﬁniteness of the matrix related to the Fisher
information is shown with the Cauchy-Schwarz inequality.2018-01-01T00:00:00ZUsing the extremal index for value-at-risk backtesting
http://hdl.handle.net/2003/37201
Title: Using the extremal index for value-at-risk backtesting
Authors: Bücher, Axel; Posch, Peter N.; Schmidtke, Philipp
Abstract: We introduce a set of new Value-at-Risk independence backtests by establishing a
connection between the independence property of Value-at-Risk forecasts and the
extremal index, a general measure of extremal clustering of stationary sequences.
We introduce a sequence of relative excess returns whose extremal index has to
be estimated. We compare our backtest to both popular and recent competitors
using Monte-Carlo simulations and find considerable power in many scenarios.
In an applied section we perform realistic out-of-sample forecasts with common
forecasting models and discuss advantages and pitfalls of our approach.2018-01-01T00:00:00ZSwitching to green electricity: Spillover effects on household consumption
http://hdl.handle.net/2003/37200
Title: Switching to green electricity: Spillover effects on household consumption
Authors: Sommer, Stephan
Abstract: One way to reduce emissions from the consumption of electricity is switching to
green electricity suppliers. This paper identifies the determinants of adopting green electricity
and the effect on electricity consumption, using panel data on more than 9,000
households. To control for potential self-selection into green electricity tariffs, an endogenous
dummy treatment effects model is estimated. The results suggest that wealthier
and better-educated households are more likely to adopt green electricity. Moreover, we
find that switching to green electricity decreases electricity consumption and households
supplied by green electricity are less price-responsive. Consequently, enforcing higher
prices for conventional electricity might prove effective in reducing both greenhouse gas
emissions and electricity consumption at the household level.2018-01-01T00:00:00ZPanel cointegrating polynomial regression analysis and the environmental Kuznets curve
http://hdl.handle.net/2003/37148
Title: Panel cointegrating polynomial regression analysis and the environmental Kuznets curve
Authors: de Jong, Robert M.; Wagner, Martin
Abstract: This paper develops a modified and a fully modified OLS estimator for a panel of cointegrating
polynomial regressions, i.e. regressions that include an integrated process and its powers
as explanatory variables. The stationary errors are allowed to be serially correlated and the
regressors are allowed to be endogenous and we allow for individual and time fixed effects. Inspired
by Phillips and Moon (1999) we consider a cross-sectional i.i.d. random linear process
framework. The modified OLS estimator utilizes the large cross-sectional dimension that allows
to consistently estimate and subtract an additive bias term without the need to also transform
the dependent variable as required in fully modified OLS estimation. Both developed estimators
have zero mean Gaussian limiting distributions and thus allow for standard asymptotic inference.
Our illustrative application indicates that the developed methods are a potentially useful
addition to not least the environmental Kuznets curve literature's toolkit.2018-01-01T00:00:00ZCombining uncertainty with uncertainty to get certainty? Efficiency analysis for regulation purposes
http://hdl.handle.net/2003/37146
Title: Combining uncertainty with uncertainty to get certainty? Efficiency analysis for regulation purposes
Authors: Andor, Mark; Parmeter, Christopher; Sommer, Stephan
Abstract: Data envelopment analysis (DEA) and stochastic frontier analysis (SFA),
as well as combinations thereof, are widely applied in incentive regulation
practice, where the assessment of efficiency plays a major role in regulation
design and benchmarking. Using a Monte Carlo simulation experiment,
this paper compares the performance of six alternative methods commonly
applied by regulators. Our results demonstrate that combination approaches,
such as taking the maximum or the mean over DEA and SFA efficiency
scores, have certain practical merits and might offer an useful alternative
to strict reliance on a singular method. In particular, the results highlight
that taking the maximum not only minimizes the risk of underestimation,
but can also improve the precision of efficiency estimation. Based on our results,
we give recommendations for the estimation of individual efficiencies
for regulation purposes and beyond.2018-01-01T00:00:00ZTesting relevant hypotheses in functional time series via self-normalization
http://hdl.handle.net/2003/37138
Title: Testing relevant hypotheses in functional time series via self-normalization
Authors: Dette, Holger; Kokot, Kevin; Volgushev, Stanislav
Abstract: In this paper we develop methodology for testing relevant hypotheses in a tuning-free
way. Our main focus is on functional time series, but extensions to other settings are also
discussed. Instead of testing for exact equality, for example for the equality of two mean
functions from two independent time series, we propose to test a relevant deviation under
the null hypothesis. In the two sample problem this means that an L2-distance between
the two mean functions is smaller than a pre-specified threshold. For such hypotheses
self-normalization, which was introduced by Shao (2010) and Shao and Zhang (2010) and
is commonly used to avoid the estimation of nuisance parameters, is not directly applicable.
We develop new self-normalized procedures for testing relevant hypotheses in the one
sample, two sample and change point problem and investigate their asymptotic properties.
Finite sample properties of the proposed tests are illustrated by means of a simulation study
and a data example.2018-01-01T00:00:00ZOptimal designs for inspection times of interval-censored data
http://hdl.handle.net/2003/37137
Title: Optimal designs for inspection times of interval-censored data
Authors: Malevich, Nadja; Müller, Christine H.
Abstract: We treat optimal equidistant and optimal non-equidistant inspection
times for interval-censored data with exponential distribution.We provide
in particular a recursive formula for calculating the optimal non-equidistant
inspection times which is similar to a formula for optimal spacing of quantiles
for asymptotically best linear estimates based on order statistics. This formula
provides an upper bound for the standardized Fisher information which
is reached for the optimal non-equidistant inspection times if the number of
inspections is converging to infinity. The same upper bound is also shown for
the optimal equidistant inspection times. Since optimal equidistant inspection
times are easier to calculate and easier to handle in practice, we study the
efficiency of optimal equidistant inspection times with respect to optimal nonequidistant
inspection times. Moreover, since the optimal inspection times are
only locally optimal, we provide also some results concerning maximin efficient
designs.2018-01-01T00:00:00ZOn second order conditions in the multivariate block maxima and peak over threshold method
http://hdl.handle.net/2003/37120
Title: On second order conditions in the multivariate block maxima and peak over threshold method
Authors: Bücher, Axel; Volgushev, Stanislav; Zou, Nan
Abstract: Second order conditions provide a natural framework for establishing asymptotic
results about estimators for tail related quantities. Such conditions are typically
tailored to the estimation principle at hand, and may be vastly different for estimators
based on the block maxima (BM) method or the peak-over-threshold (POT)
approach. In this paper we provide details on the relationship between typical second
order conditions for BM and POT methods in the multivariate case. We show that the
two conditions typically imply each other, but with a possibly different second order
parameter. The latter implies that, depending on the data generating process, one
of the two methods can attain faster convergence rates than the other. The class of
multivariate Archimax copulas is examined in detail; we find that this class contains
models for which the second order parameter is smaller for the BM method and vice
versa. The theory is illustrated by a small simulation study.2018-01-01T00:00:00ZThe Phillips unit root tests for polynomials of integrated processes
http://hdl.handle.net/2003/37119
Title: The Phillips unit root tests for polynomials of integrated processes
Authors: Stypka, Oliver; Wagner, Martin
Abstract: We show that the Phillips (1987) unit root tests have nuisance parameter free limiting dis-
tributions when applied to polynomials of integrated processes driven by linear process errors.
This substantially generalizes a similar result of Wagner (2012) allowing only for serially uncor-
related errors. The result is based on novel kernel weighted sum limit results involving powers
of integrated processes. These results allow us also consider additional modifications of the
Phillips (1987) tests applicable to polynomials of integrated processes.2018-01-01T00:00:00Z