Eldorado Community:http://hdl.handle.net/2003/92024-05-27T21:25:22Z2024-05-27T21:25:22ZOnline monitoring of dynamic networks using flexible multivariate control chartsFlossdorf, JonathanFried, RolandJentsch, Carstenhttp://hdl.handle.net/2003/407302022-02-22T23:16:52Z2022-01-01T00:00:00ZTitle: Online monitoring of dynamic networks using flexible multivariate control charts
Authors: Flossdorf, Jonathan; Fried, Roland; Jentsch, Carsten
Abstract: The identification of differences in dynamic networks between various time points is an important task and
involves statistical procedures like two-sample tests or changepoint detection. Due to the rather complex nature of temporal graphs, the analysis is challenging which is why the complexity is typically reduced to a metric or some sort of a model. This is not only likely to result in a loss of relevant information, but common approaches also use restrictive assumptions and are therefore heavily limited in their usability. We propose an online monitoring approach usable for flexible network structures and able to handle various types of changes. It is based on a sound choice of a set of network characteristics under consideration of their mathematical properties which is crucial in order to cover the relevant information. Subsequently, those metrics are jointly monitored in a suitable multivariate control chart scheme which performs superior to a univariate analysis and enables both parametric and non-parametric usage. The user also benefits from a handy interpretation of the structural reasons for the detected changes which is a crucial advantage in the rather complex field of dynamic networks. Our findings are supported by an extensive simulation study.2022-01-01T00:00:00ZThe influence of different diamond spacings in diamond impregnated tools on the wear behavior and material removalDreier, JuliaFerreira, ManuelMalcherczyk, DennisBiermann, DirkMüller, Christine H.Tillmann, Wolfganghttp://hdl.handle.net/2003/407292022-02-22T23:17:02Z2022-01-01T00:00:00ZTitle: The influence of different diamond spacings in diamond impregnated tools on the wear behavior and material removal
Authors: Dreier, Julia; Ferreira, Manuel; Malcherczyk, Dennis; Biermann, Dirk; Müller, Christine H.; Tillmann, Wolfgang
Abstract: The influence of the spacing of four diamonds on the breakout time and material
removal is investigated for a diamond impregnated tool for machining concrete
workpieces. A statistical analysis using the Cox-model shows a positive effect of
larger spacings on the lifetime of the diamonds where no effect on the material
removal can be found. Moreover, a relationship between the position of the diamond
and its lifetime is observed.2022-01-01T00:00:00ZModel checks and simultaneous prediction bands for load sharing models in prestressed concrete beamsLeckey, KevinHeinrich, JensMüller, Christine H.Maurer, Reinhardhttp://hdl.handle.net/2003/406842022-01-20T23:16:37Z2022-01-01T00:00:00ZTitle: Model checks and simultaneous prediction bands for load sharing models in prestressed concrete beams
Authors: Leckey, Kevin; Heinrich, Jens; Müller, Christine H.; Maurer, Reinhard
Abstract: This article presents a new method to test on whether a parametric model is capable
of describing data properly. It also introduces a simple procedure to generate simultaneous
prediction bands based on independent copies of a process. The performance
of these prediction bands, e.g. in a leave-one-out cross-validation, will also be used as
another indication of whether data is modeled properly. Both methods are applied
to data from fatigue experiments on prestressed concrete beam girders. These experiments
highlight a couple of different influences on the fatigue of such girders, namely
the so-called cable factor and the deflection force. Both effects are incorporated into
different load sharing models for component failures which then are compared and used
for predicting these failure times.2022-01-01T00:00:00ZPro-environmental behavior as a means of self-signaling: Theory and evidenceFlörchinger, DanielaFrondel, ManuelSommer, StephanAndor, Mark A.http://hdl.handle.net/2003/406832022-01-20T23:16:28Z2022-01-01T00:00:00ZTitle: Pro-environmental behavior as a means of self-signaling: Theory and evidence
Authors: Flörchinger, Daniela; Frondel, Manuel; Sommer, Stephan; Andor, Mark A.
Abstract: Recent research indicates that pro-environmental behavior may be driven by concerns about one’s moral identity. Using identiﬁcation with the environmentalist movement Fridays for Future, this paper develops and empirically tests a straightforward model of self-signaling. We assume that pro-environmental behavior, here taking the train rather than the plane for a journey, serves as a means of self-signaling. On the basis of a large-scale survey experiment with revealed preferences, we ﬁnd evidence that respondents who receive an identity prime in the form of a reminder of their previously stated attitude towards Fridays for Future are more likely to behave in line with the movement’s moral principles in that they take the train. Our explanation of this outcome is that individuals attempt to avoid cognitive dissonance by choosing the more environmentally benign alternative. Our results suggest that pro-environmental behavior may be enhanced by appealing to an individual’s self-image so that costly interventions that are designed to convince subjects of new moral principles may be unnecessary.2022-01-01T00:00:00ZFairness and the support of redistributive environmental policiesAndor, Mark A.Lange, AndreasSommer, Stephanhttp://hdl.handle.net/2003/406522022-01-12T14:10:08Z2021-01-01T00:00:00ZTitle: Fairness and the support of redistributive environmental policies
Authors: Andor, Mark A.; Lange, Andreas; Sommer, Stephan
Abstract: Exemptions from costly policy measures are frequently applied to alleviate financial
burdens to specific market participants. Using a stated-choice experiment
with around 6,000 German household heads, we test how exemptions for lowincome
households and energy-intensive companies influence the political acceptability
of additional cost for the promotion of renewable energies. We find that the
support for the policy is substantially higher when low-income households are
exempt rather than the industry. Introducing exemptions for low-income households
on top of existing exemptions for the industry increases the acceptability
of the policy. We show that the support for exemptions as one example of distributional
policy design is associated with individual behavioral measures like
inequality aversion and fairness perceptions.2021-01-01T00:00:00ZKörperschallanalyse der Ermüdung von SpannbetonbauteilenDreier, JuliaHafer, MarliesHeinrich, JensLeckey, KevinMalcherczyk, DennisMaurer, ReinhardMüller, Christine H.http://hdl.handle.net/2003/406082021-12-14T23:16:23Z2021-01-01T00:00:00ZTitle: Körperschallanalyse der Ermüdung von Spannbetonbauteilen
Authors: Dreier, Julia; Hafer, Marlies; Heinrich, Jens; Leckey, Kevin; Malcherczyk, Dennis; Maurer, Reinhard; Müller, Christine H.
Abstract: Das Ermüdungsversagen von Spannbetonbauteilen mit Spanngliedern aus mehrdrähtigen
Litzenbündeln erfolgt i. d. R. drahtweise. Das Brechen solcher einzelner Spanndrähte kann
unter Laborbedingungen akustisch sehr gut exakt erfasst werden. Hier wird untersucht, ob das
Brechen der Spanndrähte schon kurz vorher akustisch messbar ist. Dies erfolgte über eine
Körperschallanalyse mit 512 Frequenzen im Frequenzbereich 0.003 MHz – 1.562 MHz. Wir
beschreiben hier eine Analyse-Möglichkeit, identifizieren Probleme dabei und machen
Vorschläge für verbesserte zukünftige Analysen. Insbesondere war es von Nachteil, dass nur
in einem zweistündigen Takt zweimal pro Stunde nur für ca. Sekunden gemessen werden
konnte. Damit konnte kein Zusammenhang zu den Drahtbrüchen festgestellt werden.
Allerdings ergab sich ein Zusammenhang zu einem Steifigkeitsparameter.2021-01-01T00:00:00ZFiscal policy, international spillovers, and endogenous productivityKlein, MathiasLinnemann, Ludgerhttp://hdl.handle.net/2003/406072021-12-14T23:16:28Z2021-01-01T00:00:00ZTitle: Fiscal policy, international spillovers, and endogenous productivity
Authors: Klein, Mathias; Linnemann, Ludger
Abstract: The paper presents empirical evidence on the international effects of US fiscal
policy from structural vector autoregressions identified through external instruments in a
panel setting for the G7 countries. An exogenous increase in US government spending is
estimated to produce sizeable positive responses of output and consumption in the rest of
the G7 countries, both about half as large as their domestic US counterparts, while strongly
depreciating the US terms of trade and lowering short-run real interest rates. Moreover,
fiscal shocks are estimated to have a strongly positive impact on hourly labor productivity
in the private sector. We solve a two-country New Keynesian model in closed form and
show that a low cost elasticity of varying technology utilization can simultaneously explain
the positive productivity, consumption and international spillover effects as well as the real
depreciation resulting from expansionary US government spending shocks.2021-01-01T00:00:00ZAkzeptanz der CO2-Bepreisung in Deutschland: Die hohe Bedeutung der Rückverteilung der EinnahmenFrondel, ManuelHelmers, ViolaMattauch, LinusPahle, MichaelSommer, StephanSchmidt, Christoph M.Edenhofer, Ottmarhttp://hdl.handle.net/2003/406062021-12-14T23:16:51Z2021-01-01T00:00:00ZTitle: Akzeptanz der CO2-Bepreisung in Deutschland: Die hohe Bedeutung der Rückverteilung der Einnahmen
Authors: Frondel, Manuel; Helmers, Viola; Mattauch, Linus; Pahle, Michael; Sommer, Stephan; Schmidt, Christoph M.; Edenhofer, Ottmar
Abstract: Im Jahr 2021 wurde in Deutschland die sogenannte CO2-Bepreisung fossiler Kraft- und Brennstoffe eingeführt, um deren Verbrauch zum Zwecke des Klimaschutzes zu reduzieren. Dieser Preisaufschlag auf fossile Energieträger wird in den kommenden Jahren sukzessive erhöht. Dieser Beitrag untersucht die Akzeptanz der
CO2-Bepreisung für die Zeit vor Einführung des CO2-Preises im Jahr 2019. Eine Erhebung unter mehr als 6.000 Haushalten zeigt, dass eine leichte absolute Mehrheit von 53,7 % der Befragten grundsätzlich bereit ist, zu Klimaschutzzwecken höhere Kosten in Kauf zu nehmen. Die Zustimmung zu einer CO2-Bepreisung nimmt jedoch mit sinkendem Einkommen deutlich ab: Bei Befragten der untersten Einkommensgruppe
liegt die Zustimmungsrate bei knapp unter 40 %. Erwartungsgemäß verringert sich die Zustimmung auch mit der Höhe des CO2-Preises. So wurde ein CO2-Preis von 50 Euro von einer Mehrheit der Befragten von 50,6 % abgelehnt. Um bei bis zum Jahr 2025 auf 55 Euro steigenden CO2-Preisen die mehrheitliche Akzeptanz der Bürger zu gewinnen, wird hier für einen breit angelegten Ausgleichsmechanismus durch die Reduzierung
verzerrender und sozial ungerechter Steuern und Abgaben auf den Strompreis plädiert, die insbesondere Gering- und Durchschnittsverdienern zugutekommt. Andernfalls könnten die über die Zeit steigenden CO2-Preise eine hohe soziale Sprengkraft entfalten.2021-01-01T00:00:00ZEfficiency gains in structural vector autoregressions by selecting informative higher-order moment conditionsKeweloh, Sascha AlexanderHetzenecker, Stephanhttp://hdl.handle.net/2003/405782021-11-28T23:16:08Z2021-01-01T00:00:00ZTitle: Efficiency gains in structural vector autoregressions by selecting informative higher-order moment conditions
Authors: Keweloh, Sascha Alexander; Hetzenecker, Stephan
Abstract: This study combines block-recursive restrictions with non-Gaussian and mean independent shocks
to derive identifying and overidentifying higher-order moment conditions for structural vector
autoregressions. We show that overidentifying higher-order moments can contain additional
information and increase the efficiency of the estimation. In particular, we prove that in the
non-Gaussian recursive SVAR higher-order moment conditions are relevant and therefore, the
frequently applied estimator based on the Cholesky decomposition is inefficient. Even though
incorporating information in valid higher-order moments is asymptotically efficient, including
many redundant and potentially even invalid moment conditions renders standard SVAR GMM
estimators unreliable in finite samples. We apply a LASSO-type GMM estimator to select the
relevant and valid higher-order moment conditions, increasing finite sample precision. A Monte
Carlo experiment and an application to quarterly U.S. data illustrate the improved performance
of the proposed estimator.2021-01-01T00:00:00ZApproximation and error analysis of forward-backward SDEs driven by general Lévy processes using shot noise series representationsMassing, Tillhttp://hdl.handle.net/2003/405772021-11-28T23:16:21Z2021-01-01T00:00:00ZTitle: Approximation and error analysis of forward-backward SDEs driven by general Lévy processes using shot noise series representations
Authors: Massing, Till
Abstract: We consider the simulation of a system of decoupled forward-backward stochastic differential
equations (FBSDEs) driven by a pure jump Lévy process L and an independent Brownian motion
B. We allow the Lévy process L to have an infinite jump activity. Therefore, it is necessary for the
simulation to employ a finite approximation of its Lévy measure. We use the generalized shot noise
series representation method by Rosinski (2001) to approximate the driving Lévy process L. We
compute the Lp error, p > 2, between the true and the approximated FBSDEs which arises from
the finite truncation of the shot noise series (given sufficient conditions for existence and uniqueness
of the FBSDE). We also derive the Lp error between the true solution and the discretization of the
approximated FBSDE using an appropriate backward Euler scheme.2021-01-01T00:00:00ZK-depth tests for testing simultaneously independence and other model assumptions in time seriesDohme, HendrikMalcherczyk, DennisLeckey, KevinMüller, Christinehttp://hdl.handle.net/2003/405492021-11-09T23:16:09Z2021-01-01T00:00:00ZTitle: K-depth tests for testing simultaneously independence and other model assumptions in time series
Authors: Dohme, Hendrik; Malcherczyk, Dennis; Leckey, Kevin; Müller, Christine
Abstract: We consider the recently developed K-depth tests for testing simultaneously independence and other model assumptions for univariate time series with a potentially related d-dimensional process of explanatory variables. Since these tests are based only on signs of residuals, they are easy to comprehend. They can be used in a full version and in a simplified version. While former investigations already showed
that the full version is appropriate for testing model assumptions, we concentrate here on either testing the independence assumption on its own or on simultaneously testing independence- and model assumptions with both types of tests. In an extensive simulation study, we compare these tests with several known independence test such as the runs test, the Durbin-Watson test, and the Von-Neumann-Rank-Ratio test. Finally, we demonstrate how the K-depth tests can be used for improved modelling of crack width time series depending on temperature measurements in a bridge monitoring.2021-01-01T00:00:00ZBlock-recursive non-Gaussian structural vector autoregressionsKeweloh, Sascha AlexanderHetzenecker, StephanSeepe, Andrehttp://hdl.handle.net/2003/405482022-01-03T23:19:55Z2021-01-01T00:00:00ZTitle: Block-recursive non-Gaussian structural vector autoregressions
Authors: Keweloh, Sascha Alexander; Hetzenecker, Stephan; Seepe, Andre
Abstract: This study combines block-recursive restrictions with higher-order moment conditions to identify
and estimate non-Gaussian structural vector autoregressions. The estimator allows to impose
a block-recursive structure on the SVAR and for a given block-recursive structure we derive
a conservative set of assumptions on the dependence and Gaussianity of the shocks to ensure
identification. We use a Monte Carlo simulation to illustrate the advantages of the proposed blockrecursive
estimator compared to unrestricted, purely data driven estimators in small samples. The
block-recursive estimator is used to analyze the interdependence of monetary policy and the stock
market. We find that a positive stock market shock contemporaneously increases the nominal
interest rate, while contractionary monetary policy shocks lead to lower stock returns on impact.2021-01-01T00:00:00ZA feasible approach to incorporate information in higher moments in structural vector autoregressionsKeweloh, Sascha Alexanderhttp://hdl.handle.net/2003/405472021-11-09T23:16:37Z2021-01-01T00:00:00ZTitle: A feasible approach to incorporate information in higher moments in structural vector autoregressions
Authors: Keweloh, Sascha Alexander
Abstract: Generalized method of moments and continuous updating estimators based on second- to fourth-order
moment conditions can be used to solve the identification problem and estimate non-Gaussian structural vectorautoregressions. However, estimating the asymptotically optimal
weighting matrix and the asymptotic variance of the estimators is challenging in small samples.
I show that this can lead to a severe bias, large variance, and inaccurate inference in
small samples. I propose to use the assumption of independent structural shocks not only to derive
moment conditions but also to derive alternative estimators for the asymptotically optimal
weighting matrix and the asymptotic variance of the estimator. I demonstrate that these estimators
greatly improve the performance of the generalized method of moments and continuous
updating estimators in terms of bias, variance, and inference.2021-01-01T00:00:00ZProviding Information by Resource- Constrained Data AnalysisMorik, KatharinaRhode, Wolfganghttp://hdl.handle.net/2003/405462021-11-09T16:10:08Z2020-12-31T00:00:00ZTitle: Providing Information by Resource- Constrained Data Analysis
Authors: Morik, Katharina; Rhode, Wolfgang
Abstract: The Collaborative Research Center SFB 876 (Providing Information by Resource-Constrained Data Analysis) brings together the research fields of data analysis (Data Mining, Knowledge Discovery in Data Bases, Machine Learning, Statistics) and embedded systems and enhances their methods such that information from distributed, dynamic masses of data becomes available anytime and anywhere. The research center approaches these problems with new algorithms respecting the resource constraints in the different scenarios. This Technical Report presents the work of the members of the integrated graduate school.2020-12-31T00:00:00ZDas Klimaschutz-Sofortprogramm von Bündnis90/Die Grünen: Mögliche Auswirkungen auf Emissionen und GesellschaftFrondel, Manuelhttp://hdl.handle.net/2003/405272021-10-15T22:16:27Z2021-01-01T00:00:00ZTitle: Das Klimaschutz-Sofortprogramm von Bündnis90/Die Grünen: Mögliche Auswirkungen auf Emissionen und Gesellschaft
Authors: Frondel, Manuel
Abstract: In diesem Beitrag werden die Auswirkungen des Klimaschutz-Sofortprogramms der Partei Bündnis90/Die Grünen im Hinblick auf die gesellschaftlichen Verteilungswirkungen und die Potentiale zur Emissionsminderung bewertet. Aufgrund von Unklarheiten in der Ausgestaltung zahlreicher Maßnahmen ist es prinzipiell unmöglich, die damit in Summe einhergehenden Emissionsminderungen zu quantifizieren. Stattdessen wird sich hier auf diejenigen der großen Mannigfaltigkeit an Maßnahmen kapriziert, die im Programm hinreichend klar formuliert sind, um eine Bewertung zu erlauben, zumindest in qualitativer Hinsicht. Baerbock und Habeck (2021:2) kündigen unter anderem an, die erneuerbaren Stromerzeugungskapazitäten schneller ausbauen und den Kohleausstieg auf das Jahr 2030 vorziehen zu wollen. Diese nationalen Maßnahmen verursachen unnötig hohe Kosten. Es wäre kostengünstiger, diese dem Markt bzw. den steigenden Preisen für Emissionszertifikate zu überlassen. Lobenswert ist hingegen das Versprechen von Baerbock und Habeck (2021: 7), dass sie eine transatlantische Klimapartnerschaft zwischen der EU und den USA auf den Weg bringen möchten, da für eine effektive weltweite Klimapolitik internationale Kooperation unabdingbar ist. Für effektive Minderungen der globalen Emissionen ist ein solches bilaterales Bündnis allerdings zu wenig. Ein Bündnis zum Zwecke der effektiven und effizienten Senkung der Treibhausgasemissionen sollte deutlich umfassender sein und zumindest auf Ebene der G20-Staaten initiiert werden sowie ein Abkommen über die Etablierung eines einheitlichen CO2-Preises in diesen Ländern beinhalten. Allein einem möglichst umfassenden Klimaschutzabkommen über einen einheitlichen CO2-Preis trauen Experten die effektive Senkung der globalen Treibhausgasemissionen zu.2021-01-01T00:00:00ZStatistical inference for function-on-function linear regressionDette, HolgerTang, Jiajunhttp://hdl.handle.net/2003/405262021-10-15T22:16:50Z2021-01-01T00:00:00ZTitle: Statistical inference for function-on-function linear regression
Authors: Dette, Holger; Tang, Jiajun
Abstract: We propose a reproducing kernel Hilbert space approach to estimate the slope
in a function-on-function linear regression via penalised least squares, regularized by the
thin-plate spline smoothness penalty. In contrast to most of the work on functional linear
regression, our main focus is on statistical inference with respect to the sup-norm. This
point of view is motivated by the fact that slope (surfaces) with rather different shapes may
still be identified as similar when the difference is measured by an L2-type norm. However,
in applications it is often desirable to use metrics reflecting the visualization of the objects
in the statistical analysis.
We prove the weak convergence of the slope surface estimator as a process in the space of
all continuous functions. This allows us the construction of simultaneous confidence regions
for the slope surface and simultaneous prediction bands. As a further consequence, we derive
new tests for the hypothesis that the maximum deviation between the “true” slope surface
and a given surface is less or equal than a given threshold. In other words: we are not trying
to test for exact equality (because in many applications this hypothesis is hard to justify),
but rather for pre-specified deviations under the null hypothesis. To ensure practicability,
non-standard bootstrap procedures are developed addressing particular features that arise
in these testing problems.
As a by-product, we also derive several new results and statistical inference tools for the
function-on-function linear regression model, such as minimax optimal convergence rates and
likelihood-ratio tests. We also demonstrate that the new methods have good finite sample
properties by means of a simulation study and illustrate their practicability by analyzing a
data example.2021-01-01T00:00:00ZConfidence surfaces for the mean of locally stationary functional time seriesDette, HolgerWu, Weichihttp://hdl.handle.net/2003/405252021-10-15T22:16:28Z2021-01-01T00:00:00ZTitle: Confidence surfaces for the mean of locally stationary functional time series
Authors: Dette, Holger; Wu, Weichi
Abstract: The problem of constructing a simultaneous confidence band for the mean function of
a locally stationary functional time series {Xi,n(t)}i=1,...n is challenging as these bands can
not be built on classical limit theory. On the one hand, for a fixed argument t of the functions
Xi,n, the maximum absolute deviation between an estimate and the time dependent
regression function exhibits (after appropriate standardization) an extreme value behaviour
with a Gumbel distribution in the limit. On the other hand, for stationary functional data,
simultaneous confidence bands can be built on classical central theorems for Banach space
valued random variables and the limit distribution of the maximum absolute deviation is
given by the sup-norm of a Gaussian process. As both limit theorems have different rates of
convergence, they are not compatible, and a weak convergence result, which could be used
for the construction of a confidence surface in the locally stationary case, does not exist.
In this paper we propose new bootstrap methodology to construct a simultaneous confidence
band for the mean function of a locally stationary functional time series, which is
motivated by a Gaussian approximation for the maximum absolute deviation. We prove the
validity of our approach by asymptotic theory, demonstrate good finite sample properties by
means of a simulation study and illustrate its applicability analyzing a data example.2021-01-01T00:00:00ZSome practical aspects of sequential change point detectionSivanesan, SivanjaDette, HolgerZiggel, Danielhttp://hdl.handle.net/2003/404722021-08-16T22:16:30Z2021-01-01T00:00:00ZTitle: Some practical aspects of sequential change point detection
Authors: Sivanesan, Sivanja; Dette, Holger; Ziggel, Daniel
Abstract: In this report we investigate the finite sample properties of a new online monitoring
scheme which was recently introduced by Gösmann et al. (2020) by means of a simulation
study and a real data example. We also develop an algorithm which can be used in an
active risk management.
We start with an introduction in the basic notation and an explanation of the monitoring
procedure, continue with an extensive simulation study to provide recommendations
for the choice of several tuning parameters. Finally we present some illustration analyzing
the Standard & Poor’s 500, MSCI World and MSCI Emerging Markets indices.2021-01-01T00:00:00ZDose response signal detection by parametric and least squares bootstrapBastian, PatrickDette, HolgerKokot, KevinBornkamp, BjörnBretz, Frankhttp://hdl.handle.net/2003/404712021-08-16T22:16:27Z2021-01-01T00:00:00ZTitle: Dose response signal detection by parametric and least squares bootstrap
Authors: Bastian, Patrick; Dette, Holger; Kokot, Kevin; Bornkamp, Björn; Bretz, Frank2021-01-01T00:00:00ZWasserverbrauch privater Haushalte in Deutschland: Eine empirische MikroanalyseFrondel, ManuelNiehues, Delia A.Sommer, Stephanhttp://hdl.handle.net/2003/404702021-08-16T22:16:39Z2021-01-01T00:00:00ZTitle: Wasserverbrauch privater Haushalte in Deutschland: Eine empirische Mikroanalyse
Authors: Frondel, Manuel; Niehues, Delia A.; Sommer, Stephan
Abstract: Deutschland ist ein eher wasserreiches Land. Dennoch könnten es
klimatische Veränderungen notwendig machen, künftig sorgsam mit der Ressource Wasser
umzugehen, vor allem in Zeiten von Trockenheit. Vor diesem Hintergrund schätzt
dieser Beitrag die Preiselastizität des Wasserverbrauchs privater Haushalte und differenziert
dabei zwischen Haushalten, die eine grobe Kenntnis der Wasserpreise haben, und
Haushalten ohne Preiskenntnis. Auf Basis von ca. 1.100 Beobachtungen für Haushalte,
die in Einfamilienhäusern wohnen, und unter Verwendung der Summe der Kubikmeter-
Preise für Wasser und Abwasser findet sich eine moderate, aber statistisch signifikant
von Null verschiedene Preiselastizität von -0,102. Haushalte, die über die Kenntnis der
Wasserpreise verfügen, weisen tendenziell eine höhere Elastizität auf, während Haushalte
ohne Preiskenntnis keine statistisch signifikante Reaktion in ihrem Wasserverbrauch
zeigen. Preise können demnach nur in begrenztem Umfang als Mittel zur Steuerung des
Wasserverbrauchs eingesetzt werden.2021-01-01T00:00:00ZCarbon pricing in Germany’s road transport and housing sector: Options for reimbursing carbon revenuesFrondel, ManuelSchubert, Stefaniehttp://hdl.handle.net/2003/403542021-07-30T22:17:02Z2021-01-01T00:00:00ZTitle: Carbon pricing in Germany’s road transport and housing sector: Options for reimbursing carbon revenues
Authors: Frondel, Manuel; Schubert, Stefanie
Abstract: In 2021, Germany launched a national emissions trading system (ETS) in its
road transport and housing sectors that increases the cost burden of consumers of fossil
fuels, the major source of carbon dioxide (CO2) emissions. A promising approach to secure
public acceptance for such a carbon pricing would be to entirely reallocate the resulting
“carbon” revenues to consumers. This article discusses three alternatives that
were discussed in the political arena prior to the introduction of the national carbon pricing:
a) a per-capita reallocation to private households, b) the reduction of electricity prices
by, e.g., decreasing the electricity tax, as well as c) targeted financial aid for vulnerable
consumers, such as increasing housing benefits. To estimate both the revenues originating
from carbon pricing and the resulting emission savings, we employ a partial equilibrium
approach that is based on price elasticity estimates on individual fossil fuel consumption
from the empirical literature. Most effective with respect to alleviating the burden
of poor households would be increasing housing benefits. While this measure would
not require large monetary resources, we argue that the remaining revenues should be
preferably employed to reduce Germany’s electricity tax, which becomes more and more
obsolete given the steadily increasing amount of electricity generated by renewable energy
technologies.2021-01-01T00:00:00ZDisaggregate consumption feedback and energy conservationGerster, AndreasAndor, Mark A.Goette, Lorenzhttp://hdl.handle.net/2003/40286.22021-07-07T22:16:56Z2021-01-01T00:00:00ZTitle: Disaggregate consumption feedback and energy conservation
Authors: Gerster, Andreas; Andor, Mark A.; Goette, Lorenz
Abstract: Novel information technologies have the potential to improve decision making. In
the context of smart metering, we investigate the impact of providing households with
appliance-level electricity feedback. In a randomized controlled trial, we find that the provision
of appliance-level feedback creates a conservation effect of an additional 5% relative
to a group receiving standard (aggregate) feedback. Consumers with poor knowledge of
appliance wattage respond most strongly to appliance-level feedback, consistent with the
mechanism in our model. We estimate that a smart-meter rollout will yield much larger
gains in consumer surplus if appliance-level feedback can be provided.2021-01-01T00:00:00ZBayesian analysis of reduced rank regression models using post-processingAßmann, ChristianBoysen-Hogrefe, JensPape, Markushttp://hdl.handle.net/2003/402852021-06-29T05:10:10Z2021-01-01T00:00:00ZTitle: Bayesian analysis of reduced rank regression models using post-processing
Authors: Aßmann, Christian; Boysen-Hogrefe, Jens; Pape, Markus
Abstract: Bayesian estimation of reduced rank regression models requires careful consideration of the
well known identification problem. We demonstrate that this identification problem can be handled
efficiently by using prior distributions that restrict a part of the parameter space to the
Stiefel manifold and post-processing the obtained Gibbs sampler output according to an appropriately
specified loss function. This extends the possibilities for Bayesian inference in reduced
rank regression models. Besides inference, we also discuss model selection in terms of posterior
predictive assessment. We choose this approach because computing the marginal data likelihood
under the identifying restrictions implies prohibitive computational burden. We illustrate the
proposed approach with a simulation study and an empirical application.2021-01-01T00:00:00ZBargaining power and the labor share – a structural break approachKraft, KorneliusLammers, Alexanderhttp://hdl.handle.net/2003/402282021-05-28T22:16:48Z2021-01-01T00:00:00ZTitle: Bargaining power and the labor share – a structural break approach
Authors: Kraft, Kornelius; Lammers, Alexander
Abstract: In this paper we investigate the relevance of bargaining institutions in the decline in
labor share. Several explanations for the decline exist, which consider the relevance
of technology, globalization and markups. Surprisingly neglected so far, however, is
the influence of bargaining institutions, in particular with a focus on changes in the
outside option. We provide evidence of this issue, using the Hartz IV labor market
reform in Germany as an exogenous shock in the wage bargaining of employees,
and investigate its impact on the labor share. We begin by developing a theoretical
model in which we outline the effect of a decrease in the outside option within a
wage bargaining framework. Thereafter, the approach is twofold. Combining the
EU KLEMS and Penn World Table databases, we first endogenously identify the
Hartz IV reform as a significant structural break in the German labor share. Second,
we estimate the effect of the Hartz IV legislation on the aggregated labor share
using a synthetic control approach in which we construct a counterfactual Germany
doppelganger. Finally, we use rich firm-level panel data compiled by Bureau van
Dijk to support our results on the aggregated labor share. We find that the reform
decreases the labor share by 1.6 - 2.7 percentage points depending on method and
aggregation level. The synthetic control approach furthermore provides evidence
that this effect is persistent over time since the reform.2021-01-01T00:00:00ZThe effects of reforming a federal employment agency on labor demandKraft, KorneliusLammers, Alexanderhttp://hdl.handle.net/2003/401922021-05-21T22:31:10Z2019-01-01T00:00:00ZTitle: The effects of reforming a federal employment agency on labor demand
Authors: Kraft, Kornelius; Lammers, Alexander
Abstract: In this paper we report the results of an empirical study on the employment growth
effects of a policy intervention, explicitly aimed at increasing placement efficiency
of the Federal Employment Agency in Germany. We use the Hartz III reform in
the year 2004 as an exogenous intervention that improves the matching process and
compare establishments that use the services of the Federal Employment Agency
with establishments that do not use the placement services. Using detailed German
establishment level data, our difference-in-differences estimates reveal an increase
in employment growth among those firms that use the agency for their recruitment
activities compared to non-user firms. After the Hartz III reform was in place, establishments
using the agency grew roughly two percentage points faster in terms
of employment relative to non-users and those establishments achieve an increase
in the proportion of hires. We provide several robustness tests using for example
inverse-probability weighting to additionally account for differences in observable
characteristics. Our paper highlights the importance of the placement service on
the labor demand side, in particular on the so far overlooked establishment level.2019-01-01T00:00:00ZLocally stationary multiplicative volatility modellingWalsh, ChristopherVogt, Michaelhttp://hdl.handle.net/2003/401912021-05-21T22:31:13Z2021-01-01T00:00:00ZTitle: Locally stationary multiplicative volatility modelling
Authors: Walsh, Christopher; Vogt, Michael
Abstract: In this paper, we study a semiparametric multiplicative volatility model, which
splits up into a nonparametric part and a parametric GARCH component. The
nonparametric part is modelled as a product of a deterministic time trend com-
ponent and of further components that depend on stochastic regressors. We
propose a two-step procedure to estimate the model. To estimate the nonpara-
metric components, we transform the model in order to apply the backfitting
procedure used in Vogt and Walsh (2019). The GARCH parameters are esti-
mated in a second step via quasi maximum likelihood. We show consistency
and asymptotic normality of our estimators. Our results are obtained using
mixing properties and local stationarity. We illustrate our method using finan-
cial data. Finally, a small simulation study illustrates a substantial bias in the
GARCH parameter estimates when omitting the stochastic regressors.2021-01-01T00:00:00ZDigitalisierung und Nachhaltigkeit im Haushalts-, Gebäude- und Verkehrssektor: Ein kurzer ÜberblickFrondel, Manuelhttp://hdl.handle.net/2003/401022021-03-25T23:10:32Z2021-01-01T00:00:00ZTitle: Digitalisierung und Nachhaltigkeit im Haushalts-, Gebäude- und Verkehrssektor: Ein kurzer Überblick
Authors: Frondel, Manuel
Abstract: Der Digitalisierung wird ein großes Potential zur Senkung des Energieverbrauchs und der
damit einhergehenden Umwelteffekte zugeschrieben. Die in diesem Beitrag
zusammengetragene empirische Evidenz deutet jedoch darauf hin, dass damit häufig
lediglich geringe Effekte einhergehen. So fallen die Energieeinsparwirkungen von Smart-
Home- und Smart-Metering-Technologien eher moderat aus und bewegen sich im
niedrigen einstelligen Prozentbereich. Dementsprechend gering sind auch die mit der
Energieeinsparung verbundenen Umwelteffekte. In Bezug auf den Ausstoß an Kohlendioxid
sind wegen des Wasserbetteffektes gar keinerlei Minderungseffekte in Sektoren zu
erwarten, die in den EU-Emissionshandel integriert sind. Dieser Beitrag argumentiert, dass
in Kombination mit der Etablierung von Mautsystemen die größten Effekte in dem noch
nicht in den EU-Emissionshandel integrierten Sektor Verkehr zu erwarten sein dürften.2021-01-01T00:00:00ZNonparametric and high-dimensional functional graphical modelsSolea, EftychiaDette, Holgerhttp://hdl.handle.net/2003/401012021-03-25T23:10:34Z2021-01-01T00:00:00ZTitle: Nonparametric and high-dimensional functional graphical models
Authors: Solea, Eftychia; Dette, Holger
Abstract: We consider the problem of constructing nonparametric undirected graphical models for highdimensional
functional data. Most existing statistical methods in this context assume either a Gaussian
distribution on the vertices or linear conditional means. In this article we provide a more
flexible model which relaxes the linearity assumption by replacing it by an arbitrary additive form. The use
of functional principal components offers an estimation strategy that uses a group lasso penalty to
estimate the relevant edges of the graph. We establish statistical guarantees for the resulting estimators,
which can be used to prove consistency if the dimension and the number of functional principal
components diverge to infinity with the sample size. We also investigate the empirical performance of
our method through simulation studies and a real data application.2021-01-01T00:00:00ZWeighted bootstrap consistency for matching estimators: The role of bias-correctionWalsh, ChristopherJentsch, CarstenHossain, Shaikh Tanvirhttp://hdl.handle.net/2003/400832021-03-16T23:10:28Z2021-01-01T00:00:00ZTitle: Weighted bootstrap consistency for matching estimators: The role of bias-correction
Authors: Walsh, Christopher; Jentsch, Carsten; Hossain, Shaikh Tanvir
Abstract: We show that the purpose of consistent bias-correction for matching estimators of treatment effects is two-fold. Firstly, it is known to improve point estimation to get rid of asymptotically non-negligible bias terms. Secondly,
point estimates, it will also distort inference leading e.g. to invalid confidence intervals. In simulations, we show that the choice of the bias-correction estimator that practitioners still have to make, can severely affect the weighted bootstrap’s performance when estimating the asymptotic variance in finite samples. In particular, simple rules such as estimating the bias based on linear regressions in the treatment arms can lead to very poor weighted bootstrap based variance estimates.2021-01-01T00:00:00ZClimate policy in times of the corona pandemic: Empirical evidence from GermanyFrondel, ManuelKussel, GerhardLarysch, TobiasOsberghaus, Danielhttp://hdl.handle.net/2003/400662021-03-05T23:10:43Z2021-01-01T00:00:00ZTitle: Climate policy in times of the corona pandemic: Empirical evidence from Germany
Authors: Frondel, Manuel; Kussel, Gerhard; Larysch, Tobias; Osberghaus, Daniel
Abstract: Given the dramatic changes triggered by the Corona pandemic, the question arises whether it
has displaced people’s concerns about climate change and whether Corona-related financial losses among
affected households can influence their assessment of climate change. Based on a survey among more
than 6,000 German household heads conducted in the period spanning from May 18 to June 14, 2020, this
paper provides empirical evidence on the impact of the pandemic on perceptions of climate change and
climate policy, as well as the extent to which respondents are affected in terms of health and finances.
Although the majority of almost 77% of the respondents is concerned about their own health and that of
their families, according to our descriptive results, climate change appears to remain an important issue:
only six percent of the respondents feel that climate change has become less important since the beginning
of 2020, while about 70% of the respondents see no change in the importance of the issue. Yet, employing
discrete-choice models, our estimation results indicate that households that suffered from Coronarelated
financial losses consider climate change to be less important than households that remained unaffected
in this respect. In accord with Engler et al. (2020), we thus conclude that lowering individual financial
losses is not only relevant from a social perspective, but it is also critical for the acceptance of
climate policy measures.2021-01-01T00:00:00ZRobust inference under timevarying volatility: A real-time evaluation of professional forecastersDemetrescu, MateiHanck, ChristophKruse, Robinsonhttp://hdl.handle.net/2003/400652021-03-05T23:10:42Z2021-01-01T00:00:00ZTitle: Robust inference under timevarying volatility: A real-time evaluation of professional forecasters
Authors: Demetrescu, Matei; Hanck, Christoph; Kruse, Robinson
Abstract: In many forecast evaluation applications, standard tests (e.g., Diebold and Mariano, 1995) as
well as tests allowing for time-variation in relative forecast ability (e.g., Giacomini and Rossi,
2010) build on heteroskedasticity-and-autocorrelation consistent (HAC) covariance estimators.
Yet, the finite-sample performance of these asymptotics is often poor. "Fixed-b" asymptotics
(Kiefer and Vogelsang, 2005), used to account for long-run variance estimation, improve finitesample
performance under homoskedasticity, but lose asymptotic pivotality under time-varying
volatility. Moreover, loss of pivotality due to time-varying volatility is found in the standard
HAC framework in certain cases as well. We prove a wild bootstrap implementation to restore
asymptotically pivotal inference for the above and new CUSUM- and Cramér-von Mises based
tests in a fairly general setup, allowing for estimation uncertainty from either a rolling window
or a recursive approach when fixed-b asymptotics are adopted to achieve good finite-sample
performance. We then investigate the (time-varying) performance of professional forecasters
relative to naive no-change and model-based predictions in real-time. We exploit the Survey of
Professional Forecasters (SPF) database and analyze nowcasts and forecasts at different horizons
for output and inflation. We find that not accounting for time-varying volatility seriously
affects outcomes of tests for equal forecast ability: wild bootstrap inference typically yields convincing
evidence for advantages of the SPF, while tests using non-robust critical values provide
remarkably less. Moreover, we find significant evidence for time-variation of relative forecast
ability, the advantages of the SPF weakening considerably after the "Great Moderation".2021-01-01T00:00:00ZNearest neighbor matching: Does the M-out-of-N bootstrap work when the naive bootstrap fails?Walsh, ChristopherJentsch, CarstenHossain, Shaikh Tanvirhttp://hdl.handle.net/2003/400272021-02-09T23:10:29Z2021-01-01T00:00:00ZTitle: Nearest neighbor matching: Does the M-out-of-N bootstrap work when the naive bootstrap fails?
Authors: Walsh, Christopher; Jentsch, Carsten; Hossain, Shaikh Tanvir
Abstract: In a seminal paper Abadie and Imbens (2008) showed that the limiting variance of the classi-
cal nearest neighbor matching estimator cannot be consistently estimated by a naive Efron-type
bootstrap. Specifically, they show that the conditional variance of the Efron-type boostrap es-
timator does not converge to the correct limit in expectation. In essence this is due to drawing
with replacement such that original observations appear more than once in the bootstrap sample
with positive probability even when the sample size becomes large. In the same paper, it is con-
jectured that the limiting variance should be consistently estimable by an M-out-of-N bootstrap.
Here, we prove that the conditional variance of an M-out-of-N-type bootstrap estimator does in-
deed converge to the correct limit in expectation in the setting considered in Abadie and Imbens
(2008). The key to the proof lies in the fact that asymptotically the M-out-of-N-type bootstrap
sample does not contain any observations more than once with probability one. The finite sample
performance of the M-out-of-N-type bootstrap is investigated in a simulation study of the DGP
considered by Abadie and Imbens (2008).2021-01-01T00:00:00ZReproducing kernel Hilbert spaces, polynomials and the classical moment problemsDette, HolgerZhigljavsky, Anatolyhttp://hdl.handle.net/2003/400152021-01-29T23:10:37Z2021-01-01T00:00:00ZTitle: Reproducing kernel Hilbert spaces, polynomials and the classical moment problems
Authors: Dette, Holger; Zhigljavsky, Anatoly
Abstract: We show that polynomials do not belong to the reproducing kernel Hilbert space
of infinitely differentiable translation-invariant kernels whose spectral measures have
moments corresponding to a determinate moment problem. Our proof is based
on relating this question to the problem of best linear estimation in continuous
time one-parameter regression models with a stationary error process defined by
the kernel. In particular, we show that the existence of a sequence of estimators
with variances converging to 0 implies that the regression function cannot be an
element of the reproducing kernel Hilbert space. This question is then related
to the determinacy of the Hamburger moment problem for the spectral measure
corresponding to the kernel.
In the literature it was observed that a non-vanishing constant function does not
belong to the reproducing kernel Hilbert space associated with the Gaussian kernel
(see Corollary 4.44 in Steinwart and Christmann, 2008). Our results provide a unifying
view of this phenomenon and show that the mentioned result can be extended
for arbitrary polynomials and a broad class of translation-invariant kernels.2021-01-01T00:00:00ZModel order selection for cascade autoregressive (CAR) modelsKöhler, Steffenhttp://hdl.handle.net/2003/400142021-01-30T02:41:00Z2021-01-01T00:00:00ZTitle: Model order selection for cascade autoregressive (CAR) models
Authors: Köhler, Steffen
Abstract: In recent years, Cascade Autoregression (CAR) models enjoy increasing popularity in applied econometrics.
This is due to the fact that they are able to approximate both short- and long-memory processes and are easy
to implement. However, their model order, namely the timing of the steps, relies on ad-hoc decisions rather
than being data-driven. In this paper, techniques for model order selection of CAR models in finite samples
are presented. The approaches are evaluated in an extensive simulation study, as well as in an empirical
application. The results suggest that model order selection may provide gains in both in- and out-of-sample
performance.2021-01-01T00:00:00ZOptimal designs for comparing regression curves - dependence within and between groupsSchorning, KirstenDette, Holgerhttp://hdl.handle.net/2003/399742022-06-09T09:07:01Z2021-01-01T00:00:00ZTitle: Optimal designs for comparing regression curves - dependence within and between groups
Authors: Schorning, Kirsten; Dette, Holger
Abstract: We consider the problem of designing experiments for the comparison of two regression
curves describing the relation between a predictor and a response in two groups,
where the data between and within the group may be dependent. In order to derive effi-
cient designs we use results from stochastic analysis to identify the best linear unbiased
estimator (BLUE) in a corresponding continuous time model. It is demonstrated that
in general simultaneous estimation using the data from both groups yields more precise
results than estimation of the parameters separately in the two groups. Using the BLUE
from simultaneous estimation, we then construct an efficient linear estimator for finite
sample size by minimizing the mean squared error between the optimal solution in the
continuous time model and its discrete approximation with respect to the weights (of the
linear estimator). Finally, the optimal design points are determined by minimizing the
maximal width of a simultaneous confidence band for the difference of the two regression
functions. The advantages of the new approach are illustrated by means of a simulation
study, where it is shown that the use of the optimal designs yields substantially narrower
confidence bands than the application of uniform designs.2021-01-01T00:00:00ZSoziale Normen und der Emissionsausgleich bei Flügen: Evidenz für deutsche HaushalteEßer, JanaFrondel, ManuelSommer, Stephanhttp://hdl.handle.net/2003/399732021-01-13T02:40:53Z2021-01-01T00:00:00ZTitle: Soziale Normen und der Emissionsausgleich bei Flügen: Evidenz für deutsche Haushalte
Authors: Eßer, Jana; Frondel, Manuel; Sommer, Stephan
Abstract: Die Bereitschaft, freiwillige Zahlungen zum Ausgleich von CO2-
Emissionen zu leisten, etwa bei Flügen, hat in den vergangenen Jahren erheblich
zugenommen. Eine Möglichkeit, diese Kompensationsbereitschaft weiter zu erhöhen,
besteht in der Aktivierung einer sozialen Norm, indem darauf aufmerksam gemacht wird,
dass ein Emissionsausgleich gesellschaftlich erwünscht ist. Vor diesem Hintergrund
untersucht dieser Beitrag die Bereitschaft, die durch Flugreisen verursachten CO2-
Emissionen durch den Kauf von Ausgleichszertifikaten zu kompensieren anhand eines
diskreten Entscheidungsexperimentes, das in eine Erhebung aus dem Jahr 2019
eingebettet wurde. Dabei wurde eine soziale Norm in zufälliger Weise vorgegeben,
ebenso wie eine von drei Kompensationshöhen von 5, 10 oder 15 Euro. Im Ergebnis zeigt
sich, dass 57,0% der Probanden sich dafür entscheiden, die Emissionen eines künftig
anstehenden Fluges auszugleichen. Hierbei gibt es nur geringe, statistisch nicht
signifikante Unterschiede zwischen der Gruppe, die mit einer sozialen Norm konfrontiert
wurde, und der Kontrollgruppe. Auch die Kompensationshöhe scheint keinen statistisch
signifikanten Einfluss auf die Kompensationsbereitschaft zu haben, möglicherweise weil
die Unterschiede in den Kompensationshöhen gering sind.2021-01-01T00:00:00ZReducing vehicle cold start emissions through carbon pricing: Evidence from GermanyFrondel, ManuelMarggraf, ClemensSommer, StephanVance, Colinhttp://hdl.handle.net/2003/399682021-01-06T02:40:56Z2020-01-01T00:00:00ZTitle: Reducing vehicle cold start emissions through carbon pricing: Evidence from Germany
Authors: Frondel, Manuel; Marggraf, Clemens; Sommer, Stephan; Vance, Colin
Abstract: A large proportion of local pollutants originating from the road transport sector
is generated during the so-called cold-start phase of driving, that is, the first
few minutes of driving after a car has stood inactive for several hours. Drawing on
data from the German Mobility Panel (MOP), this paper analyzes the factors that
affect the frequency of cold starts, approximated here by the number of car tours
that a household takes over the course of a week. Based on fixed-effects panel
estimations, we find a negative and statistically significant effect of fuel prices on
the number of tours and, hence, cold starts. Using our estimates to explore the
spatial implications arising from fuel price increases stipulated under Germany’s
Climate Programme 2030, we find substantial impacts on the number of avoided
tours even for modest fuel price increases of 20 cents per liter, particularly in urban
areas. This outcome lends support to using carbon pricing as a means to improve
both global climate and local air quality, pointing to a co-benefit of climate policy.2020-01-01T00:00:00ZAccurate and (almost) tuning parameter free inference in cointegrating regressionsReichold, KarstenJentsch, Carstenhttp://hdl.handle.net/2003/399642021-01-05T02:40:56Z2020-01-01T00:00:00ZTitle: Accurate and (almost) tuning parameter free inference in cointegrating regressions
Authors: Reichold, Karsten; Jentsch, Carsten
Abstract: Tuning parameter choices complicate statistical inference in cointegrating
regressions and affect finite sample distributions of test statistics. As commonly
used asymptotic theory fails to capture these effects, tests often suffer
from severe size distortions. We propose a novel self-normalized test statistic
for general linear hypotheses, which avoids the choice of tuning parameters.
Its limiting null distributions is nonstandard, but simulating asymptotically
valid critical values is straightforward. To further improve the performance
of the test in small to medium samples, we employ the vector autoregressive
sieve bootstrap to construct critical values. To show its consistency, we
establish a bootstrap invariance principle result under conditions that go
beyond the assumptions commonly imposed in the literature. Simulation
results demonstrate that our new test outperforms competing approaches,
as it has good power properties and is considerably less prone to size distortions.2020-01-01T00:00:00ZMonetary policy and the stock market - A partly recursive SVAR estimatorKeweloh, Sascha AlexanderSeepe, Andrehttp://hdl.handle.net/2003/398312020-12-05T02:40:55Z2020-01-01T00:00:00ZTitle: Monetary policy and the stock market - A partly recursive SVAR estimator
Authors: Keweloh, Sascha Alexander; Seepe, Andre
Abstract: This study analyzes the interdependence of monetary policy and the stock market in a structural
VAR model. We argue that commonly used short- and long-run restrictions on the interaction
of both variables might not hold and propose an estimator not requiring any of these restrictions
on the interaction of monetary policy and the stock market. The proposed estimator combines
a data driven and restriction based identification approach. In particular, the estimator allows
the researcher to order and identify some shocks recursively, while other shocks can remain unrestricted
and are identified based on independence and non-Gaussianity. We find that a positive
stock market shock contemporaneously increases the nominal interest rate, while a contractionary
monetary policy shock leads to lower stock returns on impact. Furthermore, we present evidence
that monetary policy is non-neutral with respect to long-run real stock prices.2020-01-01T00:00:00Z“The mother of all political problems?” On asylum seekers and electionsTomberg, LukasSmith Stegen, KarenVance, Colinhttp://hdl.handle.net/2003/398172020-11-26T02:40:52Z2020-01-01T00:00:00ZTitle: “The mother of all political problems?” On asylum seekers and elections
Authors: Tomberg, Lukas; Smith Stegen, Karen; Vance, Colin
Abstract: As immigration to Europe has increased, so has support for extremist parties. While many studies
have examined the effect of immigration on election outcomes, few have probed the effect of asylum
seekers – those fleeing strife and persecution – on voting, nor has there been much research on the
mediating role of local economic conditions. Drawing on county level panel data from Germany, our
study fills both gaps. We find that economic circumstances, as measured by the unemployment rate
and the level of disposable income, condition voters’ responses to the presence of asylum seekers, but
the effects for parties on the far right and left diverge markedly. Under economic prosperity, immigration
increases support on both sides of the political spectrum. As economic conditions worsen,
however, the effect of asylum seekers on the vote share for the far right remains stable, but weakens
for the left, eventually becoming negative. This divergence – which has not yet been reported in the
literature – suggests that an influx of asylum seekers, particularly when coupled with an economic
downturn, could tilt a political system rightwards. From a policy perspective, these results suggest
that heterogeneity arising from local economic conditions has important implications for the regional
allocation of asylum seekers.2020-01-01T00:00:00ZDetermining the efficiency of residential electricity consumptionAndor, Mark A.Bernstein, David H.Sommer, Stephanhttp://hdl.handle.net/2003/398092020-11-10T02:40:51Z2020-01-01T00:00:00ZTitle: Determining the efficiency of residential electricity consumption
Authors: Andor, Mark A.; Bernstein, David H.; Sommer, Stephan
Abstract: Increasing energy efficiency is a key global policy goal for climate protection.
An important step towards an optimal reduction of energy consumption is the identification
of energy saving potentials in different sectors and the best strategies for increasing
efficiency. This paper analyzes these potentials in the household sector by estimating the
degree of inefficiency in the use of electricity and its determinants. Using stochastic frontier
analysis and disaggregated household data, we estimate an input requirement function
and inefficiency on a sample of 2,000 German households. Our results suggest that the
mean inefficiency amounts to around 20%, indicating a notable potential for energy savings.
Moreover, we find that the household size and income are among the main determinants
of individual inefficiency. This information can be used to increase the cost-efficiency of
programs aimed to enhance energy efficiency.2020-01-01T00:00:00ZWeak convergence of sample covariance matrices and testing for seasonal unit rootsKawka, Rafaelhttp://hdl.handle.net/2003/398082020-11-10T02:40:51Z2020-01-01T00:00:00ZTitle: Weak convergence of sample covariance matrices and testing for seasonal unit roots
Authors: Kawka, Rafael
Abstract: The paper has two main contributions. First, weak convergence results are derived from
sampling moments of processes that contains a unit root at an arbitrary frequency, where,
in contrast to the previous literature, the proofs are mainly based on algebraic manipulations
and well known weak convergence results for martingale difference sequences. These
convergence results are used to derive the limiting distribution of the ordinary least squares
estimator for unit root autoregressions. As as second contribution, a Phillips-Perron type
test for a unit root at an arbitrary frequency is introduced and its limiting distributions are
derived. This test is further extended to a joint test for multiple unit roots and seasonal
integration. The limiting distributions of these test statistics are asymptotically equivalent
to various statistics presented earlier in the seasonal unit root literature.2020-01-01T00:00:00ZIntegrated modified OLS and fixed-b inference for seasonally cointegrated processesKawka, Rafaelhttp://hdl.handle.net/2003/398072020-11-10T02:40:50Z2020-01-01T00:00:00ZTitle: Integrated modified OLS and fixed-b inference for seasonally cointegrated processes
Authors: Kawka, Rafael
Abstract: Many economic time series exhibit persistent seasonal patterns. One approach to model
this phenomenon is given by models including seasonal unit roots and, if several time series
are considered jointly, seasonal cointegration. For quarterly time series, e.g., unit roots may
be present at frequencies =2 and , in addition to the “standard unit root” at frequency
zero. Gregoir (2010) has extended the fully modified OLS estimator of Phillips and Hansen
(1990) from the cointegrating regression to the seasonally cointegrating regression case. In
this paper, we have a similar agenda, in that we undertake the corresponding extension for
the IM-OLS estimator of Vogelsang and Wagner (2014). The benefit of the seasonal IMOLS
estimator, or SIM-OLS estimator, is that it forms the basis not only for asymptotic
standard inference but also allows for fixed-b inference. The paper furthermore proposes a
test for seasonal cointegration at all unit root frequencies. Note here that the cointegrating
spaces in general differ across frequencies and have to be estimated separately for each
frequency. The theoretical analysis is complemented by a simulation study.2020-01-01T00:00:00ZPivotal tests for relevant differences in the second order dynamics of functional time seriesvan Delft, AnneDette, Holgerhttp://hdl.handle.net/2003/397912020-10-27T02:40:47Z2020-01-01T00:00:00ZTitle: Pivotal tests for relevant differences in the second order dynamics of functional time series
Authors: van Delft, Anne; Dette, Holger
Abstract: Motivated by the need to statistically quantify differences between modern (complex) datasets which commonly result as high-resolution measurements of stochastic processes varying over a continuum, we propose novel testing procedures to detect relevant differences between the second order dynamics of two functional time series. In order to take the between-function dynamics into account that characterize this type of functional data, a frequency domain approach is taken. Test statistics are developed to compare differences in the spectral density operators and in the primary modes of variation as encoded in the associated eigenelements. Under mild moment conditions, we show convergence of the underlying statistics to Brownian motions and obtain pivotal test statistics via a self-normalization approach. The latter is essential because the nuisance parameters can be unwieldly and their robust estimation infeasible, especially if the two functional time series are dependent. Besides from these novel features, the properties of the tests are robust to any choice of frequency band enabling also to compare energy contents at a single frequency. The finite sample performance of the tests are verified through a simulation study and are illustrated with an application to fMRI data.2020-01-01T00:00:00ZPhotovoltaics and the solar rebound: Evidence for GermanyFrondel, ManuelKaestner, KathrinSommer, StephanVance, Colinhttp://hdl.handle.net/2003/397582020-10-06T01:40:51Z2020-01-01T00:00:00ZTitle: Photovoltaics and the solar rebound: Evidence for Germany
Authors: Frondel, Manuel; Kaestner, Kathrin; Sommer, Stephan; Vance, Colin
Abstract: Recent research suggests that households would increase their electricity consumption
in the aftermath of installing photovoltaics (PV) panels, a behavioral
change commonly referred to as the solar rebound. Drawing on panel data originating
from the German Residential Energy Consumption Survey (GRECS), we
employ panel estimation methods and the dynamic system estimator developed
by Blundell and Bond (1998) to investigate the solar rebound effect, thereby accounting
for simultaneity and endogeneity issues relating to PV installation and
the electricity price. Our empirical results suggest that PV panel adoption of households
hardly reduces the amount of electricity taken from the grid. As we derive
theoretically, this outcome implies that the rebound reaches a maximum that is
bounded by about 30% for German households. Yet, we are skeptical of whether
there is such a large solar rebound effect given the strong economic incentives to
feed solar electricty into the public grid in the past.2020-01-01T00:00:00ZTesting for nonlinear cointegration under heteroskedasticityHanck, ChristophMassing, Tillhttp://hdl.handle.net/2003/395332020-09-26T01:40:54Z2020-01-01T00:00:00ZTitle: Testing for nonlinear cointegration under heteroskedasticity
Authors: Hanck, Christoph; Massing, Till
Abstract: This article discusses cointegration tests for nonlinear cointegration in the presence of variance
breaks in the errors. We build on approaches of Cavaliere and Taylor (2006, Journal of
Time Series Analysis) for heteroskedastic cointegration tests and of Choi and Saikkonen (2010,
Econometric Theory) for nonlinear cointegration tests. We propose a bootstrap test and prove
its consistency.
A Monte Carlo study shows the approach to have appealing finite sample properties and
to work better than an approach using subresiduals. We provide an empirical application to
the environmental Kuznets curves (EKC), finding that the cointegration tests do not reject the
EKC hypothesis in most cases.2020-01-01T00:00:00ZA portmanteau-type test for detecting serial correlation in locally stationary functional time seriesBücher, AxelDette, HolgerHeinrichs, Florianhttp://hdl.handle.net/2003/393042020-09-23T01:40:53Z2020-01-01T00:00:00ZTitle: A portmanteau-type test for detecting serial correlation in locally stationary functional time series
Authors: Bücher, Axel; Dette, Holger; Heinrichs, Florian
Abstract: The Portmanteau test provides the vanilla method for detecting serial
correlations in classical univariate time series analysis. The method is extended to
the case of observations from a locally stationary functional time series. Asymptotic
critical values are obtained by a suitable block multiplier bootstrap procedure. The
test is shown to asymptotically hold its level and to be consistent against general
alternatives.2020-01-01T00:00:00ZA note on optimal designs for estimating the slope of a polynomial regressionDette, HolgerMelas, Viatcheslav B.Shpilev, Petrhttp://hdl.handle.net/2003/393012020-09-19T01:40:50Z2020-01-01T00:00:00ZTitle: A note on optimal designs for estimating the slope of a polynomial regression
Authors: Dette, Holger; Melas, Viatcheslav B.; Shpilev, Petr
Abstract: In this note we consider the optimal design problem for estimating the slope of
a polynomial regression with no intercept at a given point, say z. In contrast to
previous work, which considers symmetric design spaces we investigate the model on
the interval [0; a] and characterize those values of z, where an explicit solution of the
optimal design is possible.2020-01-01T00:00:00ZCorrecting intraday periodicity bias in realized volatility measuresDette, HolgerGolosnoy, VasylKellermann, Janoschhttp://hdl.handle.net/2003/392752020-09-12T01:40:51Z2020-01-01T00:00:00ZTitle: Correcting intraday periodicity bias in realized volatility measures
Authors: Dette, Holger; Golosnoy, Vasyl; Kellermann, Janosch
Abstract: Diurnal ﬂuctuations in volatility are a well-documented stylized fact of intraday price data. We
investigate how this intraday periodicity (IP) aﬀects both ﬁnite sample as well as asymptotic
properties of several popular realized estimators of daily integrated volatility which are based on
functionals of M intraday returns. We demonstrate that most of the estimators considered in our
study exhibit a ﬁnite-sample bias due to IP, which can however get negligible if the number of
intraday returns diverges to inﬁnity. We suggest appropriate correction factors for this bias based
on estimates of the IP. The adequacy of the new corrections is evaluated by means of a Monte
Carlo simulation study and an empirical example.2020-01-01T00:00:00ZData-based priors for vector error correction modelsPrüser, Janhttp://hdl.handle.net/2003/392512020-09-08T01:40:47Z2020-01-01T00:00:00ZTitle: Data-based priors for vector error correction models
Authors: Prüser, Jan
Abstract: We propose two data-based priors for vector error correction models. Both
priors lead to highly automatic approaches which require only minimal user
input. An empirical investigation reveals that Bayesian vector error correction
(BVEC) models equipped with our proposed priors turn out to scale well to
higher dimensions and to forecast well. In addition, we find that exploiting
information in the level variables has the potential for improving long-term
forecasts. Thus, working with VARs in first differences may ignore valuable
information. A simulation study reveals that it is beneficial, in terms of estimation
accuracy, to use BVEC in the presence of cointegration. But if there is
no cointegration, the proposed priors provide a sufficient amount of shrinkage
so that the BVEC model has a similar estimation accuracy compared to the
Bayesian vector autoregressive (BVAR) estimated in first differences.2020-01-01T00:00:00ZDifference-in-differences estimation under non-parallel trendsDette, HolgerSchumann, Martinhttp://hdl.handle.net/2003/392502020-09-08T01:40:49Z2020-01-01T00:00:00ZTitle: Difference-in-differences estimation under non-parallel trends
Authors: Dette, Holger; Schumann, Martin
Abstract: Classic difference-in-differences estimation relies on the validity of the "parallel
trends assumption" (PTA), which ensures that the evolution of the variable of interest in the
control group can be used to determine its counterfactual development in the treatment group
in the absence of treatment. The plausibility of the PTA is usually assessed by a test of the
null hypothesis that the difference between the means of both groups is constant over time
before the treatment. However, this procedure is problematic as failure to reject the null
hypothesis does not imply the absence of differences in time trends between both groups due
to low power to detect economically relevant differences. We provide three tests of equivalence
leading to a "common range" (CR) condition that replaces the PTA and which naturally reflects
differences between treatment and control. We combine the CR with standard confidence
intervals to capture both design and sampling uncertainty in the data and show that the
combined confidence intervals yield more reliable inference when the PTA is violated.2020-01-01T00:00:00ZCorrecting intraday periodicity bias in realized volatility measuresDette, HolgerGolosnoy, VasylKellermann, Janoschhttp://hdl.handle.net/2003/392092020-07-18T01:40:53Z2020-01-01T00:00:00ZTitle: Correcting intraday periodicity bias in realized volatility measures
Authors: Dette, Holger; Golosnoy, Vasyl; Kellermann, Janosch
Abstract: Diurnal fluctuations in volatility are a well-documented stylized fact of intraday price data. We
investigate how this intraday periodicity (IP) affects both finite sample as well as asymptotic
properties of several popular realized estimators of daily integrated volatility which are based on
functionals of M intraday returns. We demonstrate that most of the estimators considered in our
study exhibit a finite-sample bias due to IP, which can however get negligible if the number of
intraday returns diverges to infinity. We suggest appropriate correction factors for this bias based
on estimates of the IP. The adequacy of the new corrections is evaluated by means of a Monte
Carlo simulation study and an empirical example.2020-01-01T00:00:00ZNew model-based bioequivalence statistical approaches for pharmacokinetic studies with sparse samplingLoingeville, FlorenceBertrand, JulieNguyen, Thu ThuySharan, SatishFeng, KairuiSun, WanjieHan, JingGrosser, StellaZhao, LiangFang, LanyanMöllenhoff, KathrinDette, HolgerMentré, Francehttp://hdl.handle.net/2003/392082020-07-18T01:40:52Z2020-01-01T00:00:00ZTitle: New model-based bioequivalence statistical approaches for pharmacokinetic studies with sparse sampling
Authors: Loingeville, Florence; Bertrand, Julie; Nguyen, Thu Thuy; Sharan, Satish; Feng, Kairui; Sun, Wanjie; Han, Jing; Grosser, Stella; Zhao, Liang; Fang, Lanyan; Möllenhoff, Kathrin; Dette, Holger; Mentré, France
Abstract: In traditional pharmacokinetic (PK) bioequivalence analysis, two one-sided tests (TOST) are conducted on the area under the concentration-time curve and the maximal concentration derived using a non-compartmental approach. When rich sampling is unfeasible, a model-based (MB) approach, using nonlinear mixed effect models (NLMEM) is possible. However, MB-TOST using asymptotic standard errors (SE) presents increased type I error when asymptotic conditions do not hold. Methods : In this work, we propose three alternative calculations of the SE based on i) an adaptation to NLMEM of the correction proposed by Gallant, ii) the a posteriori distribution of the treatment coefficient using the Hamiltonian Monte Carlo algorithm, and iii) parametric random effects and residual errors bootstrap. We evaluate these approaches by simulations, for two-arms parallel and two-periods two-sequences cross-over design with rich (n=10) and sparse (n=3) sampling under the null and the alternative hypotheses, with MB-TOST. Results: All new approaches correct for the in ation of MB-TOST type I error in PK studies with sparse designs. The approach based on the a posteriori distribution appears to be the best compromise between controlled type I errors and computing times. Conclusion: MB-TOST using non-asymptotic SE controls type I error rate better than when using asymptotic SE estimates for bioequivalence on PK studies with sparse sampling.2020-01-01T00:00:00ZA global-local prior for time-varying parameter VARs and monetary policyPrüser, Janhttp://hdl.handle.net/2003/392072020-07-18T01:40:53Z2020-01-01T00:00:00ZTitle: A global-local prior for time-varying parameter VARs and monetary policy
Authors: Prüser, Jan
Abstract: Time-varying parameter VARs have become the workhorse models in empirical
macroeconomics. These models are usually equipped with tightly
parametrized prior distributions which favor a small and gradual change in
parameters. Do such prior distributions suppress some degree of time variation
in the VAR coefficients? We address this question by proposing a
exible global-local prior. It turns out that the conventional prior may suppress economically
relevant patterns of time variation. Using the global-local prior,
we observe that parameter change can be abrupt rather than smooth. We
find that, during the chairmanship of Paul Volcker, the Fed has been fighting
inflation pressures by raising the interest rate in response to a negative supply
shock. However, during the chairmanship of Alan Greenspan, this policy
came to an end. In contrast, using the conventional prior, we do not detect
this pattern.2020-01-01T00:00:00ZDetecting relevant differences in the covariance operators of functional time series - a sup-norm approachDette, HolgerKokot, Kevinhttp://hdl.handle.net/2003/391812020-06-24T01:40:55Z2020-01-01T00:00:00ZTitle: Detecting relevant differences in the covariance operators of functional time series - a sup-norm approach
Authors: Dette, Holger; Kokot, Kevin
Abstract: In this paper we propose statistical inference tools for the covariance operators of functional
time series in the two sample and change point problem. In contrast to most of
the literature the focus of our approach is not testing the null hypothesis of exact equality
of the covariance operators. Instead we propose to formulate the null hypotheses in the
form that "the distance between the operators is small", where we measure deviations by
the sup-norm. We provide powerful bootstrap tests for these type of hypotheses, investigate
their asymptotic properties and study their finite sample properties by means of a
simulation study.2020-01-01T00:00:00ZDekarbonisierung bis zum Jahr 2050? Klimapolitische Maßnahmen und Energieprognosen für Deutschland, Österreich und die SchweizFrondel, ManuelThomas, Tobiashttp://hdl.handle.net/2003/391802020-06-24T01:40:49Z2020-01-01T00:00:00ZTitle: Dekarbonisierung bis zum Jahr 2050? Klimapolitische Maßnahmen und Energieprognosen für Deutschland, Österreich und die Schweiz
Authors: Frondel, Manuel; Thomas, Tobias
Abstract: Angesichts der wachsenden klimapolitischen Herausforderungen streben viele Länder Europas
bis zum Jahr 2050 eine Dekarbonisierung an, das heißt den Ausstieg aus der Nutzung
fossiler Energieträger. Vor diesem Hintergrund präsentiert dieser Beitrag Prognosen des
Energiebedarfs und der Energiemixe für Deutschland, Österreich und die Schweiz für das
Jahr 2030 sowie einen Ausblick auf das Jahr 2050. Der Vergleich der bisherigen Energiepolitiken
dieser Länder offenbart gravierende Unterschiede: Während Deutschland bislang
vorwiegend auf die massive Subventionierung alternativer Stromerzeugungstechnologien
gesetzt hat, war der bisherige Ansatz Österreichs eher, Energieverbrauch und Treibhausgasausstoß
mit ordnungsrechtlichen Maßnahmen, insbesondere Ge- und Verboten, aber
auch Subventionen, senken zu wollen. Im Gegensatz dazu setzt die Schweiz bereits seit
dem Jahr 2008 auf das marktwirtschaftliche Instrument der CO2-Abgabe. Die hier präsentierten
Prognosen des Energiebedarfs der drei Länder deuten darauf hin, dass vor allem
Deutschland und Österreich mit einer Fortführung der bisherigen Politik das langfristige
Ziel einer weitgehenden Dekarbonisierung nicht erreichen dürften, während es in der
Schweiz bereits zu einem spürbaren Rückgang des Primärenergieverbrauchs gekommen
ist. Vor diesem Hintergrund gewinnt die jüngst in Deutschland beschlossene CO2-Bepreisung
der Emissionen in den Bereichen Verkehr und Wärme besondere Bedeutung. Auch
Österreich möchte in diesen Sektoren eine CO2-Bepreisung einführen. Es bleibt allerdings
abzuwarten, wie konsequent das marktwirtschaftliche Instrument der CO2-Bepreisung tatsächlich
verfolgt werden wird.2020-01-01T00:00:00ZSequential change point detection in high dimensional time seriesGösmann, JosuaStoehr, ChristinaDette, Holgerhttp://hdl.handle.net/2003/391672020-06-04T01:40:48Z2020-01-01T00:00:00ZTitle: Sequential change point detection in high dimensional time series
Authors: Gösmann, Josua; Stoehr, Christina; Dette, Holger
Abstract: Change point detection in high dimensional data has found considerable interest
in recent years. Most of the literature designs methodology for a retrospective
analysis, where the whole sample is already available when the statistical inference begins.
This paper takes a different point of view and develops monitoring schemes for the
online scenario, where high dimensional data arrives steadily and the goal is to detect
changes as fast as possible controlling at the same time the probability of a type I error of
a false alarm. We develop sequential procedures capable of detecting changes in the mean
vector of a successively observed high dimensional time series with spatial and temporal
dependence. The statistical properties of the methods are analyzed in the case where
both, the sample size and dimension converge to infinity. In this scenario it is shown that
the new monitoring schemes have asymptotic level alpha under the null hypothesis of no
change and are consistent under the alternative of a change in at least one component
of the high dimensional mean vector. Moreover, we also prove that the new detection
scheme identifies all components affected by a change. The finite sample properties of the
new methodology are illustrated by means of a simulation study and in the analysis of a
data example.
Our approach is based on a new type of monitoring scheme for one-dimensional data
which turns out to be often more powerful than the usually used CUSUM and Page-
CUSUM methods, and the component-wise statistics are aggregated by the maximum
statistic. From a mathematical point of view we use Gaussian approximations for high
dimensional time series to prove our main results and derive extreme value convergence for
the maximum of the maximal increment of dependent Brownian motions. In particular
we show that the range of a Brownian motion on a given interval is in the domain of
attraction of the Gumbel distribution.2020-01-01T00:00:00ZA distribution free test for changes in the trend function of locally stationary processesHeinrichs, FlorianDette, Holgerhttp://hdl.handle.net/2003/391542020-05-28T01:40:49Z2020-01-01T00:00:00ZTitle: A distribution free test for changes in the trend function of locally stationary processes
Authors: Heinrichs, Florian; Dette, Holger
Abstract: In the common time series model Xi,n = μ(i/n)+"i,n with non-stationary errors we consider the problem of detecting a significant deviation of the mean function g(μ) from a benchmark g(μ) (such as the initial value μ(0) or the average trend R 1 0 μ(t)dt). The problem is motivated by a more realistic modelling of change point analysis, where one is interested in identifying relevant deviations in a smoothly varying sequence of means (μ(i/n))i=1,...,n and cannot assume that the sequence is piecewise constant. A test for this type of hypotheses is developed using an appropriate estimator for the integrated squared deviation of the mean function and the threshold. By a new concept of self-normalization adapted to non-stationary processes an asymptotically
pivotal test for the hypothesis of a relevant deviation is constructed. The results are illustrated by means of a simulation study and a data example.2020-01-01T00:00:00ZK-sign depth: From asymptotics to efficient implementationMalcherczyk, DennisLeckey, KevinMüller, Christine H.http://hdl.handle.net/2003/391002020-05-01T01:40:51Z2020-01-01T00:00:00ZTitle: K-sign depth: From asymptotics to efficient implementation
Authors: Malcherczyk, Dennis; Leckey, Kevin; Müller, Christine H.
Abstract: The K-sign depth (K-depth) of a model parameter θ in a data set is the relative number of K-tuples among its residual vector that have alternating signs. The K-depth test based on K-depth, recently proposed by Leckey et al. (2019), is equivalent to the classical residual-based sign test for K = 2, but is much more powerful for K ≥ 3. This test has two major drawbacks. First, the computation of the K-depth is fairly time consuming, and second, the test requires knowledge about the quantiles of the test statistic which previously had to be obtained by simulation for each sample size individually. We tackle both of these drawbacks by presenting a limit theorem for the distribution of the test statistic and deriving an (asymptotically equivalent) form of the K-depth which can be computed eﬃciently. For K = 3, such a limit theorem was already derived in Kustosz et al. (2016a) by mimicking the proof for U-statistics. We provide here a much shorter proof based on Donsker’s theorem and extend it to any K ≥ 3. As part of the proof, we derive an asymptotically equivalent form of the K-depth which can be computed in linear time. This alternative and the original implementation of the K-depth are compared with respect to their runtimes and absolute diﬀerence.2020-01-01T00:00:00ZPowerful generalized sign tests based on sign depthLeckey, KevinMalcherczyk, DennisMüller, Christine H.http://hdl.handle.net/2003/390992020-05-01T01:40:50Z2020-01-01T00:00:00ZTitle: Powerful generalized sign tests based on sign depth
Authors: Leckey, Kevin; Malcherczyk, Dennis; Müller, Christine H.
Abstract: The classical sign test usually provides very bad power for certain alternatives. We present
a generalization which is similarly easy to comprehend but much more powerful. It is based on
K-sign depth, shortly denoted by K-depth. These so-called K-depth tests are motivated by
simplicial regression depth, but are not restricted to regression problems. They can be applied
as soon as the true model leads to independent residuals with median equal to zero. Moreover,
general hypotheses on the unknown parameter vector can be tested. Since they depend only
on the signs of the residuals, these test statistics are outlier robust. While the 2-depth test, i.e.
the K-depth test for K = 2, is equivalent to the classical sign test, K-depth test with K ≥3
turn out to be more powerful in many applications. As we will briefly discuss, these tests are
also related to runs tests. A drawback of the K-depth test is its fairly high computational effort
when implemented naively. However, we show how this inherent computational complexity can
be reduced. In order to see why K-depth tests with K ≥ 3 are more powerful than the classical
sign test, we discuss the asymptotic behaviour of its test statistic for residual vectors with only
few sign changes, which is in particular the case for some nonfits the classical sign test cannot
reject. In contrast, we also consider residual vectors with alternating signs, representing models
that fit the data very well. Finally, we demonstrate the good power of the K-depth tests for
quadratic regression.2020-01-01T00:00:00ZMarket premia for renewables in Germany: The effect on electricity pricesFrondel, ManuelKaeding, MatthiasSommer, Stephanhttp://hdl.handle.net/2003/390982020-05-01T01:40:50Z2020-01-01T00:00:00ZTitle: Market premia for renewables in Germany: The effect on electricity prices
Authors: Frondel, Manuel; Kaeding, Matthias; Sommer, Stephan
Abstract: Due to the growing share of ”green” electricity generated by renewable energy
technologies, the frequency of negative price spikes has substantially increased in
Germany. To reduce such events, in 2012, a market premium scheme (MPS) was introduced
as an alternative to feed-in tariffs for the promotion of green electricity. Drawing
on hourly day-ahead spot prices for the time period spanning 2009 to 2016 and
employing a nonparametric modeling strategy called Bayesian Additive Regression
Trees, this paper empirically evaluates the efficacy of Germany’s MPS. Via counterfactual
analyses, we demonstrate that the introduction of the MPS decreased the number
of hours with negative prices by some 70%.2020-01-01T00:00:00ZEfficient tests for bio-equivalence in functional dataDette, HolgerKokot, Kevinhttp://hdl.handle.net/2003/390972020-05-01T01:40:49Z2020-01-01T00:00:00ZTitle: Efficient tests for bio-equivalence in functional data
Authors: Dette, Holger; Kokot, Kevin
Abstract: We study the problem of testing the equivalence of functional parameters (such as the
mean or variance function) in the two sample functional data problem. In contrast to
previous work, which reduces the functional problem to a multiple testing problem for the
equivalence of scalar data by comparing the functions at each point, our approach is based
on an estimate of a distance measuring the maximum deviation between the two functional
parameters. Equivalence is claimed if the estimate for the maximum deviation does not
exceed a given threshold. A bootstrap procedure is proposed to obtain quantiles for the
distribution of the test statistic and consistency of the corresponding test is proved in the
large sample scenario. As the methods proposed here avoid the use of the intersectionunion
principle they are less conservative and more powerful than the currently available
methodology.2020-01-01T00:00:00ZProviding Information by Resource- Constrained Data AnalysisMorik, KatharinaRhode, Wolfganghttp://hdl.handle.net/2003/390962020-05-01T01:40:51Z2019-12-31T00:00:00ZTitle: Providing Information by Resource- Constrained Data Analysis
Authors: Morik, Katharina; Rhode, Wolfgang
Abstract: The Collaborative Research Center SFB 876 (Providing Information by Resource-Constrained Data Analysis) brings together the research fields of data analysis (Data Mining, Knowledge Discovery in Data Bases, Machine Learning, Statistics) and embedded systems and enhances their methods such that information from distributed, dynamic masses of data becomes available anytime and anywhere. The research center approaches these problems with new algorithms respecting the resource constraints in the different scenarios. This Technical Report presents the work of the members of the integrated graduate school.2019-12-31T00:00:00ZQuantifying deviations from separability in space-time functional processesDette, HolgerDierickx, GauthierKutta, Timhttp://hdl.handle.net/2003/390752020-04-01T01:40:49Z2020-01-01T00:00:00ZTitle: Quantifying deviations from separability in space-time functional processes
Authors: Dette, Holger; Dierickx, Gauthier; Kutta, Tim
Abstract: The estimation of covariance operators of spatio-temporal data is in many applications only computationally feasible under simplifying assumptions, such as separability of the covariance into strictly temporal and spatial factors. Powerful tests for this assumption have been proposed in the literature. However, as real world systems, such as climate data are notoriously inseparable, validating this assumption by statistical tests, seems inherently questionable. In this paper we present an alternative approach: By virtue of separability measures, we quantify how strongly the data’s covariance operator diverges from a separable approximation. Conﬁdence intervals localize these measures with statistical guarantees. This method provides users with a ﬂexible tool, to weigh the computational gains of a separable model against the associated increase in bias. As separable approximations we consider the established methods of partial traces and partial products, and develop weak convergence principles for the corresponding estimators. Moreover, we also prove such results for estimators of optimal, separable approximations, which are arguably of most interest in applications. In particular we present for the ﬁrst time statistical inference for this object, which has been conﬁned to estimation previously. Besides conﬁdence intervals, our results encompass tests for approximate separability. All methods proposed in this paper are free of nuisance parameters and do neither require computationally expensive resampling procedures nor the estimation of nuisance parameters. A simulation study underlines the advantages of our approach and its applicability is demonstrated by the investigation of German annual temperature data.2020-01-01T00:00:00ZDesign admissibility and de la Garza phenomenon in multi-factor experimentsDette, HolgerLiu, XinYue, Rong-Xianhttp://hdl.handle.net/2003/390702020-03-25T02:40:47Z2020-01-01T00:00:00ZTitle: Design admissibility and de la Garza phenomenon in multi-factor experiments
Authors: Dette, Holger; Liu, Xin; Yue, Rong-Xian
Abstract: The determination of an optimal design for a given regression problem is an intricate
optimization problem, especially for models with multivariate predictors. Design
admissibility and invariance are main tools to reduce the complexity of the optimization
problem and have been successfully applied for models with univariate predictors.
In particular several authors have developed sufficient conditions for the existence of
saturated designs in univariate models, where the number of support points of the optimal
design equals the number of parameters. These results generalize the celebrated de
la Garza phenomenon (de la Garza, 1954) which states that for a polynomial regression
model of degree p -1 any optimal design can be based on at most p points.
This paper provides - for the first time - extensions of these results for models
with a multivariate predictor. In particular we study a geometric characterization
of the support points of an optimal design to provide sufficient conditions for the
occurrence of the de la Garza phenomenon in models with multivariate predictors and
characterize properties of admissible designs in terms of admissibility of designs in
conditional univariate regression models.2020-01-01T00:00:00ZCO2-Bepreisung in den Sektoren Verkehr und Wärme: Optionen für eine sozial ausgewogene AusgestaltungFrondel, Manuelhttp://hdl.handle.net/2003/390662020-03-14T02:40:51Z2020-01-01T00:00:00ZTitle: CO2-Bepreisung in den Sektoren Verkehr und Wärme: Optionen für eine sozial ausgewogene Ausgestaltung
Authors: Frondel, Manuel
Abstract: Die Einführung einer nationalen CO2-Bepreisung ab dem Jahr 2021 ist beschlossene Sache:
In den Sektoren Verkehr und Wärme soll ein nationales Emissionshandelssystem
etabliert werden, in dem die CO2-Preise in den Jahren 2021 bis 2025 fixiert sind und beginnend
mit 25 Euro je Tonne sukzessive ansteigen. Dies bringt höhere Kostenbelastungen
für die Verbraucher mit sich. Um dennoch eine breite Akzeptanz für eine CO2-
Bepreisung zu gewinnen, wäre ein vielversprechender Ansatz, die daraus resultierenden
Einnahmen wieder vollständig an die Verbraucher zurückzugeben. Vor diesem Hintergrund
diskutiert dieser Beitrag drei Alternativen zur Rückverteilung der zusätzlichen
staatlichen Einnahmen: a) eine pauschale Pro-Kopf-Rückerstattung für private Haushalte,
b) die Senkung der Stromkosten durch (i) die Steuerfinanzierung der Industrieausnahmen
bei der EEG-Umlage und (ii) die Senkung der Stromsteuer und c) gezielte Zuschüsse
für besonders betroffene Verbraucher, etwa in Form einer Erhöhung des Wohngelds. Am
treffsichersten im Hinblick auf die Entlastung bedürftiger Haushalte wäre die dritte Alternative.
Mit den restlichen Mitteln könnte die unter ökologischen Gesichtspunkten zunehmend
obsolet werdende Stromsteuer reduziert werden. Wenngleich es gute Gründe sowohl
für eine Pro-Kopf-Rückerstattung als auch für eine Stromsteuersenkung gibt, hat
eine Stromsteuersenkung mehrere Vorteile gegenüber einer Pro-Kopfpauschale, insbesondere
im Hinblick auf die Sektorkopplung und die Transaktionskosten des Rückverteilungsaufwands,
welche bei einer Stromsteuersenkung vernachlässigbar wären.2020-01-01T00:00:00ZTests based on sign depth for multiple regressionHorn, MelanieMüller, Christine H.http://hdl.handle.net/2003/390652020-03-14T02:40:50Z2020-01-01T00:00:00ZTitle: Tests based on sign depth for multiple regression
Authors: Horn, Melanie; Müller, Christine H.
Abstract: The extension of simplicial depth to robust regression, the so-called simplicial regression depth,
provides an outlier robust test for the parameter vector of regression models. Since simplicial regression
depth often reduces to counting the subsets with alternating signs of the residuals, this led recently to
the notion of sign depth and sign depth test. Thereby sign depth tests generalize the classical sign tests.
Since sign depth depends on the order of the residuals, one generally assumes that the D-dimensional
regressors (explanatory variables) can be ordered with respect to an inherent order. While the one-dimensional
real space possesses such a natural order, one cannot order these regressors that easily for
D > 1 because there exists no canonical order of the data in most cases.
For this scenario, we present orderings according to the Shortest Hamiltonian Path and an approximation
of it. We compare them with more naive approaches like taking the order in the data set or ordering
on the basis of a single quantity of the regressor. The comparison bases on the computational runtime,
stability of the order when transforming the data, as well as on the power of the resulting sign depth
tests for testing the parameter vector of different multiple regression models. Moreover, we compare the
power of our new tests with the power of the classical sign test and the F-test. Thereby, the sign depth
tests based on our distance based approaches show similar power as the F-test for normally distributed
residuals with the additional benefit of being much more robust against outliers.2020-01-01T00:00:00ZAn asymptotic test for constancy of the variance under short-range dependenceSchmidt, SaraWornowizki, MaxFried, RolandDehling, Heroldhttp://hdl.handle.net/2003/390572020-03-07T02:40:50Z2020-01-01T00:00:00ZTitle: An asymptotic test for constancy of the variance under short-range dependence
Authors: Schmidt, Sara; Wornowizki, Max; Fried, Roland; Dehling, Herold
Abstract: We present a novel approach to test for heteroscedasticity of
a non-stationary time series that is based on Gini's mean difference of
logarithmic local sample variances. In order to analyse the large sample behaviour
of our test statistic, we establish new limit theorems for U-statistics
of dependent triangular arrays.We derive the asymptotic distribution of the
test statistic under the null hypothesis of a constant variance and show that
the test is consistent against a large class of alternatives, including multiple
structural breaks in the variance. Our test is applicable even in the case
of non-stationary processes, assuming a locally stationary mean function.
The performance of the test and its comparatively low computation time
are illustrated in an extensive simulation study. As an application, we analyse
data from civil engineering, monitoring crack widths in concrete bridge
surfaces.2020-01-01T00:00:00ZStatistical inference for high dimensional panel functional time seriesZhou, ZhouDette, Holgerhttp://hdl.handle.net/2003/390202020-09-15T14:46:11Z2020-01-01T00:00:00ZTitle: Statistical inference for high dimensional panel functional time series
Authors: Zhou, Zhou; Dette, Holger
Abstract: In this paper we develop statistical inference tools for high dimensional functional
time series. We introduce a new concept of physical dependent processes in
the space of square integrable functions, which adopts the idea of basis decomposition
of functional data in these spaces, and derive Gaussian and multiplier bootstrap
approximations for sums of high dimensional functional time series. These results
have numerous important statistical consequences. Exemplarily, we consider the development
of joint simultaneous confidence bands for the mean functions and the
construction of tests for the hypotheses that the mean functions in the spatial dimension
are parallel. The results are illustrated by means of a small simulation study
and in the analysis of Canadian temperature data.2020-01-01T00:00:00ZAre deviations in a gradually varying mean relevant? A testing approach based on sup-norm estimatorsBücher, AxelDette, HolgerHeinrichs, Florianhttp://hdl.handle.net/2003/387202020-09-15T14:45:08Z2020-01-01T00:00:00ZTitle: Are deviations in a gradually varying mean relevant? A testing approach based on sup-norm estimators
Authors: Bücher, Axel; Dette, Holger; Heinrichs, Florian
Abstract: Classical change point analysis aims at (1) detecting abrupt changes
in the mean of a possibly non-stationary time series and at (2) identifying regions
where the mean exhibits a piecewise constant behavior. In many applications however,
it is more reasonable to assume that the mean changes gradually in a smooth
way. Those gradual changes may either be non-relevant (i.e., small), or relevant
for a specific problem at hand, and the present paper presents statistical methodology
to detect the latter. More precisely, we consider the common nonparametric
regression model Xi = μ(i/n) +εi with possibly non-stationary errors and propose
a test for the null hypothesis that the maximum absolute deviation of the
regression function μ from a functional g(μ) (such as the value μ(0) or the integral 1
0 μ(t)dt) is smaller than a given threshold on a given interval [x0, x1] [0, 1]. A
test for this type of hypotheses is developed using an appropriate estimator, say
ˆ d∞n, for the maximum deviation d∞ = supt∈[x0,x1] |μ(t) − g(μ)|. We derive the
limiting distribution of an appropriately standardized version of ˆ d∞,n, where the
standardization depends on the Lebesgue measure of the set of extremal points of
the function μ(·) − g(μ). A refined procedure based on an estimate of this set is
developed and its consistency is proved. The results are illustrated by means of a
simulation study and a data example.2020-01-01T00:00:00ZProviding Information by Resource- Constrained Data AnalysisMorik, KatharinaRhode, Wolfganghttp://hdl.handle.net/2003/385712020-02-15T02:40:48Z2018-12-31T00:00:00ZTitle: Providing Information by Resource- Constrained Data Analysis
Authors: Morik, Katharina; Rhode, Wolfgang
Abstract: The Collaborative Research Center SFB 876 (Providing Information by Resource-Constrained Data Analysis) brings together the research fields of data analysis (Data Mining, Knowledge Discovery in Data Bases, Machine Learning, Statistics) and embedded systems and enhances their methods such that information from distributed, dynamic masses of data becomes available anytime and anywhere. The research center approaches these problems with new algorithms respecting the resource constraints in the different scenarios. This Technical Report presents the work of the members of the integrated graduate school.2018-12-31T00:00:00ZExplicit results on conditional distributions of generalized exponential mixturesKlüppelberg, ClaudiaSeifert, Miriam Isabelhttp://hdl.handle.net/2003/385702020-02-15T02:40:47Z2020-01-01T00:00:00ZTitle: Explicit results on conditional distributions of generalized exponential mixtures
Authors: Klüppelberg, Claudia; Seifert, Miriam Isabel
Abstract: For independent exponentially distributed random variables Xi, i ∈ N with distinct rates λi we consider sums ∑i∈AXi for A⊆N which follow generalized exponential mixture (GEM) distributions. We provide novel
explicit results on the conditional distribution of the total sum ∑i∈NXi giventhat a subset sum
∑j∈NXj exceeds a certain threshold value t > 0, and vice versa. Moreover, we investigate the characteristic tail behavior of these conditional distributions for t → ∞,. Finally, we illustrate how our probabilistic results can be applied in practice by providing examples
from both reliability theory and risk management.2020-01-01T00:00:00ZPrediction in locally stationary time seriesDette, HolgerWu, Weichihttp://hdl.handle.net/2003/385302020-01-18T02:41:28Z2020-01-01T00:00:00ZTitle: Prediction in locally stationary time series
Authors: Dette, Holger; Wu, Weichi
Abstract: We develop an estimator for the high-dimensional covariance matrix of a locally
stationary process with a smoothly varying trend and use this statistic to derive consistent
predictors in non-stationary time series. In contrast to the currently available
methods for this problem the predictor developed here does not rely on fitting an
autoregressive model and does not require a vanishing trend. The finite sample properties
of the new methodology are illustrated by means of a simulation study and a
data example.2020-01-01T00:00:00ZDetecting structural breaks in eigensystems of functional time seriesDette, HolgerKutta, Timhttp://hdl.handle.net/2003/383862019-11-20T02:41:03Z2019-01-01T00:00:00ZTitle: Detecting structural breaks in eigensystems of functional time series
Authors: Dette, Holger; Kutta, Tim
Abstract: Detecting structural changes in functional data is a prominent topic in statistical
literature. However not all trends in the data are important in applications, but only
those of large enough in
uence. In this paper we address the problem of identifying
relevant changes in the eigenfunctions and eigenvalues of covariance kernels of L^2[0; 1]-
valued time series. By self-normalization techniques we derive pivotal, asymptotically
consistent tests for relevant changes in these characteristics of the second order structure
and investigate their finite sample properties in a simulation study. The applicability of
our approach is demonstrated analyzing German annual temperature data.2019-01-01T00:00:00ZEquivalence tests for binary efficacy-toxicity responsesMöllenhoff, KathrinDette, HolgerBretz, Frankhttp://hdl.handle.net/2003/383792019-11-14T02:40:48Z2019-01-01T00:00:00ZTitle: Equivalence tests for binary efficacy-toxicity responses
Authors: Möllenhoff, Kathrin; Dette, Holger; Bretz, Frank
Abstract: Clinical trials often aim to compare a new drug with a reference treatment in terms of efficacy and/or toxicity depending on covariates such as, for example, the dose level of the drug. Equivalence of these treatments can be claimed if the difference in average outcome is below a certain threshold over the covariate range. In this paper we assume that the efficacy and toxicity of the treatments are measured as binary outcome variables and we address two problems. First, we develop a new test procedure for the assessment of equivalence of two treatments over the entire covariate range for a single binary endpoint. Our approach is based on a parametric bootstrap, which generates data under the constraint that the distance between the curves is equal to the pre-speciﬁed equivalence threshold. Second, we address equivalence for bivariate binary (correlated) outcomes by extending the previous approach for a univariate response. For this purpose we use a 2-dimensional Gumbel model for binary efficacy-toxicity responses. We investigate the operating characteristics of the proposed approaches by means of a simulation study and present a case study as an illustration.2019-01-01T00:00:00ZConvergence of spectral density estimators in the locally stationary frameworkKawka, Rafaelhttp://hdl.handle.net/2003/382602019-10-03T01:40:44Z2019-01-01T00:00:00ZTitle: Convergence of spectral density estimators in the locally stationary framework
Authors: Kawka, Rafael
Abstract: Locally stationary processes are characterised by spectral densities that are functions
of rescaled time. We study the asymptotic properties of spectral density
estimators in the locally stationary framework. In particular, we show that for a
locally stationary process with time-varying spectral density function f(u; ) standard
spectral density estimators consistently estimate the time-averaged spectral
density R 1 0 f(u; ) du. This result is complemented by some illustrative examples
and applications including HAC-inference in the multiple linear regression model
and a simple visual tool for the detection of unconditional heteroskedasticity.2019-01-01T00:00:00ZSteuer versus Emissionshandel: Optionen für die Ausgestaltung einer CO2-BepreisungFrondel, Manuelhttp://hdl.handle.net/2003/382592019-10-03T01:40:47Z2019-01-01T00:00:00ZTitle: Steuer versus Emissionshandel: Optionen für die Ausgestaltung einer CO2-Bepreisung
Authors: Frondel, Manuel
Abstract: Nach Auffassung von Ökonomen können die Treibhausgase in
Europa am kosteneffizientesten dadurch vermieden werden, dass der bislang auf die
Energiewirtschaft und die Industrie beschränkte EU-Emissionshandel auf alle noch nicht
darin integrierten Sektoren ausgeweitet wird. Allerdings müssen für die Ausweitung des
Emissionshandels Mehrheiten in der Europäischen Union gefunden werden. Solange diese
Ausweitung nicht die Zustimmung aller Mitgliedsstaaten findet, könnte die Einführung
einer nationalen CO2-Bepreisung in diesen Sektoren erwogen und im Prinzip auf zwei
Wegen umgesetzt werden: über einen Emissionshandel, entweder separat als nationales
Handelssystem etabliert oder durch einen Opt-in der noch nicht integrierten Sektoren
Deutschlands in den bestehenden EU-Emissionshandel, oder mittels Einführung einer
nationalen CO2-Steuer. Die in diesem Beitrag vorgenommene Abwägung der Vor- und
Nachteile beider Optionen, CO2-Steuer versus Emissionshandel, zeigt, dass eine CO2-
Steuer gravierende Nachteile aufweist, allen voran die mangelnde Treffsicherheit bei der
Erreichung vorgegebener Emissionsziele.2019-01-01T00:00:00ZCognitive reflection and the valuation of energy efficiencyAndor, Mark A.Frondel, ManuelGerster, AndreasSommer, Stephanhttp://hdl.handle.net/2003/382582019-10-03T01:40:49Z2019-01-01T00:00:00ZTitle: Cognitive reflection and the valuation of energy efficiency
Authors: Andor, Mark A.; Frondel, Manuel; Gerster, Andreas; Sommer, Stephan
Abstract: Based on a stated-choice experiment among about 3,600 German household
heads on the purchase of electricity-using durables, this paper explores the impact
of cognitive reflection on consumers’ valuation of energy efficiency, as well as its
interaction with consumers’ response to the EU energy label. Using a standard
cognitive reflection test, our results indicate that consumers with low cognitive
reflection scores value energy efficiency less than those with high scores. Furthermore,
we find that consumers with a low level of cognitive reflection respond more
strongly to grade-like energy efficiency classes than to detailed information on
annual energy use.2019-01-01T00:00:00ZTwo-sample tests for relevant differences in the eigenfunctions of covariance operatorsAue, AlexanderDette, HolgerRice, Gregoryhttp://hdl.handle.net/2003/382562019-10-03T01:40:46Z2019-01-01T00:00:00ZTitle: Two-sample tests for relevant differences in the eigenfunctions of covariance operators
Authors: Aue, Alexander; Dette, Holger; Rice, Gregory
Abstract: This paper deals with two-sample tests for functional time series data, which have become widely
available in conjunction with the advent of modern complex observation systems. Here, particular interest
is in evaluating whether two sets of functional time series observations share the shape of their primary
modes of variation as encoded by the eigenfunctions of the respective covariance operators. To this end,
a novel testing approach is introduced that connects with, and extends, existing literature in two main
ways. First, tests are set up in the relevant testing framework, where interest is not in testing an exact
null hypothesis but rather in detecting deviations deemed sufficiently relevant, with relevance determined
by the practitioner and perhaps guided by domain experts. Second, the proposed test statistics rely on
a self-normalization principle that helps to avoid the notoriously difficult task of estimating the long-run
covariance structure of the underlying functional time series. The main theoretical result of this paper is
the derivation of the large-sample behavior of the proposed test statistics. Empirical evidence, indicating
that the proposed procedures work well in finite samples and compare favorably with competing methods,
is provided through a simulation study, and an application to annual temperature data.2019-01-01T00:00:00ZA generalized method of moments estimator for structural vector autoregressions based on higher momentsKeweloh, Alexander Saschahttp://hdl.handle.net/2003/382242019-09-12T08:00:42Z2019-09-11T00:00:00ZTitle: A generalized method of moments estimator for structural vector autoregressions based on higher moments
Authors: Keweloh, Alexander Sascha
Abstract: I propose a generalized method of moments estimator for structural vector
autoregressions with independent and non-Gaussian shocks. The shocks are
identified by exploiting information contained in higher moments of the
data. Extending the standard identification approach, which relies on the
covariance, to the coskewness and cokurtosis allows to identify and
estimate the simultaneous interaction without any further restrictions. I
analyze the finite sample properties of the estimator and apply it to
illustrate the simultaneous interaction between economic activity, oil and
stock prices.2019-09-11T00:00:00ZEfficient model-based bioequivalence testingMöllenhoff, KathrinLoingeville, FlorenceBertrand, JulieNguyen, Thu ThuySharan, SatishSun, GuoyingGrosser, StellaZhao, LiangFang, LanyanMentré, FranceDette, Holgerhttp://hdl.handle.net/2003/382132019-09-11T01:40:51Z2019-01-01T00:00:00ZTitle: Efficient model-based bioequivalence testing
Authors: Möllenhoff, Kathrin; Loingeville, Florence; Bertrand, Julie; Nguyen, Thu Thuy; Sharan, Satish; Sun, Guoying; Grosser, Stella; Zhao, Liang; Fang, Lanyan; Mentré, France; Dette, Holger
Abstract: The classical approach to analyze pharmacokinetic (PK) data in bioequivalence studies
aiming to compare two different formulations is to perform noncompartmental analysis
(NCA) followed by two one-sided tests (TOST). In this regard the PK parameters AUC
and Cmax are obtained for both treatment groups and their geometric mean ratios are
considered. According to current guidelines by the U.S. Food and Drug Administration
and the European Medicines Agency the formulations are deemed to be similar if the
90%- confidence interval for these ratios falls between 0:8 and 1:25. As NCA is not a
reliable approach in case of sparse designs, a model-based alternative has already been
proposed for the estimation of AUC and Cmax using non-linear mixed effects models.
Here we propose another test than the TOST, called BOT, and evaluate it through a
simulation study both for NCA and model-based approaches. For products with high
variability on PK parameters, this method appears to have closer type I errors to the
conventionally accepted significance level of 0:05, suggesting its potential use in situations
where conventional bioequivalence analysis is not applicable.2019-01-01T00:00:00ZA note on Herglotz’s theorem for time series on function spacesvan Delft, AnneEichler, Michaelhttp://hdl.handle.net/2003/382072019-09-07T01:40:48Z2019-01-01T00:00:00ZTitle: A note on Herglotz’s theorem for time series on function spaces
Authors: van Delft, Anne; Eichler, Michael
Abstract: In this article, we prove Herglotz’s theorem for Hilbert-valued time series. This requires the notion of an operator-valued measure, which we shall make precise for our setting. Herglotz’s theorem for functional time series allows to generalize existing results that are central to frequency domain analysis on the function space. In particular, we use this result to prove the existence of a functional Cramér representation of a large class of processes, including those with jumps in the spectral distribution and long-memory processes. We furthermore obtain an optimal ﬁnite dimensional reduction of the time series under weaker assumptions than available in the literature. The results of this paper therefore enable Fourier analysis for processes of which the spectral density operator does not necessarily exist.2019-01-01T00:00:00ZTesting for stationarity of functional time series in the frequency domainAue, Alexandervan Delft, Annehttp://hdl.handle.net/2003/382062019-09-07T01:40:47Z2019-01-01T00:00:00ZTitle: Testing for stationarity of functional time series in the frequency domain
Authors: Aue, Alexander; van Delft, Anne
Abstract: Interest in functional time series has spiked in the recent past with papers covering both methodology and applications being published at a much increased pace. This article contributes to the research in this area by proposing a new stationarity test for functional time series based on frequency domain methods. The proposed test statistics is based on joint dimension reduction via functional principal components analysis across the spectral density operators at all Fourier frequencies, explicitly allowing for frequency-dependent levels of truncation to adapt to the dynamics of the underlying functional time series. The properties of the test are derived both under the null hypothesis of stationary functional time series and under the smooth alternative of locally stationary functional time series. The methodology is theoretically justiﬁed through asymptotic results. Evidence from simulation studies and an application to annual temperature curves suggests that the test works well in ﬁnite samples.2019-01-01T00:00:00ZA note on quadratic forms of stationary functional time series under mild conditionsvan Delft, Annehttp://hdl.handle.net/2003/382052019-09-07T01:40:48Z2019-01-01T00:00:00ZTitle: A note on quadratic forms of stationary functional time series under mild conditions
Authors: van Delft, Anne
Abstract: We study the distributional properties of a quadratic form of a stationary functional time series under mild moment conditions. As an important application, we obtain consistency rates of estimators of spectral density operators and prove joint weak convergence to a vector of complex Gaussian random operators. Weak convergence is established based on an approximation of the form via transforms of Hilbert-valued martingale difference sequences. As a side-result, the distributional properties of the long-run covariance operator are established.2019-01-01T00:00:00ZSampling distributions of optimal portfolio weights and characteristics in low and large dimensionsBodnar, TarasDette, HolgerParolya, NestorThorsén, Erikhttp://hdl.handle.net/2003/382042019-09-07T01:40:46Z2019-01-01T00:00:00ZTitle: Sampling distributions of optimal portfolio weights and characteristics in low and large dimensions
Authors: Bodnar, Taras; Dette, Holger; Parolya, Nestor; Thorsén, Erik
Abstract: Optimal portfolio selection problems are determined by the (unknown) parameters of
the data generating process. If an investor want to realise the position suggested by the
optimal portfolios he/she needs to estimate the unknown parameters and to account the
parameter uncertainty into the decision process. Most often, the parameters of interest
are the population mean vector and the population covariance matrix of the asset re
turn distribution. In this paper we characterise the exact sampling distribution of the
estimated optimal portfolio weights and their characteristics by deriving their sampling
distribution which is present in terms of a stochastic representation. This approach pos
sesses several advantages, like (i) it determines the sampling distribution of the estimated
optimal portfolio weights by expressions which could be used to draw samples from this
distribution efficiently; (ii) the application of the derived stochastic representation pro
vides an easy way to obtain the asymptotic approximation of the sampling distribution.
The later property is used to show that the high-dimensional asymptotic distribution
of optimal portfolio weights is a multivariate normal and to determine its parameters.
Moreover, a consistent estimator of optimal portfolio weights and their characteristics
is derived under the high-dimensional settings. Via an extensive simulation study, we
investigate the ﬁnite-sample performance of the derived asymptotic approximation and
study its robustness to the violation of the model assumptions used in the derivation of
the theoretical results.2019-01-01T00:00:00ZIdentifying shifts between two regression curvesDette, HolgerSankar Dhar, SubhraWu, Weichihttp://hdl.handle.net/2003/381962019-08-31T01:40:50Z2019-01-01T00:00:00ZTitle: Identifying shifts between two regression curves
Authors: Dette, Holger; Sankar Dhar, Subhra; Wu, Weichi
Abstract: This article studies the problem whether two convex (concave) regression functions
modelling the relation between a response and covariate in two samples differ by a shift
in the horizontal and/or vertical axis. We consider a nonparametric situation assuming
only smoothness of the regression functions. A graphical tool based on the derivatives
of the regression functions and their inverses is proposed to answer this question and
studied in several examples. We also formalize this question in a corresponding hypothesis
and develop a statistical test. The asymptotic properties of the corresponding
test statistic are investigated under the null hypothesis and local alternatives. In contrast
to most of the literature on comparing shape invariant models, which requires
independent data the procedure is applicable for dependent and non-stationary data.
We also illustrate the finite sample properties of the new test by means of a small
simulation study and a real data example.2019-01-01T00:00:00ZPrediction in regression models with continuous observationsDette, HolgerPepelyshev, AndreyZhigljavsky, Anatolyhttp://hdl.handle.net/2003/381952019-08-31T01:40:46Z2019-01-01T00:00:00ZTitle: Prediction in regression models with continuous observations
Authors: Dette, Holger; Pepelyshev, Andrey; Zhigljavsky, Anatoly
Abstract: We consider the problem of predicting values of a random process or ﬁeld satisfying a linear model y(x) = θ>f(x) + ε(x), where errors ε(x) are correlated. This is a common problem in kriging, where the case of discrete observations is standard. By focussing on the case of continuous observations, we derive expressions for the best linear unbiased predictors and their mean squared error. Our results are also applicable in the case where the derivatives of the process y are available, and either a response or one of its derivatives need to be predicted. The theoretical results are illustrated by several examples in particular for the popular Matérn 3/2 kernel.2019-01-01T00:00:00ZVolatility forecasting accuracy for BitcoinKöchling, GerritSchmidtke, PhilippPosch, Peter N.http://hdl.handle.net/2003/381652019-08-06T01:40:49Z2019-01-01T00:00:00ZTitle: Volatility forecasting accuracy for Bitcoin
Authors: Köchling, Gerrit; Schmidtke, Philipp; Posch, Peter N.
Abstract: We analyse the quality of Bitcoin volatility forecasting of GARCH-type
models applying the commonly used volatility proxy based on squared daily
returns as well as a jump-robust proxy based on intra-day returns and vary
the degrees of asymmetry in robust loss functions. We construct model
confidence sets (MCS) which contain superior models with a high probability
and find them to be systematically smaller for asymmetric loss functions
and the jump robust proxy. Our findings suggest a cautious use of GARCH
models in forecasting Bitcoin's volatility.2019-01-01T00:00:00ZOptimal designs for estimating individual coefficients in polynomial regression with no interceptDette, HolgerMelas, Viatcheslav B.Shpilev, Petrhttp://hdl.handle.net/2003/381372019-07-13T01:40:48Z2019-01-01T00:00:00ZTitle: Optimal designs for estimating individual coefficients in polynomial regression with no intercept
Authors: Dette, Holger; Melas, Viatcheslav B.; Shpilev, Petr
Abstract: In a seminal paper Studden (1968) characterized c-optimal designs in regression
models, where the regression functions form a Chebyshev system. He used these
results to determine the optimal design for estimating the individual coefficients in a
polynomial regression model on the interval [-1; 1] explicitly. In this note we identify
the optimal design for estimating the individual coefficients in a polynomial regression
model with no intercept (here the regression functions do not form a Chebyshev
system).2019-01-01T00:00:00ZFinancial risk measures for a network of individual agents holding portfolios of lighttailed objectsKlüppelberg, ClaudiaSeifert, Miriam Isabelhttp://hdl.handle.net/2003/380882019-06-14T13:10:06Z2019-01-01T00:00:00ZTitle: Financial risk measures for a network of individual agents holding portfolios of lighttailed objects
Authors: Klüppelberg, Claudia; Seifert, Miriam Isabel
Abstract: We investigate a financial network of agents holding portfolios of independent
light-tailed risky objects whose losses are asymptotically exponentially
distributed with distinct tail parameters. We show that the
asymptotic distributions of portfolio losses belong to the class of functional
exponential mixtures which we introduce in this paper. We also
provide statements for Value-at-Risk and Expected Shortfall risk measures
as well as for their conditional counterparts. Compared to heavy
tail settings we establish important qualitative differences in the asymptotic
behavior of portfolio risks under a light tail assumption which have
to be accounted for in practical risk management.2019-01-01T00:00:00ZA new approach for open-end sequential change point monitoringGösmann, JosuaKley, TobiasDette, Holgerhttp://hdl.handle.net/2003/380812019-06-07T01:40:47Z2019-01-01T00:00:00ZTitle: A new approach for open-end sequential change point monitoring
Authors: Gösmann, Josua; Kley, Tobias; Dette, Holger
Abstract: We propose a new sequential monitoring scheme for changes in the parameters of
a multivariate time series. In contrast to procedures proposed in the literature which
compare an estimator from the training sample with an estimator calculated from the
remaining data, we suggest to divide the sample at each time point after the training
sample. Estimators from the sample before and after all separation points are then
continuously compared calculating a maximum of norms of their differences. For openend
scenarios our approach yields an asymptotic level a procedure, which is consistent
under the alternative of a change in the parameter.2019-01-01T00:00:00ZWirtschaftliche Aktivität und Emissionen: Die UmweltkuznetskurveWagner, MartinKnorre, Fabianhttp://hdl.handle.net/2003/380762019-05-30T01:40:49Z2019-01-01T00:00:00ZTitle: Wirtschaftliche Aktivität und Emissionen: Die Umweltkuznetskurve
Authors: Wagner, Martin; Knorre, Fabian
Abstract: Seit dem Beginn der industriellen Revolution ist die mittlere globale Temperatur um circa
ein Grad Celsius gestiegen. Es steht außer Zweifel, dass dieser Anstieg wesentlich auch
durch menschliche Aktivitäten getrieben ist - durch Emissionen von Kohlenstoffdioxid
und anderen Treibhausgasen. Wie sehen die Zusammenhänge zwischen wirtschaftlicher
Aktivität und Emissionen aus? Steigen die Emissionen zwingend mit steigender
wirtschaftlicher Aktivität? In diesem Kapitel wollen wir einige grundlegende Probleme
beleuchten, die bei der statistischen - eigentlich ökonometrischen - Analyse dieser
Zusammenhänge auftreten. Diese Probleme sind symptomatisch für wirtschaftswissenschaftliche
Beziehungen und ein Grund warum sich die Ökonometrie als eigenständige
Disziplin etabliert hat.2019-01-01T00:00:00ZLimit theorems for locally stationary processesKawka, Rafaelhttp://hdl.handle.net/2003/380462019-05-11T01:40:47Z2019-01-01T00:00:00ZTitle: Limit theorems for locally stationary processes
Authors: Kawka, Rafael
Abstract: We present limit theorems for locally stationary processes that have a one sided
time-varying moving average representation. In particular, we prove a central limit
theorem (CLT), a weak and a strong law of large numbers (WLLN, SLLN) and a
law of the iterated logarithm (LIL) under mild assumptions that are closely related
to those originally imposed by Dahlhaus and Polonik (2006).2019-01-01T00:00:00ZSome explicit solutions of c-optimal design problems for polynomial regressionDette, HolgerMelas, Viatcheslav B.Shpilev, Petrhttp://hdl.handle.net/2003/380392019-05-04T01:40:45Z2019-01-01T00:00:00ZTitle: Some explicit solutions of c-optimal design problems for polynomial regression
Authors: Dette, Holger; Melas, Viatcheslav B.; Shpilev, Petr
Abstract: In this paper we consider the optimal design problem for extrapolation and estimation
of the slope at a given point, say z, in a polynomial regression with no intercept.
We provide explicit solutions of these problems in many cases and characterize those
values of z, where this is not possible.2019-01-01T00:00:00ZOn scale estimation under shifts in the meanAxt, IevaFried, Rolandhttp://hdl.handle.net/2003/380142019-04-13T01:40:48Z2019-01-01T00:00:00ZTitle: On scale estimation under shifts in the mean
Authors: Axt, Ieva; Fried, Roland
Abstract: In many situations it is crucial to estimate the variance properly. Ordinary variance estimators
perform poorly in the presence of shifts in the mean. We investigate an approach
based on non-overlapping blocks, which yields good results in this change-point scenario.
We show the strong consistency and the asymptotic normality of such blocks-estimators
of the variance under rather general conditions. For estimation of the standard deviation
a blocks-estimator based on average standard deviations turns out to be preferable over
the square root of the average variances. We provide recommendations on the appropriate
choice of the block size and compare this blocks-approach with difference-based
estimators. If level shifts occur rather frequently even better results can be obtained by
adaptive trimming of the blocks under the assumption of normality.2019-01-01T00:00:00ZOptimal designs for model averaging in non-nested modelsAlhorn, KiraDette, HolgerSchorning, Kirstenhttp://hdl.handle.net/2003/379792019-04-04T01:40:58Z2019-01-01T00:00:00ZTitle: Optimal designs for model averaging in non-nested models
Authors: Alhorn, Kira; Dette, Holger; Schorning, Kirsten
Abstract: In this paper we construct optimal designs for frequentist model averaging estimation.
We derive the asymptotic distribution of the model averaging estimate with fixed weights
in the case where the competing models are non-nested and none of these models is correctly
specified. A Bayesian optimal design minimizes an expectation of the asymptotic
mean squared error of the model averaging estimate calculated with respect to a suitable
prior distribution. We demonstrate that Bayesian optimal designs can improve the
accuracy of model averaging substantially. Moreover, the derived designs also improve
the accuracy of estimation in a model selected by model selection and model averaging
estimates with random weights.2019-01-01T00:00:00ZWTA-WTP disparity: The role of perceived realism of the valuation settingFrondel, ManuelSommer, StephanTomberg, Lukashttp://hdl.handle.net/2003/379442019-03-19T02:40:48Z2019-01-01T00:00:00ZTitle: WTA-WTP disparity: The role of perceived realism of the valuation setting
Authors: Frondel, Manuel; Sommer, Stephan; Tomberg, Lukas
Abstract: Based on a survey among more than 5,000 German households and a single-binary
choice experiment in which we randomly split the respondents into two groups, this
paper elicits both households’ willingness to pay (WTP) for power supply security
and their willingness to accept (WTA) compensations for a reduced security level.
In accord with numerous empirical studies, we find that the mean WTA value substantially
exceeds the mean WTP bid, in our empirical example by a factor of 3.56.
Yet, the WTA-WTP ratio decreases to 2.35 among respondents who believe that the
hypothetical valuation setting is likely to become true. Conversely, the WTA-WTP
ratio increases to 3.81 among respondents who deem the setting unlikely. Given this
discrepancy, we conclude that to diminish the WTA-WTP disparity resulting from
stated-preference surveys at least to some extent, inquiring about respondents’ perception
on the realism of the valuation setting is an essential element of any survey
design.2019-01-01T00:00:00ZEmployee representation and innovation – disentangling the effect of legal and voluntary representation institutions in GermanyKraft, KorneliusLammers, Alexanderhttp://hdl.handle.net/2003/379162019-02-15T02:40:52Z2019-01-01T00:00:00ZTitle: Employee representation and innovation – disentangling the effect of legal and voluntary representation institutions in Germany
Authors: Kraft, Kornelius; Lammers, Alexander
Abstract: This paper studies the effect of employee representation bodies provided by management on product and process innovations. In contrast to statutory forms of co-determination such as works councils, participative practices initiated by management are not equipped with any legally granted rights at all. Such alternative forms of employee representation are far less frequently and thoroughly analyzed than works councils. We compare the effects of these co-determination institutions established voluntarily with those initiated on a legal basis on different kinds of innovation measures. We differentiate between process and product (incremental and radical) innovations. To tackle endogeneity, the estimations are based on recursive bivariate and multivariate probit models. Results show that employee representation provided voluntarily by management supports incremental as well as radical product and process innovations. The effect is much more pronounced when endogeneity is taken into account. Works councils, however, only exhibit a positive effect on incremental innovations. Moreover, the results point to a substitutive relationship between both types of employee representation.2019-01-01T00:00:00ZEquivalence of regression curves sharing common parametersMöllenhoff, KathrinBretz, FrankDette, Holgerhttp://hdl.handle.net/2003/379152019-02-15T02:40:52Z2019-01-01T00:00:00ZTitle: Equivalence of regression curves sharing common parameters
Authors: Möllenhoff, Kathrin; Bretz, Frank; Dette, Holger
Abstract: In clinical trials the comparison of two different populations is a frequently addressed
problem. Non-linear (parametric) regression models are commonly used to
describe the relationship between covariates as the dose and a response variable in
the two groups. In some situations it is reasonable to assume some model parameters
to be the same, for instance the placebo effect or the maximum treatment effect. In
this paper we develop a (parametric) bootstrap test to establish the similarity of two
regression curves sharing some common parameters. We show by theoretical arguments
and by means of a simulation study that the new test controls its level and
achieves a reasonable power. Moreover, it is demonstrated that under the assumption
of common parameters a considerable more powerful test can be constructed compared
to the test which does not use this assumption. Finally, we illustrate potential
applications of the new methodology by a clinical trial example.2019-01-01T00:00:00ZThe empirical process of residuals from an inverse regressionKutta, TimBissantz, NicolaiChown, JustinDette, Holgerhttp://hdl.handle.net/2003/379042019-02-07T02:40:53Z2019-01-01T00:00:00ZTitle: The empirical process of residuals from an inverse regression
Authors: Kutta, Tim; Bissantz, Nicolai; Chown, Justin; Dette, Holger
Abstract: In this paper we investigate an indirect regression model characterized by the
Radon transformation. This model is useful for recovery of medical images obtained by computed tomography scans. The indirect regression function is estimated using a series estimator
motivated by a spectral cut-off technique. Further, we investigate the empirical process of
residuals from this regression, and show that it satsifies a functional central limit theorem.2019-01-01T00:00:00Z