Eldorado Community:http://hdl.handle.net/2003/92017-11-13T05:18:55Z2017-11-13T05:18:55ZA test for separability in covariance operators of random surfacesBagchi, PramitaDette, Holgerhttp://hdl.handle.net/2003/361692017-11-09T02:40:43Z2017-11-08T09:31:50ZTitle: A test for separability in covariance operators of random surfaces
Authors: Bagchi, Pramita; Dette, Holger
Abstract: The assumption of separability is a simplifying and very popular assumption in
the analysis of spatio-temporal or hypersurface data structures. It is often made in
situations where the covariance structure cannot be easily estimated, for example
because of a small sample size or because of computational storage problems. In
this paper we propose a new and very simple test to validate this assumption. Our
approach is based on a measure of separability which is zero in the case of separability
and positive otherwise. The measure can be estimated without calculating
the full non-separable covariance operator. We prove asymptotic normality of the
corresponding statistic with a limiting variance, which can easily be estimated from
the available data. As a consequence quantiles of the standard normal distribution
can be used to obtain critical values and the new test of separability is very easy to
implement. In particular, our approach does neither require projections on subspaces
generated by the eigenfunctions of the covariance operator, nor resampling
procedures to obtain critical values nor distributional assumptions as recently used
by Aston et al. (2017) and Constantinou et al. (2017) to construct tests for separability.
We investigate the finite sample performance by means of a simulation study
and also provide a comparison with the currently available methodology. Finally,
the new procedure is illustrated analyzing wind speed and temperature data.2017-11-08T09:31:50ZOptimal designs for regression with spherical dataDette, HolgerKonstantinou, MariaSchorning, KirstenGösmann, Josuahttp://hdl.handle.net/2003/361682017-11-09T02:40:40Z2017-11-08T09:29:23ZTitle: Optimal designs for regression with spherical data
Authors: Dette, Holger; Konstantinou, Maria; Schorning, Kirsten; Gösmann, Josua
Abstract: In this paper optimal designs for regression problems with spherical predictors of
arbitrary dimension are considered. Our work is motivated by applications in material
sciences, where crystallographic textures such as the missorientation distribution
or the grain boundary distribution (depending on a four dimensional spherical predictor)
are represented by series of hyperspherical harmonics, which are estimated
from experimental or simulated data.
For this type of estimation problems we explicitly determine optimal designs with
respect to Kiefers op-criteria and a class of orthogonally invariant information criteria
recently introduced in the literature. In particular, we show that the uniform
distribution on the m-dimensional sphere is optimal and construct discrete and implementable
designs with the same information matrices as the continuous optimal
designs. Finally, we illustrate the advantages of the new designs for series estimation
by hyperspherical harmonics, which are symmetric with respect to the first and
second crystallographic point group.2017-11-08T09:29:23ZFunctional data analysis in the Banach space of continuous functionsDette, HolgerKokot, KevinAue, Alexanderhttp://hdl.handle.net/2003/361292017-10-20T01:40:41Z2017-10-19T14:17:45ZTitle: Functional data analysis in the Banach space of continuous functions
Authors: Dette, Holger; Kokot, Kevin; Aue, Alexander
Abstract: Functional data analysis is typically conducted within the L2-Hilbert space framework. There is by now a fully developed statistical toolbox allowing for the principled application of the functional data machinery to real-world problems, often based on dimension reduction techniques such as functional principal component analysis. At the same time, there have recently been a number of publications that sidestep dimension reduction steps and focus on a fully functional L2-methodology. This paper goes one step further and develops data analysis methodology for functional time series in the space of all continuous functions. The work is motivated by the fact that objects with rather different shapes may still have a small L2-distance and are therefore identified as similar when using an L2-metric. However, in applications it is often desirable to
use metrics reflecting the visualaization of the curves in the statistical analysis. The methodological contributions are focused on developing two-sample and change-point tests as well as confidence bands, as these procedures appear do be conducive to the proposed setting. Particular interest is put on relevant differences; that is, on not trying to test for exact equality, but rather for pre-specified deviations under the null hypothesis.
The procedures are justified through large-sample theory. To ensure practicability, nonstandard bootstrap procedures are developed and investigated addressing particular features that arise in the problem of testing relevant hypotheses. The finite sample properties are explored through a simulation study and an application to annual temperature profiles.2017-10-19T14:17:45ZOptimal designs for enzyme inhibition kinetic modelsSchorning, KirstenDette, HolgerKettelhake, KatrinMöller, Tilmanhttp://hdl.handle.net/2003/360992017-09-16T01:40:43Z2017-09-15T13:15:48ZTitle: Optimal designs for enzyme inhibition kinetic models
Authors: Schorning, Kirsten; Dette, Holger; Kettelhake, Katrin; Möller, Tilman
Abstract: In this paper we present a new method for determining optimal designs for enzyme
inhibition kinetic models, which are used to model the influence of the concentration of a
substrate and an inhibition on the velocity of a reaction. The approach uses a nonlinear
transformation of the vector of predictors such that the model in the new coordinates is
given by an incomplete response surface model. Although there exist no explicit solutions
of the optimal design problem for incomplete response surface models so far, the corre-
sponding design problem in the new coordinates is substantially more transparent, such
that explicit or numerical solutions can be determined more easily. The designs for the
original problem can finally be found by an inverse transformation of the optimal designs
determined for the response surface model. We illustrate the method determining explicit
solutions for the D-optimal design and for the optimal design problem for estimating the
individual coefficients in a non-competitive enzyme inhibition kinetic model.2017-09-15T13:15:48ZCombining cumulative sum change-point detection tests for assessing the stationarity of univariate time seriesBücher, AxelFermanian, Jean-DavidKojadinovic, Ivanhttp://hdl.handle.net/2003/360982017-09-16T01:40:42Z2017-09-15T12:55:28ZTitle: Combining cumulative sum change-point detection tests for assessing the stationarity of univariate time series
Authors: Bücher, Axel; Fermanian, Jean-David; Kojadinovic, Ivan
Abstract: We derive tests of stationarity for continuous univariate time series by combining changepoint
tests sensitive to changes in the contemporary distribution with tests sensitive to
changes in the serial dependence. Rank-based cumulative sum tests based on the empirical
distribution function and on the empirical autocopula at a given lag are considered first.
The combination of their dependent p-values relies on a joint dependent multiplier bootstrap
of the two underlying statistics. Conditions under which the proposed combined testing
procedure is asymptotically valid under stationarity are provided. After discussing the
choice of the maximum lag to investigate, extensions based on tests solely focusing on second-order
characteristics are proposed. The finite-sample behaviors of all the derived statistical
procedures are investigated in large-scale Monte Carlo experiments and illustrations on two
real data sets are provided. Extensions to multivariate time series are briefly discussed as
well.2017-09-15T12:55:28ZA nonparametric test for stationarity in functional time seriesvan Delft, AnneBagchi, PramitaCharaciejus, VaidotasDette, Holgerhttp://hdl.handle.net/2003/360832017-09-07T01:40:49Z2017-09-06T13:48:08ZTitle: A nonparametric test for stationarity in functional time series
Authors: van Delft, Anne; Bagchi, Pramita; Characiejus, Vaidotas; Dette, Holger
Abstract: We propose a new measure for stationarity of a functional time series, which is based on an explicit representation of the L2-distance between the spectral density operator of a non-stationary process and its best (L2-)approximation by a spectral density operator corresponding to a stationary process. This distance can easily be estimated by sums of Hilbert-Schmidt inner products of periodogram operators (evaluated at different frequencies), and asymptotic normality of an appropriately standardised version of the estimator can be established for the corresponding estimate under the null hypothesis and alternative. As a
result we obtain confidence intervals for the discrepancy of the underlying process from a functional stationary process and a simple asymptotic frequency domain level ® test (using the quantiles of the normal distribution) for the hypothesis of stationarity of functional time series. Moreover, the new methodology allows also to test precise hypotheses of the form “the functional time series is approximately stationarity”, which means that the new measure of stationarity is smaller than a given threshold. Thus in contrast to methods proposed in the literature our approach also allows to test for “relevant” deviations from stationarity. We demonstrate in a small simulation study that the new method has very good finite sample properties and compare it with the currently available alternative procedures. Moreover, we apply our test to annual temperature curves.2017-09-06T13:48:08ZBehavioral economics and energy conservation - a systematic review of nonprice interventions and their causal effectsAndor, MarkFels, Katjahttp://hdl.handle.net/2003/360372017-07-29T02:00:08Z2017-07-28T12:09:18ZTitle: Behavioral economics and energy conservation - a systematic review of nonprice interventions and their causal effects
Authors: Andor, Mark; Fels, Katja
Abstract: Research from economics and psychology suggests that behavioral
interventions can be a powerful climate policy instrument. This paper
provides a systematic review of the existing empirical evidence on non-price
interventions targeting energy conservation behavior of private households.
Specifically, we analyze the four nudge-like interventions referred to as social
comparison, pre-commitment, goal setting and labeling in 38 international
studies comprising 91 treatments. This paper differs from previous systematic
reviews by solely focusing on studies that permit the identification of causal
effects. We find that all four interventions have the potential to significantly
reduce energy consumption of private households, yet effect sizes vary
immensely. We conclude by emphasizing the importance of impact
evaluations before rolling out behavioral policy interventions at scale.2017-07-28T12:09:18ZA note on conditional versus joint unconditional weak convergence in bootstrap consistency resultsBücher, AxelKojadinovic, Ivanhttp://hdl.handle.net/2003/359892017-06-11T02:00:12Z2017-06-10T12:42:56ZTitle: A note on conditional versus joint unconditional weak convergence in bootstrap consistency results
Authors: Bücher, Axel; Kojadinovic, Ivan
Abstract: The consistency of a bootstrap or resampling scheme is classically validated by weak convergence of conditional laws. However, when working with stochastic processes in the space of bounded functions and their weak convergence in the Hoffmann-Jorgensen sense, an obstacle occurs: due to possible non-measurability, neither laws nor conditional laws are well-defined. Starting from an equivalent formulation of weak convergence based on the bounded Lipschitz metric, a classical circumvent is to formulate bootstrap consistency in terms of the latter distance between what might be called a conditional law of the (nonmeasurable) bootstrap process and the law of the limiting process. The main contribution of this note is to provide an equivalent formulation of bootstrap consistency in the space of bounded functions which is more intuitive and easy to work with. Essentially, the equivalent formulation consists of (unconditional) weak convergence of the original process jointly with an arbitrary large number of bootstrap replicates. As a by-product, we provide two equivalent formulations of bootstrap consistency for Rd-valued statistics: the first in terms of (unconditional) weak convergence of the statistic jointly with its bootstrap replicates, the second in terms of convergence in probability of the empirical distribution function of the bootstrap replicates. Finally, the asymptotic validity of bootstrap-based confidence intervals and tests is briefly revisited, with particular emphasis on the, in practice unavoidable, Monte Carlo approximation of conditional quantiles.2017-06-10T12:42:56ZInference for heavy tailed stationary time series based on sliding blocksBücher, AxelSegers, Johanhttp://hdl.handle.net/2003/359882017-06-11T02:00:08Z2017-06-10T12:40:34ZTitle: Inference for heavy tailed stationary time series based on sliding blocks
Authors: Bücher, Axel; Segers, Johan
Abstract: The block maxima method in extreme value theory consists of fitting an extreme value distribution to a sample of block maxima extracted from a time series. Traditionally, the maxima are taken over disjoint blocks of observations. Alternatively, the blocks can be chosen to slide through the observation period, yielding a
larger number of overlapping blocks. Inference based on sliding blocks is found to be more efficient than inference based on disjoint blocks. The asymptotic variance of the maximum likelihood estimator of the Fréchet shape parameter is reduced by more than 18%. Interestingly, the amount of the efficiency gain is the same whatever the serial dependence of the underlying time series: as for disjoint blocks, the asymptotic
distribution depends on the serial dependence only through the sequence of scaling constants. The findings are illustrated by simulation experiments and are applied to the estimation of high return levels of the daily log-returns of the Standard & Poor's 500 stock market index.2017-06-10T12:40:34ZDie Gerechtigkeitslücke in der Verteilung der Kosten der Energiewende auf die privaten HaushalteFrondel, ManuelKutzschbauch, OleSommer, StephanTraub, Stefanhttp://hdl.handle.net/2003/359782017-06-02T02:00:07Z2017-06-01T10:13:43ZTitle: Die Gerechtigkeitslücke in der Verteilung der Kosten der Energiewende auf die privaten Haushalte
Authors: Frondel, Manuel; Kutzschbauch, Ole; Sommer, Stephan; Traub, Stefan
Abstract: Die Energiewende bürdet den Verbrauchern zunehmende Lasten auf. Relativ zu
ihrem Einkommen fallen diese Belastungen für einkommensschwache Haushalte
stärker aus als für einkommensstarke Haushalte. Die Ergebnisse unserer empirischen
Erhebung unter mehr als 11.000 Haushalten zeigen jedoch, dass in der Regel eine
Aufteilung der Kosten der Energiewende gewünscht wird, die Haushalte mit hohen
Einkommen vergleichsweise stärker in die Pflicht nimmt als einkommensschwache
Haushalte. Die auf dieser Grundlage von uns konstatierte Gerechtigkeitslücke zwischen
der gewünschten und tatsächlichen Kostenbelastung der Haushalte nimmt
mit den wachsenden Kosten der Energiewende voraussichtlich weiter zu. Diese
Lücke könnte im Prinzip jedoch leicht geschlossen werden, wie die in diesem Beitrag
dargestellten empirischen Schätzungen der Zahlungsbereitschaft der Haushalte
für die Förderung der Erneuerbaren auf Basis von Diskreten-Wahl-Modellen nahelegen.
So könnten die einkommenstärkeren Haushalte bei der Finanzierung der
Energiewende stärker als bislang in die Pflicht genommen werden, da nach unseren
Schätzergebnissen die Haushalte des oberen Einkommensdrittels eine statistisch
signifikant höhere Zustimmung zu zukünftigen EEG-Umlageerhöhungen zeigen als
die Haushalte des unteren Einkommensdrittels.2017-06-01T10:13:43ZThe speed of transition revisitedNaevdal, EricWagner, Martinhttp://hdl.handle.net/2003/359422017-05-03T02:00:07Z2017-05-02T12:17:22ZTitle: The speed of transition revisited
Authors: Naevdal, Eric; Wagner, Martin
Abstract: The speed of transition literature appears to have overlooked the fact that due to the
dynamic nature of the economy the post-transition economic performance influences optimal
behavior already during transition. We illustrate the implications of this neglect
using the well-known model of Aghion and Blanchard (1994, Section 6.4). The correct
solution differs in several respects from the "approximate" solution presented by Aghion
and Blanchard. First, unemployment is increasing up to a certain endogenous point in
time, when, second, the remaining state sector is closed down. This point in time can be
defined as the end of transition. The correct solution is based on transforming the problem
to a type of a dynamic optimization problem often encountered in resource economics: a
scrap value problem with free terminal time.2017-05-02T12:17:22ZConsequentiality and the Willingness-To-Pay for Renewables: Evidence from GermanyAndor, Mark A.Frondel, ManuelHorvath, Marcohttp://hdl.handle.net/2003/359372017-04-26T02:00:10Z2017-04-25T07:44:39ZTitle: Consequentiality and the Willingness-To-Pay for Renewables: Evidence from Germany
Authors: Andor, Mark A.; Frondel, Manuel; Horvath, Marco
Abstract: Based on hypothetical responses originating from a large-scale survey among
about 7,000 German households, this study investigates the discrepancy in willingness-to-
pay (WTP) estimates for green electricity across discrete-choice and open-ended valuation
formats, thereby accounting for perceived consequentiality: respondents selfselect
into two groups distinguished by their belief in the consequentiality of their
answers for policy making. Recognizing that consequentiality status and WTP might
be jointly influenced by unobservable factors, we employ a switching regression model
that accounts for the potential endogeneity of respondents’ belief in consequences and,
hence, biases from sample selectivity. Contrasting with the received literature, we find
WTP bids that tend to be higher among those respondents who obtained questions
in the open-ended format, rather than single binary choice questions. This difference
shrinks, however, when focusing on individuals who perceive the survey as politically
consequential.2017-04-25T07:44:39ZRelevant change points in high dimensional time seriesDette, HolgerGösmann, Josuahttp://hdl.handle.net/2003/359342017-04-20T02:00:11Z2017-04-19T13:32:46ZTitle: Relevant change points in high dimensional time series
Authors: Dette, Holger; Gösmann, Josua2017-04-19T13:32:46ZSequential detection of parameter changes in dynamic conditional correlation modelsPape, KatharinaGaleano, PedroWied, Dominikhttp://hdl.handle.net/2003/359142017-04-07T02:00:14Z2017-04-06T11:27:08ZTitle: Sequential detection of parameter changes in dynamic conditional correlation models
Authors: Pape, Katharina; Galeano, Pedro; Wied, Dominik
Abstract: A multivariate monitoring procedure is presented to detect changes in the parameter vector of
the dynamic conditional correlation model proposed by Robert Engle in 2002. The benefit of
the proposed procedure is that it can be used to detect changes in both the conditional and
unconditional variance as well as in the correlation structure of the model. The detector is based
on quasi log likelihood scores. More precisely, standardized derivations of quasi log likelihood
contributions of points in the monitoring period are evaluated at parameter estimates calculated
from a historical period. The null hypothesis of a constant parameter vector is rejected if these
standardized terms differ too much from those that were expected under the assumption of a
constant parameter vector. Under appropriate assumptions on moments and the structure of
the parameter space, limit results are derived both under null hypothesis and alternatives. In a
simulation study, size and power properties of the procedure are examined in various scenarios.2017-04-06T11:27:08ZFourier analysis of serial dependence measuresVan Hecke, RiaVolgushev, StanislavDette, Holgerhttp://hdl.handle.net/2003/358532017-03-16T03:00:08Z2017-03-15T11:40:32ZTitle: Fourier analysis of serial dependence measures
Authors: Van Hecke, Ria; Volgushev, Stanislav; Dette, Holger
Abstract: Classical spectral analysis is based on the discrete Fourier transform of the auto-covariances.
In this paper we investigate the asymptotic properties of new frequency domain methods where the auto-covariances in the spectral density are replaced by alternative dependence measures which can be estimated by U-statistics. An interesting example is given by
Kendall's r , for which the limiting variance exhibits a surprising behavior.2017-03-15T11:40:32ZCointegration in singular ARMA modelsDeistler, ManfredWagner, Martinhttp://hdl.handle.net/2003/357782017-02-04T03:00:11Z2017-02-03T11:14:06ZTitle: Cointegration in singular ARMA models
Authors: Deistler, Manfred; Wagner, Martin
Abstract: We consider the cointegration properties of singular ARMA processes integrated of order one.
Such processes are necessarily cointegrated as opposed to the regular case. We show that in the
left coprime case the cointegrating space only depends upon the autoregressive polynomial at
one.2017-02-03T11:14:06ZRisk estimators for choosing regularization parameters in ill-posed problems - properties and limitationsLucka, FelixProksch, KatharinaBrune, ChristophBissantz, NicolaiBurger, MartinDette, HolgerWübbeling, Frankhttp://hdl.handle.net/2003/357722017-02-02T03:00:07Z2017-02-01T10:36:22ZTitle: Risk estimators for choosing regularization parameters in ill-posed problems - properties and limitations
Authors: Lucka, Felix; Proksch, Katharina; Brune, Christoph; Bissantz, Nicolai; Burger, Martin; Dette, Holger; Wübbeling, Frank
Abstract: This paper discusses the properties of certain risk estimators recently proposed to
choose regularization parameters in ill-posed problems. A simple approach is Stein's unbiased
risk estimator (SURE), which estimates the risk in the data space, while a recent
modification (GSURE) estimates the risk in the space of the unknown variable. It seems
intuitive that the latter is more appropriate for ill-posed problems, since the properties
in the data space do not tell much about the quality of the reconstruction. We provide
theoretical studies of both estimators for linear Tikhonov regularization in a finite
dimensional setting and estimate the quality of the risk estimators, which also leads to
asymptotic convergence results as the dimension of the problem tends to infinity. Unlike
previous papers, who studied image processing problems with a very low degree of
ill-posedness, we are interested in the behavior of the risk estimators for increasing illposedness.
Interestingly, our theoretical results indicate that the quality of the GSURE
risk can deteriorate asymptotically for ill-posed problems, which is confirmed by a detailed
numerical study. The latter shows that in many cases the GSURE estimator leads
to extremely small regularization parameters, which obviously cannot stabilize the reconstruction.
Similar but less severe issues with respect to robustness also appear for the
SURE estimator, which in comparison to the rather conservative discrepancy principle
leads to the conclusion that regularization parameter choice based on unbiased risk estimation
is not a reliable procedure for ill-posed problems. A similar numerical study for
sparsity regularization demonstrates that the same issue appears in nonlinear variational
regularization approaches.2017-02-01T10:36:22ZClimate change, population ageing and public spending: Evidence on individual preferencesAndor, MarkSchmidt, Christoph M.Sommer, Stephanhttp://hdl.handle.net/2003/357712017-02-01T03:00:08Z2017-01-31T14:58:48ZTitle: Climate change, population ageing and public spending: Evidence on individual preferences
Authors: Andor, Mark; Schmidt, Christoph M.; Sommer, Stephan
Abstract: Economic theory, as well as empirical research, suggest that elderly people
prefer public spending on policies yielding short-term benefits. This might be bad
news for policies aimed at combating climate change: while the unavoidable costs of
these policies arise today, the expected benefits occur in the distant future. Drawing
on data from over 12,000 households and using the ordered logit and the generalized
ordered logit model, we analyze whether attitudes towards climate change and climate
policies, as well as public spending preferences, differ with respect to age. Our
estimates show that elderly people are less concerned about climate change, but more
concerned about other global challenges. Furthermore, they are less likely to support
climate-friendly policies, such as the subsidization of renewables, and allocate less
public resources to environmental policies. Thus, our results suggest that the ongoing
demographic change in industrialized countries may undermine climate policies.2017-01-31T14:58:48ZRobust estimation of change-point locationGerstenberger, Carinahttp://hdl.handle.net/2003/357482017-01-13T08:30:09Z2017-01-11T09:33:07ZTitle: Robust estimation of change-point location
Authors: Gerstenberger, Carina
Abstract: We introduce a robust estimator of the location parameter for the change-point in the
mean based on the Wilcoxon statistic and establish its consistency for L1 near epoch
dependent processes. It is shown that the consistency rate depends on the magnitude
of change. A simulation study is performed to evaluate finite sample properties of the
Wilcoxon-type estimator in standard cases, as well as under heavy-tailed distributions and
disturbances by outliers, and to compare it with a CUSUM-type estimator. It shows that
the Wilcoxon-type estimator is equivalent to the CUSUM-type estimator in standard cases,
but outperforms the CUSUM-type estimator in presence of heavy tails or outliers in the
data.2017-01-11T09:33:07ZOn MSE-optimal crossover designsNeumann, ChristophKunert, Joachimhttp://hdl.handle.net/2003/357432017-01-07T03:00:11Z2017-01-06T12:30:55ZTitle: On MSE-optimal crossover designs
Authors: Neumann, Christoph; Kunert, Joachim
Abstract: In crossover designs, each subject receives a series of treatments
one after the other. Most papers on optimal crossover designs consider an
estimate which is corrected for carryover effects. We look at the estimate
for direct effects of treatment, which is not corrected for carryover effects.
If there are carryover effects, this estimate will be biased. We try to find a
design that minimizes the mean square error, that is the sum of the squared
bias and the variance. It turns out that the designs which are optimal for
the corrected estimate are highly efficient for the uncorrected estimate.2017-01-06T12:30:55ZOrdinal pattern dependence between hydrological time seriesFischer, SvenjaSchumann, AndreasSchnurr, Alexanderhttp://hdl.handle.net/2003/357322016-12-23T03:00:19Z2016-12-22T12:29:13ZTitle: Ordinal pattern dependence between hydrological time series
Authors: Fischer, Svenja; Schumann, Andreas; Schnurr, Alexander
Abstract: Ordinal patterns provide a method to measure correlation between time series. In
contrast to classical correlation measures like the Pearson correlation coefficient they
are able to measure not only linear correlation but also non-linear correlation even
in the presence of non-stationarity. Hence, they are a noteworthy alternative to the
classical approaches when considering discharge series. Discharge series naturally
show a high variation as well as single extraordinary extreme events and, caused by
anthropogenic and climatic impacts, non-stationary behaviour. Here, the method
of ordinal patterns is used to compare pairwise discharge series derived from macroand
mesoscale catchments in Germany. Differences of coincident groups were detected
for winter and summer annual maxima. Hydrological series, which are mainly
driven by annual climatic conditions (yearly discharges and low water discharges)
showed other and in some cases surprising interdependencies between macroscale
catchments. Anthropogenic impacts as the construction of a reservoir or different
flood conditions caused by urbanization could be detected.2016-12-22T12:29:13ZA simple test for white noise in functional time seriesBagchi, PramitaCharaciejus, VaidotasDette, Holgerhttp://hdl.handle.net/2003/357312016-12-23T03:00:17Z2016-12-22T12:27:11ZTitle: A simple test for white noise in functional time series
Authors: Bagchi, Pramita; Characiejus, Vaidotas; Dette, Holger
Abstract: We propose a new procedure for white noise testing of a functional time series.
Our approach is based on an explicit representation of the L2-distance between the
spectral density operator and its best (L2-)approximation by a spectral density operator
corresponding to a white noise process. The estimation of this distance can be
easily accomplished by sums of periodogram kernels and it is shown that an appropriately
standardized version of the estimator is asymptotically normal distributed
under the null hypothesis (of functional white noise) and under the alternative. As a
consequence we obtain a very simple test (using the quantiles of the normal distribution)
for the hypothesis of a white noise functional process. In particular the test
does neither require the estimation of a long run variance (including a fourth order
cumulant) nor resampling procedures to calculate critical values. Moreover, in contrast
to all other methods proposed in the literature our approach also allows to test
for "relevant" deviations from white noise and to construct confidence intervals for
a measure which measures the discrcepancy of the underlying process from a functional
white noise process.2016-12-22T12:27:11ZA multivariate approach for onset detection using supervised classificationBauer, NadjaFriedrichs, KlausWeihs, Claushttp://hdl.handle.net/2003/356992016-12-15T03:00:11Z2016-12-14T14:10:55ZTitle: A multivariate approach for onset detection using supervised classification
Authors: Bauer, Nadja; Friedrichs, Klaus; Weihs, Claus
Abstract: In this paper we introduce a new onset detection approach which incorporates a
supervised classification model for estimating the tone onset probability in signal
frames. In contrast to the most classical strategies where only one detection
function can be applied for signal feature extraction, the classification model
can be fitted on a large feature set. This is meaningful since, depending on the
music characteristics, some detection functions can be more advantageous that
the others.
Although the idea of the considering of many detection functions is not new
in the literature, these functions are, so far, treated in a univariate way by, e.g.,
building of weighted sums. This probably lies on the difficulties of the direct
transfer of the classification ideas to the onset detection task. The goodness
measure of onset detection is namely based on the comparison of two time
vectors while by the classification such a measure is derived from the framewise
matches of predicted and true labels.
In this work we first construct { based on several resent publications { a
comprehensive univariate onset detection algorithm which depends on many free
settable parameters. Then, the new multivariate approach also depending on
many free parameters is introduced. The parameters of the both onset detection
strategies are optimized for online and offline cases by utilizing an appropriate
validation technique. The main funding is that the multivariate strategy outperforms
the univariate one significantly regarding the F-measure. Furthermore,
the multivariate approach seems to be especially beneficial in online case since
it requires only the halve of the future signal information comparing to the best
setting of the univariate onset detection.2016-12-14T14:10:55ZTime efficient optimization of instance based problems with application to tone onset detectionBauer, NadjaFriedrichs, KlausWeihs, Claushttp://hdl.handle.net/2003/356982016-12-15T03:00:09Z2016-12-14T14:08:04ZTitle: Time efficient optimization of instance based problems with application to tone onset detection
Authors: Bauer, Nadja; Friedrichs, Klaus; Weihs, Claus
Abstract: A time efficient optimization technique for instance based problems is proposed,
where for each parameter setting the target function has to be evaluated on a
large set of problem instances. Computational time is reduced by beginning with
a performance estimation based on the evaluation of a representative subset of
instances. Subsequently, only promising settings are evaluated on the whole
data set.
As application a comprehensive music onset detection algorithm is introduced
where several numerical and categorical algorithm parameters are optimized
simultaneously. Here, problem instances are music pieces of a data base.
Sequential model based optimization is an appropriate technique to solve this
optimization problem. The proposed optimization strategy is compared to the
usual model based approach with respect to the goodness measure for tone onset
detection. The performance of the proposed method appears to be competitive
with the usual one while saving more than 84% of instance evaluation time
on average. One other aspect is a comparison of two strategies for handling
categorical parameters in Kriging based optimization.2016-12-14T14:08:04ZA Bayesian heterogeneous coefficients spatial autoregressive panel data model of retail fuel duopoly pricingLeSage, James P.Vance, ColinChih, Yao-Yuhttp://hdl.handle.net/2003/356782016-12-02T03:00:07Z2016-11-30T12:01:00ZTitle: A Bayesian heterogeneous coefficients spatial autoregressive panel data model of retail fuel duopoly pricing
Authors: LeSage, James P.; Vance, Colin; Chih, Yao-Yu
Abstract: We apply a heterogenous coefficient spatial autoregressive panel model to explore
competition/cooperation by duopoly pairs of German fueling stations in setting prices
for diesel and E5 fuel. We rely on a Markov Chain Monte Carlo (MCMC) estimation
methodology applied with non-informative priors, which produces estimates equivalent
to those from (quasi-) maximum likelihood. We explore station-level pricing behavior
using pairs of proximately situated fueling stations with no nearby neighbors. Our sample
data represents average daily diesel and e5 fuel prices, and refinery cost information
covering more than 487 days.
The heterogeneous coefficients spatial autoregressive panel data model uses the large
sample of daily time periods to produce spatial autoregressive model estimates for each
fueling station. These estimates provide information regarding the price reaction function
of each station to its duopoly rival station. This is in contrast to conventional
estimates of price reaction functions that average over the entire cross-sectional sample
of stations. We show how these estimates can be used to infer competition versus
cooperation in price setting by individual stations.2016-11-30T12:01:00ZJoint modeling of annual maximum precipitation across different duration levelsGräler, BenediktFischer, SvenjaSchumann, Andreashttp://hdl.handle.net/2003/356672016-11-29T03:00:29Z2016-11-28T12:55:24ZTitle: Joint modeling of annual maximum precipitation across different duration levels
Authors: Gräler, Benedikt; Fischer, Svenja; Schumann, Andreas
Abstract: Summarizing a series of rainfall events for different duration levels by their annual maxima provides
valuable information. These statistics are e.g. the design base of urban drainage systems. Investigating
an entire set of duration levels, the dependence among them has to be taken into account. We propose
an approach where a set of generalized extreme value distributions and a D-vine copula are
exibly
parameterized by the set of duration levels of interest. A priori, it is not necessary to fix the duration
levels nor the number of duration levels. This joint model produces increasing values for both, longer
duration levels and larger return periods. In a sample application, we show that this model is
exible
enough to capture variations across the duration levels while reproducing the correlation structure of
the data. A joint probabilistic model allows to study a new set of design questions where conditional
probabilities or joint return periods are of interest. This is for instance the case when nested sub-
basins are studied. An urban area within a larger catchment will be sensitive to annual maxima of
shorter durations due to high intensities while the enclosing catchment is prone to annual maxima of
long durations due to huge volumes. A risk analysis of the entire catchment requires a joint study of
both and an approach where the duration levels' dependence is taken into account.2016-11-28T12:55:24ZTests for scale changes based on pairwise differencesGerstenberger, CarinaVogel, DanielWendler, Martinhttp://hdl.handle.net/2003/356302016-11-25T03:00:59Z2016-11-24T15:46:45ZTitle: Tests for scale changes based on pairwise differences
Authors: Gerstenberger, Carina; Vogel, Daniel; Wendler, Martin
Abstract: In many applications it is important to know whether the amount of
uctuation in a
series of observations changes over time. In this article, we investigate different tests for
detecting change in the scale of mean-stationary time series. The classical approach based
on the CUSUM test applied to the squared centered, is very vulnerable to outliers and
impractical for heavy-tailed data, which leads us to contemplate test statistics based on
alternative, less outlier-sensitive scale estimators.
It turns out that the tests based on Gini's mean difference (the average of all pairwise
distances) or generalized Qn estimators (sample quantiles of all pairwise distances) are very
suitable candidates. They improve upon the classical test not only under heavy tails or in
the presence of outliers, but also under normality. An explanation for this counterintuitive
result is that the corresponding long-run variance estimates are less affected by a scale
change than in the case of the sample-variance-based test.
We use recent results on the process convergence of U-statistics and U-quantiles for
dependent sequences to derive the limiting distribution of the test statistics and propose
estimators for the long-run variance. We perform a simulation study to investigate the
finite sample behavior of the tests and their power. Furthermore, we demonstrate the
applicability of the new change-point detection methods at two real-life data examples
from hydrology and finance.2016-11-24T15:46:45ZOn Wigner-Ville spectra and the unicity of time-varying quantile-based spectral densitiesBirr, StefanDette, HolgerHallin, MarcKley, TobiasVolgushev, Stanislavhttp://hdl.handle.net/2003/356292016-11-25T03:00:54Z2016-11-24T15:35:05ZTitle: On Wigner-Ville spectra and the unicity of time-varying quantile-based spectral densities
Authors: Birr, Stefan; Dette, Holger; Hallin, Marc; Kley, Tobias; Volgushev, Stanislav
Abstract: The unicity of the time-varying quantile-based spectrum
proposed in Birr et al. (2016) is established via an asymptotic representation
result involving Wigner-Ville spectra.2016-11-24T15:35:05ZHeterogeneity of regional growth in the EU: A recursive partitioning approachWagner, MartinZeileis, Achimhttp://hdl.handle.net/2003/356272016-11-25T03:01:04Z2016-11-24T15:32:24ZTitle: Heterogeneity of regional growth in the EU: A recursive partitioning approach
Authors: Wagner, Martin; Zeileis, Achim
Abstract: We use model-based recursive partitioning as a technique to assess heterogeneity
of growth and convergence processes based on an economic growth regression for
255 European Union NUTS2 regions from 1995 to 2005. The starting point of the
analysis is a human-capital-augmented Solow-type growth equation similar in spirit
to Mankiw, Romer, and Weil (1992). Initial GDP and the share of highly educated
in the working age population are found to be important for explaining economic
growth, whereas the investment share in physical capital is only significant for coastal
regions in the PIIGS countries. Recursive partitioning leads to a regression tree with
four terminal nodes with partitioning according to (i) capital regions, (ii) non-capital
regions in or outside the so-called PIIGS countries and (iii) inside the respective
PIIGS regions furthermore between coastal and non-coastal regions.2016-11-24T15:32:24ZMultiscale inference for multivariate deconvolutionEckle, KonstantinBissantz, NicolaiDette, Holgerhttp://hdl.handle.net/2003/356262016-11-25T03:03:53Z2016-11-24T15:30:35ZTitle: Multiscale inference for multivariate deconvolution
Authors: Eckle, Konstantin; Bissantz, Nicolai; Dette, Holger
Abstract: In this paper we provide new methodology for inference of the geometric features of
a multivariate density in deconvolution. Our approach is based on multiscale tests to
detect significant directional derivatives of the unknown density at arbitrary points in
arbitrary directions. The multiscale method is used to identify regions of monotonicity
and to construct a general procedure for the detection of modes of the multivariate density.
Moreover, as an important application a significance test for the presence of a local
maximum at a pre-specified point is proposed. The performance of the new methods is investigated
from a theoretical point of view and the finite sample properties are illustrated
by means of a small simulation study.2016-11-24T15:30:35ZPredictive, finite-sample model choice for time series under stationarity and non-stationarityKley, TobiasPreuß, PhilipFryzlewicz, Piotrhttp://hdl.handle.net/2003/353942016-11-24T03:01:16Z2016-11-23T13:01:56ZTitle: Predictive, finite-sample model choice for time series under stationarity and non-stationarity
Authors: Kley, Tobias; Preuß, Philip; Fryzlewicz, Piotr
Abstract: In statistical research there usually exists a choice between structurally simpler or
more complex models. We argue that, even if a more complex, locally stationary time
series model were true, then a simple, stationary time series model may be advantageous
to work with under parameter uncertainty. We present a new model choice
methodology, where one of two competing approaches is chosen based on its empirical
finite-sample performance with respect to prediction. A rigorous, theoretical analysis
of the procedure is provided. As an important side result we prove, for possibly diverging
model order, that the localised Yule-Walker estimator is strongly, uniformly
consistent under local stationarity. An R package, forecastSNSTS, is provided and
used to apply the methodology to financial and meteorological data in empirical examples.
We further provide an extensive simulation study and discuss when it is
preferable to base forecasts on the more volatile time-varying estimates and when it
is advantageous to forecast as if the data were from a stationary process, even though
they might not be.2016-11-23T13:01:56Z“Linear” fully modified OLS estimation of cointegrating polynomial regressionsStypka, OliverGrabarczyk, PeterKawka, RafaelWagner, Martinhttp://hdl.handle.net/2003/353932016-11-24T03:01:26Z2016-11-23T12:59:02ZTitle: “Linear” fully modified OLS estimation of cointegrating polynomial regressions
Authors: Stypka, Oliver; Grabarczyk, Peter; Kawka, Rafael; Wagner, Martin
Abstract: A large part of the empirical environmental Kuznets curve literature uses cointegrating regressions
involving a unit root process and its powers as regressors. In this literature the unit root
process and its powers are, incorrectly, all treated as integrated processes and modified least
squares estimation methods for linear cointegrating regressions are routinely employed. We
show that this approach to estimation leads for the Fully Modified OLS estimator surprisingly
to the same limiting distribution as obtained for the version of the Fully Modified OLS estimator
adapted to the cointegrating polynomial regression setting of Wagner and Hong (2016).2016-11-23T12:59:02ZA note on functional equivalence between intertemporal and multisectoral investment adjustment costsIvashchenko, SergeyMutschler, Willihttp://hdl.handle.net/2003/353922016-11-24T03:01:24Z2016-11-23T12:56:25ZTitle: A note on functional equivalence between intertemporal and multisectoral investment adjustment costs
Authors: Ivashchenko, Sergey; Mutschler, Willi
Abstract: Kim (2003, JEDC) shows functional equivalence between intertemporal and multisectoral
investment adjustments costs in a linearized RBC model. From an identification point
of view, two parameters are not separately distinguishable, they enter as a sum into the
linearized solution. We demonstrate that estimating the quadratic approximation of the
model provides means to extract more information on the structural parameters from
data and thus estimate both parameters that are unidentiable under the log-linearized
model.2016-11-23T12:56:25ZThe environmental Kuznets curve for carbon dioxide emissions: A seemingly unrelated cointegrating polynomial regressions approachWagner, MartinGrabarczyk, Peterhttp://hdl.handle.net/2003/353912016-11-24T03:01:21Z2016-11-23T12:53:27ZTitle: The environmental Kuznets curve for carbon dioxide emissions: A seemingly unrelated cointegrating polynomial regressions approach
Authors: Wagner, Martin; Grabarczyk, Peter
Abstract: We present estimation and inference techniques for systems of seemingly unrelated cointegrating
polynomial regressions. In particular, we present two fully modified-type estimators and
Wald-type hypothesis tests based upon them. We develop tests for poolability of subsets of
coefficients over subsets of equations. For the case that these restrictions are not rejected, we
provide the correspondingly pooled estimators. This group-wise pooling turns out to be very
useful in our application where we analyze the environmental Kuznets curve for CO2 emissions
for seven early industrialized countries. Group-wise pooled estimation leads to almost the same
results as unrestricted estimation whilst reducing the number of estimated parameters by about
one third. Fully pooled, panel-data type estimation performs poorly in comparison.2016-11-23T12:53:27ZIntegrated modified OLS estimation for cointegrating polynomial regressions - with an application to the environmental Kuznets curve for CO2 emissionsFrondel, ManuelGrabarczyk, PeterWagner, Martinhttp://hdl.handle.net/2003/353902016-11-24T03:01:01Z2016-11-23T12:50:25ZTitle: Integrated modified OLS estimation for cointegrating polynomial regressions - with an application to the environmental Kuznets curve for CO2 emissions
Authors: Frondel, Manuel; Grabarczyk, Peter; Wagner, Martin
Abstract: This paper considers the integrated modified OLS (IM-OLS) estimator for cointegrating
polynomial regressions recently developed in Vogelsang and Wagner (2014a; 2014b).
Cointegrating polynomial regressions include deterministic variables, integrated processes
and integer powers of integrated processes as explanatory variables. The stochastic
regressors are allowed to be endogenous and the stationary errors are allowed to
be serially correlated. The IM-OLS estimator allows for asymptotically standard inference
in this framework when using consistent estimators of the long run variance.
Additionally, we also provide fixed-b asymptotic theory for the case of full design to
capture the impact of kernel and bandwidth choice on the sampling distributions of
estimators and test statistics. We investigate the properties of the IM-OLS estimator
and hypothesis tests based upon it by means of a simulation study to compare its
performance with fully modified OLS (FM-OLS) and dynamic OLS (D-OLS). Finally,
we apply the method to estimate the environmental Kuznets curve for CO2 emissions
over the period 1870-2009.2016-11-23T12:50:25ZChange point detection in autoregressive models with no moment assumptionsAkashi, FumiyaDette, HolgerLiu, Yanhttp://hdl.handle.net/2003/353892016-11-24T03:01:23Z2016-11-23T12:47:34ZTitle: Change point detection in autoregressive models with no moment assumptions
Authors: Akashi, Fumiya; Dette, Holger; Liu, Yan
Abstract: In this paper we consider the problem of detecting a change in the parameters
of an autoregressive process, where the moments of the innovation process
do not necessarily exist. An empirical likelihood ratio test for the existence
of a change point is proposed and its asymptotic properties are studied. In
contrast to other work on change point tests using empirical likelihood, we do
not assume knowledge of the location of the change point. In particular, we
prove that the maximizer of the empirical likelihood is a consistent estimator
for the parameters of the autoregressive model in the case of no change point
and derive the limiting distribution of the corresponding test statistic under
the null hypothesis. We also establish consistency of the new test. A nice
feature of the method consists in the fact that the resulting test is asymptotically
distribution free and does not require an estimate of the long run
variance. The asymptotic properties of the test are investigated by means of
a small simulation study, which demonstrates good finite sample properties of
the proposed method.2016-11-23T12:47:34ZA computational study of auditory models in music recognition tasks for normalhearing and hearing-impaired listenersFriedrichs, KlausBauer, NadjaMartin, RainerWeihs, Claushttp://hdl.handle.net/2003/353822016-11-24T03:01:09Z2016-11-23T12:24:15ZTitle: A computational study of auditory models in music recognition tasks for normalhearing and hearing-impaired listeners
Authors: Friedrichs, Klaus; Bauer, Nadja; Martin, Rainer; Weihs, Claus
Abstract: The utility of auditory models for solving three music recognition
tasks { onset detection, pitch estimation and instrument recognition
{ is analyzed. Appropriate features are introduced which enable the
use of supervised classification. The auditory model-based approaches are tested in a comprehensive study and compared to state-of-the-art methods, which usually do not employ an auditory model. For this study, music data is selected according to an experimental design, which enables statements about performance differences with respect to specific music characteristics. The results confirm that the performance of music classification using the auditory model is at least comparable to the traditional methods. Furthermore, the auditory model is modified to exemplify the decrease of recognition rates in the presence of hearing deficits. The resulting system is a basis for estimating the intelligibility of music which in the future might be used for the automatic assessment of hearing instruments.2016-11-23T12:24:15ZA cointegrating polynomial regression analysis of the material Kuznets curve hypothesisFrondel, ManuelGrabarczyk, PeterSommer, StephanWagner, Martinhttp://hdl.handle.net/2003/353812016-11-24T03:01:07Z2016-11-23T12:15:06ZTitle: A cointegrating polynomial regression analysis of the material Kuznets curve hypothesis
Authors: Frondel, Manuel; Grabarczyk, Peter; Sommer, Stephan; Wagner, Martin
Abstract: Employing consumption data for aluminum, lead and zinc for eight OECD countries
spanning from 1900 to 2006, this paper tests the hypothesis underlying the notion
of the Material Kuznets Curve (MKC), which postulates an inverted U-shaped
relationship between a country’s level of economic development and its intensity
of metal use. Applying the tests and estimation techniques for nonlinear cointegration
developed by Saikkonen and Choi (2004),Wagner (2013) as well as Wagner
and Hong (2016), we find that the MKC hypothesis is less strongly supported by
the data than when employing the standard methods that have been used in the
empirical Environmental Kuznets Curve (EKC) literature so far. The evidence for a
cointegrating MKC is mixed, at best.2016-11-23T12:15:06ZAn asymptotic test on the stationarity of the varianceDehling, HeroldFried, RolandWornowizki, Maxhttp://hdl.handle.net/2003/353802016-11-24T03:01:03Z2016-11-23T12:09:07ZTitle: An asymptotic test on the stationarity of the variance
Authors: Dehling, Herold; Fried, Roland; Wornowizki, Max
Abstract: We reconsider a statistic introduced in Wornowizki et al. (2016) allowing to
test the stationarity of the variance for a sequence of independent random variables. In-
stead of determining rejection regions via the permutation principle as proposed before, we
provide asymptotic critical values leading to huge savings in computation time. To prove
the required limit theorems, the test statistic is viewed as a U-statistic constructed from
blockwise variance estimates. Since the distribution of the test statistic depends on the
sample size, a suitable new law of large numbers as well as a central limit theorem are
developed. These asymptotic results are illustrated on artificial data. The permutation and
asymptotic version of the test are compared to alternative procedures in extensive Monte
Carlo experiments. The simulation results suggest that the methods offer similar results
and high power when compared to their competitors, particularly in the case of multiple
structural breaks. They also estimate the structural break positions adequately.2016-11-23T12:09:07ZTrimmed likelihood estimators for stochastic differential equations with an application to crack growth analysis from photosMüller, Christine H.Meinke, Stefan H.http://hdl.handle.net/2003/353582016-11-10T03:00:38Z2016-11-09T14:58:35ZTitle: Trimmed likelihood estimators for stochastic differential equations with an application to crack growth analysis from photos
Authors: Müller, Christine H.; Meinke, Stefan H.
Abstract: We introduce trimmed likelihood estimators for processes given by a
stochastic differential equation for which a transition density is known or can
be approximated and present an algorithm to calculate them. To measure the
fit of the observations to a given stochastic process, two performance measures
based on the trimmed likelihood estimator are proposed. The approach is applied
to crack growth data which are obtained from a series of photos by backtracking
large cracks which were detected in the last photo. Such crack growth
data are contaminated by several outliers caused by errors in the automatic
image analysis. We show that trimming 20% of the data of a growth curve
leads to good results when 100 obtained crack growth curves are fitted with
the Ornstein-Uhlenbeck process and the Cox-Ingersoll-Ross processes while
the fit of the Geometric Brownian Motion is significantly worse. The method
is sensitive in the sense that crack curves obtained under different stress conditions
provide significantly different parameter estimates.2016-11-09T14:58:35ZA new method for adaptive spectral complexity reduction of music signalsKrymova, EkaterinaNagathil, AnilBelomestny, DenisMartin, Rainerhttp://hdl.handle.net/2003/353192016-11-09T03:00:24Z2016-11-08T09:39:50ZTitle: A new method for adaptive spectral complexity reduction of music signals
Authors: Krymova, Ekaterina; Nagathil, Anil; Belomestny, Denis; Martin, Rainer
Abstract: In this discussion paper we present a novel unsupervised segmentation
procedure for music signals which relies on an explained variance criterion in the eigenspace of the constant-Q spectral domain. The procedure
is used in the context of a spectral complexity reduction method which
mitigates effects of cochlear hearing loss. It is compared to a segmentation based on equidistant boundaries. The results demonstrate that the
proposed segmentation procedure gives an improvement in terms of signal-
to-artefacts ratio in comparison to a segmentation based on equidistant
boundaries.2016-11-08T09:39:50ZChange point estimation based on the Wilcoxon test in the presence of long-range dependenceBetken, Annikahttp://hdl.handle.net/2003/353182016-11-09T03:00:22Z2016-11-08T09:38:12ZTitle: Change point estimation based on the Wilcoxon test in the presence of long-range dependence
Authors: Betken, Annika
Abstract: We consider an estimator, based on the two-sample Wilcoxon statistic, for the location of a
shift in the mean of long-range dependent sequences. Consistency and the rate of convergence for the
estimated change point are established. In particular, the 1/n convergence rate (with n denoting the number
of observations), which is typical under the assumption of independent observations, is also achieved for
long memory sequences in case of a constant shift height. It is proved that after a suitable normalization
the estimator converges in distribution to a functional of a fractional Brownian motion, if the change point
height decreases to 0 with a certain rate. The estimator is tested on two well-known data sets. Finite sample
behaviors are investigated in a Monte Carlo simulation study.2016-11-08T09:38:12ZTesting for change in stochastic volatility with long range dependenceBetken, AnnikaKulik, Rafalhttp://hdl.handle.net/2003/353172016-11-09T03:00:20Z2016-11-08T09:22:08ZTitle: Testing for change in stochastic volatility with long range dependence
Authors: Betken, Annika; Kulik, Rafal
Abstract: In this paper we consider a change point problem for long memory stochastic
volatility models. We show that the limiting behavior for the CUSUM test statistics
may not be affected by long memory, unlike the Wilcoxon test statistic which is
infuenced by long range dependence. We compare our results to subordinated long
memory Gaussian processes. Theoretical properties are accompanied by simulation
studies.2016-11-08T09:22:08ZA computational study of auditory models in music recognition tasks for normalhearing and hearing-impaired listenersFriedrichs, KlausBauer, NadjaMartin, RainerWeihs, Claushttp://hdl.handle.net/2003/353162016-11-09T03:00:11Z2016-11-08T09:18:54ZTitle: A computational study of auditory models in music recognition tasks for normalhearing and hearing-impaired listeners
Authors: Friedrichs, Klaus; Bauer, Nadja; Martin, Rainer; Weihs, Claus
Abstract: The utility of auditory models for solving three music recognition tasks { onset detection, pitch estimation and
instrument recognition { is analyzed. Appropriate features are introduced which enable the use of supervised
classification. The auditory model-based approaches are tested in a comprehensive study and compared to
state-of-the-art methods, which usually do not employ an auditory model. For this study, music data is selected
according to an experimental design, which enables statements about performance differences with respect to
specific music characteristics. The results conirm that the performance of music classification using the
auditory model is at least comparable to the traditional methods. Furthermore, the auditory model is modified
to exemplify the decrease of recognition rates in the presence of hearing deficits. The resulting system is a
basis for estimating the intelligibility of music which in the future might be used for the automatic assessment
of hearing instruments.2016-11-08T09:18:54ZEfficient global optimization: Motivation, variations and applicationsWeihs, ClausHerbrandt, SwetlanaBauer, NadjaFriedrichs, KlausHorn, Danielhttp://hdl.handle.net/2003/353152016-11-09T03:00:09Z2016-11-08T09:15:59ZTitle: Efficient global optimization: Motivation, variations and applications
Authors: Weihs, Claus; Herbrandt, Swetlana; Bauer, Nadja; Friedrichs, Klaus; Horn, Daniel
Abstract: A popular optimization method of a black box objective function is
Efficient Global Optimization (EGO), also known as Sequential Model Based
Optimization, SMBO, with kriging and expected improvement. EGO is a sequential
design of experiments aiming at gaining as much information as possible
from as few experiments as feasible by a skillful choice of the factor
settings in a sequential way. In this paper we will introduce the standard procedure
and some of its variants. In particular, we will propose some new variants
like regression as a modeling alternative to kriging and two simple methods for
the handling of categorical variables, and we will discuss focus search for the
optimization of the infill criterion. Finally, we will give relevant examples for
the application of the method. Moreover, in our group, we implemented all the
described methods in the publicly available R package mlrMBO.2016-11-08T09:15:59ZOn the method of probability weighted moments in regional frequency analysisLilienthal, JonaKinsvater, PaulFried, Rolandhttp://hdl.handle.net/2003/353112016-11-04T03:00:14Z2016-11-03T13:42:40ZTitle: On the method of probability weighted moments in regional frequency analysis
Authors: Lilienthal, Jona; Kinsvater, Paul; Fried, Roland
Abstract: In regional flood frequency analysis it is of interest to estimate high quantiles of a local river
flow distribution by gathering information from similar stations in the neighborhood. E. g., the
popular Index Flood (IF) approach is based on an assumption termed regional homogeneity,
which states that the quantile curves of those stations only differ by a site-specific factor, the
so-called index flood, and it is assumed that the station's distribution is known up to some
finite-dimensional parameter. In this context the method of probability weighted moments (or
equivalently L-moments) is most popular for parameter estimation. While the observations
often can be regarded as independent in time, a challenge arises from the fact that river
flows from nearby stations are strongly dependent in space. To the best of our knowledge, none of the
approaches from the literature based on the IF-model and on L-moments is able to take spatial
dependence adequately into account. Our goal is to fill this gap. We present asymptotic theory
that does not ignore inter-site dependence, which, for instance, allows to evaluate estimation
uncertainty. As an application of this theory, a test procedure to check for regional homogeneity
under index-flood assumptions is given and reviewed in a simulation study.2016-11-03T13:42:40ZMultiscale inference for multivariate deconvolutionEckle, KonstantinBissantz, NicolaiDette, Holgerhttp://hdl.handle.net/2003/353102016-11-04T03:00:12Z2016-11-03T13:40:17ZTitle: Multiscale inference for multivariate deconvolution
Authors: Eckle, Konstantin; Bissantz, Nicolai; Dette, Holger
Abstract: We propose multiscale tests for deconvolution in order to detect geometric features of
an unknown multivariate density. Our approach uses simultaneous tests on all scales for
the monotonicity of the density at arbitrary points in arbitrary directions. We consider
the situation of polynomial decay of the Fourier transform of the error density in the de-
convolution model (moderately ill-posed). We develop multiscale methods for identifying
regions of monotonicity and a general procedure to detect the modes of a multivariate
density. The theoretical results are illustrated by means of a simulation study.2016-11-03T13:40:17ZConsumer inattention, heuristic thinking and the role of energy labelsAndor, MarkGerster, AndreasSommer, Stephanhttp://hdl.handle.net/2003/353092016-11-04T03:00:10Z2016-11-03T13:36:21ZTitle: Consumer inattention, heuristic thinking and the role of energy labels
Authors: Andor, Mark; Gerster, Andreas; Sommer, Stephan
Abstract: Energy labels have been introduced in many countries to increase consumers’
attention to energy use in purchase decisions of durables. In a discrete-choice experiment
among about 5,000 households, we implement randomized information
treatments to explore the effects of various kinds of energy labels on purchasing decisions.
Our results show that adding annual operating cost information to the EU
energy label promotes the choice of energy-efficient durables. In addition, we find
that a majority of participants value efficiency classes beyond the economic value
of the underlying energy use differences. Our results further indicate that displaying
operating cost affects choices through two distinct channels: it increases the
attention to operating cost and reduces the valuation of efficiency class differences.2016-11-03T13:36:21ZLocally adaptive confidence bandsPatschkowski, TimRohde, Angelikahttp://hdl.handle.net/2003/353082016-11-04T03:00:07Z2016-11-03T13:34:14ZTitle: Locally adaptive confidence bands
Authors: Patschkowski, Tim; Rohde, Angelika
Abstract: We develop honest and locally adaptive confidence bands for probability
densities. They provide substantially improved confidence statements in
case of inhomogeneous smoothness, and are easily implemented and visualized.
The article contributes conceptual work on locally adaptive inference
as a straightforward modification of the global setting imposes severe obstacles
for statistical purposes. Among others, we introduce a statistical notion
of local Hölder regularity and prove a correspondingly strong version of local
adaptivity. We substantially relax the straightforward localization of
the self-similarity condition in order not to rule out prototypical densities.
The set of densities permanently excluded from the consideration is shown
to be pathological in a mathematically rigorous sense. On a technical level,
the crucial component for the verification of honesty is the identification
of an asymptotically least favorable stationary case by means of Slepian's
comparison inequality.2016-11-03T13:34:14ZOptimal discrimination designs for semi-parametric modelsDette, HolgerGuchenko, RomanMelas, ViatcheslavWong, Weng Keehttp://hdl.handle.net/2003/353062016-10-29T02:00:12Z2016-10-28T12:33:36ZTitle: Optimal discrimination designs for semi-parametric models
Authors: Dette, Holger; Guchenko, Roman; Melas, Viatcheslav; Wong, Weng Kee
Abstract: Much of the work in the literature on optimal discrimination designs assumes that the
models of interest are fully specified, apart from unknown parameters in some models.
Recent work allows errors in the models to be non-normally distributed but still requires
the specification of the mean structures. This research is motivated by the interesting
work of Otsu (2008) to discriminate among semi-parametric models by generalizing
the KL-optimality criterion proposed by Lopez-Fidalgo et al. (2007) and Tommasi and
Lopez-Fidalgo (2010). In our work we provide further important insights in this interesting
optimality criterion. In particular, we propose a practical strategy for finding
optimal discrimination designs among semi-parametric models that can also be verified
using an equivalence theorem. In addition, we study properties of such optimal designs
and identify important cases where the proposed semi-parametric optimal discrimination
designs coincide with the celebrated T-optimal designs.2016-10-28T12:33:36ZThe impact of disclosure obligations on executive compensation - A policy evaluation using quantile treatment estimatorsDyballa, KatharinaKraft, Korneliushttp://hdl.handle.net/2003/353052016-10-29T02:00:14Z2016-10-28T12:30:25ZTitle: The impact of disclosure obligations on executive compensation - A policy evaluation using quantile treatment estimators
Authors: Dyballa, Katharina; Kraft, Kornelius
Abstract: This empirical study analyses the effects of the introduction of strongly increased
disclosure requirements in Germany on the level of executive compensation. One
innovative aspect is the comparison of companies which voluntarily followed a
recommendation of the German Governance Code before the relevant law was
implemented and published detailed information on executive compensation with
other firms which did not. Conditional and unconditional quantile difference-indifferences
models are estimated. The companies which refused to publish data
before it became mandatory show a reduction in compensation levels for the upper
quantiles. Hence, the mandatory requirement to publish detailed information
reduced the higher levels of executive compensations, but did not affect executive
compensation at lower or medium levels.2016-10-28T12:30:25ZBest linear unbiased estimators in continuous time regression modelsDette, HolgerPepelyshev, AndreyZhigljavsy, Anatolyhttp://hdl.handle.net/2003/353042016-10-29T02:00:10Z2016-10-28T12:27:26ZTitle: Best linear unbiased estimators in continuous time regression models
Authors: Dette, Holger; Pepelyshev, Andrey; Zhigljavsy, Anatoly
Abstract: In this paper the problem of best linear unbiased estimation is
investigated for continuous-time regression models. We prove several
general statements concerning the explicit form of the best linear unbiased
estimator (BLUE), in particular when the error process is a
smooth process with one or several derivatives of the response process
available for construction of the estimators. We derive the explicit
form of the BLUE for many specific models including the cases
of continuous autoregressive errors of order two and integrated error
processes (such as integrated Brownian motion). The results are
illustrated by several examples.2016-10-28T12:27:26ZRegularization parameter selection in indirect regression by residual based bootstrapBissantz, NicolaiChown, JustinDette, Holgerhttp://hdl.handle.net/2003/353022016-10-29T01:40:57Z2016-10-28T09:00:10ZTitle: Regularization parameter selection in indirect regression by residual based bootstrap
Authors: Bissantz, Nicolai; Chown, Justin; Dette, Holger
Abstract: Residual-based analysis is generally considered a cornerstone of statistical methodology.
For a special case of indirect regression, we investigate the residual-based empirical distribution
function and provide a uniform expansion of this estimator, which is also shown to
be asymptotically most precise. This investigation naturally leads to a completely data-driven
technique for selecting a regularization parameter used in our indirect regression function estimator.
The resulting methodology is based on a smooth bootstrap of the model residuals. A
simulation study demonstrates the effectiveness of our approach.2016-10-28T09:00:10ZThe effect of intraday periodicity on realized volatility measuresDette, HolgerGolosnoy, VasylKellermann, Janoschhttp://hdl.handle.net/2003/353012016-10-29T01:40:57Z2016-10-28T08:57:43ZTitle: The effect of intraday periodicity on realized volatility measures
Authors: Dette, Holger; Golosnoy, Vasyl; Kellermann, Janosch
Abstract: U-shaped intraday periodicity (IP) is a typical stylized fact characterizing intraday returns
on risky assets. In this study we focus on the realized volatility and bipower variation
estimators for daily integrated volatility (IV ) which are based on intraday returns following
a discrete-time model with IP. We demonstrate that neglecting the impact of IP on
realized estimators may lead to non-valid statistical inference concerning IV for the commonly
available number of intraday returns, moreover, the size of daily jump tests may be
distorted. Given the functional form of IP, we derive corrections for the realized measures
of IV . We show in a Monte Carlo and an empirical study that the proposed corrections
improve commonly point and interval estimators of the IV and tests for jumps.2016-10-28T08:57:43ZSwitching on electricity demand response: Evidence for German householdsFrondel, ManuelKussel, Gerhardhttp://hdl.handle.net/2003/353002016-10-29T01:40:58Z2016-10-28T08:55:10ZTitle: Switching on electricity demand response: Evidence for German households
Authors: Frondel, Manuel; Kussel, Gerhard
Abstract: Empirical evidence on the response of German households to electricity price
changes is sparse. Using panel data originating from Germany’s Residential Energy
Consumption Survey (GRECS), we fill this void by employing an instrumental variable
approach to cope with the endogeneity of the consumers’ tariff choice. By additionally
exploiting our information on the households’ knowledge about power prices, we also
employ an Endogenous Switching Regression Model to estimate price elasticities for
two groups of households, finding that only those households that are informed about
prices are sensitive to price changes, whereas the electricity demand of uninformed
households is entirely price-inelastic.2016-10-28T08:55:10Z'Change in space’-point estimation, Part I: Lower bound for rates of consistencyBrauer, MarcelRohde, Angelikahttp://hdl.handle.net/2003/352992016-10-29T01:40:55Z2016-10-28T08:52:16ZTitle: 'Change in space’-point estimation, Part I: Lower bound for rates of consistency
Authors: Brauer, Marcel; Rohde, Angelika
Abstract: Given n discrete observations of a homogeneous diffusion process with a
piecewise constant diffusion coefficient containing one point of discontinuity
p0, we study the semiparametric problem of estimating its 'change in space'-
point p_0 in the high-frequency setting. We establish a lower bound for the
minimax rate of convergence n^--3/4, which is slower than the n^-1-rate in
traditional change-point problems.2016-10-28T08:52:16ZNew backtests for unconditional coverage of the expected shortfallLöser, RobertWied, DominikZiggel, Danielhttp://hdl.handle.net/2003/352862016-10-15T02:00:13Z2016-10-14T10:24:48ZTitle: New backtests for unconditional coverage of the expected shortfall
Authors: Löser, Robert; Wied, Dominik; Ziggel, Daniel
Abstract: We present a new backtest for the unconditional coverage property of the ES. The test statistic is available
for finite out-of-sample size which leads to better size and power properties compared to existing tests.
Moreover, it can be easily extended to a multivariate test.2016-10-14T10:24:48ZEfficient estimation of the error distribution function in heteroskedastic nonparametric regression with missing dataChown, Justinhttp://hdl.handle.net/2003/352852016-10-15T02:00:10Z2016-10-14T10:22:43ZTitle: Efficient estimation of the error distribution function in heteroskedastic nonparametric regression with missing data
Authors: Chown, Justin
Abstract: We propose a residual-based empirical distribution function to estimate the distribution function
of the errors of a heteroskedastic nonparametric regression with responses missing at random based on
completely observed data, and we show this estimator is asymptotically most precise.2016-10-14T10:22:43ZDetecting heteroskedasticity in nonparametric regression using weighted empirical processesChown, JustinMüller, Ursula U.http://hdl.handle.net/2003/352842016-10-15T02:00:08Z2016-10-14T10:19:04ZTitle: Detecting heteroskedasticity in nonparametric regression using weighted empirical processes
Authors: Chown, Justin; Müller, Ursula U.
Abstract: Heteroskedastic errors can lead to inaccurate statistical conclusions if they are
not properly handled. We introduce a test for heteroskedasticity for the nonparametric regression
model with multiple covariates. It is based on a suitable residual-based empirical
distribution function. The residuals are constructed using local polynomial smoothing. Our
test statistic involves a "detection function" that can verify heteroskedasticity by exploiting
just the independence-dependence structure between the detection function and model
errors, i.e. we do not require a specific model of the variance function. The procedure is
asymptotically distribution free: inferences made from it do not depend on unknown parameters.
It is consistent at the parametric (root-n) rate of convergence. Our results are
extended to the case of missing responses and illustrated with simulations.2016-10-14T10:19:04ZNonparametric inference of gradual changes in the jump behaviour of time-continuous processesHoffmann, MichaelVetter, MathiasDette, Holgerhttp://hdl.handle.net/2003/352332016-10-11T02:00:07Z2016-10-10T10:46:10ZTitle: Nonparametric inference of gradual changes in the jump behaviour of time-continuous processes
Authors: Hoffmann, Michael; Vetter, Mathias; Dette, Holger
Abstract: In applications changes of the properties of a stochastic feature occur often gradually
rather than abruptly, that is: after a constant phase for some time they slowly start to
change. Efficient analysis for change points should address the specific features of such a
smooth change. In this paper we discuss statistical inference for localizing and detecting
gradual changes in the jump characteristic of a discretely observed Ito semimartingale. We
propose a new measure of time variation for the jump behaviour of the process. The statistical
uncertainty of a corresponding estimate is analyzed deriving new results on the weak
convergence of a sequential empirical tail integral process and a corresponding multiplier
bootstrap procedure.2016-10-10T10:46:10ZHigher-order statistics for DSGE modelsMutschler, Willihttp://hdl.handle.net/2003/352152016-09-20T02:00:16Z2016-09-19T10:14:48ZTitle: Higher-order statistics for DSGE models
Authors: Mutschler, Willi
Abstract: Closed-form expressions for unconditional moments, cumulants and polyspectra of order
higher than two are derived for non-Gaussian or nonlinear (pruned) solutions to DSGE
models. Apart from the existence of moments and white noise property no distributional
assumptions are needed. The accuracy and utility of the formulas for computing
skewness and kurtosis are demonstrated by three prominent models: Smets and Wouters
(AER, 586-606, 97, 2007) (first-order approximation), An and Schorfheide (Econom.
Rev., 113-172, 26, 2007) (second-order approximation) and the neoclassical growth model
(third-order approximation). Both the Gaussian as well as Student's t-distribution are
considered as the underlying stochastic processes. Lastly, the efficiency gain of including
higher-order statistics is demonstrated by the estimation of a RBC model within a
Generalized Method of Moments framework.2016-09-19T10:14:48ZWeak convergence of a pseudo maximum likelihood estimator for the extremal indexBerghaus, BetinaBücher, Axelhttp://hdl.handle.net/2003/352142016-09-20T02:00:13Z2016-09-19T10:06:11ZTitle: Weak convergence of a pseudo maximum likelihood estimator for the extremal index
Authors: Berghaus, Betina; Bücher, Axel
Abstract: The extremes of a stationary time series typically occur in clusters. A
primary measure for this phenomenon is the extremal index, representing the reciprocal
of the expected cluster size. Both a disjoint and a sliding blocks estimator for the
extremal index are analyzed in detail. In contrast to many competitors, the estimators
only depend on the choice of one parameter sequence. We derive an asymptotic
expansion, prove asymptotic normality and show consistency of an estimator for the
asymptotic variance. Explicit calculations in certain models and a finite-sample Monte
Carlo simulation study reveal that the sliding blocks estimator is outperforming other
blocks estimators, and that it is competitive to runs- and inter-exceedance estimators
in various models. The methods are applied to a variety of financial time series.2016-09-19T10:06:11ZLow-frequency estimation of continuous-time moving average Lévy processesBelomestny, DenisPanov, VladimirWoerner, Jeannette H. C.http://hdl.handle.net/2003/351972016-09-03T02:00:13Z2016-09-02T09:00:44ZTitle: Low-frequency estimation of continuous-time moving average Lévy processes
Authors: Belomestny, Denis; Panov, Vladimir; Woerner, Jeannette H. C.
Abstract: In this paper we study the problem of statistical inference for a continuoustime
moving average Lévy process of the form
Zt=∫ℝκ(t-s)dLs, t∈ℝ
with a deterministic kernel κ and a Lévy process L. Especially the estimation
of the Lévy measure v of L from low-frequency observations of the process
Z is considered. We construct a consistent estimator, derive its convergence
rates and illustrate its performance by a numerical example. On the technical
level, the main challenge is to establish a kind of exponential mixing for
continuous-time moving average Lévy processes.2016-09-02T09:00:44ZSimulation free prediction intervals for a state dependent failure process using accellerated lifetime experimentsMüller, Christine H.Szugat, SebastianMaurer, Reinhardhttp://hdl.handle.net/2003/351932016-09-01T02:00:08Z2016-08-31T13:11:44ZTitle: Simulation free prediction intervals for a state dependent failure process using accellerated lifetime experiments
Authors: Müller, Christine H.; Szugat, Sebastian; Maurer, Reinhard
Abstract: We consider the problem of constructing prediction intervals for the
time point at which a given number of components of a system exposed to
degradation fails. The failure process with respect to the failure times of
the components is modeled by a state dependent point process which is an
alternative to the nonhomogeneous Poisson process often used in failure
analysis. Several failure processes observed at different usually higher
stress conditions are incorporated by a link function. Two new simulation-
free prediction intervals are proposed. One is constructed with the
method and the implicit function theorem applied to the hypoexponential
distribution and does not need the construction of confidence sets for the
unknown parameters. The other is based on data depth using a recent
result for constructing outlier robust confidence sets for general regression.
The two new methods are compared with two methods based on classical
confidence sets for generalized linear models. The comparison is done by
leave-one-out analysis of data coming from failure processes observed at
prestressed concrete beams exposed to different cyclic loading where the
time points of breaking tension wires were reported.2016-08-31T13:11:44ZCycling on the extensive and intensive margin: The role of paths and pricesFrondel, ManuelVance, ColinWagner, Martinhttp://hdl.handle.net/2003/351792016-08-17T02:00:10Z2016-08-16T07:24:56ZTitle: Cycling on the extensive and intensive margin: The role of paths and prices
Authors: Frondel, Manuel; Vance, Colin; Wagner, Martin
Abstract: Drawing on a panel of German survey data spanning 1997-2013, this paper
identifies the correlates of non-recreational bicycling, focusing specifically on the roles
of bicycle paths and fuel prices. Our approach conceptualizes ridership as a two stage
decision process comprising the discrete choice of whether to use the bike (i.e. the intensive
margin) and the continuous choice of how far to ride (i.e. the extensive margin).
To the extent that these two choices are related and, moreover, potentially influenced by
factors unobservable to the researcher, we explore alternative estimators using two-stage
censored regression techniques to assess whether the results are subject to biases from
sample selectivity. A key finding is that while higher fuel costs are associated with an
increased probability of undertaking non-recreational bike trips, this effect is predicated
on residence in an urbanized region. We also find evidence for a positive association with
the extent of bike paths, both in increasing the probability of non-recreational bike travel
as well as the distance traveled.2016-08-16T07:24:56ZFourier methods for analysing piecewise constant volatilitiesWornowizki, MaxFried, RolandMeintanis, Simos G.http://hdl.handle.net/2003/351782016-08-17T02:00:07Z2016-08-16T07:22:49ZTitle: Fourier methods for analysing piecewise constant volatilities
Authors: Wornowizki, Max; Fried, Roland; Meintanis, Simos G.
Abstract: We develop procedures for testing the hypothesis that a parameter of
a distribution is constant throughout a sequence of independent random
variables. Our proposals are illustrated considering the variance and the
kurtosis. Under the null hypothesis of constant variance, the modulus
of a Fourier type transformation of the volatility process is identically
equal to one. The approach proposed utilizes this property considering
a canonical estimator for this modulus under the assumption of indepen-
dent and piecewise identically distributed observations with zero mean.
Using blockwise estimators we introduce several test statistics resulting
from different weight functions which are all given by simple explicit for-
mulae. The methods are compared to other tests for constant volatility
in extensive Monte Carlo experiments. Our proposals offer comparatively
good power particularly in the case of multiple structural breaks and allow
adequate estimation of the positions of the structural breaks. An appli-
cation to process control data is given, and it is shown how the methods
can be adapted to test for constancy of other quantities like the kurtosis.2016-08-16T07:22:49ZNonparametric estimation and testing on discontinuity of positive supported densities: A kernel truncation approachFunke, BenediktHirukawa, Masayukihttp://hdl.handle.net/2003/351682016-08-04T02:00:10Z2016-08-03T13:06:16ZTitle: Nonparametric estimation and testing on discontinuity of positive supported densities: A kernel truncation approach
Authors: Funke, Benedikt; Hirukawa, Masayuki
Abstract: Discontinuity in density functions is of economic importance and interest.
For instance, in studies on regression discontinuity designs, discontinuity in
the density of a running variable suggests violation of the no-manipulation
assumption. In this paper we develop estimation and testing procedures on
discontinuity in densities with positive support. Our approach is built on splitting
the gamma kernel (Chen, 2000) into two parts at a given (dis)continuity
point and constructing two truncated kernels. The jump-size magnitude of the
density at the point can be estimated nonparametrically by two kernels and a
multiplicative bias correction method. The estimator is easy to implement, and
its convergence properties are delivered by various approximation techniques on
incomplete gamma functions. Based on the jump-size estimator, two versions
of test statistics for the null of continuity at a given point are also proposed.
Moreover, estimation theory of the entire density in the presence of a discontinuity
point is explored. Monte Carlo simulations confirm nice finite-sample
properties of the jump-size estimator and the test statistics.2016-08-03T13:06:16ZNonparametric IV regression with an Archimedean dependence structurevan Kampen, Maartenhttp://hdl.handle.net/2003/351662016-08-02T02:00:10Z2016-08-01T10:28:58ZTitle: Nonparametric IV regression with an Archimedean dependence structure
Authors: van Kampen, Maarten
Abstract: This paper provides a characterization of the completeness of a family of distributions
in terms of the copula between the random variables. We give sufficient conditions
for a family of Archimedean copulas to be (boundedly) complete. Some
copulas are typically excluded in nonparametric IV regression since they have
non-square integrable densities. We provide conditions under which we can identify
the nonparametric IV regression model if the dependence structure between
the regressors and instrument variables can be described by an Archimedean
copula.2016-08-01T10:28:58ZEfficient sampling in materials simulation - exploring the parameter space of grain boundariesDette, HolgerGoesmann, JosuaGreiff, ChristianJanisch, Rebeccahttp://hdl.handle.net/2003/351632016-07-30T02:00:09Z2016-07-29T10:59:20ZTitle: Efficient sampling in materials simulation - exploring the parameter space of grain boundaries
Authors: Dette, Holger; Goesmann, Josua; Greiff, Christian; Janisch, Rebecca
Abstract: In the framework of materials design there is the demand for extensive databases of specific materials
properties. In this work we suggest an improved strategy for creating future databases, especially for
extrinsic properties that depend on several material parameters. As an example we choose the energy of
grain boundaries as a function of their geometric degrees of freedom. The construction of existing databases
of grain boundary energies in face-centred and body centred cubic metals relied on the a-priori knowledge of
the location of important cusps and maxima in the five-dimensional energy landscape, and on an as-densely-
as-possible sampling strategy. We introduce two methods to improve the current state of the art. The
location and number of the energy minima along which the hierarchical sampling takes place is predicted
from existing data points without any a-priori knowledge, using a predictor function. Furthermore we
show that it is more efficient to use a sequential sampling in a \design of experiment" scheme, rather than
sampling all observations homogeneously in one batch. This sequential design exhibits a smaller error than
the simultaneous one, and thus can provide the same accuracy with fewer data points. The new strategy
should be particularly beneficial in the exploration of grain boundary energies in new alloys and/or non-cubic
structures.2016-07-29T10:59:20ZA focused information criterion for quantile regression: Evidence for the rebound effectBehl, PeterDette, HolgerFrondel, ManuelVance, Colinhttp://hdl.handle.net/2003/351602016-07-28T02:00:10Z2016-07-27T14:11:20ZTitle: A focused information criterion for quantile regression: Evidence for the rebound effect
Authors: Behl, Peter; Dette, Holger; Frondel, Manuel; Vance, Colin
Abstract: In contrast to conventional model selection criteria, the Focused Information
Criterion (FIC) allows for the purpose-specific choice of model specifications.
This accommodates the idea that one kind of model might be highly
appropriate for inferences on a particular focus parameter, but not for another.
Using the FIC concept that is developed by BEHL, CLAESKENS and DETTE (2014)
for quantile regression analysis, and the estimation of the rebound effect in individual
mobility behavior as an example, this paper provides for an empirical
application of the FIC in the selection of quantile regression models.2016-07-27T14:11:20ZModel robust designs for survival trialsKonstantinou, MariaBiedermann, StefanieKimber, Alanhttp://hdl.handle.net/2003/351562016-07-26T02:00:07Z2016-07-25T11:23:48ZTitle: Model robust designs for survival trials
Authors: Konstantinou, Maria; Biedermann, Stefanie; Kimber, Alan
Abstract: The exponential-based proportional hazards model is often assumed in time-
to-event experiments but may only approximately hold. We consider deviations
in different neighbourhoods of this model that include other widely used paramet-
ric proportional hazards models and we further assume that the data are subject
to censoring. Minimax designs are then found explicitly based on criteria corre-
sponding to classical c- and D-optimality. We provide analytical characterisations
of optimal designs which, unlike optimal designs for related problems in the litera-
ture, have finite support and thus avoid the issues of implementing a density-based
design in practice. Finally, our designs are compared with the balanced design that
is traditionally used in practice, and recommendations for practitioners are given.2016-07-25T11:23:48ZResidual-based inference on moment hypotheses, with an application to testing for constant correlationDemetrescu, MateiWied, Dominikhttp://hdl.handle.net/2003/351552016-07-23T02:01:09Z2016-07-22T13:37:55ZTitle: Residual-based inference on moment hypotheses, with an application to testing for constant correlation
Authors: Demetrescu, Matei; Wied, Dominik
Abstract: Often, inference on moment properties of unobserved processes are conducted on the basis of estimated
counterparts obtained in a preliminary step. In some situations, the use of residuals instead of the
true quantities affects inference even in the limit, while in others there is no asymptotic residual effect.
For the case of statistics based on partial sums of nonlinear functions of the residuals, we give here a
characterization of the conditions under which the residual effect does not vanish as the sample size goes
to infinity (generic regularity conditions provided). An overview of methods to account for the residual
effect is also provided. The analysis extends to models with change points in parameters at estimated
time, in spite of the discontinuous manner in which the break time enters the model of interest. To
illustrate the usefulness of the results, we propose a test for constant correlations allowing for breaks
at unknown time in the marginal means and variances. We find, in Monte Carlo simulations and in an
application to US and German stock returns, that not accounting for changes in the marginal moments
has severe consequences.2016-07-22T13:37:55ZAssessing the similarity of dose response and target doses in two non-overlapping subgroupsBretz, FrankMöllenhoff, KathrinDette, HolgerLiu, WeiTrampisch, Matthiashttp://hdl.handle.net/2003/351382016-07-14T02:00:14Z2016-07-13T13:02:57ZTitle: Assessing the similarity of dose response and target doses in two non-overlapping subgroups
Authors: Bretz, Frank; Möllenhoff, Kathrin; Dette, Holger; Liu, Wei; Trampisch, Matthias
Abstract: We consider two problems that are attracting increasing attention in clinical dose
finding studies. First, we assess the similarity of two non-linear regression models
for two non-overlapping subgroups of patients over a restricted covariate space. To
this end, we derive a confidence interval for the maximum difference between the two
given models. If this confidence interval excludes the equivalence margins, similarity
of dose response can be claimed. Second, we address the problem of demonstrating
the similarity of two target doses for two non-overlapping subgroups, using again a
confidence interval based approach. We illustrate the proposed methods with a real
case study and investigate their operating characteristics (coverage probabilities, Type
I error rates, power) via simulation.2016-07-13T13:02:57ZConditional heavy-tail behavior with applications to precipitation and river flow extremesKinsvater, PaulFried, Rolandhttp://hdl.handle.net/2003/351312016-07-06T01:40:50Z2016-07-04T09:48:33ZTitle: Conditional heavy-tail behavior with applications to precipitation and river flow extremes
Authors: Kinsvater, Paul; Fried, Roland
Abstract: This article deals with the right-tail behavior of a response distribution F_Y conditional on a regressor vector X = x restricted to the heavy-tailed case of Pareto-type conditional distributions F_Y (y| x) = P(Y ≤ y| X = x), with heaviness of the right tail characterized by the conditional extreme value index γ(x) > 0. We particularly focus on testing the hypothesis H_0;tail : γ(x) = γ0 of constant tail behavior for some
γ0 > 0 and all possible x.
When considering x as a time index, the term trend analysis is commonly used. In the recent past several such trend analyses in extreme value data have been published, mostly focusing on time-varying modeling of location and scale parameters of the response distribution. In many such environmental studies a simple test against trend based on Kendall's tau statistic is applied. This test is powerful when the center of the conditional distribution F_Y (y|x) changes monotonically in x, for instance, in a simple location model μ(x) = μ_0 + x * μ_1, x = (1, x)’, but the test is rather insensitive against monotonic tail behavior, say, μ(x) = η_0 + x * η_1. This has to be considered, since for many environmental applications the main interest is on the tail rather than the center of a distribution. Our work is motivated by this problem and it is our goal to demonstrate the opportunities and the limits of detecting and estimating non-constant conditional heavy-tail behavior with regard to applications from hydrology. We present and compare four different procedures by simulations and illustrate our findings on real data from hydrology: Weekly maxima of hourly precipitation from France and monthly maximal river
flows from Germany.2016-07-04T09:48:33ZModeling of Gibbs energies of pure elements down to 0K using segmented regressionRoslyakova, IrinaSundmann, BoDette, HolgerZhang, LijunSteinbach, Ingohttp://hdl.handle.net/2003/351302016-07-05T12:55:24Z2016-07-04T08:51:14ZTitle: Modeling of Gibbs energies of pure elements down to 0K using segmented regression
Authors: Roslyakova, Irina; Sundmann, Bo; Dette, Holger; Zhang, Lijun; Steinbach, Ingo
Abstract: A novel thermodynamic modeling strategy of stable solid alloy phases is
proposed based on segmented regression approach. The model considers several
physical effects (e.g. electronic, vibrational etc.) and is valid from 0K up to
the melting temperature. The preceding approach has been applied for several
pure elements. Results show good agreement with experimental data at low
and high temperatures. Since it is not a first attempt to propose a "universal"
physical-based model down to 0K for the pure elements as an alternative to
current SGTE description, we also compare the results to existing models.
Analysis of the obtained results shows that the newly proposed model delivers
more accurate description down to 0K for all studied pure elements according
to several statistical tests.2016-07-04T08:51:14ZRisk perception of climate change: Empirical evidence for GermanyFrondel, ManuelSimora, MichaelSommer, Stephanhttp://hdl.handle.net/2003/351252016-07-05T12:55:24Z2016-06-29T12:09:35ZTitle: Risk perception of climate change: Empirical evidence for Germany
Authors: Frondel, Manuel; Simora, Michael; Sommer, Stephan
Abstract: The perception of risks resulting from climate change is a key factor in motivating
individual adaptation and prevention behavior, as well as for the support of climate
policy measures. Using a generalized ordered logit approach and drawing on a
unique data set originating from two surveys conducted in 2012 and 2014, each among
more than 6,000 German households, we analyze the determinants of individual risk
perception associated with three kinds of natural hazards: heat waves, storms, and
floods. Our focus is on the role of objective risk measures and experience with these
natural hazards, whose frequency is likely to be affected by climate change. In line
with the received literature, the results suggest that personal experience with adverse
events and, even more importantly, personal damage therefrom are strong drivers of
individual risk perception.2016-06-29T12:09:35ZEstimation methods for the LRD parameter under a change in the meanRooch, AeneasZelo, IevaFried, Rolandhttp://hdl.handle.net/2003/351242016-07-06T01:40:50Z2016-06-28T14:50:45ZTitle: Estimation methods for the LRD parameter under a change in the mean
Authors: Rooch, Aeneas; Zelo, Ieva; Fried, Roland
Abstract: When analyzing time series which are supposed to exhibit long-range dependence (LRD), a basic
issue is the estimation of the LRD parameter, for example the Hurst parameter H 2 (1=2; 1). Conventional
estimators of H easily lead to spurious detection of long memory if the time series includes a shift in the
mean. This defect has fatal consequences in change-point problems: Tests for a level shift rely on H, which
needs to be estimated before, but this estimation is distorted by the level shift.
We investigate two blocks approaches to adapt estimators of H to the case that the time series includes
a jump and compare them with other natural techniques as well as with estimators based on the trimming
idea via simulations. These techniques improve the estimation of H if there is indeed a change in the mean.
In the absence of such a change, the methods little affect the usual estimation. As adaption, we recommend
an overlapping blocks approach: If one uses a consistent estimator, the adaption will preserve this property
and it performs well in simulations.2016-06-28T14:50:45ZGermany’s Energiewende: A tale of increasing costs and decreasing willingness-to-payAndor, Mark A.Frondel, ManuelVance, Colinhttp://hdl.handle.net/2003/351232016-07-05T12:55:24Z2016-06-28T14:48:24ZTitle: Germany’s Energiewende: A tale of increasing costs and decreasing willingness-to-pay
Authors: Andor, Mark A.; Frondel, Manuel; Vance, Colin
Abstract: This paper presents evidence that the accumulating costs of Germany’s ambitious
plan to transform its system of energy provision – the so-called Energiewende –
are butting up against consumers’ decreased willingness-to-pay (WTP) for it. Following
a descriptive presentation that traces the German promotion of renewable energy
technologies since 2000, we draw on two stated-preference surveys conducted in 2013
and 2015 that elicit the households’ WTP for green electricity. To deal with the bias
that typifies hypothetical responses, a switching regression model is estimated that
distinguishes respondents according to whether they express definite certainty in their
reported WTP. Our results reveal a strong contrast between the households’ general
acceptance of supporting renewable energy technologies and their own WTP for green
electricity.2016-06-28T14:48:24ZAdaptive grid semidefinite programming for finding optimal designsDuarte, Belmiro P.M.Wong, Weng KeeDette, Holgerhttp://hdl.handle.net/2003/351222016-07-06T01:40:50Z2016-06-28T14:33:05ZTitle: Adaptive grid semidefinite programming for finding optimal designs
Authors: Duarte, Belmiro P.M.; Wong, Weng Kee; Dette, Holger
Abstract: We find optimal designs for linear models using a novel algorithm that iteratively combines a Semidefinite
Programming (SDP) approach with adaptive grid (AG) techniques. The search space is first discretized
and SDP is applied to find the optimal design based on the initial grid. The points in the next grid set are
points that maximize the dispersion function of the SDP-generated optimal design using Nonlinear Programming
(NLP). The procedure is repeated until a user-specified stopping rule is reached. The proposed
algorithm is broadly applicable and we demonstrate its flexibility using (i) models with one or more variables,
and (ii) differentiable design criteria, such as A-, D-optimality, and non-differentiable criterion like
E-optimality, including the mathematically more challenging case when the minimum eigenvalue of the
information matrix of the optimal design has geometric multiplicity larger than 1. Our algorithm is computationally
efficient because it is based on mathematical programming tools and so optimality is assured at
each stage; it also exploits the convexity of the problems whenever possible. Using several linear models,
we show the proposed algorithm can efficiently find both old and new optimal designs.2016-06-28T14:33:05ZBeyond inequality: A novel measure of skewness and its propertiesKrämer, WalterDette, Holgerhttp://hdl.handle.net/2003/350872016-07-05T12:55:24Z2016-06-13T09:22:31ZTitle: Beyond inequality: A novel measure of skewness and its properties
Authors: Krämer, Walter; Dette, Holger
Abstract: We show that a recent appendix to the Gini-coeffcient to make
the latter more sensitive to asymmetric income distributions can be
viewed as an abstract measure of skewness. We develop some of its
properties and apply it to the US-income distribution in 1974 and
2010.2016-06-13T09:22:31ZBaPreStoPro: an R package for Bayesian prediction of stochastic processesHermann, Simonehttp://hdl.handle.net/2003/350662016-07-05T12:55:24Z2016-06-07T11:49:58ZTitle: BaPreStoPro: an R package for Bayesian prediction of stochastic processes
Authors: Hermann, Simone
Abstract: In many applications, stochastic processes are used for modeling. Bayesian
analysis is a strong tool for inference as well as for prediction. We here present
an R package for a large class of models, all based on the definition of a jump
diffusion with a non-homogeneous Poisson process. Special cases, as the Poisson
process itself, a general diffusion process or a hierarchical (mixed) diffusion model,
are considered. The package is a general tool box, because it is based on the
stochastic differential equation, approximated with the Euler scheme. Functions
for simulation, estimation and prediction are provided for each considered model.2016-06-07T11:49:58ZBayesian prediction for stochastic processesHermann, Simonehttp://hdl.handle.net/2003/350192016-07-05T12:55:24Z2016-06-03T11:24:37ZTitle: Bayesian prediction for stochastic processes
Authors: Hermann, Simone
Abstract: In many fields of statistical analysis, one is not only interested in estimation
of model parameters, but in a prediction for future observations. For stochastic
processes, on the one hand, one can be interested in the prediction for the further
development of the current, i.e. observed, series. On the other hand, prediction
for a new series can be of interest. This work presents two Bayesian prediction
procedures based on the transition density of the Euler approximation, that include
estimation uncertainty as well as the model variance. In a first algorithm,
the pointwise predictive distribution is calculated, in a second, trajectories will
be drawn. Both methods will be compared and analyzed with respect to their
advantages and drawbacks and set in contrast to two commonly used prediction
approaches.2016-06-03T11:24:37ZAsymmetry and performance metrics for equity returnsBowden, Roger J.Posch, Peter N.Ullmann, Danielhttp://hdl.handle.net/2003/350162016-07-05T12:55:24Z2016-06-03T08:35:29ZTitle: Asymmetry and performance metrics for equity returns
Authors: Bowden, Roger J.; Posch, Peter N.; Ullmann, Daniel
Abstract: An assumption of symmetric asset returns, together with globally risk averse utility
functions, is unappealing for fund managers and other activist investors, whose preferences
switch between risk aversion on the downside and risk seeking on the upside. A performance
return criterion is originated that is more consistent with the implicit Friedman-Savage utility
ordering. Adapted from recent developments in the income distribution literature, the proposed
metric weights the lower versus upper conditional expected returns, while a dual spread or
dispersion metric also exists. The resulting performance metric is easy to compute. A point of
departure is the conventional Sharpe performance ratio, with the empirical comparisons extending
to a range of existing performance criteria. In contrast, the proposed W-metric results in
different and more embracing performance rankings.2016-06-03T08:35:29ZDual disadvantage and dispersion dynamics for income distributionsBowden, Roger J.Posch, Peter N.Ullmann, Danielhttp://hdl.handle.net/2003/350152016-07-05T12:55:24Z2016-06-03T08:33:06ZTitle: Dual disadvantage and dispersion dynamics for income distributions
Authors: Bowden, Roger J.; Posch, Peter N.; Ullmann, Daniel
Abstract: Income distribution has been a longstanding focus of social and economic interest,
but never more so than in recent times. New metrics for disadvantage and spread enable a
more precise differentiation of directional asymmetry and dispersion, drawing on an internal
contextual perspective. The dual metrics for asymmetry and spread can be plotted over time
into a phase plane, enabling comparative social welfare perspectives over time and between
countries. The methods are utilised to study the dramatic changes that took place in Europe
prior to and after the GFC. Major differences are revealed. In terms of asymmetry and spread,
some countries have been fallers (lower in both) while other countries are risers.2016-06-03T08:33:06ZBayesian D-optimal designs for error-in-variables modelsKonstantinou, MariaDette, Holgerhttp://hdl.handle.net/2003/349662016-07-05T12:55:24Z2016-05-18T10:31:47ZTitle: Bayesian D-optimal designs for error-in-variables models
Authors: Konstantinou, Maria; Dette, Holger
Abstract: Bayesian optimality criteria provide a robust design strategy to parameter misspeci-
fication. We develop an approximate design theory for Bayesian D-optimality for non-
linear regression models with covariates subject to measurement errors. Both maximum
likelihood and least squares estimation are studied and explicit characterisations of the
Bayesian D-optimal saturated designs for the Michaelis-Menten, Emax and exponential
regression models are provided. Several data examples are considered for the case of no
preference for specific parameter values, where Bayesian D-optimal saturated designs are
calculated using the uniform prior and compared to several other designs, including the
corresponding locally D-optimal designs, which are often used in practice.2016-05-18T10:31:47ZNatural (non-)informative priors for skew-symmetric distributionsDette, HolgerLey, ChristopheRubio, Francisco J.http://hdl.handle.net/2003/349622016-05-12T02:00:13Z2016-05-11T11:28:49ZTitle: Natural (non-)informative priors for skew-symmetric distributions
Authors: Dette, Holger; Ley, Christophe; Rubio, Francisco J.
Abstract: In this paper, we present an innovative method for constructing proper priors for the
skewness parameter in the skew-symmetric family of distributions. The proposed method is
based on assigning a prior distribution on the perturbation effect of the skewness parameter,
which is quantified in terms of the Total Variation distance. We discuss strategies to translate
prior beliefs about the asymmetry of the data into an informative prior distribution of this
class. We show that our priors induce posterior distributions with good frequentist properties
via a Monte Carlo simulation study. We also propose a scale- and location-invariant prior
structure for models with unknown location and scale parameters and provide sufficient
conditions for the propriety of the corresponding posterior distribution. Illustrative examples
are presented using simulated and real data.2016-05-11T11:28:49ZTesting asymmetry in dependence with copula-coskewnessBücher, AxelIrresberger, FelixWeiss, Gregor N. F.http://hdl.handle.net/2003/349612016-05-12T02:00:11Z2016-05-11T10:14:08ZTitle: Testing asymmetry in dependence with copula-coskewness
Authors: Bücher, Axel; Irresberger, Felix; Weiss, Gregor N. F.
Abstract: A new measure of asymmetry in dependence is proposed which is based on
taking the difference between the margin-free coskewness parameters of the underlying
copula. The new measure and a related test are applied to both a hydrological and
a financial market data sample and we show that both samples exhibit systematic
asymmetric dependence.2016-05-11T10:14:08ZControl charts for the mean based on robust two-sample testsAbbas, SermadFried, Rolandhttp://hdl.handle.net/2003/349512016-05-03T02:01:03Z2016-05-02T13:33:57ZTitle: Control charts for the mean based on robust two-sample tests
Authors: Abbas, Sermad; Fried, Roland
Abstract: We propose and investigate robust control charts for the detection of sudden shifts in sequences of very noisy observations with a naturally slowly varying mean. They sequentially
apply local two-sample tests for the location problem. Thus, no previous knowledge about
the in-control behaviour is necessary.
We identify critical values for the tests to achieve a desired in-control average run length
(ARL_0) with extensive simulations. Control charts based on nonparametric tests or a randomization principle provide a satisfactory run length behaviour for different error distributions. They possess a nearly distribution-free ARL_0 and are fast in detecting present signal jumps in a time series.
In our simulations and exemplary real-world applications from biosignal analysis, a test based
on the two-sample Hodges-Lehmann estimator leads to very promising results regarding distribution independence, robustness and detection speed.2016-05-02T13:33:57ZThe power of mandatory quality disclosureFrondel, ManuelGerster, AndreasVance, Colinhttp://hdl.handle.net/2003/349122016-04-27T02:00:10Z2016-04-26T13:48:45ZTitle: The power of mandatory quality disclosure
Authors: Frondel, Manuel; Gerster, Andreas; Vance, Colin
Abstract: Many countries have introduced Energy Performance Certificates (EPCs) to mitigate
information asymmetry problems with respect to the thermal quality of houses. Using big data
on real estate advertisements that cover large parts of the German housing market, this paper
empirically investigates the consequences of a shift from a voluntary to a mandatory quality
disclosure regime on the offer prices of houses. Illustrated by a stylized theoretical model, we
test the following key hypothesis: Prices for houses whose owners would not voluntarily disclose
their house’s energy consumption in real estate advertisements should decrease upon a
shift to a mandatory disclosure scheme. Employing an instrumental variable approach to cope
with the endogeneity of disclosure decisions, our analysis demonstrates the relative advantage
of mandatory over voluntary disclosure rules.2016-04-26T13:48:45ZResistance to fatigue and prediction of lifetime of wire tendons cast into concrete up to 10^8 cyclesHeinrich, JensHeeke, GuidoMaurer, ReinhardMüller, Christine H.http://hdl.handle.net/2003/349112016-04-27T02:00:08Z2016-04-26T13:46:14ZTitle: Resistance to fatigue and prediction of lifetime of wire tendons cast into concrete up to 10^8 cycles
Authors: Heinrich, Jens; Heeke, Guido; Maurer, Reinhard; Müller, Christine H.
Abstract: Usually for verification of compliance, the fatigue resistance of prestressing steel is determined from tests of
naked specimens at 2 million cycles. However, for design the fatigue resistance of tendons cast into concrete,
is substantially lower. To verify the resistance of existing older prestressed concrete bridges and for the
design of new bridges, S-N curves of prestressing steel in curved steel ducts embedded into concrete are
needed. In bridges, the load cycles due to heavy vehicles may rise up to about 10E8 cycles or even more.
Previous tests with curved tendons in steel ducts primarily cover a range of up to about 20 million cycles.
Thereby no real endurance strength has been estimated jet. Hence the S-N curves given in Eurocode 2 and
Model Code 2010 are defined hypothetically for a range from 10^6 up to 10^8 and are not based on test results.
The reason is that experimental investigations in a range up to 10 8 cycles are very expensive and also demand
a very long duration.
Essential progress results from the development of an optimized test setup that allows a frequency of 10Hz
for the applied load cycles. Therewith, the experimental investigations up to 10^8 cycles have been done by
means of prestressed concrete beams with embedded curved tendons in steel ducts.
Furthermore, procedures to also forecast the lifetime in the case of very low stress ranges respectively the
remaining lifetime of a running test had been developed in conjunction with an interdisciplinary research
project. The procedures are based on refined statistical analysis of the extensively measured data including
increase of crack width, strains, sound emission etc. Additionally the analysis of the latter leads to some
interesting new perceptions.2016-04-26T13:46:14ZNeue Erkenntnisse zur Ermüdungsfestigkeit und Prognose der Lebensdauer von einbetonierten Spannstählen bei sehr hohen LastwechselzahlenHeeke, GuidoHeinrich, JensMaurer, ReinhardMüller, Christinehttp://hdl.handle.net/2003/349102016-04-27T02:00:12Z2016-04-26T13:43:10ZTitle: Neue Erkenntnisse zur Ermüdungsfestigkeit und Prognose der Lebensdauer von einbetonierten Spannstählen bei sehr hohen Lastwechselzahlen
Authors: Heeke, Guido; Heinrich, Jens; Maurer, Reinhard; Müller, Christine
Abstract: Für den Nachweis gegen Ermüdung von älteren bestehenden Spannbetonbrücken oder Neubauten werden Bemessungswöhlerlinien für Spannstahl im einbetonierten Zustand benötigt. Die bisherigen Versuche mit gekrümmten Spanngliedern im nachträglichen Verbund decken die Wöhlerlinie im Wesentlichen im Zeitfestigkeitsbereich ab. Nur sehr wenige Versuche weisen Lastspielzahlen über 10 Mio. auf, wobei keine echte Dauerfestigkeit erkennbar ist. Daher ist der Verlauf der Wöhlerlinie im Dauerfestigkeitsbereich im Wesentlichen hypothetisch festgelegt worden. Im Rahmen eines Teilprojektes im SFB 823 „Statistik nichtlinearer dynamischer Prozesse“ an der TU Dortmund wird die Ermüdungsfestigkeit einbetonierter Spannstähle im Bereich bis etwa 10^8 Lastzyklen experimentell untersucht. Des Weiteren wurden auf Grundlage der sehr umfangreichen Datenbasis verschiedener gemessener physikalischer Größen mit
mathematisch statistischen Methoden mehrere Prognoseverfahren entwickelt. Diese ermöglichen bei vorgegebener Aussagewahrscheinlichkeit die Abschätzung eines Prognoseintervalls mit Mittelwert- und Quantilkurven für den Schädigungsverlauf.2016-04-26T13:43:10ZPrediction intervals for the failure time of prestressed concrete beamsSzugat, SebastianHeinrich, JensMaurer, ReinhardMüller, Christine H.http://hdl.handle.net/2003/348962016-04-19T02:00:10Z2016-04-18T09:20:13ZTitle: Prediction intervals for the failure time of prestressed concrete beams
Authors: Szugat, Sebastian; Heinrich, Jens; Maurer, Reinhard; Müller, Christine H.
Abstract: The aim is the prediction of the failure time of prestressed concrete beams under low
cyclic load. Since the experiments last long for low load, accelerated failure tests with higher
load are conducted. However, the accelerated tests are expensive so that only few tests are
available. To obtain a more precise failure time prediction, the additional information of
time points of breakage of tension wires is used. These breakage time points are modeled
by a nonlinear birth process. This allows not only point prediction of a critical number
of broken tension wires but also prediction intervals which express the uncertainty of the
prediction.2016-04-18T09:20:13ZHomogeneity testing for skewed and cross-correlated data in regional flood frequency analysisLilienthal, JonaFried, RolandSchumann, Andreas H.http://hdl.handle.net/2003/348942016-04-16T02:00:11Z2016-04-15T11:35:30ZTitle: Homogeneity testing for skewed and cross-correlated data in regional flood frequency analysis
Authors: Lilienthal, Jona; Fried, Roland; Schumann, Andreas H.
Abstract: In regional frequency analysis the homogeneity of a group of multiple
stations is an essential pre-assumption. A standard procedure in hydrology to
evaluate this condition is the test based on the homogeneity measure of Hosking and
Wallis, which applies L-moments. Disadvantages of it are the lack of power when
analysing highly skewed data and the implicit assumption of spatial independence.
To face these issues we generalize this procedure in two ways. Copulas are utilised to
model intersite dependence and trimmed L-moments as a more robust alternative to
ordinary L-moments. The results of simulation studies are presented to discuss the
influence of different copula models and different trimming parameters. It turns out
that the usage of asymmetrically trimmed L-moments improves the heterogeneity
detection in skewed data, while maintaining a reasonable error rate. Simple copula
models are sufficient to incorporate the dependence structure of the data in the
procedure. Additionally, a more robust behaviour against extreme events at single
stations is achieved with the use of trimmed L-moments. Strong intersite dependence
and skewed data reveal the need of a modified procedure in a case study with data
from Saxony, Germany.2016-04-15T11:35:30ZSieve maximum likelihood estimation in a semi-parametric regression model with errors in variablesBelomestny, DenisKlochkov, EgorSpokoiny, Vladimirhttp://hdl.handle.net/2003/348882016-04-13T02:00:11Z2016-04-12T13:01:21ZTitle: Sieve maximum likelihood estimation in a semi-parametric regression model with errors in variables
Authors: Belomestny, Denis; Klochkov, Egor; Spokoiny, Vladimir
Abstract: The paper deals with a semi-parametric regression problem under deterministic
and regular design which is observed with errors. We first
linearise the problem using a sieve approach and then apply the total
penalised maximum likelihood estimator to the linearised model. Sufficient conditions for
√n-consistency and efficiency under parametric
assumption are derived and a possible misspecification bias under different
smoothness assumptions on the design is analysed. The Monte
Carlo simulations show the performance of the estimator with simulated
data.2016-04-12T13:01:21ZMultiscale inference for a multivariate density with applications to X-ray astronomyEckle, KonstantinBissantz, NicolaiDette, HolgerProksch, KatharinaEinecke, Sabrinahttp://hdl.handle.net/2003/348872016-04-13T02:00:08Z2016-04-12T12:49:07ZTitle: Multiscale inference for a multivariate density with applications to X-ray astronomy
Authors: Eckle, Konstantin; Bissantz, Nicolai; Dette, Holger; Proksch, Katharina; Einecke, Sabrina
Abstract: In this paper we propose methods for inference of the geometric features of a multivariate density. Our approach uses multiscale tests for the monotonicity of the density
at arbitrary points in arbitrary directions. In particular, a significance test for a mode at
a specific point is constructed. Moreover, we develop multiscale methods for identifying
regions of monotonicity and a general procedure for detecting the modes of a multivari-
ate density. It is is shown that the latter method localizes the modes with an effectively
optimal rate. The theoretical results are illustrated by means of a simulation study and
a data example. The new method is applied to and motivated by the determination
and verification of the position of high-energy sources from X-ray observations by the
Swift satellite which is important for a multiwavelength analysis of objects such as Active
Galactic Nuclei.2016-04-12T12:49:07ZOptimal designs for dose response curves with common parametersFeller, ChrystelSchorning, KirstenDette, HolgerBermann, GeorginaBornkamp, Björnhttp://hdl.handle.net/2003/348402016-03-16T03:00:14Z2016-03-15T14:46:31ZTitle: Optimal designs for dose response curves with common parameters
Authors: Feller, Chrystel; Schorning, Kirsten; Dette, Holger; Bermann, Georgina; Bornkamp, Björn
Abstract: A common problem in Phase II clinical trials is the comparison of dose response curves
corresponding to different treatment groups. If the effect of the dose level is described by
parametric regression models and the treatments differ in the administration frequency (but
not in the sort of drug) a reasonable assumption is that the regression models for the different
treatments share common parameters.
This paper develops optimal design theory for the comparison of different regression models
with common parameters. We derive upper bounds on the number of support points of
admissible designs, and explicit expressions for D-optimal designs are derived for frequently
used dose response models with a common location parameter. If the location and scale
parameter in the different models coincide, minimally supported designs are determined and
sufficient conditions for their optimality in the class of all designs derived. The results are
illustrated in a dose-finding study comparing monthly and weekly administration.2016-03-15T14:46:31ZModel based optimization of a statistical simulation model for single diamond grindingHerbrandt, SwetlanaLigges, UwePinho Ferreira, ManuelKansteiner, MichaelBiermann, DirkTillmann, WolfgangWeihs, Claushttp://hdl.handle.net/2003/345212016-02-27T03:00:10Z2016-02-26T12:35:25ZTitle: Model based optimization of a statistical simulation model for single diamond grinding
Authors: Herbrandt, Swetlana; Ligges, Uwe; Pinho Ferreira, Manuel; Kansteiner, Michael; Biermann, Dirk; Tillmann, Wolfgang; Weihs, Claus
Abstract: We present a model for simulating normal forces arising during a grind-
ing process in cement for single diamond grinding. Assuming the diamond to have
the shape of a pyramid, a very fast calculation of force and removed volume can be
achieved. The basic approach is the simulation of the scratch track. Its triangle profile
is determined by the shape of the diamond. The approximation of the scratch
track is realized by stringing together polyhedra. Their sizes depend on both the
actual cutting depth and an error implicitly describing the material brittleness.
Each scratch track part can be subdivided into three three-dimensional simplices
for a straightforward calculation of the removed volume. Since the scratched min-
eral subsoil is generally inhomogeneous, the forces at different positions of the
workpiece are expected to vary. This heterogeneous nature is considered by sam-
pling from a Gaussian random field.
To achieve a realistic outcome the model parameters are adjusted applying
model based optimization methods. A noisy Kriging model is chosen as surrogate
to approximate the deviation between modelled and observed forces. This devia-
tion is minimized and the results of the modelled forces and the actual forces from
conducted experiments are rather similar.2016-02-26T12:35:25ZOptimal designs for regression models with autoregressive errors structureDette, HolgerPepelyshev, AndreyZhigljavsky, Anatolyhttp://hdl.handle.net/2003/345122016-02-16T03:00:10Z2016-02-15T12:56:47ZTitle: Optimal designs for regression models with autoregressive errors structure
Authors: Dette, Holger; Pepelyshev, Andrey; Zhigljavsky, Anatoly
Abstract: In the one-parameter regression model with AR(1) and AR(2) errors we find explicit
expressions and a continuous approximation of the optimal discrete design for the signed
least square estimator. The results are used to derive the optimal variance of the best
linear estimator in the continuous time model and to construct efficient estimators and
corresponding optimal designs for finite samples. The resulting procedure (estimator and
design) provides nearly the same efficiency as the weighted least squares and its variance
is close to the optimal variance in the continuous time model. The results are illustrated
by several examples demonstrating the feasibility of our approach.2016-02-15T12:56:47ZGovernment spending shocks and labor productivityLinnemann, LudgerGábor, B. UhrinWagner, Martinhttp://hdl.handle.net/2003/345112016-02-16T03:00:08Z2016-02-15T12:54:19ZTitle: Government spending shocks and labor productivity
Authors: Linnemann, Ludger; Gábor, B. Uhrin; Wagner, Martin
Abstract: A central question in the empirical fiscal policy literature is the magnitude, in fact even
the sign, of the fiscal multiplier. Standard identification schemes for fiscal VAR models
typically imply positive output as well as labor productivity responses to expansionary
government spending shocks. The standard macro assumption of decreasing returns to
labor, however, implies that expansionary government spending shocks should lead to
increasing output and hours, but to decreasing labor productivity. To potentially reconcile
theory and empirical analysis we impose, amongst other sign restrictions, opposite signs
of the impulse responses of output and labor productivity to government spending shocks
in eight- to ten-variable VAR models, estimated on quarterly US data. Doing so leads to
contractionary effects of positive government spending shocks. This potentially surprising
finding is robust to the inclusion of variable capital utilization rates and total factor
productivity.2016-02-15T12:54:19ZComparing default predictions in the rating industry for different sets of obligorsKrämer, WalterNeumärker, Simonhttp://hdl.handle.net/2003/344992016-02-03T03:00:23Z2016-02-02T14:54:59ZTitle: Comparing default predictions in the rating industry for different sets of obligors
Authors: Krämer, Walter; Neumärker, Simon
Abstract: We generalize the refinement ordering for well calibrated probability forecasters to the
case were the debtors under consideration are not necessarily identical. This ordering is
consistent with many well known skill scores used in practice. We also add an illustration
using default predictions made by the leading rating agencies Moody's and S&P.2016-02-02T14:54:59Z