Explainable online ensemble of deep neural network pruning for time series forecasting

dc.contributor.authorSaadallah, Amal
dc.contributor.authorJakobs, Matthias
dc.contributor.authorMorik, Katharina
dc.date.accessioned2023-08-30T10:18:32Z
dc.date.available2023-08-30T10:18:32Z
dc.date.issued2022-08-02
dc.description.abstractBoth the complex and evolving nature of time series data make forecasting among one of the most challenging tasks in machine learning. Typical methods for forecasting are designed to model time-evolving dependencies between data observations. However, it is generally accepted that none of them are universally valid for every application. Therefore, methods for learning heterogeneous ensembles by combining a diverse set of forecasters together appears as a promising solution to tackle this task. While several approaches in the context of time series forecasting have focused on how to combine individual models in an ensemble, ranging from simple and enhanced averaging tactics to applying meta-learning methods, few works have tackled the task of ensemble pruning, i.e. individual model selection to take part in the ensemble. In addition, in classical ML literature, ensemble pruning techniques are mostly restricted to operate in a static manner. To deal with changes in the relative performance of models as well as changes in the data distribution, we employ gradient-based saliency maps for online ensemble pruning of deep neural networks. This method consists of generating individual models’ performance saliency maps that are subsequently used to prune the ensemble by taking into account both aspects of accuracy and diversity. In addition, the saliency maps can be exploited to provide suitable explanations for the reason behind selecting specific models to construct an ensemble that plays the role of a forecaster at a certain time interval or instant. An extensive empirical study on many real-world datasets demonstrates that our method achieves excellent or on par results in comparison to the state-of-the-art approaches as well as several baselines. Our code is available on Github (https://github.com/MatthiasJakobs/os-pgsm/tree/ecml_journal_2022).en
dc.identifier.urihttp://hdl.handle.net/2003/42086
dc.identifier.urihttp://dx.doi.org/10.17877/DE290R-23919
dc.language.isoende
dc.relation.ispartofseriesMachine learning;111(9)
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/de
dc.subjectEnsemble pruningen
dc.subjectDeep learningen
dc.subjectOnline learningen
dc.subjectTime series forecastingen
dc.subjectConcept driften
dc.subjectExplainabilityen
dc.subjectSaliency mapsen
dc.subject.ddc004
dc.subject.rswkDeep learningde
dc.subject.rswkZeitreihede
dc.subject.rswkMaschinelles Lernende
dc.subject.rswkSalienzde
dc.titleExplainable online ensemble of deep neural network pruning for time series forecastingen
dc.typeTextde
dc.type.publicationtypeArticlede
dcterms.accessRightsopen access
eldorado.secondarypublicationtruede
eldorado.secondarypublication.primarycitationSaadallah, A., Jakobs, M. & Morik, K. Explainable online ensemble of deep neural network pruning for time series forecasting. Mach Learn 111, 3459–3487 (2022). https://doi.org/10.1007/s10994-022-06218-4de
eldorado.secondarypublication.primaryidentifierhttps://doi.org/10.1007/s10994-022-06218-4de

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
s10994-022-06218-4.pdf
Size:
1.55 MB
Format:
Adobe Portable Document Format
Description:
DNB
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
4.85 KB
Format:
Item-specific license agreed upon to submission
Description: