|Title:||Boosting classifiers for drifting concepts|
|Abstract:||This paper proposes a boosting-like method to train a classifier ensemble from data streams. It naturally adapts to concept drift and allows to quantify the drift in terms of its base learners. The algorithm is empirically shown to outperform learning algorithms that ignore concept drift. It performs no worse than advanced adaptive time window and example selection strategies that store all the data and are thus not suited for mining massive streams.|
|Subject Headings:||Base learners|
Mining massive streams
|Appears in Collections:||Sonderforschungsbereich (SFB) 475|
This item is protected by original copyright
All resources in the repository are protected by copyright.