Autor(en): Bockermann, Christian
Lee, Sangkyun
Titel: Scalable stochastic gradient descent with improved confidence
Sprache (ISO): en
Zusammenfassung: Stochastic gradient descent methods have been quite successful for solving large- scale and online learning problems. We provide a simple parallel framework to obtain solutions of high confidence, where the confidence can be easily controlled by the number of processes, independently of the length of learning processes. Our framework is implemented as a scalable open-source software which can be configured for a single multicore machine or for a cluster of computers, where the training outcomes from independent parallel processes are combined to produce the final output.
URI: http://hdl.handle.net/2003/29345
http://dx.doi.org/10.17877/DE290R-3293
Erscheinungsdatum: 2012-02-28
Ist Teil von: NIPS Workshop on Big Learning -- Algorithms, Systems, and Tools for Learning at Scale
Enthalten in den Sammlungen:Sonderforschungsbereich (SFB) 876

Dateien zu dieser Ressource:
Datei Beschreibung GrößeFormat 
lee_bockermann_2011a_2.pdfDNB185.65 kBAdobe PDFÖffnen/Anzeigen


Diese Ressource ist urheberrechtlich geschützt.



Diese Ressource ist urheberrechtlich geschützt. rightsstatements.org