Bibtype |
Inproceedings |
Bibkey |
Pfahler/Morik/2018a |
Author |
Pfahler, Lukas and Morik, Katharina |
Title |
Nystroem-SGD: Rapidly Learning Kernel-Classifiers with Conditioned Stochastic Gradient Descent |
Booktitle |
Machine Learning and Knowledge Discovery in Databases - European Conference, {ECML} {PKDD} 2018, Dublin, Ireland |
Abstract |
Kernel methods are a popular choice for classification problems, but when solving large-scale learning tasks computing the quadratic kernel matrix quickly becomes infeasible. To circumvent this problem, the Nyström method that approximates the kernel matrix using only a smaller sample of the kernel matrix has been proposed. Other techniques to speed up kernel learning include stochastic first order optimization and conditioning.We introduce Nyström-SGD, a learning algorithm that trains kernel classifiers by minimizing a convex loss function with conditioned stochastic gradient descent while exploiting the low-rank structure of a Nyström kernel approximation.Our experiments suggest that the Nyström-SGD enables us to rapidly train high-accuracy classifiers for large-scale classification tasks.
|
Year |
2018 |
Projekt |
SFB876-C3 |