Online kernel learning with nearly constant support vectors
详细信息    查看全文
文摘
Nyström method has been widely used to improve the computational efficiency of batch kernel learning. The key idea of Nyström method is to randomly sample M support vectors from the collection of T   training instances, and learn a kernel classifier in the space spanned by the randomly sampled support vectors. In this work, we studied online regularized kernel learning using the Nyström method, with a focus on the sample complexity, i.e. the number of randomly sampled support vectors that are needed to yield the optimal convergence rate O(1/T), where T   is the number of training instances received in online learning. We show that, when the loss function is smooth and strongly convex, only View the MathML source randomly sampled support vectors are needed to guarantee an O(logT/T) convergence rate, which is almost optimal except for the View the MathML source factor. We further validate our theory by an extensive empirical study.
NGLC 2004-2010.National Geological Library of China All Rights Reserved.
Add:29 Xueyuan Rd,Haidian District,Beijing,PRC. Mail Add: 8324 mailbox 100083
For exchange or info please contact us via email.