Re: [Dclib-devel] On svm_pegasos
Brought to you by:
davisking
From: Davis K. <dav...@us...> - 2009-09-23 11:48:47
|
It's really just a combination of the SVM solver from the paper Pegasos: Primal estimated sub-gradient solver for SVM (2007) by Yoram Singer, Nathan Srebro and the sparsification method from this other paper, The Kernel Recursive Least Squares Algorithm by Yaakov Engel. The kcentroid implements Engel's sparsification trick in a nice reuseable way and svm_pegasos is basically straight out of Singer's paper. There are also a lot of comments in the code. I also added something to help automatically adjust the tolerance (this is the parameter of Engel's ALD condition). So when you ask a kcentroid to only use so many dictionary vectors all it really does is estimate the tolerance that would have caused you to use only that many and then to discard vectors above that threshold. But this is a minor and straightforward addition and doesn't change the results. It just makes life easier on the user. Cheers, Davis 2009/9/23 Q. W. Xiao <qw...@li...> > Dear Mr. King, > > I met the dlib C++ library today, and I am very interested in the > impletation of svm_pegasos. You said that "this object uses the kcentroid > object to maintain a sparse approximation of the learned decision function", > and then “the number of support vectors in the resulting decision function > is also unrelated to the size of the dataset”. This is a very attractive > property. Could you please provide some materials giving the details of this > method? > > Best regards, > > Quan-Wu Xiao > ------------------------------ > Invite your mail contacts to join your friends list with Windows Live > Spaces. It's easy! Try it!<http://spaces.live.com/spacesapi.aspx?wx_action=create&wx_url=/friends.aspx&mkt=en-us> > |