#8 libsvm cache_size

open
nobody
None
5
2007-12-10
2007-12-10
David Doukhan
No

Hi, it seems that the argument cache_size is not correctly set up in the svm_parameter structure passed to libsvm.svm_train_one_pyml in the method trainLibsvm of the module PyML.svm.

After having added some verboses in the svm_train_one_pyml function, it appeared that the cache_size value of the svm_parameter structure is always set to 0.

I think I found a way to fix that bugg:
while calling misc.update in the method trainLibsvm of the module PyML.svm, the actual code is the following:

> misc.update(param,
> kernel_type = LINEAR,
> svm_type = self.svm_type,
> cach_size = self.cacheSize,
> eps = self.eps,
> C = self.C,
> nu = 0.5,
> degree = 2,
> p = 0.1,
> shrinking = 1,
> nr_weight = 0,
> coef0 = 0)

changing the following line:
> cach_size = self.cacheSize,
into:
> cache_size = self.cacheSize,

would fix the bugg.

A trick that could allow to avoid similar errors would be to call the function svm_check_parameter of libsvm before validating the parameters passed to libsvm: in this context, that function would have told that a cache size of 0 is not an acceptable value.

Fortunately, even with this bug, the behavior of the SVM classifier is not modified, it may afect only the speed of the training algorithm... I still don't know exactly the impact of that argument.

Discussion