Hi,

Your question got my attention as I was just reading today some CVPR'12 about how to calibrate the scores of the SVM classifiers into a probability distribution using the Extreme Value Theory.
The authors do provide the code wrapped into python using SWIG, tried to hack it today but did not yet succeed, probably should explore it further. Give it a try!

Best,
A.Eweiwi

On Mon, Aug 13, 2012 at 9:49 PM, Andreas Mueller <amueller@ais.uni-bonn.de> wrote:
On 08/13/2012 08:29 PM, Abhi wrote:
> Andreas Müller <amueller@...> writes:
>
>> Alternatively you could look at the output of "decision_function" in
> LinearSVC.
>> These do not represent probabilities, though.
>>
>> Andy
>>
>
> Hi Andy, thanks for pointing me towards that. I looked around online but I'm
>   still not sure how I can use the decision_function method to determine how good
>   the match was (i.e. how confident LinearSVC's prediction that the input is in
>   this category is). Could you shed some light on this?
>
The problem with the values are that they are not normalized, so the
range is
hard to interpret. If you have a two-class problem, higher means more
confident for
positive class and lower means more confident for negative class.
This value can be used for example to plot roc-curves and do
precision-recall trade offs.

I am using "confident" here in an informal way.

If you want real probabilities, try LogisticRegression, as I said in my
other mail.

Cheers,
Andy

------------------------------------------------------------------------------
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and
threat landscape has changed and how IT managers can respond. Discussions
will include endpoint security, mobile security and the latest in malware
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general