Menu

Scored Annotation confidence value - explanation needed

Q3Varnam
2014-07-15
2015-01-26
  • Q3Varnam

    Q3Varnam - 2014-07-15

    I have managed to get a face recognition program working using LBPH. I am using ChiSquare comparison for the histograms. With K=1 and threshold of 100. Other values gridx=y=8 and radius = 1 and samples =8.

    When I print the returned value of the recognise method, the list

    List<independentpair\<detectedface, list\<scoredannotation\<string="">>>> contains the image name and a score, this appears as</independentpair\<detectedface,>

    [[org.openimaj.image.processing.face.detection.DetectedFace@15ebe925,[(Current7, 1.0)]]]

    What does the float value signify ?

    Because if I set K=2, I get two images returned with equal values

    [[org.openimaj.image.processing.face.detection.DetectedFace@41f99142,[(man, 0.5), (Current9, 0.5)]]]

    In OpenCV I am used to getting a distance value which is normally below the threshold i.e in this case a value less than 100

    Can you please explain how this value returned by openIMAJ recogniser is used.

     
  • Jonathon Hare

    Jonathon Hare - 2014-07-15

    The values are the probability of a particular class (or person), based on the number of nearest neighbour instances requested (or the total number of selected neighbours (with distance below the threshold) if it's smaller), and number of neighbour instances belonging to each class. If K was 3, and two of the neighbours were 'Jon' and one was 'Sina', then the values would be 0.6666 and 0.3333 respectively. Normally, you'd just pick the class with the highest probability as the best guess.

     
  • Q3Varnam

    Q3Varnam - 2014-07-16

    Many thanks, I did guess that it was the probability . I wanted to compare the distance value computed by your code against that of opencv when using the same set of images - is there a method that will expose this value?, I looked in the KNNAnnotator class - the distance value is discarded after comparison.

     
  • Jonathon Hare

    Jonathon Hare - 2014-07-16

    There's no way to do it using the KNNAnnotator, but you could compute it directly as follows:

        final FaceDetector<DetectedFace, FImage> faceDetector = HaarCascadeDetector.BuiltInCascade.frontalface_alt2
                .load();
    
        final LocalLBPHistogram.Extractor<DetectedFace> extr = new LocalLBPHistogram.Extractor<DetectedFace>(
                new IdentityAligner<DetectedFace>());
    
        final FImage img1 = ImageUtilities.readF(new File("image1.jpg"));
        final FImage img2 = ImageUtilities.readF(new File("image2.jpg"));
    
        //assuming a single face in each image
        FloatFV feature1 = extr.extractFeature(faceDetector.detectFaces(img1).get(0)).getFeatureVector();
        FloatFV feature2 = extr.extractFeature(faceDetector.detectFaces(img2).get(0)).getFeatureVector();
    
        System.out.println(FloatFVComparison.CHI_SQUARE.compare(feature1, feature2));
    
     
  • Q3Varnam

    Q3Varnam - 2014-07-16

    Many thanks for your quick reply.

     
  • Anonymous

    Anonymous - 2015-01-26
    Post awaiting moderation.

Anonymous
Anonymous

Add attachments
Cancel





Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.