Menu

OpenIMAJ vs iPhone confidence values

Anonymous
2015-09-16
2015-09-16
  • Anonymous

    Anonymous - 2015-09-16

    Hi,

    I am working on a simple face detection application. The workflow is pretty simple. If an image containing a person was taken with an iPhone camera then I infer face data using the xmp/mwg metadata. Else, I use OpenIMAJ's HaarCascadeDetector.

    For the same image, the reported confidence values are different. For example, the value of the metadata attribute apple-fi:ConfidenceLevel is 283, whilst the value from the corresponding DetectedFace object is 47.

    Clearly the two values are being calculated differently... only I can't see if/where Apple have published their methodology.

    Does anyone know how I could approximately transform one value to the other?

    (Apologies in advance if this question had already been answered, or indeed is too naive.)

    Cheers,

    Anthony

     
  • Jonathon Hare

    Jonathon Hare - 2015-09-16

    As far as I'm aware Apple don't publish any info on their implementation. The OpenIMAJ implementation of confidence for the HaarCascadeDetector is unnormalised and is based on the number of independent detections from the grouping stage of the cascade process.

    It might be possible to figure out some kind of approximate relationship between the confidences of the two different implementations, however you might find they are almost completely uncorrelated.

     

Anonymous
Anonymous

Add attachments
Cancel





Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.