Menu

Machine learning and video tracking

Anonymous
2016-02-07
2016-02-10
  • Anonymous

    Anonymous - 2016-02-07

    First, thanks a lot to the developers of this awesome program.

    I'd like to identify the following patterns in all the frames of the video of a tennis game: the Players (2), the Ball being played(1), the Net, the Court and the Background. These patterns can overlap (players can be on the court, ball can be in the background).

    In order to track the ball first, I've tried a FournierTemplateMatcher (see code below), but the results are not accurate as seen below, the ball is mistaken for other elements.

    I've also tried a SIFT with similar results. The ball gets mistaken for other elements.

    Given the difficulty of recognizing these different moving elements that are moving and rotating in 3d, I believe that my best chance of success is to use the machine learning in a similar fashion as caltech 101: giving the algorithm enough images of players, balls and courts to let it recognize them all separately. Problem is, I don't know how to correctly annotate the images, or how to parametrize the algorithm. Sorry for not having more specific questions, but

    1. how should I start? where can I find documentation? Any help would be much appreciated.

    2. Would another possibility be to use the FelzenszwalbHuttenlocherSegmenter? This,at least, sees the ball as a separate object. see here

    Thanks again, Cheers

    Gauthier

    query image: here
    short video stream: https://youtu.be/zWz-f5jW_gY

    ***Fournier Template Matcher code:

    public class templatevid {
        public static void main( String[] args ) throws IOException {
            MBFImage q = ImageUtilities.readMBF(new File("D:/down/tennis/TENNISBALLV4.jpg"));
            Video<MBFImage> video;
            video = new XuggleVideo(new File("D:/down/tennis/video/feddjoko.mkv"));
    
            for (MBFImage mbfImage : video) {       
            MBFImage target = mbfImage;
            MBFImage query = q.clone();
            FourierTemplateMatcher tm = new FourierTemplateMatcher(query.flatten(), FourierTemplateMatcher.Mode.CORRELATION_COEFFICIENT ) ;
            tm.analyseImage(target.flatten());
            FValuePixel best1 = tm.getBestResponses(3)[0] ;
            FValuePixel best2 = tm.getBestResponses(3)[1] ;
            FValuePixel best3 = tm.getBestResponses(3)[2] ;
            target.drawShape(new Ellipse(best1.getX(), best1.getY(), 10f, 10f, 0), RGBColour.RED);
            target.drawShape(new Ellipse(best2.getX(), best2.getY(), 10f, 10f, 0), RGBColour.RED);
            target.drawShape(new Ellipse(best3.getX(), best3.getY(), 10f, 10f, 0), RGBColour.RED);
            DisplayUtilities.display(target,"result");
            System.out.println(best1.getX()+" "+best1.getY()+"\t"+best2.getX()+" "+best2.getY()+"\t"+best3.getX()+" "+best3.getY());
            DisplayUtilities.display(query,"query");
        }
        }
    

    *** SIFT CODE

        public class App {
    public static void main( String[] args ) throws IOException {
        Video<MBFImage> video;
        video = new XuggleVideo(new File("D:/down/tennis/video/feddjoko.mkv"));
        DoGSIFTEngine engine = new DoGSIFTEngine();
        for (MBFImage mbfImage : video) {
    
            MBFImage query = ImageUtilities.readMBF(new File("D:/down/tennis/TENNISBALLV4.jpg"));
            MBFImage target = mbfImage;
    
                        LocalFeatureList<Keypoint> queryKeypointsc3_2 = engine.findFeatures(query.flatten());
                        LocalFeatureList<Keypoint> targetKeypointsc3_2 = engine.findFeatures(target.flatten());
                        LocalFeatureMatcher<Keypoint> matcherc3_2 = new BasicTwoWayMatcher<Keypoint>();
                        matcherc3_2.setModelFeatures(queryKeypointsc3_2);
                        matcherc3_2.findMatches(targetKeypointsc3_2);
                        RobustHomographyEstimator modelFitterc3_2 = new RobustHomographyEstimator(1,50,new RANSAC.NumberInliersStoppingCondition(1), HomographyRefinement.NONE);
                        matcherc3_2 = new ConsistentLocalFeatureMatcher2d<Keypoint>(new BasicTwoWayMatcher<Keypoint>(), modelFitterc3_2);
                        matcherc3_2.setModelFeatures(queryKeypointsc3_2);
                        matcherc3_2.findMatches(targetKeypointsc3_2);
    
                        MBFImage consistentMatches3_2 = MatchingUtilities.drawMatches(query, target, matcherc3_2.getMatches(),RGBColour.WHITE);
                        DisplayUtilities.displayName(consistentMatches3_2,"RANSAC");
    
            }
    

    FHS CODE

    public class App {
        public static void main( String[] args ) throws IOException {
            MBFImage input = ImageUtilities.readMBF(new File("D:/down/tennis/capture.jpg"));
            MBFImage clone = input.clone();
            FelzenszwalbHuttenlocherSegmenter test = new FelzenszwalbHuttenlocherSegmenter(2f,5f,100);
            List<ConnectedComponent> comp = test.segment(clone);
            SegmentationUtilities.renderSegments(clone, comp);
            MBFImage clone2 = ColourSpace.convert(clone, ColourSpace.RGB);
            DisplayUtilities.display(clone2);
            ImageUtilities.write(clone, new File("D:/down/tennis/export-FHS.jpg"));
        }
    }
    
     
    • Anonymous

      Anonymous - 2017-07-21
      Post awaiting moderation.
    • Anonymous

      Anonymous - 2017-08-26
      Post awaiting moderation.
    • Anonymous

      Anonymous - 2017-08-26
      Post awaiting moderation.

Anonymous
Anonymous

Add attachments
Cancel





Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.