You can subscribe to this list here.
2004 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(59) |
Jun
(40) |
Jul
(59) |
Aug
(81) |
Sep
(14) |
Oct
(9) |
Nov
(22) |
Dec
(1) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2005 |
Jan
(25) |
Feb
(3) |
Mar
(27) |
Apr
(14) |
May
(15) |
Jun
(112) |
Jul
(44) |
Aug
(7) |
Sep
(18) |
Oct
(34) |
Nov
(17) |
Dec
(20) |
2006 |
Jan
(12) |
Feb
|
Mar
(1) |
Apr
|
May
|
Jun
(3) |
Jul
(1) |
Aug
|
Sep
|
Oct
|
Nov
(1) |
Dec
(11) |
From: Pierre M. <sid...@us...> - 2005-07-31 18:53:08
|
Update of /cvsroot/robotflow/RobotFlow/Probes/src In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv3307 Modified Files: Makefile.am Log Message: Added include path for OpenCV required by VisualROISelection. Index: Makefile.am =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Probes/src/Makefile.am,v retrieving revision 1.14 retrieving revision 1.15 diff -C2 -d -r1.14 -r1.15 *** Makefile.am 23 Jun 2005 20:58:56 -0000 1.14 --- Makefile.am 31 Jul 2005 18:52:59 -0000 1.15 *************** *** 34,38 **** INCLUDES = -I../include $(OVERFLOW_INCLUDE) $(GNOME_INCLUDE) \ ! -I../../Vision/include $(SDL_INCLUDE) -I$(OVERFLOW_DATA)/HMM $(OPENCV_INCLUDES) LDADD = $(SDL_LIB) --- 34,39 ---- INCLUDES = -I../include $(OVERFLOW_INCLUDE) $(GNOME_INCLUDE) \ ! -I../../Vision/include -I ../../OpenCV/include \ ! $(SDL_INCLUDE) -I$(OVERFLOW_DATA)/HMM $(OPENCV_INCLUDES) LDADD = $(SDL_LIB) |
Update of /cvsroot/robotflow/RobotFlow/Vision/src In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv9230/src Modified Files: Makefile.am Removed Files: ColorHistExtraction.cc CvFaceDetection.cc DrawPFParticle.cc GetVisualROIParam.cc IntegralColorExtraction.cc IntegralEdgesOriExtraction.cc IntegralLBPExtraction.cc MeanShiftTracker.cc MultiIntegralCuesPFTracker.cc PFGenericParticle.cc PFGenericParticleFilter.cc PFPMRandomWalk.cc PFState2VisualROI.cc PFUtilityFct.cc TextSignDetect.cc VisualFeatureDesc.cc VisualHistogramDesc.cc VisualIntegralDesc.cc VisualROI.cc VisualTarget.cc VisualTargetManager.cc VisualTargetPFInterface.cc Log Message: moved OpenCV stuff into the OpenCV directory --- CvFaceDetection.cc DELETED --- --- VisualFeatureDesc.cc DELETED --- --- PFUtilityFct.cc DELETED --- --- IntegralColorExtraction.cc DELETED --- --- VisualIntegralDesc.cc DELETED --- --- VisualTarget.cc DELETED --- --- MultiIntegralCuesPFTracker.cc DELETED --- --- VisualHistogramDesc.cc DELETED --- --- MeanShiftTracker.cc DELETED --- --- PFGenericParticle.cc DELETED --- --- IntegralEdgesOriExtraction.cc DELETED --- --- DrawPFParticle.cc DELETED --- --- TextSignDetect.cc DELETED --- --- IntegralLBPExtraction.cc DELETED --- --- PFGenericParticleFilter.cc DELETED --- --- VisualROI.cc DELETED --- --- ColorHistExtraction.cc DELETED --- --- PFPMRandomWalk.cc DELETED --- Index: Makefile.am =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Vision/src/Makefile.am,v retrieving revision 1.37 retrieving revision 1.38 diff -C2 -d -r1.37 -r1.38 *** Makefile.am 1 Jul 2005 17:31:07 -0000 1.37 --- Makefile.am 28 Jul 2005 15:09:19 -0000 1.38 *************** *** 6,36 **** lib_LTLIBRARIES = libVision.la - if WITH_OPENCV - OPENCV_SOURCES = ColorHistExtraction.cc \ - CvFaceDetection.cc \ - DrawPFParticle.cc \ - GetVisualROIParam.cc \ - IntegralColorExtraction.cc \ - IntegralEdgesOriExtraction.cc \ - IntegralLBPExtraction.cc \ - MeanShiftTracker.cc \ - MultiIntegralCuesPFTracker.cc \ - PFGenericParticle.cc \ - PFGenericParticleFilter.cc \ - PFPMRandomWalk.cc \ - PFUtilityFct.cc \ - PFState2VisualROI.cc \ - TextSignDetect.cc \ - VisualFeatureDesc.cc \ - VisualHistogramDesc.cc \ - VisualIntegralDesc.cc \ - VisualROI.cc \ - VisualTarget.cc \ - VisualTargetManager.cc \ - VisualTargetPFInterface.cc - else - OPENCV_SOURCES = - endif - # Sources for compilation in the library libVision_la_SOURCES = BMPLoad.cc \ --- 6,9 ---- *************** *** 91,102 **** RGB5552RGB24.cc \ SkinColorGMMSegm.cc \ ! SkinColorHistSegm.cc $(OPENCV_SOURCES) ! ! ! ! libVision_la_LDFLAGS = -release $(LT_RELEASE) $(PIXBUF_LIBS) $(GNOME_LIB) $(JPEG_LIB) $(OPENCV_LIBS) ! INCLUDES = -I../include $(OVERFLOW_INCLUDE) -I$(OVERFLOW_DATA)/HMM $(PIXBUF_INCLUDE) $(JPEG_INCLUDE) $(OPENCV_INCLUDES) install-data-local: --- 64,72 ---- RGB5552RGB24.cc \ SkinColorGMMSegm.cc \ ! SkinColorHistSegm.cc ! libVision_la_LDFLAGS = -release $(LT_RELEASE) $(PIXBUF_LIBS) $(GNOME_LIB) $(JPEG_LIB) ! INCLUDES = -I../include $(OVERFLOW_INCLUDE) -I$(OVERFLOW_DATA)/HMM $(PIXBUF_INCLUDE) $(JPEG_INCLUDE) install-data-local: --- VisualTargetPFInterface.cc DELETED --- --- PFState2VisualROI.cc DELETED --- --- VisualTargetManager.cc DELETED --- --- GetVisualROIParam.cc DELETED --- |
Update of /cvsroot/robotflow/RobotFlow/Vision/include In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv9230/include Modified Files: Makefile.am Removed Files: ColorHistExtraction.h IntegralColorExtraction.h IntegralEdgesOriExtraction.h IntegralLBPExtraction.h MeanShiftTracker.h PFGenericParticle.h PFGenericParticleFilter.h PFPMRandomWalk.h PFParticle.h PFParticleFilter.h PFPredictionModel.h PFUtilityFct.h VisualFeatureDesc.h VisualFeaturesExtraction.h VisualHistogramDesc.h VisualIntegralDesc.h VisualROI.h VisualTarget.h VisualTargetManager.h VisualTracker.h Log Message: moved OpenCV stuff into the OpenCV directory --- VisualIntegralDesc.h DELETED --- --- VisualFeaturesExtraction.h DELETED --- --- IntegralEdgesOriExtraction.h DELETED --- --- VisualFeatureDesc.h DELETED --- Index: Makefile.am =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Vision/include/Makefile.am,v retrieving revision 1.12 retrieving revision 1.13 diff -C2 -d -r1.12 -r1.13 *** Makefile.am 2 Jun 2005 16:51:35 -0000 1.12 --- Makefile.am 28 Jul 2005 15:09:18 -0000 1.13 *************** *** 1,30 **** AUTOMAKE_OPTIONS = no-dependencies - - if WITH_OPENCV - OPENCV_FILES = VisualFeaturesExtraction.h \ - ColorHistExtraction.h \ - IntegralColorExtraction.h \ - IntegralEdgesOriExtraction.h \ - IntegralLBPExtraction.h \ - MeanShiftTracker.h \ - PFGenericParticle.h \ - PFGenericParticleFilter.h \ - PFParticle.h \ - PFParticleFilter.h \ - PFPMRandomWalk.h \ - PFPredictionModel.h \ - PFUtilityFct.h \ - VisualFeatureDesc.h \ - VisualHistogramDesc.h \ - VisualIntegralDesc.h \ - VisualROI.h \ - VisualTarget.h \ - VisualTargetManager.h \ - VisualTracker.h - else - OPENCV_FILES = - endif - include_HEADERS = Bt848.h \ ColorLookup.h \ --- 1,4 ---- *************** *** 41,44 **** lines.h \ CSymbol.h \ ! CTextLine.h $(OPENCV_FILES) --- 15,18 ---- lines.h \ CSymbol.h \ ! CTextLine.h --- VisualTarget.h DELETED --- --- IntegralLBPExtraction.h DELETED --- --- MeanShiftTracker.h DELETED --- --- PFParticleFilter.h DELETED --- --- VisualTargetManager.h DELETED --- --- PFParticle.h DELETED --- --- VisualTracker.h DELETED --- --- ColorHistExtraction.h DELETED --- --- PFPMRandomWalk.h DELETED --- --- IntegralColorExtraction.h DELETED --- --- VisualHistogramDesc.h DELETED --- --- VisualROI.h DELETED --- --- PFGenericParticleFilter.h DELETED --- --- PFGenericParticle.h DELETED --- --- PFPredictionModel.h DELETED --- --- PFUtilityFct.h DELETED --- |
From: Dominic L. <ma...@us...> - 2005-07-28 15:01:24
|
Update of /cvsroot/robotflow/RobotFlow In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv7035 Modified Files: Makefile.am configure.in Log Message: removed Player compilation until we update to Player 1.6, added OpenCV compilation Index: Makefile.am =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Makefile.am,v retrieving revision 1.20 retrieving revision 1.21 diff -C2 -d -r1.20 -r1.21 *** Makefile.am 27 Jun 2005 17:46:36 -0000 1.20 --- Makefile.am 28 Jul 2005 15:01:15 -0000 1.21 *************** *** 10,16 **** endif EXTRA_DIST = RobotFlow.spec RobotFlow.spec.in ! SUBDIRS = Pioneer2 Generic Behaviors Audio Vision Control Devices Probes FSM $(PLAYER_DIR) $(MARIE_DIR) rpm: dist --- 10,20 ---- endif + if WITH_OPENCV + OPENCV_DIR = OpenCV + endif + EXTRA_DIST = RobotFlow.spec RobotFlow.spec.in ! SUBDIRS = Pioneer2 Generic Behaviors Audio Vision Control Devices Probes FSM $(MARIE_DIR) $(OPENCV_DIR) rpm: dist Index: configure.in =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/configure.in,v retrieving revision 1.51 retrieving revision 1.52 diff -C2 -d -r1.51 -r1.52 *** configure.in 27 Jun 2005 17:46:34 -0000 1.51 --- configure.in 28 Jul 2005 15:01:15 -0000 1.52 *************** *** 313,316 **** --- 313,320 ---- MARIE/include/Makefile \ MARIE/src/Makefile \ + OpenCV/Makefile \ + OpenCV/include/Makefile \ + OpenCV/src/Makefile \ + OpenCV/n-files/Makefile \ RobotFlow.spec) |
Update of /cvsroot/robotflow/RobotFlow/OpenCV/src In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv6226/OpenCV/src Added Files: ColorHistExtraction.cc CvFaceDetection.cc DrawPFParticle.cc GetVisualROIParam.cc IntegralColorExtraction.cc IntegralEdgesOriExtraction.cc IntegralLBPExtraction.cc Makefile.am MeanShiftTracker.cc MultiIntegralCuesPFTracker.cc PFGenericParticle.cc PFGenericParticleFilter.cc PFPMRandomWalk.cc PFState2VisualROI.cc PFUtilityFct.cc TextSignDetect.cc VisualFeatureDesc.cc VisualHistogramDesc.cc VisualIntegralDesc.cc VisualROI.cc VisualTarget.cc VisualTargetManager.cc VisualTargetPFInterface.cc Log Message: moved from Vision directory --- NEW FILE: CvFaceDetection.cc --- #ifndef _CVFACEDETECTION_CC_ #define _CVFACEDETECTION_CC_ #include "BufferedNode.h" #include "cv.h" #include "cvaux.h" #include "Image.h" #include "VisualROI.h" #include <stdlib.h> #include <sys/timeb.h> using namespace FD; using namespace std; namespace RobotFlow { class CvFaceDetection; DECLARE_NODE(CvFaceDetection) /*Node * * @name CvFaceDetection * @category RobotFlow:Vision:Detection * @description Haar cascade classifier applied to face recognition using OpenCV. * * @parameter_name FRAME_WIDTH * @parameter_type int * @parameter_value 320 * @parameter_description Video frame width. * * @parameter_name FRAME_HEIGHT * @parameter_type int * @parameter_value 240 * @parameter_description Video frame height. * * @parameter_name MAX_NUM_SKIN_REGIONS * @parameter_type int * @parameter_value 20 * @parameter_description Maximum number of skin regions to search for faces in a frame. * * @parameter_name MIN_SKIN_ROI_WIDTH * @parameter_type int * @parameter_value 24 * @parameter_description Minimum width of a skin region to consider for face detection. * * @parameter_name MIN_SKIN_ROI_HEIGHT * @parameter_type int * @parameter_value 24 * @parameter_description Minimum height of a skin region to consider for face detection. * * @input_name IN_IMAGE * @input_type Image * @input_description Current grayscale frame to process. * * @input_name IN_SKIN_MASK * @input_type Image * @input_description Current skin color segmentation result. * * @input_name IN_TRACKED_ROI * @input_type VisualROI * @input_description Region of interest currently tracked. * * @input_name ACTIVATION * @input_type bool * @input_description Node activation flag. * * @output_name OUT_ROI * @output_type VisualROI * @output_description Region of interest corresponding to current model. * END*/ class CvFaceDetection : public BufferedNode { public: CvFaceDetection(string nodeName, ParameterSet params) : BufferedNode(nodeName, params), m_storage1(NULL), m_storage2(NULL), m_haarClassifier1(NULL), m_haarClassifier2(NULL) { m_imageInID = addInput("IN_IMAGE"); m_maskInID = addInput("IN_SKIN_MASK"); m_roiInID = addInput("IN_TRACKED_ROI"); m_activatedInID = addInput("ACTIVATION"); m_roiOutID = addOutput("OUT_ROI"); m_width = dereference_cast<int>(parameters.get("FRAME_WIDTH")); m_height = dereference_cast<int>(parameters.get("FRAME_HEIGHT")); m_maxNumSkinRegions = dereference_cast<int>(parameters.get("MAX_NUM_SKIN_REGIONS")); m_minROIWidth = dereference_cast<int>(parameters.get("MIN_SKIN_ROI_WIDTH")); m_minROIHeight =dereference_cast<int>(parameters.get("MIN_SKIN_ROI_HEIGHT")); m_numChannels = 3; m_numPixels = m_width*m_height; m_numBytesInFrame = m_numPixels*m_numChannels; m_imgXCen = m_width/2; m_imgYCen = m_height/2; CvSize imgSize; imgSize.width = m_width; imgSize.height = m_height; m_curImage = cvCreateImage(imgSize, IPL_DEPTH_8U, m_numChannels); m_skinMask = cvCreateImage(imgSize, IPL_DEPTH_8U, 1); m_filtMask = cvCreateImage(imgSize, IPL_DEPTH_8U, 1); m_storage1 = cvCreateMemStorage(0); m_storage2 = cvCreateMemStorage(0); m_haarClassifier1 = (CvHaarClassifierCascade*)cvLoad( "haarcascade_frontalface_alt2.xml", 0, 0, 0 ); m_haarClassifier2 = (CvHaarClassifierCascade*)cvLoad( "haarcascade_profileface.xml", 0, 0, 0 ); m_curSkinROI = new CvRect[m_maxNumSkinRegions]; m_contourStorage = cvCreateMemStorage(0); m_faceROI = RCPtr<VisualROI>(new VisualROI(e_VISUALROI_rectangular, 1, 1, 1, 1, 0)); } CvFaceDetection() { delete [] m_curSkinROI; cvReleaseImage(&m_filtMask); cvReleaseImage(&m_skinMask); cvReleaseImage(&m_curImage); } void calculate(int output_id, int count, Buffer &out) { try { // Get activation flag m_activated = getInput(m_activatedInID, count); if (!(*m_activated)) { // Output nilObjects and return (*outputs[m_roiOutID].buffer)[count] = nilObject; return; } //struct timeb t1, t2; bool foundFace = false; RCPtr<Image> imageRef = getInput(m_imageInID, count); // Verify input image sanity if (imageRef->get_width() != m_width || imageRef->get_height() != m_height || imageRef->get_pixelsize() != m_numChannels) { throw new GeneralException ("CvFaceDetection::calculate : image parameters do not correspond to given input.",__FILE__,__LINE__); } // Copy input image memcpy(m_curImage->imageData, imageRef->get_data(), m_numBytesInFrame); // Start timer //ftime(&t1); // Get IN_TRACKED_ROI ObjectRef roiRef = getInput(m_roiInID, count); if (roiRef->isNil()) { // No tracked target, so do standard detection // Try to get a skin mask ObjectRef maskRef = getInput(m_maskInID, count); if (!maskRef->isNil()) { // Initialize target at current ROI RCPtr<Image> skinMaskImgRef = maskRef; // Verify input image sanity if (skinMaskImgRef->get_width() != m_width || skinMaskImgRef->get_height() != m_height || skinMaskImgRef->get_pixelsize() != 1) { throw new GeneralException ("CvFaceDetection::calculate : skin mask image parameters do not correspond to given input.",__FILE__,__LINE__); } // Copy input image memcpy(m_skinMask->imageData, skinMaskImgRef->get_data(), m_numPixels); foundFace = FaceDetectionWSkin(); } else { foundFace = FaceDetectionStd(); } } else { // Need to validate tracked ROI RCPtr<VisualROI> roiRefPtr = roiRef; foundFace = FaceDetectionWROI(roiRefPtr.get()); } // End timer //ftime(&t2); // Display time used //double timeDiff=(t2.time-t1.time)+((t2.millitm-t1.millitm)/1000.0); //cout << "Total run time (sec): " << timeDiff << " found face=" << foundFace << endl; if (foundFace) { (*outputs[m_roiOutID].buffer)[count] = m_faceROI; } else { (*outputs[m_roiOutID].buffer)[count] = nilObject; } } catch (BaseException *e) { throw e->add(new GeneralException("Exception in CvFaceDetection::calculate:",__FILE__,__LINE__)); } } bool FaceDetectionStd() { try { cvClearMemStorage(m_storage1); // Detect upperbodies CvSeq* faces = cvHaarDetectObjects( m_curImage, m_haarClassifier1, m_storage1, 1.2, 2, CV_HAAR_DO_CANNY_PRUNING, cvSize(0,0) ); cvClearMemStorage(m_storage2); CvSeq* profiles = cvHaarDetectObjects( m_curImage, m_haarClassifier2, m_storage2, 1.2, 2, CV_HAAR_DO_CANNY_PRUNING, cvSize(0,0) ); float bestDist = sqrt((float)(m_width*m_width + m_height*m_height)); int curXCen = 0; int curYCen = 0; int curHSX = 0; int curHSY = 0; int i, hsX, hsY, xCen, yCen, xDist, yDist; float curDist; CvPoint pt1, pt2; // Display identified objects for(i = 0; i < (faces ? faces->total : 0); i++ ) { CvRect* r = (CvRect*)cvGetSeqElem( faces, i ); pt1.x = r->x; pt2.x = (r->x+r->width); pt1.y = r->y; pt2.y = (r->y+r->height); hsX = r->width/2; hsY = r->height/2; xCen = r->x + hsX; yCen = r->y + hsY; hsX = (int)((float)(hsX)*0.7f); xDist = xCen-m_imgXCen; yDist = yCen-m_imgYCen; curDist = sqrt((float)(xDist*xDist + yDist*yDist)); if (curDist < bestDist) { bestDist = curDist; curXCen = xCen; curYCen = yCen; curHSX = hsX; curHSY = hsY; isFrontFace = true; isProfileFace = false; } } for(i = 0; i < (profiles ? profiles->total : 0); i++ ) { CvRect* r = (CvRect*)cvGetSeqElem( profiles, i ); pt1.x = r->x; pt2.x = (r->x+r->width); pt1.y = r->y; pt2.y = (r->y+r->height); hsX = r->width/2; hsY = r->height/2; xCen = r->x + hsX; yCen = r->y + hsY; hsX = (int)((float)(hsX)*0.6f); hsY = (int)((float)(hsY)*0.9f); xDist = xCen-m_imgXCen; yDist = yCen-m_imgYCen; curDist = sqrt((float)(xDist*xDist + yDist*yDist)); if (curDist < bestDist) { bestDist = curDist; curXCen = xCen; curYCen = yCen; curHSX = hsX; curHSY = hsY; isFrontFace = false; isProfileFace = true; } } if (curHSX > 0 && curHSY > 0) { m_faceROI->SetXCen(curXCen); m_faceROI->SetYCen(curYCen); m_faceROI->Reset(curHSX, curHSY, 0); return true; } else { return false; } } catch (BaseException *e) { throw e->add(new GeneralException("Exception in CvFaceDetection::FaceDetectionStd:",__FILE__,__LINE__)); } } bool FaceDetectionWSkin() { try { // First identify skin blobs that could be appropriate for faces int r; int curNumSkinRegions = 0; float bestDist = sqrt((float)(m_width*m_width + m_height*m_height)); int curXCen = 0; int curYCen = 0; int curHSX = 0; int curHSY = 0; int i, hsX, hsY, xCen, yCen, xDist, yDist; float curDist; CvPoint pt1, pt2; // Contour (connected compnents) sequence CvSeq *contour; // Pointer to skin regions parameters CvRect *p_regions = m_curSkinROI; // Smooth mask cvSmooth(m_skinMask, m_filtMask, CV_MEDIAN, 3, 0); // Find connected components contours using a simple chain code approximation cvFindContours(m_filtMask, m_contourStorage, &contour, sizeof(CvContour), CV_RETR_CCOMP, CV_CHAIN_APPROX_SIMPLE); // Process each contour to determine if it is a potential face region of interest for( ; contour != 0; contour = contour->h_next ) { // Get rectangular bounding box of current contour CvRect bbox = cvContourBoundingRect(contour, 1); // Accept only regions fitting the desired scale if (bbox.width > m_minROIWidth && bbox.height > m_minROIHeight) { if (curNumSkinRegions < m_maxNumSkinRegions) { // Copy current region information // but increase its size to be sure to fit all the head (*p_regions).x = bbox.x - (int)((float)(bbox.width)*0.2f); if ((*p_regions).x < 0) { (*p_regions).x = 0; } (*p_regions).y = bbox.y - (int)((float)(bbox.height)*0.1f); if ((*p_regions).y < 0) { (*p_regions).y = 0; } (*p_regions).width = (int)((float)(bbox.width)*1.4f); (*p_regions).height = (int)((float)(bbox.height)*1.2f); curNumSkinRegions++; p_regions++; } else { // No more storage, warn and continue to next step cerr << "CvFaceDetection::FaceDetectionWSkin: WARNING: could not process all potential skin regions because of the given limit of " << m_maxNumSkinRegions << " regions." << endl; } } } for (r=0; r<curNumSkinRegions; r++) { // Apply Haar cascade face detection only on selected ROIs // Set current image ROI to the corresponding skin region cvSetImageROI(m_curImage, m_curSkinROI[r]); // Since we are setting ROI with the OpenCV image // we need to add an offset to the returned position int xOffset = m_curSkinROI[r].x; int yOffset = m_curSkinROI[r].y; cvClearMemStorage(m_storage1); // Detect frontal faces CvSeq* faces = cvHaarDetectObjects( m_curImage, m_haarClassifier1, m_storage1, 1.2, 2, CV_HAAR_DO_CANNY_PRUNING, cvSize(0,0) ); cvClearMemStorage(m_storage2); // Detect face profiles CvSeq* profiles = cvHaarDetectObjects( m_curImage, m_haarClassifier2, m_storage2, 1.2, 2, CV_HAAR_DO_CANNY_PRUNING, cvSize(0,0) ); // Display identified objects for(int i = 0; i < (faces ? faces->total : 0); i++ ) { CvRect* r = (CvRect*)cvGetSeqElem( faces, i ); pt1.x = r->x; pt2.x = (r->x+r->width); pt1.y = r->y; pt2.y = (r->y+r->height); hsX = r->width/2; hsY = r->height/2; // Since we are setting ROI with the OpenCV image // we need to add an offset to the returned position xCen = r->x + hsX + xOffset; yCen = r->y + hsY + yOffset; hsX = (int)((float)(hsX)*0.7f); xDist = xCen-m_imgXCen; yDist = yCen-m_imgYCen; curDist = sqrt((float)(xDist*xDist + yDist*yDist)); if (curDist < bestDist) { bestDist = curDist; curXCen = xCen; curYCen = yCen; curHSX = hsX; curHSY = hsY; isFrontFace = true; isProfileFace = false; } } for(int i = 0; i < (profiles ? profiles->total : 0); i++ ) { CvRect* r = (CvRect*)cvGetSeqElem( profiles, i ); pt1.x = r->x; pt2.x = (r->x+r->width); pt1.y = r->y; pt2.y = (r->y+r->height); hsX = r->width/2; hsY = r->height/2; // Since we are setting ROI with the OpenCV image // we need to add an offset to the returned position xCen = r->x + hsX + xOffset; yCen = r->y + hsY + yOffset; hsX = (int)((float)(hsX)*0.6f); hsY = (int)((float)(hsY)*0.9f); xDist = xCen-m_imgXCen; yDist = yCen-m_imgYCen; curDist = sqrt((float)(xDist*xDist + yDist*yDist)); if (curDist < bestDist) { bestDist = curDist; curXCen = xCen; curYCen = yCen; curHSX = hsX; curHSY = hsY; isFrontFace = false; isProfileFace = true; } } } // Clear contour information cvClearMemStorage(m_contourStorage); // Reset current image ROI cvSetImageROI(m_curImage, cvRect(0, 0, m_width, m_height)); if (curHSX > 0 && curHSY > 0) { m_faceROI->SetXCen(curXCen); m_faceROI->SetYCen(curYCen); m_faceROI->Reset(curHSX, curHSY, 0); return true; } else { return false; } } catch (BaseException *e) { throw e->add(new GeneralException("Exception in CvFaceDetection::FaceDetectionStd:",__FILE__,__LINE__)); } } bool FaceDetectionWROI(VisualROI *i_trackedROI) { try { int trackedXCen = i_trackedROI->GetXCen(); int trackedYCen = i_trackedROI->GetYCen(); cvClearMemStorage(m_storage1); // Detect upperbodies CvSeq* faces = cvHaarDetectObjects( m_curImage, m_haarClassifier1, m_storage1, 1.2, 2, CV_HAAR_DO_CANNY_PRUNING, cvSize(0,0) ); cvClearMemStorage(m_storage2); CvSeq* profiles = cvHaarDetectObjects( m_curImage, m_haarClassifier2, m_storage2, 1.2, 2, CV_HAAR_DO_CANNY_PRUNING, cvSize(0,0) ); int curXCen = 0; int curYCen = 0; int curHSX = 0; int curHSY = 0; int i; CvPoint pt1, pt2; // Find a face near the given ROI for(i = 0; i < (faces ? faces->total : 0); i++ ) { CvRect* r = (CvRect*)cvGetSeqElem( faces, i ); pt1.x = r->x; pt2.x = (r->x+r->width); pt1.y = r->y; pt2.y = (r->y+r->height); if (trackedXCen > pt1.x && trackedXCen < pt2.x && trackedYCen > pt1.y && trackedYCen < pt2.y) { // Found a face in given ROI curHSX = r->width/2; curHSY = r->height/2; curXCen = r->x + curHSX; curYCen = r->y + curHSY; m_faceROI->SetXCen(curXCen); m_faceROI->SetYCen(curYCen); m_faceROI->Reset(curHSX, curHSY, 0); isFrontFace = true; isProfileFace = false; return true; } } for(i = 0; i < (profiles ? profiles->total : 0); i++ ) { CvRect* r = (CvRect*)cvGetSeqElem( profiles, i ); pt1.x = r->x; pt2.x = (r->x+r->width); pt1.y = r->y; pt2.y = (r->y+r->height); if (trackedXCen > pt1.x && trackedXCen < pt2.x && trackedYCen > pt1.y && trackedYCen < pt2.y) { // Found a face in given ROI curHSX = r->width/2; curHSY = r->height/2; curXCen = r->x + curHSX; curYCen = r->y + curHSY; m_faceROI->SetXCen(curXCen); m_faceROI->SetYCen(curYCen); m_faceROI->Reset(curHSX, curHSY, 0); isFrontFace = false; isProfileFace = true; return true; } } // If we get here, there were no face detected near the given ROI return false; } catch (BaseException *e) { throw e->add(new GeneralException("Exception in CvFaceDetection::FaceDetectionStd:",__FILE__,__LINE__)); } } private: int m_imageInID; int m_maskInID; int m_roiInID; int m_activatedInID; int m_roiOutID; RCPtr<Bool> m_activated; int m_width; int m_height; int m_numChannels; int m_numPixels; int m_numBytesInFrame; int m_imgXCen; int m_imgYCen; int m_maxNumSkinRegions; int m_minROIWidth; int m_minROIHeight; CvMemStorage *m_storage1; CvMemStorage *m_storage2; CvHaarClassifierCascade *m_haarClassifier1; CvHaarClassifierCascade *m_haarClassifier2; CvRect *m_curSkinROI; IplImage *m_curImage; IplImage *m_skinMask; IplImage *m_filtMask; // Internal contour memory storage CvMemStorage *m_contourStorage; RCPtr<VisualROI> m_faceROI; bool isFrontFace; bool isProfileFace; }; } #endif --- NEW FILE: VisualFeatureDesc.cc --- /* Copyright (C) 2005 Pierre Moisan (Pie...@US...) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA */ #include "VisualFeatureDesc.h" #include "operators.h" #include "conversion.h" #include "vmethod.h" #include "Vector.h" using namespace FD; using namespace std; namespace RobotFlow { // // Declaring known types // DECLARE_TYPE(VisualFeatureDesc<double>) typedef VisualFeatureDesc<double> VisualDoubleFeatureDesc; typedef Vector<VisualFeatureDesc<double> *> VisualDoubleFeatureDescVec; // // Adding operators // // Concat operator using two VisualFeatureDesc inputs template <class X, class Y, class Z> ObjectRef concatVFDFunction(ObjectRef op1, ObjectRef op2) { RCPtr<X> op1Value = op1; RCPtr<Y> op2Value = op2; //creating output vector RCPtr<Z> resultValue = RCPtr<Z>(Z::alloc(2)); //concat 2 values into a vector (*resultValue)[0] = &(*op1Value); (*resultValue)[1] = &(*op2Value); return resultValue; } REGISTER_DOUBLE_VTABLE_TEMPLATE( concatVtable, concatVFDFunction, VisualFeatureDesc<double>, VisualFeatureDesc<double>, Vector<VisualFeatureDesc<double> *>, 0) // Concat operator using two Vector<VisualFeatureDesc *> inputs template <class X, class Y, class Z> ObjectRef concatVFDVecVecFunction(ObjectRef op1, ObjectRef op2) { RCPtr<X> op1Value = op1; RCPtr<Y> op2Value = op2; RCPtr<Z> resultValue(Z::alloc(op1Value->size() + op2Value->size())); //copy first part for (int i =0; i < op1Value->size(); i++) { (*resultValue)[i] = (*op1Value)[i]; } //copy last part for (int i =0; i < op2Value->size(); i++) { (*resultValue)[i + op1Value->size()] = (*op2Value)[i]; } return resultValue; } REGISTER_DOUBLE_VTABLE_TEMPLATE( concatVtable, concatVFDVecVecFunction, Vector<VisualFeatureDesc<double> *>, Vector<VisualFeatureDesc<double> *>, Vector<VisualFeatureDesc<double> *>, 0) // Concat operator using a Vector<VisualFeatureDesc *> input // followed by a VisualFeatureDesc input template<class X, class Y, class Z> ObjectRef concatVectorVFDFunction(ObjectRef op1, ObjectRef op2) { RCPtr<X> op1Value = op1; RCPtr<Y> op2Value = op2; //creating new vector RCPtr<Z> resultValue(Z::alloc(op1Value->size() + 1)); //copying values from vector for (int i = 0; i < resultValue->size(); i++) { (*resultValue)[i] = (*op1Value)[i]; } //adding last element (*resultValue)[resultValue->size() - 1] = &(*op2Value); return resultValue; } REGISTER_DOUBLE_VTABLE_TEMPLATE( concatVtable, concatVectorVFDFunction, Vector<VisualFeatureDesc<double> *>, VisualFeatureDesc<double>, Vector<VisualFeatureDesc<double> *>, 0) // Concat operator using a VisualFeatureDesc input // followed by a Vector<VisualFeatureDesc *> input template<class X, class Y, class Z> ObjectRef concatVFDVectorFunction(ObjectRef op1, ObjectRef op2) { RCPtr<X> op1Value = op1; RCPtr<Y> op2Value = op2; //creating new vector RCPtr<Z> resultValue(Z::alloc(op2Value->size() + 1)); //copying values from vector for (int i = 1; i < resultValue->size(); i++) { (*resultValue)[i] = (*op2Value)[i - 1]; } //adding first element (*resultValue)[0] = &(*op1Value); return resultValue; } REGISTER_DOUBLE_VTABLE_TEMPLATE( concatVtable, concatVFDVectorFunction, VisualFeatureDesc<double>, Vector<VisualFeatureDesc<double> *>, Vector<VisualFeatureDesc<double> *>, 0) // ToVect conversion template <class T, class U> ObjectRef VisualFeatDesc2VectorConversion (ObjectRef in) { RCPtr<T> FromVFD = in; RCPtr<U> ToVector(U::alloc(1)); (*ToVector)[0] = &(*FromVFD); return ToVector; } static int dummy_init_for_visual_feature_to_vect = FD::vmethod()->registerFunct0( VisualFeatDesc2VectorConversion<VisualFeatureDesc<double>,Vector<VisualFeatureDesc<double>* > >, &typeid(VisualFeatureDesc<double>), "toVect"); //TODO working conversion! //static int dummy_init_for_visual_feature_conversion = Conversion::addConvFunction //<VisualFeatureDesc<double>, Vector<VisualFeatureDesc<double> *> > //(VisualFeatDesc2VectorConversion<Vector<VisualFeatureDesc<double> *> >); }//namespace RobotFlow --- NEW FILE: PFUtilityFct.cc --- // // Utility functions mainly for random numbers generation // Copyright (C) 2004 Jean-Marc Valin // #include "PFUtilityFct.h" using namespace std; namespace RobotFlow { static float PFUTIL_logtable2[PFUTIL_FLOGLOOKUP2SIZE]; void PFUTIL_build_flog_table() { static bool init=false; if (!init) { PFUTIL_FloatManip m; for (int i=0;i<PFUTIL_FLOGLOOKUP2SIZE;i++) { m.i = (i<<PFUTIL_FLOGLOOKUP2SHIFT) | 0x3f800000; PFUTIL_logtable2[i]=log(m.f); } init=true; } } float PFUTIL_flog(const float in) { PFUTIL_build_flog_table(); PFUTIL_FloatManip m; m.f = in; //Extract the mantissa and perform lookup for log(mantissa) float tb = PFUTIL_logtable2[(m.i & 0x007fffff)>>PFUTIL_FLOGLOOKUP2SHIFT]; //Extract the exponent int id = (m.i>>23)-127; return id*M_LN2 + tb; } float PFUTIL_randn() { static bool avail = false; static float cached=0.f; if (avail) { avail = false; return cached; } else { float U1, U2, S, x; do { U1 = float(rand())/float(RAND_MAX); U2 = float(rand())/float(RAND_MAX); U1 = 2*U1-1; U2 = 2*U2-1; S = U1*U1 + U2*U2; } while (S >= 1 || S == 0.0f); float tmp = sqrt(-2 * PFUTIL_flog(S) / S); cached = tmp * U1; avail = true; return tmp * U2; } } float PFUTIL_ran() { const unsigned int jflone = 0x3f800000; const unsigned int jflmsk = 0x007fffff; static int idum = 10000; union {int i; float f;} ran; idum = 1664525*idum + 1013904223; ran.i = jflone | (jflmsk & idum); ran.f -= 1; return ran.f; } int PFUTIL_find_range(float i_x, const float *i_cumul, int i_cumulSize) { int low=0; int high=i_cumulSize-1; int cur = high>>1; while (1) { if (i_cumul[cur] > i_x) { if (cur == 0 || i_cumul[cur-1] <= i_x) { break; } else { high = cur-1; cur = (low+cur) >> 1; } } else { low = cur; cur = (high+cur+1) >> 1; if (cur==high) { break; } } } return cur; } } --- NEW FILE: IntegralColorExtraction.cc --- /* Copyright (C) 2005 Pierre Moisan (Pie...@US...) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA */ #include "IntegralColorExtraction.h" using namespace FD; using namespace std; namespace RobotFlow { DECLARE_NODE(IntegralColorExtraction) DECLARE_TYPE(IntegralColorExtraction) /*Node * * @name IntegralColorExtraction * @category RobotFlow:Vision:FeaturesExtraction * @description Integral color features extraction. * * @parameter_name FRAME_WIDTH * @parameter_type int * @parameter_value 320 * @parameter_description Video frame width. * * @parameter_name FRAME_HEIGHT * @parameter_type int * @parameter_value 240 * @parameter_description Video frame height. * * @parameter_name NUM_CHANNELS * @parameter_type int * @parameter_value 3 * @parameter_description Number of channels in video frame. * * @parameter_name NUM_HORI_RECT * @parameter_type int * @parameter_value 2 * @parameter_description Number of horizontal rectangle to use for integral features. * * @parameter_name NUM_VERT_RECT * @parameter_type int * @parameter_value 2 * @parameter_description Number of vertical rectangle to use for integral features. * * @parameter_name USE_RECT_DIFF * @parameter_type bool * @parameter_value true * @parameter_description Features are differences between rectangle regions. * * @parameter_name USE_BOUNDARY_DIFF * @parameter_type bool * @parameter_value true * @parameter_description Flag indicating to detect a color boundary for a better scale adaptation. * * @parameter_name BOUNDARY_DIFF_THRESH * @parameter_type float * @parameter_value 25.0 * @parameter_description Minimum difference between boundary and ROI rectangle mean values. * * @input_name IN_IMAGE * @input_type Image * @input_description Current frame to process. * * @input_name CURRENT_ROI * @input_type VisualROI * @input_description The current region of interest. * * @input_name PREPROCESS_FRAME * @input_type bool * @input_description Flag indicating to preprocess a new image. * * @output_name OUT_FEATURES * @output_type Vector<VisualFeatureDesc<double> *> * @output_description Output features descriptor. * * @output_name PREPROCESS_COMPLETED * @output_type int * @output_description Output to force preprocessing. * END*/ // // Default constructor for Object // IntegralColorExtraction::IntegralColorExtraction() : m_width(-1), m_height(-1), m_numChannels(-1), m_numHoriIntRect(0), m_numVertIntRect(0), m_maxValue(0), m_useRectDiff(false), m_useBoundary(false), m_boundaryMeanDiffThresh(0.0), m_tmpMeanFeatures(NULL), m_curMeanVal(NULL), m_featVect(NULL) { } // // Constructor with complete intialisation // IntegralColorExtraction::IntegralColorExtraction(int i_width, int i_height, int i_numChannels, int i_numHoriIntRect, int i_numVertIntRect, double i_maxValue, bool i_useRectDiff, bool i_useBoundary, double i_boundaryMeanDiffThresh) : VisualFeaturesExtraction<double>(string("IntegralColorExtraction"), ParameterSet()), m_width(i_width), m_height(i_height), m_numChannels(i_numChannels), m_numHoriIntRect(i_numHoriIntRect), m_numVertIntRect(i_numVertIntRect), m_maxValue(i_maxValue), m_useRectDiff(i_useRectDiff), m_useBoundary(i_useBoundary), m_boundaryMeanDiffThresh(i_boundaryMeanDiffThresh), m_tmpMeanFeatures(NULL), m_curMeanVal(NULL), m_featVect(NULL) { Initialize(); } // // BufferedNode constructor // IntegralColorExtraction::IntegralColorExtraction(string nodeName, ParameterSet params) : VisualFeaturesExtraction<double>(nodeName, params), m_tmpMeanFeatures(NULL), m_curMeanVal(NULL), m_featVect(NULL) { m_imageInID = addInput("IN_IMAGE"); m_roiInID = addInput("CURRENT_ROI"); m_useNextImgInID = addInput("PREPROCESS_FRAME"); m_featuresOutID = addOutput("OUT_FEATURES"); m_ppCompletedOutID = addOutput("PREPROCESS_COMPLETED"); m_width = dereference_cast<int>(parameters.get("FRAME_WIDTH")); m_height = dereference_cast<int>(parameters.get("FRAME_HEIGHT")); m_numChannels = dereference_cast<int>(parameters.get("NUM_CHANNELS")); m_numHoriIntRect = dereference_cast<int>(parameters.get("NUM_HORI_RECT")); m_numVertIntRect = dereference_cast<int>(parameters.get("NUM_VERT_RECT")); m_useRectDiff = dereference_cast<bool>(parameters.get("USE_RECT_DIFF")); m_useBoundary = dereference_cast<bool>(parameters.get("USE_BOUNDARY_DIFF")); m_boundaryMeanDiffThresh = dereference_cast<float>(parameters.get("BOUNDARY_DIFF_THRESH")); if (m_useRectDiff) { m_maxValue = 510.0; } else { m_maxValue = 255.0; } Initialize(); } IntegralColorExtraction::~IntegralColorExtraction() { for (int c=0; c<m_numChannels; c++) { cvReleaseImage(&(m_sumImage[c])); cvReleaseImage(&(m_chImage[c])); } cvReleaseImage(&m_curImage); delete [] m_sumPixPtr; delete [] m_chPixPtr; delete [] m_sumImage; delete [] m_chImage; delete [] m_curMeanVal; delete [] m_tmpMeanFeatures; } // Modified BufferedNode request method to support cyclic node connection void IntegralColorExtraction::request(int output_id, const ParameterSet &req) { if (req.exist("LOOKAHEAD")) { outputs[output_id].lookAhead = max(outputs[output_id].lookAhead,dereference_cast<int> (req.get("LOOKAHEAD"))); } if (req.exist("LOOKBACK")) { outputs[output_id].lookBack = max(outputs[output_id].lookBack,dereference_cast<int> (req.get("LOOKBACK"))); } if (req.exist("INORDER")) { inOrder = true; } int outputLookAhead=0, outputLookBack=0; outputLookAhead=max(outputLookAhead, outputs[output_id].lookAhead); outputLookBack =max(outputLookBack, outputs[output_id].lookBack); // Every output usually requires these inputs ParameterSet nextImgReq; nextImgReq.add("LOOKAHEAD", ObjectRef(Int::alloc(inputsCache[m_useNextImgInID].lookAhead+outputLookAhead))); nextImgReq.add("LOOKBACK", ObjectRef(Int::alloc(inputsCache[m_useNextImgInID].lookBack+outputLookBack))); inputs[m_useNextImgInID].node->request(inputs[m_useNextImgInID].outputID,nextImgReq); if (output_id == m_featuresOutID) { ParameterSet myReq; myReq.add("LOOKAHEAD", ObjectRef(Int::alloc(inputsCache[m_roiInID].lookAhead+outputLookAhead))); myReq.add("LOOKBACK", ObjectRef(Int::alloc(inputsCache[m_roiInID].lookBack+outputLookBack))); inputs[m_roiInID].node->request(inputs[m_roiInID].outputID,myReq); } else if (output_id == m_ppCompletedOutID) { ParameterSet myReq; myReq.add("LOOKAHEAD", ObjectRef(Int::alloc(inputsCache[m_imageInID].lookAhead+outputLookAhead))); myReq.add("LOOKBACK", ObjectRef(Int::alloc(inputsCache[m_imageInID].lookBack+outputLookBack))); inputs[m_imageInID].node->request(inputs[m_imageInID].outputID, myReq); } else { throw new GeneralException ("IntegralColorExtraction::request : unknown output ID.",__FILE__,__LINE__); } } void IntegralColorExtraction::calculate(int output_id, int count, Buffer &out) { try { bool useNext = dereference_cast<bool>(getInput(m_useNextImgInID, count)); if (useNext) { RCPtr<Image> imageRef = getInput(m_imageInID, count); // Verify input image sanity if (imageRef->get_width() != m_width || imageRef->get_height() != m_height || imageRef->get_pixelsize() != m_numChannels) { throw new GeneralException ("IntegralColorExtraction::calculate : image parameters do not correspond to given input.",__FILE__,__LINE__); } // Preprocess input image Preprocess((const unsigned char *)(imageRef->get_data())); } if (output_id == m_featuresOutID) { ObjectRef roiRef = getInput(m_roiInID, count); if (!roiRef->isNil()) { RCPtr<VisualROI> roiPtr = roiRef; ExtractFeatures(roiPtr.get()); (*outputs[m_featuresOutID].buffer)[count] = m_featVect; } else { (*outputs[m_featuresOutID].buffer)[count] = ObjectRef(nilObject); } } else if (output_id == m_ppCompletedOutID) { // Preprocess image than output true RCPtr<Image> imageRef = getInput(m_imageInID, count); // Verify input image sanity if (imageRef->get_width() != m_width || imageRef->get_height() != m_height || imageRef->get_pixelsize() != m_numChannels) { throw new GeneralException ("IntegralColorExtraction::calculate : image parameters do not correspond to given input.",__FILE__,__LINE__); } // Preprocess input image Preprocess((const unsigned char *)(imageRef->get_data())); (*outputs[m_ppCompletedOutID].buffer)[count] = ObjectRef(Int::alloc(1)); } } catch (BaseException *e) { throw e->add(new GeneralException("Exception in IntegralColorExtraction::calculate:",__FILE__,__LINE__)); } } void IntegralColorExtraction::ExtractFeatures(VisualROI *i_roi) { try { CvPoint ulcLimit, lrcLimit, curULC, curLRC; int roiWidth, roiHeight, deltaX, deltaY; double numPixelsInSubROI; int c,i,j; double *featPtr = (*m_featVect)[0]->GetFeatures(); int cenX = i_roi->GetXCen(); int cenY = i_roi->GetYCen(); if (cenX < 0 || cenX >= m_width || cenY < 0 || cenY >= m_height) { // ROI center is out of image plane // Ignore current ROI (*m_featVect)[0]->SetValidity(false); return; } ulcLimit.x = cenX - i_roi->GetHSX(); if (ulcLimit.x < 0) { ulcLimit.x = 0; } else if (ulcLimit.x >= m_width) { ulcLimit.x = m_width-1; } ulcLimit.y = cenY - i_roi->GetHSY(); if (ulcLimit.y < 0) { ulcLimit.y = 0; } else if (ulcLimit.y >= m_height) { ulcLimit.y = m_height-1; } lrcLimit.x = cenX + i_roi->GetHSX(); if (lrcLimit.x < 0) { lrcLimit.x = 0; } else if (lrcLimit.x >= m_width) { lrcLimit.x = m_width-1; } lrcLimit.y = cenY + i_roi->GetHSY(); if (lrcLimit.y < 0) { lrcLimit.y = 0; } else if (lrcLimit.y >= m_height) { lrcLimit.y = m_height-1; } roiWidth = lrcLimit.x - ulcLimit.x; roiHeight = lrcLimit.y - ulcLimit.y; if (roiWidth==0 || roiHeight==0) { // Invalid ROI (*m_featVect)[0]->SetValidity(false); return; } else { (*m_featVect)[0]->SetValidity(true); } //cout << "ROI Info: ulc=(" << ulcLimit.x << "," << ulcLimit.y << ") lrc=(" << lrcLimit.x << "," << lrcLimit.y << endl; if (m_numHoriIntRect) { deltaX = roiWidth/m_numHoriIntRect; } else { deltaX = roiWidth; } if (m_numVertIntRect) { deltaY = roiHeight/m_numVertIntRect; } else { deltaY = roiHeight; } for (c=0; c<m_numChannels; c++) { m_sumPixPtr[c] = (int *)(m_sumImage[c]->imageData); } numPixelsInSubROI = deltaX*deltaY; if (numPixelsInSubROI == 0) { // Invalid ROI (*m_featVect)[0]->SetValidity(false); return; } curULC.y = ulcLimit.y; curLRC.y = curULC.y + deltaY; if (m_useRectDiff) { double *p_tmpMeanFeatures = m_tmpMeanFeatures; // First pass: compute mean values for (i=0; i<m_numVertIntRect; i++) { curULC.x = ulcLimit.x; curLRC.x = curULC.x + deltaX; for (j=0; j<m_numHoriIntRect; j++) { for (c=0; c<m_numChannels; c++) { // Compute mean color channel *p_tmpMeanFeatures = (double)(m_sumPixPtr[c][curLRC.y*m_imgSumWidth+curLRC.x] - m_sumPixPtr[c][curLRC.y*m_imgSumWidth+curULC.x] - m_sumPixPtr[c][curULC.y*m_imgSumWidth+curLRC.x] + m_sumPixPtr[c][curULC.y*m_imgSumWidth+curULC.x])/numPixelsInSubROI; *featPtr++ = *p_tmpMeanFeatures++; } curULC.x += deltaX; curLRC.x += deltaX; } curULC.y += deltaY; curLRC.y += deltaY; } // Second pass: compute differences for (i=0; i<m_numIntRect; i++) { for (c=0; c<m_numChannels; c++) { m_curMeanVal[c] = m_tmpMeanFeatures[i*m_numChannels + c]; } for (j=0; j<m_numIntRect; j++) { if (j != i) { for (c=0; c<m_numChannels; c++) { *featPtr++ = m_tmpMeanFeatures[j*m_numChannels + c] - m_curMeanVal[c] + 255.0; } } } } } else { for (i=0; i<m_numVertIntRect; i++) { curULC.x = ulcLimit.x; curLRC.x = curULC.x + deltaX; for (j=0; j<m_numHoriIntRect; j++) { for (c=0; c<m_numChannels; c++) { // Compute mean color channel double tmpVal = (double)(m_sumPixPtr[c][curLRC.y*m_imgSumWidth+curLRC.x] - m_sumPixPtr[c][curLRC.y*m_imgSumWidth+curULC.x] - m_sumPixPtr[c][curULC.y*m_imgSumWidth+curLRC.x] + m_sumPixPtr[c][curULC.y*m_imgSumWidth+curULC.x])/numPixelsInSubROI; *featPtr++ = tmpVal; } curULC.x += deltaX; curLRC.x += deltaX; } curULC.y += deltaY; curLRC.y += deltaY; } } if (m_useBoundary) { CvPoint boundULC, boundLRC; double numPixelsInBoundary; int boundWidth = cvRound((float)(roiWidth)*0.1f); int boundHeight = cvRound((float)(roiHeight)*0.1f); // Check top boundary if (ulcLimit.y >= boundHeight) { int numDiff = 0; boundULC.y = ulcLimit.y - boundHeight; boundLRC.y = ulcLimit.y; boundULC.x = ulcLimit.x; boundLRC.x = lrcLimit.x; curULC.y = ulcLimit.y; curLRC.y = ulcLimit.y + deltaY; curULC.x = ulcLimit.x; curLRC.x = lrcLimit.x; numPixelsInBoundary = roiWidth*boundHeight; numPixelsInSubROI = roiWidth*deltaY; for (c=0; c<m_numChannels; c++) { double boundVal = (double)(m_sumPixPtr[c][boundLRC.y*m_imgSumWidth+boundLRC.x] - m_sumPixPtr[c][boundLRC.y*m_imgSumWidth+boundULC.x] - m_sumPixPtr[c][boundULC.y*m_imgSumWidth+boundLRC.x] + m_sumPixPtr[c][boundULC.y*m_imgSumWidth+boundULC.x])/numPixelsInBoundary; double roiVal = (double)(m_sumPixPtr[c][curLRC.y*m_imgSumWidth+curLRC.x] - m_sumPixPtr[c][curLRC.y*m_imgSumWidth+curULC.x] - m_sumPixPtr[c][curULC.y*m_imgSumWidth+curLRC.x] + m_sumPixPtr[c][curULC.y*m_imgSumWidth+curULC.x])/numPixelsInSubROI; //cout << "Top boundary: roiVal=" << roiVal << " boundVal=" << boundVal << endl; if (abs(boundVal-roiVal) > m_boundaryMeanDiffThresh) { numDiff++; } } if (numDiff > 0) { *featPtr++ = 2.0; } else { *featPtr++ = 0.0; } } else { // Boundary out of image // Mark as unknown boundary *featPtr++ = 1.0; } // Check bottom boundary if (lrcLimit.y+boundHeight < m_height) { int numDiff = 0; boundULC.y = lrcLimit.y; boundLRC.y = lrcLimit.y + boundHeight; boundULC.x = ulcLimit.x; boundLRC.x = lrcLimit.x; curULC.y = lrcLimit.y - deltaY; curLRC.y = lrcLimit.y; curULC.x = ulcLimit.x; curLRC.x = lrcLimit.x; numPixelsInBoundary = roiWidth*boundHeight; numPixelsInSubROI = roiWidth*deltaY; for (c=0; c<m_numChannels; c++) { double boundVal = (double)(m_sumPixPtr[c][boundLRC.y*m_imgSumWidth+boundLRC.x] - m_sumPixPtr[c][boundLRC.y*m_imgSumWidth+boundULC.x] - m_sumPixPtr[c][boundULC.y*m_imgSumWidth+boundLRC.x] + m_sumPixPtr[c][boundULC.y*m_imgSumWidth+boundULC.x])/numPixelsInBoundary; double roiVal = (double)(m_sumPixPtr[c][curLRC.y*m_imgSumWidth+curLRC.x] - m_sumPixPtr[c][curLRC.y*m_imgSumWidth+curULC.x] - m_sumPixPtr[c][curULC.y*m_imgSumWidth+curLRC.x] + m_sumPixPtr[c][curULC.y*m_imgSumWidth+curULC.x])/numPixelsInSubROI; //cout << "Bottom boundary: roiVal=" << roiVal << " boundVal=" << boundVal << endl; if (abs(boundVal-roiVal) > m_boundaryMeanDiffThresh) { numDiff++; } } if (numDiff > 0) { *featPtr++ = 2.0; } else { *featPtr++ = 0.0; } } else { // Boundary out of image // Mark as unknown boundary *featPtr++ = 1.0; } // Check left boundary if (ulcLimit.x >= boundWidth) { int numDiff = 0; boundULC.y = ulcLimit.y; boundLRC.y = lrcLimit.y; boundULC.x = ulcLimit.x - boundWidth; boundLRC.x = ulcLimit.x; curULC.y = ulcLimit.y; curLRC.y = lrcLimit.y; curULC.x = ulcLimit.x; curLRC.x = ulcLimit.x + deltaX; numPixelsInBoundary = roiHeight*boundWidth; numPixelsInSubROI = roiHeight*deltaX; for (c=0; c<m_numChannels; c++) { double boundVal = (double)(m_sumPixPtr[c][boundLRC.y*m_imgSumWidth+boundLRC.x] - m_sumPixPtr[c][boundLRC.y*m_imgSumWidth+boundULC.x] - m_sumPixPtr[c][boundULC.y*m_imgSumWidth+boundLRC.x] + m_sumPixPtr[c][boundULC.y*m_imgSumWidth+boundULC.x])/numPixelsInBoundary; double roiVal = (double)(m_sumPixPtr[c][curLRC.y*m_imgSumWidth+curLRC.x] - m_sumPixPtr[c][curLRC.y*m_imgSumWidth+curULC.x] - m_sumPixPtr[c][curULC.y*m_imgSumWidth+curLRC.x] + m_sumPixPtr[c][curULC.y*m_imgSumWidth+curULC.x])/numPixelsInSubROI; //cout << "Left boundary: roiVal=" << roiVal << " boundVal=" << boundVal << endl; if (abs(boundVal-roiVal) > m_boundaryMeanDiffThresh) { numDiff++; } } if (numDiff > 0) { *featPtr++ = 2.0; } else { *featPtr++ = 0.0; } } else { // Boundary out of image // Mark as unknown boundary *featPtr++ = 1.0; } // Check right boundary if (lrcLimit.x+boundWidth < m_width) { int numDiff = 0; boundULC.y = ulcLimit.y; boundLRC.y = lrcLimit.y; boundULC.x = lrcLimit.x; boundLRC.x = lrcLimit.x + boundWidth; curULC.y = ulcLimit.y; curLRC.y = lrcLimit.y; curULC.x = lrcLimit.x - deltaX; curLRC.x = lrcLimit.x; numPixelsInBoundary = roiHeight*boundWidth; numPixelsInSubROI = roiHeight*deltaX; for (c=0; c<m_numChannels; c++) { double boundVal = (double)(m_sumPixPtr[c][boundLRC.y*m_imgSumWidth+boundLRC.x] - m_sumPixPtr[c][boundLRC.y*m_imgSumWidth+boundULC.x] - m_sumPixPtr[c][boundULC.y*m_imgSumWidth+boundLRC.x] + m_sumPixPtr[c][boundULC.y*m_imgSumWidth+boundULC.x])/numPixelsInBoundary; double roiVal = (double)(m_sumPixPtr[c][curLRC.y*m_imgSumWidth+curLRC.x] - m_sumPixPtr[c][curLRC.y*m_imgSumWidth+curULC.x] - m_sumPixPtr[c][curULC.y*m_imgSumWidth+curLRC.x] + m_sumPixPtr[c][curULC.y*m_imgSumWidth+curULC.x])/numPixelsInSubROI; //cout << "Right boundary: roiVal=" << roiVal << " boundVal=" << boundVal << endl; if (abs(boundVal-roiVal) > m_boundaryMeanDiffThresh) { numDiff++; } } if (numDiff > 0) { *featPtr++ = 2.0; } else { *featPtr++ = 0.0; } } else { // Boundary out of image // Mark as unknown boundary *featPtr++ = 1.0; } } } catch (BaseException *e) { throw e->add(new GeneralException("Exception in IntegralColorExtraction::ExtractFeatures:",__FILE__,__LINE__)); } } void IntegralColorExtraction::ExtractFeatures(IplImage *i_input, VisualROI *i_roi) { try { Preprocess((const unsigned char *)(i_input->imageData)); ExtractFeatures(i_roi); } catch (BaseException *e) { throw e->add(new GeneralException("Exception in IntegralColorExtraction::ExtractFeatures:",__FILE__,__LINE__)); } } void IntegralColorExtraction::Preprocess(const unsigned char *i_src) { int c, p; const unsigned char *p_srcPix = i_src; // First extract all channels of image for (c=0; c<m_numChannels; c++) { m_chPixPtr[c] = (unsigned char *)(m_chImage[c]->imageData); } for (p=0; p<m_numPixels; p++) { for (c=0; c<m_numChannels; c++) { *(m_chPixPtr[c])++ = *p_srcPix++; } } // Preprocessing consists of computing the sum of pixels // in order to have the integral images. for (c=0; c<m_numChannels; c++) { cvIntegral(m_chImage[c], m_sumImage[c], NULL, NULL); } } void IntegralColorExtraction::Preprocess(IplImage *i_srcImg) { int c, p; const unsigned char *p_srcPix = (const unsigned char *)i_srcImg->imageData; // First extract all channels of image for (c=0; c<m_numChannels; c++) { m_chPixPtr[c] = (unsigned char *)(m_chImage[c]->imageData); } for (p=0; p<m_numPixels; p++) { for (c=0; c<m_numChannels; c++) { *(m_chPixPtr[c])++ = *p_srcPix++; } } // Preprocessing consists of computing the sum of pixels // in order to have the integral images. for (c=0; c<m_numChannels; c++) { cvIntegral(m_chImage[c], m_sumImage[c], NULL, NULL); } } // // Private methods // void IntegralColorExtraction::Initialize() { m_numPixels = m_width*m_height; m_numBytesInFrame = m_numPixels*m_numChannels; m_numIntRect = m_numHoriIntRect*m_numVertIntRect; int numIntRectFeatures; if (m_useRectDiff) { numIntRectFeatures = m_numIntRect+m_numIntRect*(m_numIntRect-1); m_tmpMeanFeatures = new double[m_numIntRect*m_numChannels]; m_curMeanVal = new double[m_numChannels]; } else { numIntRectFeatures = m_numIntRect; } m_featVect = RCPtr<Vector<VisualFeatureDesc<double> *> >(Vector<VisualFeatureDesc<double> *>::alloc(1)); (*m_featVect)[0] = new VisualIntegralDesc<double>( e_VISUALINTDESC_EuclideanDist, m_numChannels, numIntRectFeatures, m_maxValue, m_useRectDiff, m_useBoundary); m_chImage = new IplImage *[m_numChannels]; m_sumImage = new IplImage *[m_numChannels]; m_chPixPtr = new unsigned char*[m_numChannels]; m_sumPixPtr = new int *[m_numChannels]; CvSize imgSize; imgSize.width = m_width; imgSize.height = m_height; m_curImage = cvCreateImage(imgSize, IPL_DEPTH_8U, m_numChannels); // Sum images are of size (m_width+1)x(m_height+1) m_imgSumWidth = m_width+1; CvSize sumSize; sumSize.width = m_imgSumWidth; sumSize.height = m_height+1; for (int c=0; c<m_numChannels; c++) { m_chImage[c] = cvCreateImage(imgSize, IPL_DEPTH_8U, 1); m_sumImage[c] = cvCreateImage(sumSize, IPL_DEPTH_32S, 1); } } }//namespace RobotFlow --- NEW FILE: VisualIntegralDesc.cc --- /* Copyright (C) 2005 Pierre Moisan (Pie...@US...) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA */ #include "VisualIntegralDesc.h" using namespace FD; using namespace std; namespace RobotFlow { DECLARE_TYPE(VisualIntegralDesc<double>) }//namespace RobotFlow --- NEW FILE: VisualTarget.cc --- /* Copyright (C) 2005 Pierre Moisan (Pie...@US...) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA */ #include "VisualTarget.h" using namespace FD; using namespace std; namespace RobotFlow { DECLARE_TYPE(VisualTarget<double>) }//namespace RobotFlow --- NEW FILE: MultiIntegralCuesPFTracker.cc --- /* Copyright (C) 2005 Pierre Moisan (Pie...@US...) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA */ #include "BufferedNode.h" #include <iostream> #include "Image.h" #include "cv.h" #include "VisualTarget.h" #include "PFGenericParticleFilter.h" #include "PFPMRandomWalk.h" #include "IntegralColorExtraction.h" #include "IntegralEdgesOriExtraction.h" #include "IntegralLBPExtraction.h" #include <stdlib.h> #include <sys/timeb.h> using namespace FD; using namespace std; namespace RobotFlow { class MultiIntegralCuesPFTracker; DECLARE_NODE(MultiIntegralCuesPFTracker) /*Node * * @name MultiIntegralCuesPFTracker * @category RobotFlow:Vision:Tracking * @description Full tracker implementation using different types of integral features. * * @parameter_name FRAME_WIDTH * @parameter_type int * @parameter_value 320 * @parameter_description Video frame width. * * @parameter_name FRAME_HEIGHT * @parameter_type int * @parameter_value 240 * @parameter_description Video frame height. * * @parameter_name NUM_CHANNELS * @parameter_type int * @parameter_value 3 * @parameter_description Number of channels in video frame. * * @parameter_name TARGET_MATCH_THRES * @parameter_type float * @parameter_value 0.6 * @parameter_description Minimum confidence to assume a target match. * * @parameter_name TARGET_ADAPT_THRES * @parameter_type float * @parameter_value 0.9 * @parameter_description Adaptation treshold for each target cue model. * * @parameter_name TARGET_ADAPT_RATE * @parameter_type float * @parameter_value 0.1 * @parameter_description Adaptation rate for each target cue model. * * @parameter_name CUE_ADAPT_RATE * @parameter_type float * @parameter_value 0.2 * @parameter_description Adaptation rate for each cue weight. * * @parameter_name LIKELIHOOD_SIGMA * @parameter_type float * @parameter_value 200.0 * @parameter_description Exponential sigma value to discriminate likelihood. * * @parameter_name NUM_HORI_RECT * @parameter_type int * @parameter_value 2 * @parameter_description Number of horizontal rectangle to use for integral features. * * @parameter_name NUM_VERT_RECT * @parameter_type int * @parameter_value 2 * @parameter_description Number of vertical rectangle to use for integral features. * * @parameter_name USE_RECT_DIFF * @parameter_type bool * @parameter_value true * @parameter_description Features are differences between rectangle regions. * * @parameter_name USE_COLOR_BOUNDARY * @parameter_type bool * @parameter_value true * @parameter_description Flag indicating to detect a color boundary for a better scale adaptation. * * @parameter_name BOUNDARY_DIFF_THRESH * @parameter_type float * @parameter_value 20.0 * @parameter_description Minimum difference between boundary and ROI rectangle mean values. * * @parameter_name NUM_ORIENTATIONS * @parameter_type int * @parameter_value 8 * @parameter_description Number of edge orientations. * * @parameter_name MIN_EDGES_STRENGTH * @parameter_type float * @parameter_value 100.0 * @parameter_description Threshold to remove noisy/weak edges. * * @parameter_name MAX_EDGES_STRENGTH * @parameter_type float * @parameter_value 1000.0 * @parameter_description Limit on the edge strength. * * @parameter_name NUM_PARTICLES * @parameter_type int * @parameter_value 1000 * @parameter_description Number of particles (samples) to use. * * @parameter_name PARTICLE_STATE_SIZE * @parameter_type int * @parameter_value 4 * @parameter_description Particle state size. * * @parameter_name COLOR_CUE_WEIGHT * @parameter_type float * @parameter_value 0.4 * @parameter_description Color features weight. * * @parameter_name EDGES_CUE_WEIGHT * @parameter_type float * @parameter_value 0.3 * @parameter_description Color features weight. * * @parameter_name LBP_CUE_WEIGHT * @parameter_type float * @parameter_value 0.3 * @parameter_description Color features weight. * * @parameter_name REDETECTION_FREQUENCY * @parameter_type int * @parameter_value 30 * @parameter_description Frequency of application of redetection of current tracked region (in number of frames). * * @input_name INIT_VARIANCE * @input_type Vector<float> * @input_description Variance for initialization of each of the particle state. * * @input_name NOISE_VARIANCE * @input_type Vector<float> * @input_description Noise variance for each of the particle state. * * @input_name IMAGE_IN * @input_type Image * @input_description Current video frame. * * @input_name DETECTED_ROI * @input_type VisualROI * @input_description A region of interest detected as a potential target region. * * @input_name EDGES_ORI_SUM_IMG * @input_type Vector<double *> * @input_description Reference to edges orientation integral images. While reference is not null, no edges preprocessing will be done. * * @input_name SHOW_TRACKED_ROI * @input_type bool * @input_description Flag indicating to show the current tracked region. * * @input_name FRAME_BASENAME * @input_type string * @input_description Path and basename for the frames filename that will be produced as an output. * * @input_name ACTIVATION * @input_type bool * @input_description Node activation flag. * * @output_name IMAGE_OUT * @output_type Image * @output_description Current image with identified target. * * @output_name CURRENT_ROI * @output_type VisualROI * @output_description Current region of interest corresponding to the tracked target. * * @output_name CURRENT_TARGET * @output_type VisualTarget<double> * @output_description Current target to track. * * @output_name TARGET_PROB * @output_type float * @output_description Current target likelihood at current tracked position. * * @output_name ROI_FOR_DETECTION * @output_type VisualROI * @output_description Current region of interest corresponding to the tracked target. IMPORTANT NOTE: Do not pull ROI_FOR_DETECTION output first since this will result in the output to be not synchoronized with the frames and the tracking process. * * @output_name FRAME_FILENAME * @output_type string * @output_description Filename for the current frame to save. * END*/ class MultiIntegralCuesPFTracker : public BufferedNode { public: // // BufferedNode constructor // MultiIntegralCuesPFTracker(string nodeName, ParameterSet params) : BufferedNode(nodeName, params), m_initDone(false), m_needInitTargetFeat(false), m_tmpSample(NULL), m_particleFilter(NULL), m_predModel(NULL), m_intClrExtract(NULL), m_intEdgExtract(NULL), m_intLBPExtract(NULL) { try { m_initVarianceInID = addInput("INIT_VARIANCE"); m_noiseVarianceInID = addInput("NOISE_VARIANCE"); m_imageInID = addInput("IMAGE_IN"); m_roiInID = addInput("DETECTED_ROI"); m_edgesOriInID = addInput("EDGES_ORI_SUM_IMG"); m_showROIInID = addInput("SHOW_TRACKED_ROI"); m_baseNameInID = addInput("FRAME_BASENAME"); m_activatedInID = addInput("ACTIVATION"); m_imageOutID = addOutput("IMAGE_OUT"); m_roiOutID = addOutput("CURRENT_ROI"); m_targetOutID = addOutput("CURRENT_TARGET"); m_targetProbOutID = addOutput("TARGET_PROB"); m_detectROIOutID = addOutput("ROI_FOR_DETECTION"); m_frameNameOutID = addOutput("FRAME_FILENAME"); m_width = dereference_cast<int>(parameters.get("FRAME_WIDTH")); m_height = dereference_cast<int>(parameters.get("FRAME_HEIGHT")); m_numChannels = dereference_cast<int>(parameters.get("NUM_CHANNELS")); m_targetMatchThres = dereference_cast<float>(parameters.get("TARGET_MATCH_THRES")); m_targetAdaptThres = dereference_cast<float>(parameters.get("TARGET_ADAPT_THRES")); m_targetAdaptRate = dereference_cast<float>(parameters.get("TARGET_ADAPT_RATE")); m_cueAdaptRate = dereference_cast<float>(parameters.get("CUE_ADAPT_RATE")); m_lSigma = dereference_cast<float>(parameters.get("LIKELIHOOD_SIGMA")); m_numHoriIntRect = dereference_cast<int>(parameters.get("NUM_HORI_RECT")); m_numVertIntRect = dereference_cast<int>(parameters.get("NUM_VERT_RECT")); m_useRectDiff = dereference_cast<bool>(parameters.get("USE_RECT_DIFF")); m_useBoundary = dereference_cast<bool>(parameters.get("USE_COLOR_BOUNDARY")); m_boundaryMeanDiffThresh = dereference_cast<float>(parameters.get("BOUNDARY_DIFF_THRESH")); m_numOriBins = dereference_cast<int>(parameters.get("NUM_ORIENTATIONS")); m_edgesStrTresh = dereference_cast<float>(parameters.get("MIN_EDGES_STRENGTH")); m_maxStrengthValue = dereference_cast<float>(parameters.get("MAX_EDGES_STRENGTH")); m_numSamples = dereference_cast<int>(parameters.get("NUM_PARTICLES")); m_sampleStateSize = dereference_cast<int>(parameters.get("PARTICLE_STATE_SIZE")); m_colorCueWeight = dereference_cast<float>(parameters.get("COLOR_CUE_WEIGHT")); m_edgesCueWeight = dereference_cast<float>(parameters.get("EDGES_CUE_WEIGHT")); m_LBPCueWeight = dereference_cast<float>(parameters.get("LBP_CUE_WEIGHT")); m_redetectFreq = dereference_cast<int>(parameters.get("REDETECTION_FREQUENCY")); m_numPixels = m_width*m_height; m_numBytesInFrame = m_numPixels*m_numChannels; CvSize imgSize; imgSize.width = m_width; imgSize.height = m_height; m_curImage = cvCreateImage(imgSize, IPL_DEPTH_8U, m_numChannels); m_tmpSample = new PFGenericParticle(m_sampleStateSize); m_refTarget = RCPtr<VisualTarget<double> >(new VisualTarget<double>()); m_featVecRef = RCPtr<Vector<VisualFeatureDesc<double> *> >(Vector<VisualFeatureDesc<double> *>::alloc(3)); m_intClrExtract = new IntegralColorExtraction(m_width, m_height, m_numChannels, m_numHoriIntRect, m_numVertIntRect, 255.0, m_useRectDiff, m_useBoundary, m_boundaryMeanDiffThresh); m_intEdgExtract = new IntegralEdgesOriExtraction(m_width, m_height, m_numChannels, m_numHoriIntRect, m_numVertIntRect, m_numOriBins, m_edgesStrTresh, m_maxStrengthValue, m_useRectDiff); m_intLBPExtract = new IntegralLBPExtraction(m_width, m_height, m_numChannels, m_numHoriIntRect, m_numVertIntRect, 8, 1, false, true, 0, 255.0, m_useRectDiff); m_curSampleROIRef = RCPtr<VisualROI>(new VisualROI()); m_curSampleROIRef->SetType(e_VISUALROI_rectangular); m_roiColor[0] = 255; m_roiColor[1] = 0; m_roiColor[2] = 0; m_numSkipForRedetect = 0; } catch (BaseException *e) { throw e->add(new GeneralException("Exception in MultiIntegralCuesPFTracker::MultiIntegralCuesPFTracker:",__FILE__,__LINE__)); } } virtual ~MultiIntegralCuesPFTracker() { delete m_predModel; delete m_particleFilter; delete m_intLBPExtract; delete m_intEdgExtract; delete m_intClrExtract; // This should be deleted by the particle filter //delete m_tmpSample; cvReleaseImage(&m_curImage); } // Modified BufferedNode request method to support cyclic node connection virtual void request(int output_id, const ParameterSet &req) { if (req.exist("LOOKAHEAD")) { outputs[output_id].lookAhead = max(outputs[output_id].lookAhead,dereference_cast<int> (req.g... [truncated message content] |
Update of /cvsroot/robotflow/RobotFlow/OpenCV/include In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv6226/OpenCV/include Added Files: ColorHistExtraction.h IntegralColorExtraction.h IntegralEdgesOriExtraction.h IntegralLBPExtraction.h Makefile.am MeanShiftTracker.h PFGenericParticle.h PFGenericParticleFilter.h PFPMRandomWalk.h PFParticle.h PFParticleFilter.h PFPredictionModel.h PFUtilityFct.h VisualFeatureDesc.h VisualFeaturesExtraction.h VisualHistogramDesc.h VisualIntegralDesc.h VisualROI.h VisualTarget.h VisualTargetManager.h VisualTracker.h Log Message: moved from Vision directory --- NEW FILE: VisualIntegralDesc.h --- /* Copyright (C) 2005 Pierre Moisan (Pie...@US...) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA */ #ifndef _VISUALINTEGRALCOLORDESC_H_ #define _VISUALINTEGRALCOLORDESC_H_ #include <math.h> #include "VisualFeatureDesc.h" namespace RobotFlow { typedef enum { e_VISUALINTDESC_EuclideanDist = 0, // TODO: maybe add other similarity methods e_VISUALINTDESC_UnknownSimilarity } e_VISUALINTDESC_similarityType; // // Descriptor for integral image features // A typical integral image feature consists of a mean value // for a given region of interest in each channel of the image // template <class FeatType> class VisualIntegralDesc : public VisualFeatureDesc<FeatType> { public: VisualIntegralDesc() : VisualFeatureDesc<FeatType>(e_VISUALDESCRIPTOR_integral), m_simType(e_VISUALINTDESC_UnknownSimilarity), m_numClrChannels(0), m_numIntValues(0), m_featSize(0), m_valid(false), m_useRectDiff(false), m_useBoundary(false), m_intFeatures(NULL) { SetSimilarityFct(); } VisualIntegralDesc(e_VISUALINTDESC_similarityType i_simType, unsigned int i_numClrChannels, unsigned int i_numIntValues, FeatType i_maxValue, bool i_useRectDiff, bool i_useBoundary) : VisualFeatureDesc<FeatType>(e_VISUALDESCRIPTOR_integral), m_simType(i_simType), m_numClrChannels(i_numClrChannels), m_numIntValues(i_numIntValues), m_maxValue(i_maxValue), m_valid(true), m_useRectDiff(i_useRectDiff), m_useBoundary(i_useBoundary), m_intFeatures(NULL) { m_featSize = m_numClrChannels*m_numIntValues; if (m_useRectDiff) { m_maxDiffValue = m_maxValue; m_maxValue = (FeatType)((double)(m_maxDiffValue)*0.5); m_numMeanValues = (unsigned int)(sqrt((float)(m_numIntValues))); m_numIntValues -= m_numMeanValues; } if (m_useBoundary) { m_featSize += 4; } m_intFeatures = new FeatType[m_featSize]; SetSimilarityFct(); } VisualIntegralDesc(const VisualIntegralDesc<FeatType> &i_ref) { try { this->SetType(i_ref.GetType()); m_simType = i_ref.m_simType; m_numClrChannels = i_ref.m_numClrChannels; m_numIntValues = i_ref.m_numIntValues; m_featSize = i_ref.m_featSize; m_maxValue = i_ref.m_maxValue; m_valid = i_ref.m_valid; m_useRectDiff = i_ref.m_useRectDiff; m_useBoundary = i_ref.m_useBoundary; m_maxDiffValue = i_ref.m_maxDiffValue; m_numMeanValues = i_ref.m_numMeanValues; m_intFeatures = new FeatType[m_featSize]; for (int i=0; i<m_featSize; i++) { m_intFeatures[i] = i_ref.m_intFeatures[i]; } SetSimilarityFct(); } catch (FD::BaseException *e) { throw e->add(new FD::GeneralException("Exception caught in VisualIntegralDesc::VisualIntegralDesc:",__FILE__,__LINE__)); } } ~VisualIntegralDesc() { delete [] m_intFeatures; } VisualIntegralDesc<FeatType> & operator =(const VisualIntegralDesc<FeatType> &i_ref) { // Avoid self assignment if (&i_ref == this) { return *this; } this->SetType(i_ref.GetType()); m_simType = i_ref.m_simType; m_numClrChannels = i_ref.m_numClrChannels; m_numIntValues = i_ref.m_numIntValues; m_featSize = i_ref.m_featSize; m_maxValue = i_ref.m_maxValue; m_valid = i_ref.m_valid; m_useRectDiff = i_ref.m_useRectDiff; m_useBoundary = i_ref.m_useBoundary; m_maxDiffValue = i_ref.m_maxDiffValue; m_numMeanValues = i_ref.m_numMeanValues; delete [] m_intFeatures; m_intFeatures = new FeatType[m_featSize]; for (int i=0; i<m_featSize; i++) { m_intFeatures[i] = i_ref.m_intFeatures[i]; } SetSimilarityFct(); return *this; } VisualIntegralDesc<FeatType>* clone() const { return new VisualIntegralDesc<FeatType>(*this); } // Default routine to print a VisualIntegralDesc object to an output stream void printOn(std::ostream &out) const { throw new FD::GeneralException("Exception in VisualIntegralDesc::printOn: cannot use base class routine.",__FILE__,__LINE__); } // Default routine to read a VisualIntegralDesc object from an input stream void readFrom(std::istream &in) { throw new FD::GeneralException("Exception in VisualIntegralDesc::readFrom: cannot use base class routine.",__FILE__,__LINE__); } double Similarity(const FeatType *i_candidate, unsigned int i_size) const { try { if (!i_candidate) { throw new FD::GeneralException("VisualIntegralDesc::Similarity: invalid (NULL) candidate features.",__FILE__,__LINE__); } if (i_size != m_featSize) { throw new FD::GeneralException("VisualIntegralDesc::Similarity: candidate features size differs from current features descriptor size.",__FILE__,__LINE__); } if (!m_similarityFct) { throw new FD::GeneralException("VisualIntegralDesc::Similarity: invalid or unknown similarity type.",__FILE__,__LINE__); } if (!m_valid) { // Invalid descriptor, output maximal distance return 0.0; } // Use appropriate function return (this->*m_similarityFct)(i_candidate); } catch (FD::BaseException *e) { throw e->add(new FD::GeneralException("Exception caught in VisualIntegralDesc::Similarity:",__FILE__,__LINE__)); } } void Adapt(const FeatType *i_candidate, unsigned int i_size, double i_rate) { try { if (!i_candidate) { throw new FD::GeneralException("VisualIntegralDesc::Adapt: invalid (NULL) candidate features.",__FILE__,__LINE__); } if (i_size != m_featSize) { throw new FD::GeneralException("VisualIntegralDesc::Adapt: candidate features size differs from current features descriptor size.",__FILE__,__LINE__); } if (i_rate < 0.0 || i_rate > 1.0) { throw new FD::GeneralException ("VisualHistogramDesc::Adapt : adaptation rate must be in the interval [0.0,1.0]",__FILE__,__LINE__); } if (i_rate == 0.0) { // Nothing to do return; } if (i_rate == 1.0) { SetFeatures(i_candidate, i_size); return; } int i; const FeatType *p_adaptFeat = i_candidate; double compRate = 1.0 - i_rate; for (i=0; i<m_featSize; i++) { m_intFeatures[i] = (FeatType)(compRate*(double)(m_intFeatures[i]) + i_rate*(double)(p_adaptFeat[i])); } } catch (FD::BaseException *e) { throw e->add(new FD::GeneralException("Exception caught in VisualIntegralDesc::Adapt:",__FILE__,__LINE__)); } } unsigned int GetSize() const { return m_featSize; } FeatType *GetFeatures() { return m_intFeatures; } const FeatType *GetCstFeatures() const { return (const FeatType *)m_intFeatures; } bool GetValidity() const { return m_valid; } void SetSize(unsigned int i_size) { throw new FD::GeneralException("Exception in VisualIntegralDesc::SetSize: cannot use base class routine.",__FILE__,__LINE__); } void SetFeatures(const FeatType *i_ref, unsigned int i_size) { try { if (!i_ref) { throw new FD::GeneralException("VisualIntegralDesc::SetFeatures: invalid (NULL) input features.",__FILE__,__LINE__); } if (i_size != m_featSize) { throw new FD::GeneralException("VisualIntegralDesc::SetFeatures: candidate features size differs from current features descriptor size.",__FILE__,__LINE__); } int i; const FeatType *p_inFeat = i_ref; for (i=0; i<m_featSize; i++) { m_intFeatures[i] = *p_inFeat++; } } catch (FD::BaseException *e) { throw e->add(new FD::GeneralException("Exception caught in VisualIntegralDesc::SetFeatures:",__FILE__,__LINE__)); } } void SetValidity(bool i_flag) { m_valid = i_flag; } private: void SetSimilarityFct() { if (m_simType == e_VISUALINTDESC_EuclideanDist) { if (m_useRectDiff) { if (m_useBoundary) { m_similarityFct = &VisualIntegralDesc::EuclideanDistWDiffWBound; } else { m_similarityFct = &VisualIntegralDesc::EuclideanDistWDiff; } } else { if (m_useBoundary) { m_similarityFct = &VisualIntegralDesc::EuclideanDistWBound; } else { m_similarityFct = &VisualIntegralDesc::EuclideanDist; } } } else { m_similarityFct = NULL; } } double EuclideanDist(const FeatType *i_candidate) const { FeatType diff; double dist = 0.0; const FeatType *p_curFeat = (const FeatType *)m_intFeatures; const FeatType *p_candFeat = i_candidate; for (int i=0; i<m_numIntValues; i++) { double clrDist = 0.0; for (int c=0; c<m_numClrChannels; c++) { diff = ((*p_curFeat) - (*p_candFeat))/m_maxValue; clrDist += (double)(diff*diff); p_curFeat++; p_candFeat++; } dist += sqrt(clrDist); } dist /= m_numIntValues; if (dist > 1.0) { return 0.0; } return 1.0-dist; } double EuclideanDistWDiff(const FeatType *i_candidate) const { int i, c; FeatType diff; double dist, dist1 = 0.0, dist2 = 0.0; const FeatType *p_curFeat = (const FeatType *)m_intFeatures; const FeatType *p_candFeat = i_candidate; for (i=0; i<m_numMeanValues; i++) { double clrDist = 0.0; for (c=0; c<m_numClrChannels; c++) { diff = ((*p_curFeat) - (*p_candFeat))/m_maxValue; clrDist += (double)(diff*diff); p_curFeat++; p_candFeat++; } dist1 += sqrt(clrDist); } dist1 /= m_numMeanValues; if (dist1 > 1.0) { return 0.0; } for (i=0; i<m_numIntValues; i++) { double clrDist = 0.0; for (c=0; c<m_numClrChannels; c++) { diff = ((*p_curFeat) - (*p_candFeat))/m_maxDiffValue; clrDist += (double)(diff*diff); p_curFeat++; p_candFeat++; } dist2 += sqrt(clrDist); } dist2 /= m_numIntValues; if (dist2 > 1.0) { return 0.0; } dist = (dist1+dist2)*0.5; if (dist > 1.0) { return 0.0; } return 1.0-dist; } double EuclideanDistWBound(const FeatType *i_candidate) const { int i; FeatType diff; double dist = 0.0; const FeatType *p_curFeat = (const FeatType *)m_intFeatures; const FeatType *p_candFeat = i_candidate; for (i=0; i<m_numIntValues; i++) { double clrDist = 0.0; for (int c=0; c<m_numClrChannels; c++) { diff = ((*p_curFeat) - (*p_candFeat))/m_maxValue; clrDist += (double)(diff*diff); p_curFeat++; p_candFeat++; } dist += sqrt(clrDist); } dist /= m_numIntValues; if (dist > 1.0) { return 0.0; } // Check for an obvious boundary in candidate int numBoundary = 0; int numUnknown = 0; for (i=0; i<4; i++) { if (*p_candFeat == 1) { numUnknown++; } else if (*p_candFeat == 2) { numBoundary++; } } if (numUnknown < 4) { int numValid = 4-numUnknown; dist += (double)(numValid-numBoundary)/(double)(numValid); if (dist > 1.0) { return 0.0; } } return 1.0-dist; } double EuclideanDistWDiffWBound(const FeatType *i_candidate) const { int i, c; FeatType diff; double dist, dist1 = 0.0, dist2 = 0.0; const FeatType *p_curFeat = (const FeatType *)m_intFeatures; const FeatType *p_candFeat = i_candidate; for (i=0; i<m_numMeanValues; i++) { double clrDist = 0.0; for (c=0; c<m_numClrChannels; c++) { diff = ((*p_curFeat) - (*p_candFeat))/m_maxValue; clrDist += (double)(diff*diff); p_curFeat++; p_candFeat++; } dist1 += sqrt(clrDist); } dist1 /= m_numMeanValues; if (dist1 > 1.0) { return 0.0; } for (i=0; i<m_numIntValues; i++) { double clrDist = 0.0; for (c=0; c<m_numClrChannels; c++) { diff = ((*p_curFeat) - (*p_candFeat))/m_maxDiffValue; clrDist += (double)(diff*diff); p_curFeat++; p_candFeat++; } dist2 += sqrt(clrDist); } dist2 /= m_numIntValues; if (dist2 > 1.0) { return 0.0; } dist = (dist1+dist2)*0.5; if (dist > 1.0) { return 0.0; } // Check for an obvious boundary in candidate int numBoundary = 0; int numUnknown = 0; for (i=0; i<4; i++) { if (*p_candFeat == 1) { numUnknown++; } else if (*p_candFeat == 2) { numBoundary++; } } if (numUnknown < 4) { int numValid = 4-numUnknown; dist += (double)(numValid-numBoundary)/(double)(numValid); if (dist > 1.0) { return 0.0; } } return 1.0-dist; } private: // Similarity/Distance type to use e_VISUALINTDESC_similarityType m_simType; // Number of color channels of the color space used unsigned int m_numClrChannels; // Number of integral rectangle values/features unsigned int m_numIntValues; // The size of the feature descriptor unsigned int m_featSize; // Maximum channel value FeatType m_maxValue; unsigned int m_numMeanValues; FeatType m_maxDiffValue; bool m_useRectDiff; bool m_useBoundary; // Validity flag in cases where ROI was invalid for features extraction bool m_valid; // Integral features FeatType *m_intFeatures; // Function pointer to the appropriate private similarity routine double (VisualIntegralDesc::*m_similarityFct)(const FeatType *) const; }; } #endif --- NEW FILE: VisualFeaturesExtraction.h --- /* Copyright (C) 2005 Pierre Moisan (Pie...@US...) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA */ #ifndef _VISUALFEATURESEXTRACTION_H_ #define _VISUALFEATURESEXTRACTION_H_ #include "BufferedNode.h" #include <iostream> #include "Image.h" #include "cv.h" #include "VisualFeatureDesc.h" #include "VisualROI.h" namespace RobotFlow { template <class FeatBaseType> class VisualFeaturesExtraction : public FD::BufferedNode { public: VisualFeaturesExtraction() { } VisualFeaturesExtraction(std::string nodeName, FD::ParameterSet params) : BufferedNode(nodeName, params) { } virtual ~VisualFeaturesExtraction() { } // Default routine to print a VisualFeaturesExtraction object to an output stream virtual void printOn(std::ostream &out) const = 0; // Default routine to read a VisualFeaturesExtraction object from an input stream virtual void readFrom(std::istream &in) = 0; virtual void calculate(int output_id, int count, FD::Buffer &out) = 0; virtual void ExtractFeatures(IplImage *i_frame, VisualROI *i_roi) = 0; virtual VisualFeatureDesc<FeatBaseType> *GetDescriptor() = 0; virtual const VisualFeatureDesc<FeatBaseType> *GetCstDescriptor() const = 0; }; } #endif --- NEW FILE: IntegralEdgesOriExtraction.h --- /* Copyright (C) 2005 Pierre Moisan (Pie...@US...) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA */ #ifndef _INTEGRALEDGESORIEXTRACTION_H_ #define _INTEGRALEDGESORIEXTRACTION_H_ #include "VisualFeaturesExtraction.h" #include "VisualIntegralDesc.h" #include "VisualTarget.h" #include <stdlib.h> #include <sys/timeb.h> namespace RobotFlow { // // Integral Edges Orientation Features Extraction for RobotFlow // class IntegralEdgesOriExtraction : public VisualFeaturesExtraction<double> { public: // // Default constructor for Object // IntegralEdgesOriExtraction(); // // Constructor with complete intialisation // IntegralEdgesOriExtraction(int i_width, int i_height, int i_numChannels, int i_numHoriIntRect, int i_numVertIntRect, int i_numOriBins, double i_edgesStrTresh, double i_maxStrengthValue, bool i_useRectDiff); // // BufferedNode constructor // IntegralEdgesOriExtraction(std::string nodeName, FD::ParameterSet params); // // Constructor using input stream // IntegralEdgesOriExtraction(std::istream &in) { readFrom(in); } virtual ~IntegralEdgesOriExtraction(); // Default routine to print a IntegralEdgesOriExtraction object to an output stream void printOn(std::ostream &out) const { throw new FD::GeneralException("Exception in IntegralEdgesOriExtraction::printOn: method not yet implemented.",__FILE__,__LINE__); } // Default routine to read a IntegralEdgesOriExtraction object from an input stream void readFrom(std::istream &in) { throw new FD::GeneralException("Exception in IntegralEdgesOriExtraction::readFrom: method not yet implemented.",__FILE__,__LINE__); } virtual void request(int output_id, const FD::ParameterSet &req); void calculate(int output_id, int count, FD::Buffer &out); void Preprocess(IplImage *i_srcImg); void ExtractFeatures(VisualROI *i_roi); void ExtractFeatures(IplImage *i_input, VisualROI *i_roi); VisualFeatureDesc<double> *GetDescriptor() { return (*m_featVect)[0]; } const VisualFeatureDesc<double> *GetCstDescriptor() const { return (const VisualIntegralDesc<double> *)(*m_featVect)[0]; } FD::ObjectRef GetEdgesOriSumRef() { return m_edgesOriSumRef; } void SetEdgesOriSumRef(FD::ObjectRef i_ref) { m_edgesOriSumRef = i_ref; if (!m_edgesOriSumRef->isNil()) { FD::RCPtr<FD::Vector<double *> > imgVecRef = m_edgesOriSumRef; if (imgVecRef->size() != m_numOriBins) { throw new FD::GeneralException("Exception in IntegralEdgesOriExtraction::SetEdgesOriSumRef: given reference does not seem to have the same number of orientations.",__FILE__,__LINE__); } } } private: void Initialize(); private: // Input IDs (for BufferedNode) int m_imageInID; int m_roiInID; int m_useNextImgInID; // Output IDs (for BufferedNode) int m_featuresOutID; int m_ppCompletedOutID; // Width of images int m_width; int m_imgSumWidth; // Height of images int m_height; // Number of channels in an image int m_numChannels; // Number of pixels in an image int m_numPixels; // Number of bytes in an image int m_numBytesInFrame; // Number of orientations to use int m_numOriBins; // Number of independant rectangular region to compute // the integral edges orientation features int m_numHoriIntRect; int m_numVertIntRect; int m_numIntRect; // Edge strength threshold to remove noisy edges double m_edgesStrTresh; // Maximum strength channel value double m_maxStrengthValue; double m_maxFeatValue; bool m_useRectDiff; double *m_tmpMeanFeatures; double *m_curMeanVal; // Integral color descriptor for region of interest FD::RCPtr<FD::Vector<VisualFeatureDesc<double> *> > m_featVect; // Temporary image copy IplImage *m_curImage; // Grayscale version of current image IplImage *m_grayImage; // Result from filtering With Sobel (X) IplImage *m_oriXImage; // Result from filtering With Sobel (Y) IplImage *m_oriYImage; // Multi-channel edges orientation map IplImage **m_edgesOri; // Multi-channel sum of edges orientation map IplImage **m_edgesOriSum; FD::ObjectRef m_edgesOriSumRef; // Pixel/value pointers float **m_edgesOriPix; double **m_edgesOriSumPix; }; } #endif --- NEW FILE: VisualFeatureDesc.h --- /* Copyright (C) 2005 Pierre Moisan (Pie...@US...) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA */ #ifndef _VISUALFEATUREDESC_H_ #define _VISUALFEATUREDESC_H_ #include "Object.h" #include <iostream> namespace RobotFlow { typedef enum { e_VISUALDESCRIPTOR_histogram = 0, e_VISUALDESCRIPTOR_integral, e_VISUALDESCRIPTOR_unknown } e_VISUALDESCRIPTOR_type; // // This should be an abstract base class. Since many containers (like Vector) // do not allow abstract functions, each abstract function throws an exception // to avoid the direct use of the base class "abstract" routines. // template <class FeatBaseType> class VisualFeatureDesc : public FD::Object { friend std::ostream &operator<<(std::ostream &o_out, const VisualFeatureDesc<FeatBaseType> &i_ref) { try { i_ref.printOn(o_out); // Enable cascading return o_out; } catch (FD::BaseException *e) { throw e->add(new FD::GeneralException("Exception in VisualFeatureDesc::operator<<:",__FILE__,__LINE__)); } } friend std::istream &operator>>(std::istream &i_in, VisualFeatureDesc<FeatBaseType> &o_ref) { try { o_ref.readFrom(i_in); // Enable cascading return i_in; } catch (FD::BaseException *e) { throw e->add(new FD::GeneralException("Exception in VisualFeatureDesc::operator>>:",__FILE__,__LINE__)); } } public: VisualFeatureDesc() : m_descType(e_VISUALDESCRIPTOR_unknown) { } VisualFeatureDesc(e_VISUALDESCRIPTOR_type i_descType) : m_descType(i_descType) { } VisualFeatureDesc(const VisualFeatureDesc<FeatBaseType> &i_ref) { m_descType = i_ref.m_descType; } virtual ~VisualFeatureDesc() { } virtual VisualFeatureDesc<FeatBaseType> & operator =(const VisualFeatureDesc<FeatBaseType> &i_ref) { // Avoid self assignment if (&i_ref != this) { this->m_descType = i_ref.m_descType; } return *this; } virtual VisualFeatureDesc<FeatBaseType>* clone() const { return new VisualFeatureDesc<FeatBaseType>(*this); } // Default routine to print a VisualFeatureDesc object to an output stream virtual void printOn(std::ostream &out) const { throw new FD::GeneralException("Exception in VisualFeatureDesc::printOn: cannot use base class routine.",__FILE__,__LINE__); } // Default routine to read a VisualFeatureDesc object from an input stream virtual void readFrom(std::istream &in) { throw new FD::GeneralException("Exception in VisualFeatureDesc::readFrom: cannot use base class routine.",__FILE__,__LINE__); } virtual double Similarity(const FeatBaseType *i_candidate, unsigned int i_size) const { throw new FD::GeneralException("Exception in VisualFeatureDesc::Similarity: cannot use base class routine.",__FILE__,__LINE__); } virtual void Adapt(const FeatBaseType *i_candidate, unsigned int i_size, double i_rate) { throw new FD::GeneralException("Exception in VisualFeatureDesc::Adapt: cannot use base class routine.",__FILE__,__LINE__); } virtual unsigned int GetSize() const { throw new FD::GeneralException("Exception in VisualFeatureDesc::GetSize: cannot use base class routine.",__FILE__,__LINE__); } virtual FeatBaseType *GetFeatures() { throw new FD::GeneralException("Exception in VisualFeatureDesc::GetFeatures: cannot use base class routine.",__FILE__,__LINE__); } virtual const FeatBaseType *GetCstFeatures() const { throw new FD::GeneralException("Exception in VisualFeatureDesc::GetCstFeatures: cannot use base class routine.",__FILE__,__LINE__); } virtual bool GetValidity() const { throw new FD::GeneralException("Exception in VisualFeatureDesc::GetValidity: cannot use base class routine.",__FILE__,__LINE__); } e_VISUALDESCRIPTOR_type GetType() const { return m_descType; } void SetType(e_VISUALDESCRIPTOR_type i_type) { m_descType = i_type; } virtual void SetSize(unsigned int i_size) { throw new FD::GeneralException("Exception in VisualFeatureDesc::SetSize: cannot use base class routine.",__FILE__,__LINE__); } virtual void SetFeatures(const FeatBaseType *i_ref, unsigned int i_size) { throw new FD::GeneralException("Exception in VisualFeatureDesc::SetFeatures: cannot use base class routine.",__FILE__,__LINE__); } virtual void SetValidity(bool i_flag) { throw new FD::GeneralException("Exception in VisualFeatureDesc::SetValidity: cannot use base class routine.",__FILE__,__LINE__); } private: e_VISUALDESCRIPTOR_type m_descType; }; } #endif --- NEW FILE: Makefile.am --- AUTOMAKE_OPTIONS = no-dependencies include_HEADERS = VisualFeaturesExtraction.h \ ColorHistExtraction.h \ IntegralColorExtraction.h \ IntegralEdgesOriExtraction.h \ IntegralLBPExtraction.h \ MeanShiftTracker.h \ PFGenericParticle.h \ PFGenericParticleFilter.h \ PFParticle.h \ PFParticleFilter.h \ PFPMRandomWalk.h \ PFPredictionModel.h \ PFUtilityFct.h \ VisualFeatureDesc.h \ VisualHistogramDesc.h \ VisualIntegralDesc.h \ VisualROI.h \ VisualTarget.h \ VisualTargetManager.h \ VisualTracker.h --- NEW FILE: VisualTarget.h --- /* Copyright (C) 2005 Pierre Moisan (Pie...@US...) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA */ #ifndef _VISUALTARGET_H_ #define _VISUALTARGET_H_ #include "BufferedNode.h" #include <iostream> #include "Image.h" #include "cv.h" #include "Vector.h" #include "VisualFeatureDesc.h" #include "VisualROI.h" namespace RobotFlow { template <class FeatBaseType> class VisualTarget : public FD::Object { public: VisualTarget() : m_valid(false), m_id(0), m_activeAge(0), m_passiveAge(0), m_curDescIdx(0), m_refROI(NULL), m_numDesc(-1), m_targetDesc(NULL), m_cueWeights(NULL), m_tmpCueProb(NULL) { } VisualTarget(int i_id, VisualROI *i_roi, FD::Vector<VisualFeatureDesc<FeatBaseType> *> *i_targetDesc) : m_valid(true), m_id(i_id), m_activeAge(0), m_passiveAge(0), m_curDescIdx(0), m_targetDesc(NULL), m_cueWeights(NULL), m_tmpCueProb(NULL) { m_refROI = FD::RCPtr<VisualROI>(new VisualROI(*i_roi)); m_numDesc = i_targetDesc->size(); m_targetDesc = new FD::Vector<VisualFeatureDesc<FeatBaseType> *>(m_numDesc); for (int i=0; i<m_numDesc; i++) { (*m_targetDesc)[i] = new VisualFeatureDesc<FeatBaseType>(*((*i_targetDesc)[i])); } m_cueWeights = new double[m_numDesc]; m_tmpCueProb = new double[m_numDesc]; } VisualTarget(const VisualTarget<FeatBaseType> & i_ref) :m_id(-1), m_activeAge(-1), m_passiveAge(-1), m_curDescIdx(0), m_numDesc(-1), m_targetDesc(NULL), m_cueWeights(NULL), m_tmpCueProb(NULL) { try { m_valid = i_ref.m_valid; m_id = i_ref.m_id; m_activeAge = i_ref.m_activeAge; m_passiveAge = i_ref.m_passiveAge; m_curDescIdx = i_ref.m_curDescIdx; if (!i_ref.m_refROI.isNil()) { if (!this->m_refROI.isNil()) { *(this->m_refROI) = *(i_ref.m_refROI); } else { this->m_refROI = FD::RCPtr<VisualROI>(new VisualROI(*(i_ref.m_refROI))); } } else { std::cout << "VisualTarget::VisualTarget: reference to copy has NULL ROI" << std::endl; } if (i_ref.m_targetDesc) { SetDescriptorsVec(i_ref.m_targetDesc); } else { this->m_targetDesc = NULL; this->m_numDesc = 0; this->m_cueWeights = NULL; this->m_tmpCueProb = NULL; } for (int i=0; i<m_numDesc; i++) { m_cueWeights[i] = i_ref.m_cueWeights[i]; m_tmpCueProb[i] = i_ref.m_tmpCueProb[i]; } } catch (FD::BaseException *e) { throw e->add(new FD::GeneralException("Exception caught in VisualTarget::VisualTarget:",__FILE__,__LINE__)); } } ~VisualTarget() { for (int i=0; i<m_numDesc; i++) { delete (*m_targetDesc)[i]; } delete m_targetDesc; delete [] m_cueWeights; delete [] m_tmpCueProb; } VisualTarget<FeatBaseType> & operator =(VisualTarget<FeatBaseType> &i_ref) { try { // Avoid self assignment if (&i_ref != this) { this->m_valid = i_ref.m_valid; this->m_id = i_ref.m_id; this->m_activeAge = i_ref.m_activeAge; this->m_passiveAge = i_ref.m_passiveAge; this->m_curDescIdx = i_ref.m_curDescIdx; if (!i_ref.m_refROI.isNil()) { if (!this->m_refROI.isNil()) { *(this->m_refROI) = *(i_ref.m_refROI); } else { this->m_refROI = FD::RCPtr<VisualROI>(new VisualROI(*(i_ref.m_refROI))); } } else { std::cout << "VisualTarget::VisualTarget: reference to copy has NULL ROI" << std::endl; } if (i_ref.m_targetDesc) { SetDescriptorsVec(i_ref.m_targetDesc); } else { this->m_targetDesc = NULL; this->m_cueWeights = NULL; this->m_tmpCueProb = NULL; this->m_numDesc = 0; } for (int i=0; i<m_numDesc; i++) { m_cueWeights[i] = i_ref.m_cueWeights[i]; m_tmpCueProb[i] = i_ref.m_tmpCueProb[i]; } } return *this; } catch (FD::BaseException *e) { throw e->add(new FD::GeneralException("Exception caught in VisualTarget::operator=:",__FILE__,__LINE__)); } } // Default routine to print a VisualTarget object to an output stream void printOn(std::ostream &out) const { throw new FD::GeneralException("Exception in VisualTarget::printOn: method not yet implemented.",__FILE__,__LINE__); } // Default routine to read a VisualTarget object from an input stream void readFrom(std::istream &in) { throw new FD::GeneralException("Exception in VisualTarget::readFrom: method not yet implemented.",__FILE__,__LINE__); } void Adapt(FD::Vector<VisualFeatureDesc<FeatBaseType> *> *i_desc, double i_rate) { if (m_numDesc != i_desc->size()) { throw new FD::GeneralException("Exception in VisualTarget::Adapt: input descriptor vector size differs from current object's vector.",__FILE__,__LINE__); } for (int i=0; i<m_numDesc; i++) { if ((*m_targetDesc)[i]->GetType() != (*i_desc)[i]->GetType()) { throw new FD::GeneralException("Exception in VisualTarget::Adapt: features descriptor must have the same type.",__FILE__,__LINE__); } (*m_targetDesc)[i]->Adapt((*i_desc)[i]->GetCstFeatures(), (*i_desc)[i]->GetSize(), i_rate); } } void Adapt(FD::Vector<VisualFeatureDesc<FeatBaseType> *> *i_desc, double *i_rate) { if (m_numDesc != i_desc->size()) { throw new FD::GeneralException("Exception in VisualTarget::Adapt: input descriptor vector size differs from current object's vector.",__FILE__,__LINE__); } for (int i=0; i<m_numDesc; i++) { if ((*m_targetDesc)[i]->GetType() != (*i_desc)[i]->GetType()) { throw new FD::GeneralException("Exception in VisualTarget::Adapt: features descriptor must have the same type.",__FILE__,__LINE__); } (*m_targetDesc)[i]->Adapt((*i_desc)[i]->GetCstFeatures(), (*i_desc)[i]->GetSize(), i_rate[i]); } } double Similarity(FD::Vector<VisualFeatureDesc<FeatBaseType> *> *i_desc) { if (m_numDesc != i_desc->size()) { throw new FD::GeneralException("Exception in VisualTarget::Similarity: input descriptor vector size differs from current object's vector.",__FILE__,__LINE__); } double sim = 0.0; for (int i=0; i<m_numDesc; i++) { if ((*m_targetDesc)[i]->GetType() != (*i_desc)[i]->GetType()) { throw new FD::GeneralException("Exception in VisualTarget::Similarity: features descriptor must have the same type.",__FILE__,__LINE__); } sim += m_cueWeights[i]*(*m_targetDesc)[i]->Similarity((*i_desc)[i]->GetCstFeatures(), (*i_desc)[i]->GetSize()); } return sim; } double SimilarityWCueAdapt(FD::Vector<VisualFeatureDesc<FeatBaseType> *> *i_desc, double i_rate) { if (m_numDesc != i_desc->size()) { throw new FD::GeneralException("Exception in VisualTarget::SimilarityWCueAdapt: input descriptor vector size differs from current object's vector.",__FILE__,__LINE__); } int i; double sim = 0.0, cueSim, sumCueSim = 0.0; for (i=0; i<m_numDesc; i++) { if ((*m_targetDesc)[i]->GetType() != (*i_desc)[i]->GetType()) { throw new FD::GeneralException("Exception in VisualTarget::SimilarityWCueAdapt: features descriptor must have the same type.",__FILE__,__LINE__); } m_tmpCueProb[i] = (*m_targetDesc)[i]->Similarity((*i_desc)[i]->GetCstFeatures(), (*i_desc)[i]->GetSize()); sumCueSim += m_tmpCueProb[i]; sim += m_cueWeights[i]*m_tmpCueProb[i]; std::cout << "Cue Weights [" << i << "]=" << m_cueWeights[i] << " Current Cue Prob=" << m_tmpCueProb[i] << std::endl; } if (sumCueSim == 0.0) { std::cerr << "WARNING: VisualTarget::SimilarityWCueAdapt: cannot adapt cue weights with a total probability of 0.0." << std::endl; return sim; } // Adapt cue weight double rateInv = 1.0 - i_rate; double sumCueSimInv = 1.0/sumCueSim; sumCueSim = 0.0; for (i=0; i<m_numDesc; i++) { m_cueWeights[i] = i_rate*m_tmpCueProb[i]*sumCueSimInv + rateInv*m_cueWeights[i]; sumCueSim += m_cueWeights[i]; } // Normalize weights sumCueSimInv = 1.0/sumCueSim; if (sumCueSimInv == 0.0) { throw new FD::GeneralException("Exception in VisualTarget::SimilarityWCueAdapt: cannot have a target with all cue weights are zero.",__FILE__,__LINE__); } for (i=0; i<m_numDesc; i++) { m_cueWeights[i] *= sumCueSimInv; } return sim; } void AgeTarget(bool i_matched) { if (i_matched) { if (m_activeAge < 15) { m_activeAge++; } if (m_passiveAge > 0) { m_passiveAge--; } } else { if (m_activeAge < 15) { m_passiveAge++; } if (m_activeAge > 0) { m_activeAge--; } } } bool IsValid() const { return m_valid; } int GetID() const { return m_id; } int GetActiveAge() const { return m_activeAge; } int GetPassiveAge() const { return m_passiveAge; } int GetCurrentAge() const { return m_activeAge - m_passiveAge; } VisualROI *GetROI() { return m_refROI.get(); } const VisualROI *GetCstROI() const { return (const VisualROI *)(m_refROI.get()); } FD::RCPtr<VisualROI> GetROIRCPtr() { return m_refROI; } int GetNumDescriptors() const { return m_numDesc; } int GetCurDescIdx() const { return m_curDescIdx; } FD::Vector<VisualFeatureDesc<FeatBaseType> *> *GetDescriptorsVec() { return m_targetDesc; } const FD::Vector<VisualFeatureDesc<FeatBaseType> *> *GetCstDescriptorsVec() const { return (const FD::Vector<VisualFeatureDesc<FeatBaseType> *> *)m_targetDesc; } VisualFeatureDesc<FeatBaseType> *GetDescriptor(int i_idx) { if (i_idx >= m_numDesc) { throw new FD::GeneralException("Exception in VisualTarget::GetDescriptor: descriptor index is greater than the actual number of descriptors in current vector.",__FILE__,__LINE__); } return (*m_targetDesc)[i_idx]; } const VisualFeatureDesc<FeatBaseType> *GetCstDescriptor(int i_idx) const { if (i_idx >= m_numDesc) { throw new FD::GeneralException("Exception in VisualTarget::GetCstDescriptor: descriptor index is greater than the actual number of descriptors in current vector.",__FILE__,__LINE__); } return (const VisualFeatureDesc<FeatBaseType> *)((*m_targetDesc)[i_idx]); } double *GetCueWeights() { return m_cueWeights; } const double *GetCueWeights() const { return (const double *)m_cueWeights; } double GetCueWeight(int i_idx) const { if (i_idx >= m_numDesc) { throw new FD::GeneralException("Exception in VisualTarget::GetCueWeight: index is greater than the actual number of descriptors in current vector.",__FILE__,__LINE__); } return m_cueWeights[i_idx]; } void SetValidity(bool i_valid) { m_valid = i_valid; } void SetID(int i_id) { m_id = i_id; } void InitAges() { m_activeAge = 0; m_passiveAge = 0; } void SetActiveAge(int i_age) { m_activeAge = i_age; } void SetPassiveAge(int i_age) { m_passiveAge = i_age; } void SetROI(VisualROI *i_roi) { if (!m_refROI.isNil()) { *m_refROI = *i_roi; } else { m_refROI = FD::RCPtr<VisualROI>(new VisualROI(*i_roi)); } } // Requires to set descriptors afterwards void SetNumDescriptors(int i_numDesc) { if (m_targetDesc) { // First deallocate memory for (int i=0; i<m_numDesc; i++) { delete (*m_targetDesc)[i]; } } if (m_numDesc != i_numDesc) { delete m_targetDesc; delete [] m_cueWeights; delete [] m_tmpCueProb; // Reset vector size m_numDesc = i_numDesc; m_targetDesc = new FD::Vector<VisualFeatureDesc<FeatBaseType> *>(m_numDesc); m_cueWeights = new double[m_numDesc]; m_tmpCueProb = new double[m_numDesc]; } } void SetCurDescIdx(int i_idx) { if (i_idx >= m_numDesc) { throw new FD::GeneralException("Exception in VisualTarget::SetCurDescIdx: descriptor index is greater than the actual number of descriptors in current vector.",__FILE__,__LINE__); } m_curDescIdx = i_idx; } void SetDescriptorsVec(FD::Vector<VisualFeatureDesc<FeatBaseType> *> *i_descVec) { // Reset the number of descriptors if (i_descVec->size() > 0 ) { this->SetNumDescriptors(i_descVec->size()); for (int i=0; i<m_numDesc; i++) { (*m_targetDesc)[i] = (*i_descVec)[i]->clone(); } } else { throw new FD::GeneralException("Exception in VisualTarget::SetDescriptorsVec: input feature vector must have at least one element.",__FILE__,__LINE__); } } void SetDescriptor(VisualFeatureDesc<FeatBaseType> *i_desc, int i_idx) { if (i_idx >= m_numDesc) { throw new FD::GeneralException("Exception in VisualTarget::SetDescriptor: descriptor index is greater than the actual number of descriptors in current vector.",__FILE__,__LINE__); } *((*m_targetDesc)[i_idx]) = *i_desc; } void InitCueWeights() { double initWeight = 1.0/(double)m_numDesc; for (int i=0; i<m_numDesc; i++) { m_cueWeights[i] = initWeight; } } void SetCueWeights(double *i_weight) { for (int i=0; i<m_numDesc; i++) { m_cueWeights[i] = i_weight[i]; } } void SetCueWeight(double i_weight, int i_idx) { if (i_idx >= m_numDesc) { throw new FD::GeneralException("Exception in VisualTarget::SetCueWeight: index is greater than the actual number of descriptors in current vector.",__FILE__,__LINE__); } m_cueWeights[i_idx] = i_weight; } private: bool m_valid; int m_id; int m_activeAge; int m_passiveAge; FD::RCPtr<VisualROI> m_refROI; int m_numDesc; int m_curDescIdx; FD::Vector<VisualFeatureDesc<FeatBaseType> *> *m_targetDesc; double *m_cueWeights; double *m_tmpCueProb; }; } #endif --- NEW FILE: IntegralLBPExtraction.h --- /* Copyright (C) 2005 Pierre Moisan (Pie...@US...) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA */ #ifndef _INTEGRALLBPEXTRACTION_H_ #define _INTEGRALLBPEXTRACTION_H_ #include "VisualFeaturesExtraction.h" #include "VisualIntegralDesc.h" #include "VisualTarget.h" #include <stdlib.h> #include <sys/timeb.h> namespace RobotFlow { // // Integral LBP (Lcoal Binary Pattern) Features Extraction for RobotFlow // class IntegralLBPExtraction : public VisualFeaturesExtraction<double> { public: // // Default constructor for Object // IntegralLBPExtraction(); // // Constructor with complete intialisation // IntegralLBPExtraction(int i_width, int i_height, int i_numChannels, int i_numHoriIntRect, int i_numVertIntRect, int i_numSamples, int i_predicate, bool i_doInterpolation, bool i_useUniform, int i_startAngle, double i_maxValue, bool i_useRectDiff); // // BufferedNode constructor // IntegralLBPExtraction(std::string nodeName, FD::ParameterSet params); // // Constructor using input stream // IntegralLBPExtraction(std::istream &in) { readFrom(in); } virtual ~IntegralLBPExtraction(); // Default routine to print a IntegralLBPExtraction object to an output stream void printOn(std::ostream &out) const { throw new FD::GeneralException("Exception in IntegralLBPExtraction::printOn: method not yet implemented.",__FILE__,__LINE__); } // Default routine to read a IntegralLBPExtraction object from an input stream void readFrom(std::istream &in) { throw new FD::GeneralException("Exception in IntegralLBPExtraction::readFrom: method not yet implemented.",__FILE__,__LINE__); } virtual void request(int output_id, const FD::ParameterSet &req); void calculate(int output_id, int count, FD::Buffer &out); void Preprocess(IplImage *i_src); void ExtractFeatures(VisualROI *i_roi); void ExtractFeatures(IplImage *i_input, VisualROI *i_roi); VisualFeatureDesc<double> *GetDescriptor() { return (*m_featVect)[0]; } const VisualFeatureDesc<double> *GetCstDescriptor() const { return (const VisualIntegralDesc<double> *)(*m_featVect)[0]; } private: void Initialize(); void InitSamplePoints(); void ExtractLBPGeneralWithInterpolation(unsigned char *i_pixPtr); void ExtractLBPGeneralWithoutInterpolation(unsigned char *i_pixPtr); void ExtractLBPGeneralRIU2WithInterpolation(unsigned char *i_pixPtr); void ExtractLBPGeneralRIU2WithoutInterpolation(unsigned char *i_pixPtr); void ExtractLBP8WithInterpolation(unsigned char *i_pixPtr); void ExtractLBP8WithoutInterpolation(unsigned char *i_pixPtr); void ExtractLBP8RIU2WithInterpolation(unsigned char *i_pixPtr); void ExtractLBP8RIU2WithoutInterpolation(unsigned char *i_pixPtr); int ComputeBitTransitions(unsigned int i_val); int CountOneBits(unsigned int i_val); private: // Input IDs (for BufferedNode) int m_imageInID; int m_roiInID; int m_useNextImgInID; // Output IDs (for BufferedNode) int m_featuresOutID; int m_ppCompletedOutID; // Width of images int m_width; int m_imgSumWidth; // Height of images int m_height; // Number of channels in an image int m_numChannels; // Number of pixels in an image int m_numPixels; // Number of bytes in an image int m_numBytesInFrame; // Number of independant rectangular region to compute // the integral color features int m_numHoriIntRect; int m_numVertIntRect; int m_numIntRect; // The number of samples in the local neighborhood. int m_numSamples; // The current predicate (radius), i.e. the distance of the // neighborhood from its center. int m_predicate; // Interpolation flag bool m_doInterpolation; // Flag to use only uniform patterns i.e. patterns with 2 or less // bit transitions bool m_useUniform; // The angle of the first neighbor. int m_startAngle; // Maximum pixel channel value double m_maxValue; // Number of valid local binary patterns int m_numValidPattern; double m_maxFeatValue; bool m_useRectDiff; double *m_tmpMeanFeatures; double *m_curMeanVal; // Precalculated table of interpolation points. CvPoint *m_samplePoints; // Precalculated table of interpolation offsets. CvPoint2D32f *m_pointsOffsets; // Precalculated values for interpolation multiplication. double *m_BiLiMultipliers; // Temporary pixel pointers corresponding to the neighborhood samples unsigned char **m_tmpSamples; // Integral color descriptor for region of interest FD::RCPtr<FD::Vector<VisualFeatureDesc<double> *> > m_featVect; // Temporary image copy IplImage *m_curImage; // Grayscale version of current image IplImage *m_grayImage; // Each local pattern is an image IplImage **m_patternImage; // Pointer to channel image pixels unsigned char **m_patternPixPtr; int **m_sumPixPtr; // Sum of pixels (integral) image IplImage **m_sumImage; // Function pointer to the appropriate private extraction routine void (IntegralLBPExtraction::*m_extractionFct)(unsigned char *); }; } #endif --- NEW FILE: MeanShiftTracker.h --- /* Copyright (C) 2005 Pierre Moisan (Pie...@US...) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA */ #ifndef _MEANSHIFTTRACKER_H_ #define _MEANSHIFTTRACKER_H_ #include "VisualTracker.h" namespace RobotFlow { class MeanShiftTracker : public VisualTracker { public: MeanShiftTracker(); MeanShiftTracker(int i_maxNumMSIter, double i_minMSDistEpsilon); MeanShiftTracker(std::string nodeName, FD::ParameterSet params); virtual ~MeanShiftTracker(); // Default routine to print a MeanShiftTracker object to an output stream void printOn(std::ostream &out) const { throw new FD::GeneralException("Exception in MeanShiftTracker::printOn: method not yet implemented.",__FILE__,__LINE__); } // Default routine to read a MeanShiftTracker object from an input stream void readFrom(std::istream &in) { throw new FD::GeneralException("Exception in MeanShiftTracker::printOn: method not yet implemented.",__FILE__,__LINE__); } virtual void request(int output_id, const FD::ParameterSet &req); void calculate(int output_id, int count, FD::Buffer &out); private: // Input IDs (for BufferedNode) int m_imageInID; int m_targetInID; int m_msLocVecInID; int m_ppCompletedInID; // Output IDs (for BufferedNode) int m_finishedOutID; int m_imageOutID; int m_curTargetOutID; int m_targetOutID; int m_width; int m_height; int m_numChannels; int m_numPixels; int m_numBytesInFrame; int m_numIter; int m_maxNumMSIter; double m_minMSDistEpsilon; bool m_finished; bool m_initMS; VisualTarget<double> *m_curTarget; // Temporary image copy IplImage *m_curImage; }; } #endif --- NEW FILE: PFParticleFilter.h --- /* Copyright (C) 2005 Pierre Moisan (Pie...@US...) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA */ #ifndef _PFPARTICLEFILTER_H_ #define _PFPARTICLEFILTER_H_ #include "VisualTracker.h" #include "PFParticle.h" #include "PFUtilityFct.h" namespace RobotFlow { typedef enum { e_PF_GenericFilter = 0, e_PF_Unknown } e_PF_type; // // Abstract base class for particle filter // class PFParticleFilter : public VisualTracker { public: PFParticleFilter() : m_filterType(e_PF_Unknown) { } PFParticleFilter(e_PF_type i_modelType) : m_filterType(i_modelType) { } PFParticleFilter(e_PF_type i_modelType, std::string nodeName, FD::ParameterSet params) : VisualTracker(nodeName, params), m_filterType(i_modelType) { } virtual ~PFParticleFilter() { } // Default routine to print a PFParticleFilter object to an output stream void printOn(std::ostream &out) const = 0; // Default routine to read a PFParticleFilter object from an input stream void readFrom(std::istream &in) = 0; void calculate(int output_id, int count, FD::Buffer &out) = 0; e_PF_type GetType() const { return m_filterType; } void SetType(e_PF_type i_type) { m_filterType = i_type; } private: e_PF_type m_filterType; }; } #endif --- NEW FILE: VisualTargetManager.h --- /* Copyright (C) 2005 Pierre Moisan (Pie...@US...) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA */ #ifndef _VISUALTRACKER_H_ #define _VISUALTRACKER_H_ #include "BufferedNode.h" #include <iostream> #include "Image.h" #include "cv.h" #include "VisualTarget.h" #include <stdlib.h> #include <sys/timeb.h> namespace RobotFlow { class VisualTargetManager : public FD::BufferedNode { friend class FD::BufferedNode; public: VisualTargetManager(); VisualTargetManager(std::string nodeName, FD::ParameterSet params); virtual ~VisualTargetManager(); // Default routine to print a VisualTargetManager object to an output stream void printOn(std::ostream &out) const { throw new FD::GeneralException("Exception in VisualTargetManager::printOn: method not yet implemented.",__FILE__,__LINE__); } // Default routine to read a VisualTargetManager object from an input stream void readFrom(std::istream &in) { throw new FD::GeneralException("Exception in VisualTargetManager::printOn: method not yet implemented.",__FILE__,__LINE__); } virtual void request(int output_id, const FD::ParameterSet &req); void calculate(int output_id, int count, FD::Buffer &out); private: // Input IDs (for BufferedNode) int m_imageInID; int m_roiInID; int m_featVecInID; int m_targetInID; int m_ppCompletedInID; // Output IDs (for BufferedNode) int m_imageOutID; int m_roiOutID; int m_targetOutID; int m_targetProbOutID; int m_targetDXOutID; int m_targetDYOutID; int m_nameOutID; int m_width; int m_height; int m_numChannels; int m_numPixels; int m_numBytesInFrame; float m_imgXCen; float m_imgYCen; unsigned char m_roiColor[3]; float m_lSigma; struct timeb m_t1, m_t2; int m_maxNumTargets; double m_targetMatchThres; double m_targetAdaptThres; double m_targetAdaptRate; double m_cueAdaptRate; bool m_ppCompleted; FD::RCPtr<VisualTarget<double> > m_refTarget; //RCPtr<VisualROI> m_refROI; // Temporary image copy IplImage *m_curImage; }; } #endif --- NEW FILE: PFParticle.h --- /* Copyright (C) 2005 Pierre Moisan (Pie...@US...) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA */ #ifndef _PFPARTICLE_H_ #define _PFPARTICLE_H_ #include "Object.h" #include <iostream> namespace RobotFlow { typedef enum { e_PFP_Generic = 0, e_PFP_Unknown } e_PFP_type; // // This should be an abstract base class. Since many containers (like Vector) // do not allow abstract functions, each abstract function throws an exception // to avoid the direct use of the base class "abstract" routines. // class PFParticle : public FD::Object { friend std::ostream &operator<<(std::ostream &o_out, const PFParticle &i_ref) { try { i_ref.printOn(o_out); // Enable cascading return o_out; } catch (FD::BaseException *e) { throw e->add(new FD::GeneralException("Exception in PFParticle::operator<<:",__FILE__,__LINE__)); } } friend std::istream &operator>>(std::istream &i_in, PFParticle &o_ref) { try { o_ref.readFrom(i_in); // Enable cascading return i_in; } catch (FD::BaseException *e) { throw e->add(new FD::GeneralException("Exception in PFParticle::operator>>:",__FILE__,__LINE__)); } } public: PFParticle() : m_pfpType(e_PFP_Unknown) { } PFParticle(e_PFP_type i_pfpType) : m_pfpType(i_pfpType) { } PFParticle(const PFParticle &i_ref) { m_pfpType = i_ref.m_pfpType; } virtual ~PFParticle() { } virtual PFParticle & operator =(const PFParticle &i_ref) { // Avoid self assignment if (&i_ref != this) { this->m_pfpType = i_ref.m_pfpType; } return *this; } virtual PFParticle* clone() const { return new PFParticle(*this); } // Default routine to print a PFParticle object to an output stream virtual void printOn(std::ostream &out) const { throw new FD::GeneralException("Exception in PFParticle::printOn: cannot use base class routine.",__FILE__,__LINE__); } // Default routine to read a PFParticle object from an input stream virtual void readFrom(std::istream &in) { throw new FD::GeneralException("Exception in PFParticle::readFrom: cannot use base class routine.",__FILE__,__LINE__); } e_PFP_type GetType() const { return m_pfpType; } void SetType(e_PFP_type i_type) { m_pfpType = i_type; } virtual unsigned int GetStateSize() const { throw new FD::GeneralException("Exception in PFParticle::GetStateSize: cannot use base class routine.",__FILE__,__LINE__); } virtual float *GetState() { throw new FD::GeneralException("Exception in PFParticle::GetState: cannot use base class routine.",__FILE__,__LINE__); } virtual const float *GetCstState() const { throw new FD::GeneralException("Exception in PFParticle::GetCstState: cannot use base class routine.",__FILE__,__LINE__); } virtual float GetStateIdx(int i_idx) const { throw new FD::GeneralException("Exception in PFParticle::GetStateIdx: cannot use base class routine.",__FILE__,__LINE__); } virtual float GetWeight() const { throw new FD::GeneralException("Exception in PFParticle::GetWeight: cannot use base class routine.",__FILE__,__LINE__); } virtual void SetStateSize(unsigned int i_size) { throw new FD::GeneralException("Exception in PFParticle::SetStateSize: cannot use base class routine.",__FILE__,__LINE__); } virtual void SetState(const float *i_state) { throw new FD::GeneralException("Exception in PFParticle::SetState: cannot use base class routine.",__FILE__,__LINE__); } virtual void SetStateIdx(int i_idx, float i_val) { throw new FD::GeneralException("Exception in PFParticle::SetStateIdx: cannot use base class routine.",__FILE__,__LINE__); } virtual void SetWeight(float i_weight) { throw new FD::GeneralException("Exception in PFParticle::SetWeight: cannot use base class routine.",__FILE__,__LINE__); } private: e_PFP_type m_pfpType; }; } #endif --- NEW FILE: VisualTracker.h --- /* Copyright (C) 2005 Pierre Moisan (Pie...@US...) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA */ #ifndef _VISUALTRACKER_H_ #define _VISUALTRACKER_H_ #include "BufferedNode.h" #include <iostream> #include "Image.h" #include "cv.h" #include "VisualFeatureDesc.h" #include "VisualROI.h" #include "VisualTarget.h" namespace RobotFlow { class VisualTracker : public FD::BufferedNode { public: VisualTracker() { } VisualTracker(std::string nodeName, FD::ParameterSet params) : BufferedNode(nodeName, params) { } virtual ~VisualTracker() { } // Default routine to print a VisualTracker object to an output stream virtual void printOn(std::ostream &out) const = 0; // Default routine to read a VisualTracker object from an input stream virtual void readFrom(std::istream &in) = 0; virtual void calculate(int output_id, int count, FD::Buffer &out) = 0; //virtual void TrackTarget() = 0; }; } #endif --- NEW FILE: ColorHistExtraction.h --- /* Copyright (C) 2005 Pierre Moisan (Pie...@US...) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA */ #ifndef _COLORHISTEXTRACTION_H_ #define _COLORHISTEXTRACTION_H_ #include "VisualFeaturesExtraction.h" #include "VisualHistogramDesc.h" #include "Vi... [truncated message content] |
From: Dominic L. <ma...@us...> - 2005-07-28 14:59:54
|
Update of /cvsroot/robotflow/RobotFlow/OpenCV/n-files In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv6226/OpenCV/n-files Added Files: Makefile.am Log Message: moved from Vision directory --- NEW FILE: Makefile.am --- #AUTOMAKE_OPTIONS = no-dependencies toolboxdir = $(prefix) toolbox_DATA = EXTRA_DIST = $(toolbox_DATA) |
From: Dominic L. <ma...@us...> - 2005-07-28 14:59:54
|
Update of /cvsroot/robotflow/RobotFlow/OpenCV In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv6226/OpenCV Added Files: Makefile.am Log Message: moved from Vision directory --- NEW FILE: Makefile.am --- ## Process this file with automake to produce Makefile.in. -*-Makefile-*- # Disable automatic dependency tracking if using other tools than gcc and gmake #AUTOMAKE_OPTIONS = no-dependencies SUBDIRS = src include n-files |
From: Dominic L. <ma...@us...> - 2005-07-28 14:39:18
|
Update of /cvsroot/robotflow/RobotFlow/OpenCV/n-files In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv879/n-files Log Message: Directory /cvsroot/robotflow/RobotFlow/OpenCV/n-files added to the repository |
From: Dominic L. <ma...@us...> - 2005-07-28 14:39:18
|
Update of /cvsroot/robotflow/RobotFlow/OpenCV/src In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv879/src Log Message: Directory /cvsroot/robotflow/RobotFlow/OpenCV/src added to the repository |
From: Dominic L. <ma...@us...> - 2005-07-28 14:39:17
|
Update of /cvsroot/robotflow/RobotFlow/OpenCV/include In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv879/include Log Message: Directory /cvsroot/robotflow/RobotFlow/OpenCV/include added to the repository |
From: Dominic L. <ma...@us...> - 2005-07-28 14:38:56
|
Update of /cvsroot/robotflow/RobotFlow/OpenCV In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv724/OpenCV Log Message: Directory /cvsroot/robotflow/RobotFlow/OpenCV added to the repository |
From: Dominic L. <ma...@us...> - 2005-07-28 14:01:49
|
Update of /cvsroot/robotflow/RobotFlow/Vision/src In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv24114/src Modified Files: VisualHistogramDesc.cc Log Message: fixed static member definition Index: VisualHistogramDesc.cc =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Vision/src/VisualHistogramDesc.cc,v retrieving revision 1.4 retrieving revision 1.5 diff -C2 -d -r1.4 -r1.5 *** VisualHistogramDesc.cc 12 Apr 2005 19:58:10 -0000 1.4 --- VisualHistogramDesc.cc 28 Jul 2005 14:01:39 -0000 1.5 *************** *** 31,33 **** --- 31,38 ---- ("<VisualHistogramDesc<double,unsigned int>>", new ObjectFactory<VisualHistogramDesc<double, unsigned int> >("<VisualHistogramDesc<double,unsigned int>>")); + template<> + const double VisualHistogramDesc<double, unsigned char>::k_VISUALHIST_2DIVPI = 2.0/3.14159265358979323846; + + template<> + const double VisualHistogramDesc<double, unsigned int>::k_VISUALHIST_2DIVPI = 2.0/3.14159265358979323846; }//namespace RobotFlow |
From: Dominic L. <ma...@us...> - 2005-07-28 14:01:48
|
Update of /cvsroot/robotflow/RobotFlow/Vision/include In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv24114/include Modified Files: VisualHistogramDesc.h Log Message: fixed static member definition Index: VisualHistogramDesc.h =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Vision/include/VisualHistogramDesc.h,v retrieving revision 1.6 retrieving revision 1.7 diff -C2 -d -r1.6 -r1.7 *** VisualHistogramDesc.h 2 Jun 2005 16:41:59 -0000 1.6 --- VisualHistogramDesc.h 28 Jul 2005 14:01:39 -0000 1.7 *************** *** 52,56 **** { public: ! static const double k_VISUALHIST_2DIVPI = 2.0/3.14159265358979323846; VisualHistogramDesc() --- 52,56 ---- { public: ! static const double k_VISUALHIST_2DIVPI; VisualHistogramDesc() |
From: Dominic L. <ma...@us...> - 2005-07-28 13:32:35
|
Update of /cvsroot/robotflow/RobotFlow/Devices/src In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv17642/src Modified Files: Makefile.am SNCRZ30.cc Log Message: fixed includes Index: Makefile.am =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Devices/src/Makefile.am,v retrieving revision 1.13 retrieving revision 1.14 diff -C2 -d -r1.13 -r1.14 *** Makefile.am 25 Apr 2005 19:51:13 -0000 1.13 --- Makefile.am 28 Jul 2005 13:32:25 -0000 1.14 *************** *** 15,18 **** --- 15,19 ---- SNCRZ30.cc \ SNCRZ30RS232.cc \ + SNCRZ30TCP.cc \ IMU400CC_200.cc Index: SNCRZ30.cc =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Devices/src/SNCRZ30.cc,v retrieving revision 1.23 retrieving revision 1.24 diff -C2 -d -r1.23 -r1.24 *** SNCRZ30.cc 4 May 2005 19:22:36 -0000 1.23 --- SNCRZ30.cc 28 Jul 2005 13:32:25 -0000 1.24 *************** *** 679,682 **** --- 679,683 ---- if (jpeg_start_marker_pos != string::npos && jpeg_end_marker_pos != string::npos) { + int jpeg_start_comment_pos = input_string.find(string(JPEG_COMMENT_START,2)); if (jpeg_start_comment_pos != string::npos) { *************** *** 692,695 **** --- 693,697 ---- } } + //cerr<<"input string size (before): "<<input_string.size()<<endl; |
From: Dominic L. <ma...@us...> - 2005-07-28 13:24:12
|
Update of /cvsroot/robotflow/RobotFlow/Vision/include In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv15789/include Modified Files: VisualTarget.h Log Message: fixed template Index: VisualTarget.h =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Vision/include/VisualTarget.h,v retrieving revision 1.5 retrieving revision 1.6 diff -C2 -d -r1.5 -r1.6 *** VisualTarget.h 2 Jun 2005 16:42:43 -0000 1.5 --- VisualTarget.h 28 Jul 2005 13:24:04 -0000 1.6 *************** *** 58,62 **** } ! VisualTarget::VisualTarget(const VisualTarget& i_ref) :m_id(-1), m_activeAge(-1), m_passiveAge(-1), m_curDescIdx(0), m_numDesc(-1), m_targetDesc(NULL), m_cueWeights(NULL), --- 58,62 ---- } ! VisualTarget(const VisualTarget<FeatBaseType> & i_ref) :m_id(-1), m_activeAge(-1), m_passiveAge(-1), m_curDescIdx(0), m_numDesc(-1), m_targetDesc(NULL), m_cueWeights(NULL), |
From: MagellanPro <mag...@us...> - 2005-07-10 23:05:47
|
Update of /cvsroot/robotflow/RobotFlow/Vision/n-files In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv21176 Modified Files: ChallengeVision.n Log Message: parameters exported Index: ChallengeVision.n =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Vision/n-files/ChallengeVision.n,v retrieving revision 1.8 retrieving revision 1.9 diff -C2 -d -r1.8 -r1.9 *** ChallengeVision.n 6 Jul 2005 20:57:34 -0000 1.8 --- ChallengeVision.n 10 Jul 2005 23:05:36 -0000 1.9 *************** *** 48,52 **** <Parameter name="MAX_NUM_TEXT_ROI" type="int" value="100" description="Maximum number of text region of interest to use."/> <Parameter name="MAX_STD" type="float" value="0" description="Low threshold used for edge searching."/> ! <Parameter name="MIN_DIST" type="float" value="60" description="High threshold used for edge searching."/> <Parameter name="HORIZONTAL_SYMMETRY_THRESHOLD" type="float" value="80.0" description="Horizontal edge orientation symmetry treshold. Text region should have lower values than threshold."/> <Parameter name="VERTICAL_SYMMETRY_THRESHOLD" type="float" value="80.0" description="Vertical edge orientation symmetry treshold. Text region should have lower values than threshold."/> --- 48,52 ---- <Parameter name="MAX_NUM_TEXT_ROI" type="int" value="100" description="Maximum number of text region of interest to use."/> <Parameter name="MAX_STD" type="float" value="0" description="Low threshold used for edge searching."/> ! <Parameter name="MIN_DIST" type="float" value="40" description="High threshold used for edge searching."/> <Parameter name="HORIZONTAL_SYMMETRY_THRESHOLD" type="float" value="80.0" description="Horizontal edge orientation symmetry treshold. Text region should have lower values than threshold."/> <Parameter name="VERTICAL_SYMMETRY_THRESHOLD" type="float" value="80.0" description="Vertical edge orientation symmetry treshold. Text region should have lower values than threshold."/> *************** *** 83,93 **** </Node> <Node name="node_READv2_1" type="READv2" x="1357.000000" y="581.000000"> ! <Parameter name="MIN_SYMBOL_HEIGHT" type="int" value="30" description="No description available"/> ! <Parameter name="MIN_SYMBOL_WIDTH" type="int" value="5" description="No description available"/> ! <Parameter name="MIN_DICT_SCORE" type="float" value="0.5" description="No description available"/> ! <Parameter name="MIN_DIGIT_SCORE" type="float" value="0.8" description="No description available"/> ! <Parameter name="DICT_ONLY" type="bool" value="true" description="No description available"/> ! <Parameter name="VERTICAL_TOLERANCE" type="float" value="0.8" description="No description available"/> ! <Parameter name="HORIZONTAL_TOLERANCE" type="float" value="0.8" description="No description available"/> </Node> <Node name="node_CAMERA_CTRL_1" type="CAMERA_CTRL" x="1908.000000" y="134.000000"/> --- 83,93 ---- </Node> <Node name="node_READv2_1" type="READv2" x="1357.000000" y="581.000000"> ! <Parameter name="MIN_SYMBOL_HEIGHT" type="subnet_param" value="MIN_SYMBOL_HEIGHT" description="No description available"/> ! <Parameter name="MIN_SYMBOL_WIDTH" type="subnet_param" value="MIN_SYMBOL_WIDTH" description="No description available"/> ! <Parameter name="MIN_DICT_SCORE" type="subnet_param" value="MIN_DICT_SCORE" description="No description available"/> ! <Parameter name="MIN_DIGIT_SCORE" type="subnet_param" value="MIN_DIGIT_SCORE" description="No description available"/> ! <Parameter name="DICT_ONLY" type="subnet_param" value="DICT_ONLY" description="No description available"/> ! <Parameter name="VERTICAL_TOLERANCE" type="subnet_param" value="VERTICAL_TOLERANCE" description="No description available"/> ! <Parameter name="HORIZONTAL_TOLERANCE" type="subnet_param" value="HORIZONTAL_TOLERANCE" description="No description available"/> </Node> <Node name="node_CAMERA_CTRL_1" type="CAMERA_CTRL" x="1908.000000" y="134.000000"/> *************** *** 146,149 **** --- 146,150 ---- <Node name="node_NOP_13" type="NOP" x="1513.000000" y="239.000000"/> <Node name="node_NOP_14" type="NOP" x="1513.000000" y="286.000000"/> + <Node name="node_NOP_3" type="NOP" x="937.000000" y="52.000000"/> <Link from="node_NOP_1" output="OUTPUT" to="node_CvFaceDetection_1" input="IN_IMAGE">17 122 117 122 117 167 438 167.5 </Link> <Link from="node_NOP_1" output="OUTPUT" to="node_MultiIntegralCuesPFTracker_1" input="IMAGE_IN">17 122 116 122 116 -82 839 -82.5 </Link> *************** *** 203,206 **** --- 204,208 ---- <Link from="node_NOP_13" output="OUTPUT" to="node_CAMERA_CTRL_1" input="CURRENT_TILT"/> <Link from="node_NOP_14" output="OUTPUT" to="node_CAMERA_CTRL_1" input="CURRENT_ZOOM"/> + <Link from="node_TextSignDetect_1" output="TEXT_ROI_IMG" to="node_NOP_3" input="INPUT"/> <NetInput name="NNET" node="node_READv2_1" terminal="NNET" object_type="any" description="No description available"/> <NetInput name="DICT" node="node_READv2_1" terminal="DICT" object_type="any" description="No description available"/> *************** *** 228,231 **** --- 230,234 ---- <NetInput name="CURRENT_TILT" node="node_NOP_13" terminal="INPUT" object_type="any" description="The input"/> <NetInput name="CURRENT_ZOOM" node="node_NOP_14" terminal="INPUT" object_type="any" description="The input"/> + <NetOutput name="GRAYSCALE_IMAGE_TEXT" node="node_NOP_3" terminal="OUTPUT" object_type="any" description="The output = The input"/> <Note x="0" y="0" visible="0" text="Created with FlowDesigner 0.8.2"/> </Network> *************** *** 500,502 **** --- 503,512 ---- <Parameter name="DEFAULT_ABS_PAN_COMMAND" type="int" value="0"/> <Parameter name="DEFAULT_ABS_TILT_COMMAND" type="int" value="-300"/> + <Parameter name="MIN_SYMBOL_HEIGHT" type="int" value="30"/> + <Parameter name="MIN_SYMBOL_WIDTH" type="int" value="5"/> + <Parameter name="MIN_DICT_SCORE" type="float" value="0.5"/> + <Parameter name="MIN_DIGIT_SCORE" type="float" value="0.8"/> + <Parameter name="DICT_ONLY" type="bool" value="true"/> + <Parameter name="VERTICAL_TOLERANCE" type="float" value="0.8"/> + <Parameter name="HORIZONTAL_TOLERANCE" type="float" value="0.8"/> </Document> |
From: MagellanPro <mag...@us...> - 2005-07-08 19:58:24
|
Update of /cvsroot/robotflow/RobotFlow/Devices/src In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv20978/Devices/src Modified Files: SNCRZ30RS232.cc Log Message: front or back insertion of command Index: SNCRZ30RS232.cc =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Devices/src/SNCRZ30RS232.cc,v retrieving revision 1.5 retrieving revision 1.6 diff -C2 -d -r1.5 -r1.6 *** SNCRZ30RS232.cc 1 Jul 2005 15:03:16 -0000 1.5 --- SNCRZ30RS232.cc 8 Jul 2005 19:58:14 -0000 1.6 *************** *** 308,312 **** m_zoom_position = message[2]<<12 | message[3]<<8 | message[4]<<4 | message[5]; ! //cerr<<"Zoom inquiry reply"<<endl; pthread_mutex_unlock(&m_ack_lock); --- 308,312 ---- m_zoom_position = message[2]<<12 | message[3]<<8 | message[4]<<4 | message[5]; ! cerr<<"Zoom inquiry reply"<<endl; pthread_mutex_unlock(&m_ack_lock); *************** *** 408,412 **** } ! void Add_message_to_send(string msg_to_send) { bool duplicate_item_found = false; --- 408,412 ---- } ! void Add_message_to_send(string msg_to_send, bool front = false) { bool duplicate_item_found = false; *************** *** 465,469 **** if(!(duplicate_item_found)){ // add the new item into the list ! m_sending.push_back(msg_to_send); // Unlock the mutex pthread_mutex_unlock(&m_list_lock); --- 465,474 ---- if(!(duplicate_item_found)){ // add the new item into the list ! if (front) { ! m_sending.push_front(msg_to_send); ! } ! else { ! m_sending.push_back(msg_to_send); ! } // Unlock the mutex pthread_mutex_unlock(&m_list_lock); *************** *** 487,491 **** message += 0xFF; ! Add_message_to_send(message); } --- 492,496 ---- message += 0xFF; ! Add_message_to_send(message,true); } *************** *** 499,503 **** message += 0xFF; ! Add_message_to_send(message); } --- 504,508 ---- message += 0xFF; ! Add_message_to_send(message,true); } *************** *** 511,515 **** message += 0xFF; ! Add_message_to_send(message); } //New MODIFICATION STOP HERE ############################################################################# --- 516,520 ---- message += 0xFF; ! Add_message_to_send(message,true); } //New MODIFICATION STOP HERE ############################################################################# *************** *** 552,556 **** // add it to the list ! Add_message_to_send(message); } --- 557,561 ---- // add it to the list ! Add_message_to_send(message,false); } *************** *** 595,599 **** // add it to the list ! Add_message_to_send(message); } --- 600,604 ---- // add it to the list ! Add_message_to_send(message,false); } *************** *** 627,631 **** ! Add_message_to_send(message); } --- 632,636 ---- ! Add_message_to_send(message,false); } *************** *** 643,647 **** ! Add_message_to_send(PanTiltMessage); } --- 648,652 ---- ! Add_message_to_send(PanTiltMessage,true); } *************** *** 658,662 **** ! Add_message_to_send(ZoomMessage); } --- 663,667 ---- ! Add_message_to_send(ZoomMessage,false); } *************** *** 670,674 **** //open serial port fd = Port_initialisation(m_serial_port.c_str(),BAUDRATE_38400); ! cerr << "SNCRZ30RS232 : serial port opened " << m_serial_port << endl; --- 675,679 ---- //open serial port fd = Port_initialisation(m_serial_port.c_str(),BAUDRATE_38400); ! cerr << "SNCRZ30RS232 : serial port opened " << m_serial_port << endl; *************** *** 869,873 **** // add it to the list dont worry be happy ! ! Add_message_to_send(AutoOFF); } --- 874,878 ---- // add it to the list dont worry be happy ! ! Add_message_to_send(AutoOFF,true); } |
From: Carle C. <car...@us...> - 2005-07-07 04:56:54
|
Update of /cvsroot/robotflow/RobotFlow/Behaviors/src In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv7030/Behaviors/src Modified Files: AvoidPat.cc Log Message: velocity correction Index: AvoidPat.cc =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Behaviors/src/AvoidPat.cc,v retrieving revision 1.13 retrieving revision 1.14 diff -C2 -d -r1.13 -r1.14 *** AvoidPat.cc 6 Jul 2005 19:34:15 -0000 1.13 --- AvoidPat.cc 7 Jul 2005 04:56:45 -0000 1.14 *************** *** 401,405 **** // Compute velocity and rotation ! velocity = sign*computeVel(position, min_margin); rotation = sign*computeRot(position, min_margin); } --- 401,405 ---- // Compute velocity and rotation ! velocity = computeVel(position, min_margin); rotation = sign*computeRot(position, min_margin); } |
From: Dominic L. <ma...@us...> - 2005-07-06 20:57:45
|
Update of /cvsroot/robotflow/RobotFlow/Vision/n-files In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv4727 Modified Files: ChallengeVision.n Log Message: tracking faster Index: ChallengeVision.n =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Vision/n-files/ChallengeVision.n,v retrieving revision 1.7 retrieving revision 1.8 diff -C2 -d -r1.7 -r1.8 *** ChallengeVision.n 1 Jul 2005 18:22:15 -0000 1.7 --- ChallengeVision.n 6 Jul 2005 20:57:34 -0000 1.8 *************** *** 248,252 **** </Node> <Node name="node_GenericPID_1" type="GenericPID" x="42.000000" y="-300.000000"> ! <Parameter name="P_GAIN" type="float" value="0.1" description="Proportionnal gain."/> <Parameter name="I_GAIN" type="float" value="0.0" description="Integral gain."/> <Parameter name="D_GAIN" type="float" value="0.0" description="Derivative gain."/> --- 248,252 ---- </Node> <Node name="node_GenericPID_1" type="GenericPID" x="42.000000" y="-300.000000"> ! <Parameter name="P_GAIN" type="float" value="0.2" description="Proportionnal gain."/> <Parameter name="I_GAIN" type="float" value="0.0" description="Integral gain."/> <Parameter name="D_GAIN" type="float" value="0.0" description="Derivative gain."/> *************** *** 263,267 **** </Node> <Node name="node_GenericPID_2" type="GenericPID" x="45.000000" y="-184.000000"> ! <Parameter name="P_GAIN" type="float" value="-0.1" description="Proportionnal gain."/> <Parameter name="I_GAIN" type="float" value="0.0" description="Integral gain."/> <Parameter name="D_GAIN" type="float" value="0.0" description="Derivative gain."/> --- 263,267 ---- </Node> <Node name="node_GenericPID_2" type="GenericPID" x="45.000000" y="-184.000000"> ! <Parameter name="P_GAIN" type="float" value="-0.2" description="Proportionnal gain."/> <Parameter name="I_GAIN" type="float" value="0.0" description="Integral gain."/> <Parameter name="D_GAIN" type="float" value="0.0" description="Derivative gain."/> |
From: Fernyqc <fe...@us...> - 2005-07-06 19:34:26
|
Update of /cvsroot/robotflow/RobotFlow/Behaviors/src In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv26147 Modified Files: AvoidPat.cc SafeVelocityPat.cc Log Message: - Minors corrections Index: SafeVelocityPat.cc =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Behaviors/src/SafeVelocityPat.cc,v retrieving revision 1.10 retrieving revision 1.11 diff -C2 -d -r1.10 -r1.11 *** SafeVelocityPat.cc 6 Jul 2005 09:21:55 -0000 1.10 --- SafeVelocityPat.cc 6 Jul 2005 19:34:15 -0000 1.11 *************** *** 194,204 **** float marginFront=1, marginBack=1; // Backward or forward motion ! if(rotation == 0){ ! marginBack = getMinMargin(securityMargin, 5, 7); ! marginFront = getMinMargin(securityMargin, 0, 1); ! if(marginFront > securityMargin[11]) marginFront = securityMargin[11]; ! if(signVelIn > 0) ! marginBack = 1; ! else marginFront = 1; } else if(signVelIn > 0){ --- 194,204 ---- float marginFront=1, marginBack=1; // Backward or forward motion ! if(rotation == 0){ ! marginBack = getMinMargin(securityMargin, 5, 7); ! marginFront = getMinMargin(securityMargin, 0, 1); ! if(marginFront > securityMargin[11]) marginFront = securityMargin[11]; ! if(signVelIn > 0) ! marginBack = 1; ! else marginFront = 1; } else if(signVelIn > 0){ Index: AvoidPat.cc =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Behaviors/src/AvoidPat.cc,v retrieving revision 1.12 retrieving revision 1.13 diff -C2 -d -r1.12 -r1.13 *** AvoidPat.cc 6 Jul 2005 09:21:55 -0000 1.12 --- AvoidPat.cc 6 Jul 2005 19:34:15 -0000 1.13 *************** *** 318,322 **** velIn = dereference_cast<int>(velInRef); // Add a dead zone near zero ! if(abs(velIn) <= 5) velIn = 0; } --- 318,322 ---- velIn = dereference_cast<int>(velInRef); // Add a dead zone near zero ! if(abs(velIn) <= 10) velIn = 0; } *************** *** 357,362 **** // If avoidance is needed if((min_margin < 1) || (tempo != 0)){ ! if(min_margin < 1) tempo = m_backward_iter; ! else tempo--; // Turning left or right? --- 357,362 ---- // If avoidance is needed if((min_margin < 1) || (tempo != 0)){ ! if(min_margin < 1) tempo = m_backward_iter; ! else tempo--; // Turning left or right? *************** *** 377,381 **** if(pos_min_margin == 0){ backward = (rangeValue[pos_min_margin] < m_backward_dist); ! rotation = m_backward_rot/2; } else if(pos_min_margin == 1){ --- 377,381 ---- if(pos_min_margin == 0){ backward = (rangeValue[pos_min_margin] < m_backward_dist); ! rotation = -1*sign*m_backward_rot/2; } else if(pos_min_margin == 1){ *************** *** 393,397 **** backward_iter = m_backward_iter-1; backward_vel = velocity; ! backward_rot = rotation * sign; } else{ --- 393,397 ---- backward_iter = m_backward_iter-1; backward_vel = velocity; ! backward_rot = rotation; } else{ *************** *** 401,413 **** // Compute velocity and rotation ! velocity = computeVel(position, min_margin); ! rotation = computeRot(position, min_margin); ! } - // Define FD output - velocity *= sign; - rotation *= sign; - // Saturate velocity correction if(abs(velIn - velocity) > m_max_vel_correction) --- 401,408 ---- // Compute velocity and rotation ! velocity = sign*computeVel(position, min_margin); ! rotation = sign*computeRot(position, min_margin); } // Saturate velocity correction if(abs(velIn - velocity) > m_max_vel_correction) |
From: Dominic L. <ma...@us...> - 2005-07-06 13:33:38
|
Update of /cvsroot/robotflow/RobotFlow/Vision/src In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv12526 Modified Files: VisualROI.cc Log Message: added VisualROI toVect function Index: VisualROI.cc =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Vision/src/VisualROI.cc,v retrieving revision 1.7 retrieving revision 1.8 diff -C2 -d -r1.7 -r1.8 *** VisualROI.cc 17 Jun 2005 21:16:33 -0000 1.7 --- VisualROI.cc 6 Jul 2005 13:33:30 -0000 1.8 *************** *** 1,3 **** --- 1,5 ---- #include "VisualROI.h" + #include "vmethod.h" + #include "Vector.h" using namespace FD; *************** *** 614,616 **** --- 616,637 ---- } + ObjectRef VisualROIToVect(ObjectRef VisualROIValue) { + try { + RCPtr<VisualROI> ROI = VisualROIValue; + Vector<float> *vect = Vector<float>::alloc(6); + (*vect)[0] = ROI->GetXCen(); + (*vect)[1] = ROI->GetYCen(); + (*vect)[2] = ROI->GetHSX(); + (*vect)[3] = ROI->GetHSY(); + (*vect)[4] = ROI->GetAngle(); + (*vect)[5] = ROI->GetArea(); + return ObjectRef(vect); + } + catch(BaseException *e) { + throw e->add(new GeneralException("Unable to convert VisualROI into Vector",__FILE__,__LINE__)); + } + } + REGISTER_VTABLE0(toVect, VisualROI, VisualROIToVect, 1); + + }//namespace RobotFlow |
From: MagellanPro <mag...@us...> - 2005-07-06 09:22:08
|
Update of /cvsroot/robotflow/RobotFlow/Behaviors/src In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv12141 Modified Files: AvoidPat.cc RangeFashion.cc SafeVelocityPat.cc Log Message: - Final version for the Challenge Index: RangeFashion.cc =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Behaviors/src/RangeFashion.cc,v retrieving revision 1.7 retrieving revision 1.8 diff -C2 -d -r1.7 -r1.8 *** RangeFashion.cc 5 Jul 2005 23:04:29 -0000 1.7 --- RangeFashion.cc 6 Jul 2005 09:21:55 -0000 1.8 *************** *** 148,152 **** // Get the 3 mins of the range ! int min1=m_maxDist, min2=m_maxDist, min3=m_maxDist; for(int j=begin; j<end; ++j){ if(laserReading[j] <= min1){ --- 148,152 ---- // Get the 3 mins of the range ! int min1=m_maxDistLaser, min2=m_maxDistLaser, min3=m_maxDistLaser; for(int j=begin; j<end; ++j){ if(laserReading[j] <= min1){ *************** *** 169,186 **** // Transform of the data according to the physical dimension of the robot int offset = 0; ! switch (ind_vectPercept){ ! case 0: offset = 100; break; ! case 1: offset = 150; break; ! case 2: offset = 150; break; ! case 3: offset = 300; break; ! case 9: offset = 300; break; ! case 10: offset = 150; break; ! case 11: offset = 150; break; ! } // Save the value if(((*vectPercept)[ind_vectPercept] == -1) && (x_min < m_maxDistLaser)){ (*vectPercept)[ind_vectPercept] = x_min - offset; ! } // Next position --- 169,186 ---- // Transform of the data according to the physical dimension of the robot int offset = 0; ! /*switch (ind_vectPercept){ ! case 0: offset = 200; break; ! case 1: offset = 150; break; ! case 2: offset = 150; break; ! case 3: offset = 300; break; ! case 9: offset = 300; break; ! case 10: offset = 150; break; ! case 11: offset = 150; break; ! }*/ // Save the value if(((*vectPercept)[ind_vectPercept] == -1) && (x_min < m_maxDistLaser)){ (*vectPercept)[ind_vectPercept] = x_min - offset; ! } // Next position Index: SafeVelocityPat.cc =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Behaviors/src/SafeVelocityPat.cc,v retrieving revision 1.9 retrieving revision 1.10 diff -C2 -d -r1.9 -r1.10 *** SafeVelocityPat.cc 6 Jul 2005 05:24:20 -0000 1.9 --- SafeVelocityPat.cc 6 Jul 2005 09:21:55 -0000 1.10 *************** *** 87,90 **** --- 87,95 ---- * @parameter_description Factor to scale de security distance for the side * + * @parameter_name MAX_VEL_CORRECTION + * @parameter_type int + * @parameter_value 30 + * @parameter_description Maximum velocity correction to apply + * END*/ *************** *** 106,109 **** --- 111,115 ---- int m_min_distance; float m_side_factor; + int m_max_vel_correction; // Slope of the line used to determine security distance *************** *** 118,121 **** --- 124,128 ---- } } + return margin; } *************** *** 139,142 **** --- 146,150 ---- m_min_distance = dereference_cast<int> (parameters.get("MIN_DISTANCE")); m_side_factor = dereference_cast<float> (parameters.get("SIDE_FACTOR")); + m_max_vel_correction = dereference_cast<int> (parameters.get("MAX_VEL_CORRECTION")); m_slope = (1.0*m_max_distance - m_min_distance) / (m_max_vel - m_min_vel); *************** *** 182,188 **** // According to the side of the rotation evaluate the margin of security int rotation = dereference_cast<int>(rotationRef); float marginFront=1, marginBack=1; // Backward or forward motion ! if(signVelIn > 0){ if (rotation < 0){ // Back margin of the robot (range 6,7,8,9) --- 190,206 ---- // According to the side of the rotation evaluate the margin of security int rotation = dereference_cast<int>(rotationRef); + if(abs(rotation) < 5) rotation = 0; // Dead Zone + float marginFront=1, marginBack=1; // Backward or forward motion ! if(rotation == 0){ ! marginBack = getMinMargin(securityMargin, 5, 7); ! marginFront = getMinMargin(securityMargin, 0, 1); ! if(marginFront > securityMargin[11]) marginFront = securityMargin[11]; ! if(signVelIn > 0) ! marginBack = 1; ! else marginFront = 1; ! } ! else if(signVelIn > 0){ if (rotation < 0){ // Back margin of the robot (range 6,7,8,9) *************** *** 229,234 **** // Backward/Forward velOut *= signVelIn; ! ! // Debugging purpose only std::cout << "Safe Velocity" << std::endl; if(velOut != velIn) --- 247,258 ---- // Backward/Forward velOut *= signVelIn; ! velIn *= signVelIn; ! ! // Limit acceleration ! if(abs(velIn - velOut) > m_max_vel_correction) ! if(velIn > velOut) velOut = velIn - m_max_vel_correction; ! else velOut = velIn + m_max_vel_correction; ! ! // Debug purpose only std::cout << "Safe Velocity" << std::endl; if(velOut != velIn) Index: AvoidPat.cc =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Behaviors/src/AvoidPat.cc,v retrieving revision 1.11 retrieving revision 1.12 diff -C2 -d -r1.11 -r1.12 *** AvoidPat.cc 6 Jul 2005 05:24:20 -0000 1.11 --- AvoidPat.cc 6 Jul 2005 09:21:55 -0000 1.12 *************** *** 195,200 **** case 3: case 9: ! max_speed *= 0.75; ! min_speed *= 0.75; break; } --- 195,200 ---- case 3: case 9: ! max_speed *= 0.5; ! min_speed *= 0.5; break; } *************** *** 287,290 **** --- 287,292 ---- ObjectRef rangeValueRef = getInput(rangeID,count); ObjectRef velInRef = getInput(velocityInID,count); + + static int tempo=0; // Static variabe to manage backward movement *************** *** 302,306 **** --backward_iter; backward = !(backward_iter == 0); ! // Debugging purpose std::cout << "Avoid Behavior" << std::endl; --- 304,308 ---- --backward_iter; backward = !(backward_iter == 0); ! // Debugging purpose std::cout << "Avoid Behavior" << std::endl; *************** *** 354,359 **** // If avoidance is needed ! if(min_margin < 1){ ! // Turning left or right? int sign = 1; --- 356,363 ---- // If avoidance is needed ! if((min_margin < 1) || (tempo != 0)){ ! if(min_margin < 1) tempo = m_backward_iter; ! else tempo--; ! // Turning left or right? int sign = 1; |
From: Fernyqc <fe...@us...> - 2005-07-06 05:24:32
|
Update of /cvsroot/robotflow/RobotFlow/Behaviors/src In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv26832 Modified Files: AvoidPat.cc BumperStall.cc SafeVelocityPat.cc Log Message: - BumperStall -> Parameters to allow the robot to get out of the stall situation; - AvoidPat -> Deadzone on velIn, saturate thee output variation - SafeVelocity -> Code review and indentation Index: AvoidPat.cc =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Behaviors/src/AvoidPat.cc,v retrieving revision 1.10 retrieving revision 1.11 diff -C2 -d -r1.10 -r1.11 *** AvoidPat.cc 6 Jul 2005 03:27:15 -0000 1.10 --- AvoidPat.cc 6 Jul 2005 05:24:20 -0000 1.11 *************** *** 89,92 **** --- 89,97 ---- * @parameter_description Maximum distance of the protection (mm) * + * @parameter_name MAX_VEL_CORRECTION + * @parameter_type int + * @parameter_value 10 + * @parameter_description Maximum correction of the velocity per iteration (mm/s) + * * @parameter_name MAX_DISTANCE * @parameter_type int *************** *** 145,148 **** --- 150,154 ---- int m_min_avoid_vel; int m_max_robot_vel; + int m_max_vel_correction; int m_max_rot; int m_min_rot; *************** *** 228,233 **** case 3: case 9: ! max_speed *= 0.55; ! min_speed *= 0.60; break; } --- 234,239 ---- case 3: case 9: ! max_speed *= 0.40; ! min_speed *= 0.40; break; } *************** *** 271,274 **** --- 277,281 ---- m_backward_rot = dereference_cast<int> (parameters.get("BACKWARD_ROTATION")); m_backward_iter = dereference_cast<int> (parameters.get("BACKWARD_ITERATION")); + m_max_vel_correction = dereference_cast<int> (parameters.get("MAX_VEL_CORRECTION")); if(m_backward_iter<1) m_backward_iter = 1; *************** *** 300,305 **** std::cout << "Backward -> true" << std::endl; std::cout << "backward_iter -> " << backward_iter << std::endl; ! std::cout << "backward_vel -> " << backward_vel << std::endl; ! std::cout << "backward_rot -> " << backward_rot << std::endl; } else{ --- 307,312 ---- std::cout << "Backward -> true" << std::endl; std::cout << "backward_iter -> " << backward_iter << std::endl; ! std::cout << "backward_vel -> " << backward_vel << std::endl; ! std::cout << "backward_rot -> " << backward_rot << std::endl; } else{ *************** *** 308,311 **** --- 315,320 ---- if(!velInRef->isNil()){ velIn = dereference_cast<int>(velInRef); + // Add a dead zone near zero + if(abs(velIn) <= 5) velIn = 0; } *************** *** 362,381 **** // Moving backward ? ! switch(pos_min_margin){ ! case 0: ! case 1: ! case 11: ! backward = (rangeValue[pos_min_margin] < m_backward_dist); ! velocity = -1*m_backward_vel; ! rotation = -1*m_backward_rot; ! backward_iter = m_backward_iter-1; ! backward_vel = velocity; ! backward_rot = rotation * sign; ! break; } ! // Moving forward ! if(!backward){ ! // Symetrie of the range value int position = pos_min_margin; --- 371,395 ---- // Moving backward ? ! if(pos_min_margin == 0){ ! backward = (rangeValue[pos_min_margin] < m_backward_dist); ! rotation = m_backward_rot/2; ! } ! else if(pos_min_margin == 1){ ! backward = (rangeValue[pos_min_margin] < m_backward_dist); ! rotation = m_backward_rot; ! } ! else if(pos_min_margin == 11){ ! backward = (rangeValue[pos_min_margin] < m_backward_dist); ! rotation = -1*m_backward_rot; } ! // Finalize command ! if(backward){ ! velocity = -1*m_backward_vel; ! backward_iter = m_backward_iter-1; ! backward_vel = velocity; ! backward_rot = rotation * sign; ! } ! else{ // Symetrie of the range value int position = pos_min_margin; *************** *** 385,388 **** --- 399,403 ---- velocity = computeVel(position, min_margin); rotation = computeRot(position, min_margin); + } *************** *** 390,393 **** --- 405,414 ---- velocity *= sign; rotation *= sign; + + // Saturate velocity correction + if(abs(velIn - velocity) > m_max_vel_correction) + if(velIn > velocity) velocity = velIn - m_max_vel_correction; + else velocity = velIn + m_max_vel_correction; + (*outputs[velocityOutID].buffer)[count] = ObjectRef(Int::alloc(velocity)); (*outputs[rotationID].buffer)[count] = ObjectRef(Int::alloc(rotation)); Index: SafeVelocityPat.cc =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Behaviors/src/SafeVelocityPat.cc,v retrieving revision 1.8 retrieving revision 1.9 diff -C2 -d -r1.8 -r1.9 *** SafeVelocityPat.cc 6 Jul 2005 03:27:15 -0000 1.8 --- SafeVelocityPat.cc 6 Jul 2005 05:24:20 -0000 1.9 *************** *** 69,74 **** * @parameter_name MIN_VELOCITY * @parameter_type int ! * @parameter_value 75 ! * @parameter_description Minimum velocity to beat friction (mm/s). * * @parameter_name MAX_DISTANCE --- 69,74 ---- * @parameter_name MIN_VELOCITY * @parameter_type int ! * @parameter_value -1 ! * @parameter_description Minimum velocity to beat friction (mm/s) (-1 to disable). * * @parameter_name MAX_DISTANCE *************** *** 220,224 **** else velOut = static_cast<int>(velIn * marginBack); ! if((velOut < m_min_vel) && (velIn > m_min_vel)) velOut = m_min_vel; --- 220,225 ---- else velOut = static_cast<int>(velIn * marginBack); ! // To be sure to beat friction ! if((m_min_vel>0) && (velOut < m_min_vel) && (velIn > m_min_vel)) velOut = m_min_vel; *************** *** 228,232 **** // Backward/Forward velOut *= signVelIn; - velOut = signVelIn * velIn; // Debugging purpose only --- 229,232 ---- Index: BumperStall.cc =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Behaviors/src/BumperStall.cc,v retrieving revision 1.1 retrieving revision 1.2 diff -C2 -d -r1.1 -r1.2 *** BumperStall.cc 4 Jul 2005 06:47:32 -0000 1.1 --- BumperStall.cc 6 Jul 2005 05:24:20 -0000 1.2 *************** *** 64,67 **** --- 64,72 ---- * @parameter_description Type of action to do on bumper contact * + * @parameter_name NB_ITER_ALLOWED + * @parameter_type int + * @parameter_value 5 + * @parameter_description Number of allowed iterations with a bumper pressed (TYPE=0) + * END*/ *************** *** 73,82 **** int stateID; //outputs int rotationID; int velocityID; ! //parameters int m_type; public: --- 78,89 ---- int stateID; + //outputs int rotationID; int velocityID; ! //parameters int m_type; + int m_nbAllowed; public: *************** *** 95,98 **** --- 102,107 ---- // Parameters m_type = dereference_cast<int> (parameters.get("BEHAVIOR_TYPE")); + m_nbAllowed = dereference_cast<int> (parameters.get("NB_ITER_ALLOWED")); + if(m_nbAllowed < 1) m_nbAllowed = 1; } *************** *** 102,105 **** --- 111,115 ---- ObjectRef nbRef = getInput(nbID,count); ObjectRef stateRef = getInput(stateID,count); + static int nbAllowed=m_nbAllowed; if (!nbRef->isNil() && !stateRef->isNil()){ *************** *** 116,120 **** case 0: // Stall the robot if a bumper is pressed ! nilOutputs = (state == 0); break; default: std::cout << "BumperStall -> Unknown behavior type" << std::endl; --- 126,135 ---- case 0: // Stall the robot if a bumper is pressed ! if(state != 0){ ! if(nbAllowed > 0) ! nbAllowed--; ! else nilOutputs = false; ! } ! else nbAllowed=m_nbAllowed; break; default: std::cout << "BumperStall -> Unknown behavior type" << std::endl; |
From: MagellanPro <mag...@us...> - 2005-07-06 03:27:47
|
Update of /cvsroot/robotflow/RobotFlow/Behaviors/src In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv4334 Modified Files: AvoidPat.cc SafeVelocityPat.cc Log Message: - Defect version of the SafeVelocity - Sign correction for the backward velocity Index: SafeVelocityPat.cc =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Behaviors/src/SafeVelocityPat.cc,v retrieving revision 1.7 retrieving revision 1.8 diff -C2 -d -r1.7 -r1.8 *** SafeVelocityPat.cc 5 Jul 2005 23:04:29 -0000 1.7 --- SafeVelocityPat.cc 6 Jul 2005 03:27:15 -0000 1.8 *************** *** 228,231 **** --- 228,232 ---- // Backward/Forward velOut *= signVelIn; + velOut = signVelIn * velIn; // Debugging purpose only Index: AvoidPat.cc =================================================================== RCS file: /cvsroot/robotflow/RobotFlow/Behaviors/src/AvoidPat.cc,v retrieving revision 1.9 retrieving revision 1.10 diff -C2 -d -r1.9 -r1.10 *** AvoidPat.cc 6 Jul 2005 01:12:57 -0000 1.9 --- AvoidPat.cc 6 Jul 2005 03:27:15 -0000 1.10 *************** *** 300,305 **** std::cout << "Backward -> true" << std::endl; std::cout << "backward_iter -> " << backward_iter << std::endl; ! std::cout << "backward_vel -> " << backward_vel << std::endl; ! std::cout << "backward_rot -> " << backward_rot << std::endl; } else{ --- 300,305 ---- std::cout << "Backward -> true" << std::endl; std::cout << "backward_iter -> " << backward_iter << std::endl; ! std::cout << "backward_vel -> " << backward_vel << std::endl; ! std::cout << "backward_rot -> " << backward_rot << std::endl; } else{ *************** *** 367,371 **** case 11: backward = (rangeValue[pos_min_margin] < m_backward_dist); ! velocity = m_backward_vel; rotation = -1*m_backward_rot; backward_iter = m_backward_iter-1; --- 367,371 ---- case 11: backward = (rangeValue[pos_min_margin] < m_backward_dist); ! velocity = -1*m_backward_vel; rotation = -1*m_backward_rot; backward_iter = m_backward_iter-1; |