OpenCV计算距离(立体视觉)

问题描述 投票:0回答:1

对于我的项目,我使用下一个代码的部分内容:link

为了跟踪特定颜色的对象,我实现了此方法:

我的问题是:如何计算到被跟踪的彩色物体的距离?

  • 应用程序调用左右框架的方法。这效率不高...
  • 我需要计算DetectedObject.Zcor
DetectedObject Detect(IplImage *frame)
{
     //Track object (left frame and right frame)
     //Calculate average position
     //Show X,Y,Z coordinate and detected color

    color_image = frame;
    
    imgThreshold = cvCreateImage(cvSize(color_image->width,color_image->height), IPL_DEPTH_8U, 1);
    cvInitFont(&font, CV_FONT_HERSHEY_PLAIN, 1, 1, 0, 1.4f, CV_AA);
    
    imgdraw = cvCreateImage(cvGetSize(color_image),8,3);
    cvSetZero(imgdraw);

    cvFlip(color_image, color_image, 1);

    cvSmooth(color_image, color_image, CV_GAUSSIAN, 3, 0);

    threshold = getThreshold(color_image);
    cvErode(threshold, threshold, NULL, 3);
    cvDilate(threshold, threshold, NULL, 10);
    imgThreshold = cvCloneImage(threshold);

    storage = cvCreateMemStorage(0);
    contours = cvCreateSeq(0, sizeof(CvSeq), sizeof(CvPoint), storage);
    cvFindContours(threshold, storage, &contours, sizeof(CvContour), CV_RETR_CCOMP, CV_CHAIN_APPROX_NONE, cvPoint(0,0));
    final = cvCreateImage(cvGetSize(color_image),8,3);
    
    for(; contours!=0; contours = contours->h_next)
    {
        CvRect rect = cvBoundingRect(contours, 0);  
    
        cvRectangle(color_image, 
            cvPoint(rect.x, rect.y),
            cvPoint(rect.x+rect.width, rect.y+rect.height),
            cvScalar(0,0,255,0),
            2,8,0);
        
        string s = to_string(rect.x) + "," +  to_string(rect.y);
        char const* pchar = s.c_str();

        cvPutText(frame, pchar, cvPoint(rect.x, rect.y), &font, cvScalar(0,0,255,0));
        detectedObject.Xcor = rect.x;
        detectedObject.Ycor = rect.y;
    }

    cvShowImage("Threshold", imgThreshold);

    cvAdd(final,imgdraw,final);
    detectedObject.Zcor = 0;
    return detectedObject;
}
c++ opencv computer-vision distance
1个回答
0
投票

对于深度估计,您将需要一个校准的立体对(左摄像机和右摄像机的已知摄像机矩阵)。然后,使用相机矩阵和立体对中的相应点/轮廓,您可以计算深度。

© www.soinside.com 2019 - 2024. All rights reserved.