我正在尝试将给定的3D点投影到图像平面上,我对此发布了很多问题,很多人为我提供帮助,我也阅读了许多相关链接,但投影仍然无法正确地为我工作。
我有一个3d点(-455,-150,0),其中x是depth轴,z是upwards轴,y是horizontal轴,我有 roll:围绕前后轴(x)旋转,pitch:围绕左右轴(y)旋转和yaw:围绕垂直轴(z)旋转)>>我在相机上也有位置(x,y,z)=(-50,0,100)所以我在做以下首先,我使用外部参数从世界坐标到相机坐标:
double pi = 3.14159265358979323846; double yp = 0.033716827630996704* pi / 180; //roll double thet = 67.362312316894531* pi / 180; //pitch double k = 89.7135009765625* pi / 180; //yaw double rotxm[9] = { 1,0,0,0,cos(yp),-sin(yp),0,sin(yp),cos(yp) }; double rotym[9] = { cos(thet),0,sin(thet),0,1,0,-sin(thet),0,cos(thet) }; double rotzm[9] = { cos(k),-sin(k),0,sin(k),cos(k),0,0,0,1}; cv::Mat rotx = Mat{ 3,3,CV_64F,rotxm }; cv::Mat roty = Mat{ 3,3,CV_64F,rotym }; cv::Mat rotz = Mat{ 3,3,CV_64F,rotzm }; cv::Mat rotationm = rotz * roty * rotx; //rotation matrix cv::Mat mpoint3(1, 3, CV_64F, { -455,-150,0 }); //the 3D point location mpoint3 = mpoint3 * rotationm; //rotation cv::Mat position(1, 3, CV_64F, {-50,0,100}); //the camera position mpoint3=mpoint3 - position; //translation
现在我想从相机坐标移到图像坐标
第一个解决方案是
:正如我从某些资料中读到的Mat myimagepoint3 = mpoint3 * mycameraMatrix;
这不起作用
第二个解决方案是
:double fx = cameraMatrix.at<double>(0, 0); double fy = cameraMatrix.at<double>(1, 1); double cx1 = cameraMatrix.at<double>(0, 2); double cy1= cameraMatrix.at<double>(1, 2); xt = mpoint3 .at<double>(0) / mpoint3.at<double>(2); yt = mpoint3 .at<double>(1) / mpoint3.at<double>(2); double u = xt * fx + cx1; double v = yt * fy + cy1;
但也没有用
我也尝试过使用opencv方法fisheye :: projectpoints(从世界到图像坐标)
Mat recv2; cv::Rodrigues(rotationm, recv2); //inputpoints a vector contains one point which is the 3d world coordinate of the point //outputpoints a vector to store the output point cv::fisheye::projectPoints(inputpoints,outputpoints,recv2,position,mycameraMatrix,mydiscoff );
但是这也不起作用
[没有用,我是说:我知道(在图像中)该点应该出现在哪里,但是当我绘制它时,它总是在另一个地方(甚至不靠近),有时我甚至得到一个负值
注意:没有语法错误或异常,但是我在此处编写代码时可以打错吗?所以有人可以建议我做错了什么吗?
我正在尝试将给定的3D点投影到图像平面上,我对此发布了很多问题,很多人为我提供帮助,我也阅读了许多相关链接,但投影对我而言仍然无效...
您是否尝试过切换坐标?