当前位置:网站首页>Solve the essential matrix from the basic matrix to obtain the parameters of the camera (3D reconstruction task1-5)
Solve the essential matrix from the basic matrix to obtain the parameters of the camera (3D reconstruction task1-5)
2022-07-21 23:31:00 【North and north of the red fox】
Solve the essential matrix from the basic matrix , Get the parameters of the camera ( Three dimensional reconstruction task1-5)
Code presentation
#include <math/matrix.h>
#include <math/matrix_svd.h>
typedef math::Matrix<double, 3, 3> FundamentalMatrix;
typedef math::Matrix<double, 3, 3> EssentialMatrix;
// Used to test the correctness of camera posture
math::Vec2d p1={
0.18012331426143646, -0.15658402442932129};
math::Vec2d p2={
0.2082643061876297, -0.035404585301876068};
/* The internal and external parameters of the first camera */
double f1 = 0.972222208;
/* The internal and external parameters of the second camera */
double f2 = 0.972222208;
/** * \description Triangulate the matching points to obtain three-dimensional points * @param p1 -- Feature points in the first image * @param p2 -- Feature points in the second image * @param K1 -- The internal parameter matrix of the first image * @param R1 -- The rotation matrix of the first image * @param t1 -- The translation vector of the first image * @param K2 -- The internal parameter matrix of the second image * @param R2 -- The rotation matrix of the second image * @param t2 -- The translation vector of the second image * @return 3D point */
math::Vec3d triangulation(math::Vec2d const & p1
, math::Vec2d const & p2
, math::Matrix3d const &K1
, math::Matrix3d const &R1
, math::Vec3d const & t1
, math::Matrix3d const &K2
, math::Matrix3d const &R2
, math::Vec3d const& t2){
// Construct projection matrix
math::Matrix<double, 3, 4>P1, P2;
math::Matrix<double, 3, 3> KR1 = K1 * R1;
math::Matrix<double, 3, 1> Kt1(*(K1 * t1));
P1 = KR1.hstack(Kt1);
math::Matrix<double, 3, 3> KR2 = K2 * R2;
math::Matrix<double, 3, 1> Kt2(*(K2 * t2));
P2 = KR2.hstack(Kt2);
std::cout<<"P1: "<<P1<<std::endl;
std::cout<<"P1 for fist pose should be\n"
<<"0.972222 0 0 0\n"
<<"0 0.972222 0 0\n"
<<"0 0 1 0\n";
std::cout<<"P2: "<<P2<<std::endl;
std::cout<<"P2 for fist pose should be\n"
<<" -0.957966 0.165734 -0.00707496 0.0774496\n"
<<"0.164089 0.952816 0.102143 0.967341\n"
<<"0.0250416 0.102292 -0.994439 0.0605768\n";
/* structure A matrix */
math::Matrix<double, 4, 4> A;
// Yes A Assign values to each column of the
for(int i=0; i<4; i++){
// The first 1 A little bit
A(0, i) = p1[0]*P1(2, i) - P1(0, i);
A(1, i) = p1[1]*P1(2, i) - P1(1, i);
// The first 2 A little bit
A(2, i) = p2[0]*P2(2, i) - P2(0, i);
A(3, i) = p2[1]*P2(2, i) - P2(1, i);
}
std::cout<<"A: "<<std::endl;
std::cout<<"A for first pose should be:\n"
<<"-0.972222 0 0.180123 0\n"
<<"-0 -0.972222 -0.156584 -0\n"
<<"0.963181 -0.14443 -0.200031 -0.0648336\n"
<<"-0.164975 -0.956437 -0.0669352 -0.969486\n";
math::Matrix<double, 4, 4> V;
math::matrix_svd<double, 4, 4> (A, nullptr, nullptr, &V);
math::Vec3d X;
X[0] = V(0, 3)/V(3, 3);
X[1] = V(1, 3)/V(3, 3);
X[2] = V(2, 3)/V(3, 3);
std::cout<<"X for first pose should be:\n"
<<"3.2043116948585566 -2.7710180887818652 17.195578538234088\n";
return X;
}
/** * \description Judge whether the camera posture is correct , The method is to calculate the coordinates of three-dimensional points in two cameras , Ask for z The coordinates are greater than 0, namely * The 3D point is in front of both cameras at the same time * @param match * @param pose1 * @param pose2 * @return */
bool is_correct_pose (math::Matrix3d const &R1, math::Vec3d const & t1
,math::Matrix3d const &R2, math::Vec3d const & t2) {
/* Camera internal parameter matrix */
math::Matrix3d K1(0.0), K2(0.0);
K1(0, 0) = K1(1, 1) = f1;
K2(0, 0) = K2(1, 1) = f2;
K1(2,2) = 1.0;
K2(2,2) = 1.0;
math::Vec3d p3d = triangulation(p1, p2, K1, R1, t1, K2, R2, t2);
math::Vector<double, 3> x1 = R1 * p3d + t1;
math::Vector<double, 3> x2 = R2 * p3d + t2;
return x1[2] > 0.0f && x2[2] > 0.0f;
}
bool calc_cam_poses(FundamentalMatrix const &F
, const double f1, const double f2
, math::Matrix3d& R
, math::Vec3d & t)
{
/* Camera internal parameter matrix */
math::Matrix3d K1(0.0), K2(0.0);
K1(0, 0) = K1(1, 1) = f1; K1(2,2)=1.0;
K2(0, 0) = K2(1, 1) = f2; K2(2,2) =1.0;
EssentialMatrix E = K2.transpose() * F * K1;
std::cout<<"EssentialMatrix result is "<<E<<std::endl;
std::cout<<"EssentialMatrix should be: \n"
<<"-0.00490744 -0.0146139 0.34281\n"
<<"0.0212215 -0.000748851 -0.0271105\n"
<<"-0.342111 0.0315182 -0.00552454\n";
/* The essential matrix solves the relative pose between cameras , The first camera pose can be set to [I|0], The attitude of the second camera [R|t] * Can be obtained by decomposing the essential matrix , E=U*S*V', among S Is to normalize the scale, and then diag(1,1,0) */
math::Matrix<double, 3, 3> W(0.0);
W(0, 1) = -1.0; W(1, 0) = 1.0; W(2, 2) = 1.0;
math::Matrix<double, 3, 3> Wt(0.0);
Wt(0, 1) = 1.0; Wt(1, 0) = -1.0; Wt(2, 2) = 1.0;
math::Matrix<double, 3, 3> U, S, V;
math::matrix_svd(E, &U, &S, &V);
// Ensure that the rotation matrix det(R) = 1 (instead of -1).
if (math::matrix_determinant(U) < 0.0)
for (int i = 0; i < 3; ++i)
U(i,2) = -U(i,2);
if (math::matrix_determinant(V) < 0.0)
for (int i = 0; i < 3; ++i)
V(i,2) = -V(i,2);
/* The attitude of the camera has 4 Medium condition */
V.transpose();
std::vector<std::pair<math::Matrix3d, math::Vec3d> > poses(4);
poses[0].first = U * W * V;
poses[1].first = U * W * V;
poses[2].first = U * Wt * V;
poses[3].first = U * Wt * V;
poses[0].second = U.col(2);
poses[1].second = -U.col(2);
poses[2].second = U.col(2);
poses[3].second = -U.col(2);
std::cout<<"Result of 4 candidate camera poses shoule be \n"
<<"R0:\n"
<<"-0.985336 0.170469 -0.0072771\n"
<<"0.168777 0.980039 0.105061\n"
<<"0.0250416 0.102292 -0.994439\n"
<<"t0:\n"
<<" 0.0796625 0.99498 0.0605768\n"
<<"R1: \n"
<<"-0.985336 0.170469 -0.0072771\n"
<<"0.168777 0.980039 0.105061\n"
<<"0.0250416 0.102292 -0.994439\n"
<<"t1:\n"
<<"-0.0796625 -0.99498 -0.0605768\n"
<<"R2: \n"
<<"0.999827 -0.0119578 0.0142419\n"
<<"0.0122145 0.999762 -0.0180719\n"
<<"-0.0140224 0.0182427 0.999735\n"
<<"t2:\n"
<<"0.0796625 0.99498 0.0605768\n"
<<"R3: \n"
<<"0.999827 -0.0119578 0.0142419\n"
<<"0.0122145 0.999762 -0.0180719\n"
<<"-0.0140224 0.0182427 0.999735\n"
<<"t3: \n"
<<"-0.0796625 -0.99498 -0.0605768\n";
// The rotation matrix of the first camera R1 Set as identity matrix , Translation vector t1 Set to 0
math::Matrix3d R1;
math::matrix_set_identity(&R1);
math::Vec3d t1;
t1.fill(0.0);
// Judge whether the posture is reasonable
bool flags[4];
for(int i=0; i<4; i++){
flags[i] = is_correct_pose(R1, t1, poses[i].first, poses[i].second);
}
// Find the right posture
if(flags[0]||flags[1]||flags[2]||flags[3]){
for(int i=0; i<4; i++) {
if(!flags[i])continue;
R = poses[i].first;
t = poses[i].second;
}
return true;
}
return false;
}
int main(int argc, char* argv[]){
FundamentalMatrix F;
F[0] = -0.0051918668202215884;
F[1] = -0.015460923969578466;
F[2] = 0.35260470328319654;
F[3] = 0.022451443619913483;
F[4] = -0.00079225386526248181;
F[5] = -0.027885130552744289;
F[6] = -0.35188558059920161;
F[7] = 0.032418724757766811;
F[8] = -0.005524537443406155;
math::Matrix3d R;
math::Vec3d t;
if(calc_cam_poses(F, f1,f2,R, t)){
std::cout<<"Correct pose found!"<<std::endl;
std::cout<<"R: "<<R<<std::endl;
std::cout<<"t: "<<t<<std::endl;
}
std::cout<<"Result should be: \n";
std::cout<<"R: \n"
<< "0.999827 -0.0119578 0.0142419\n"
<< "0.0122145 0.999762 -0.0180719\n"
<< "-0.0140224 0.0182427 0.999735\n";
std::cout<<"t: \n"
<<"0.0796625 0.99498 0.0605768\n";
return 0;
}
Code parsing
1. What we know is the basic matrix F, And the internal parameters of our camera k1 and k2, What we are looking for is the parameters of the camera , Is the position of the camera R and T.
2. We know the essential matrix of our camera , It can be used k2 The transpose of is multiplied by the base matrix and then multiplied by k1, It's also equal to t Cross times two . So the position of our camera R、T And this K It can be connected with what we know . Because we know this k1、k2 and F, We can find this essential matrix , We're going to... Him SVD decompose , You can get our R and t This expression of , so what , How do we judge this? We get this R and t It's right ?
3. Then we can pass this R and t To calculate the three-dimensional world coordinates of a point , It must meet him in front of both cameras at the same time . This point can only be seen in front of the camera , If this point is behind the camera, we can't see it .
4. So we finally need to judge whether this point is in front of the camera .
边栏推荐
猜你喜欢
输入N,自动计算阶乘和
MGRE experiment
Hcip rip the next day
【无标题】HCIP第一天笔记
Agvmir205 -- software development (API call)
电气EPlan软件第一章到第五章的学习
C语言运用函数七大极易忽略的结论
CocosCreator3. X access Tencent game online battle engine notes
TWINCAT3中使用FIFO收集三轴的位置信息,XML文件的生成,解决常见报错
Real estate giant, data integration construction project plan (take it away, no thanks)
随机推荐
How to record game interface in unity editor
数据分析入门的经典书籍——《精益数据分析》
如何做冒烟测试
Comprehensive experiment of RIP
Analysis of matter protocol characteristics (III) device discovery, authentication and distribution network
电气EPlan软件第一章到第五章的学习
How to test interfaces that depend on third parties?
带你轻松解密白盒测试及(Demo详解)
Static comprehensive experiment
MGRE experiment
接口测试没思路?一篇教你搞定面试
一句话需求怎么测
TWINCAT3中使用FIFO收集三轴的位置信息,XML文件的生成,解决常见报错
I am the king of overseas lpwan. I understand the wi sun agreement in one article
探索式软件测试
Machine learning frequency vs Bayes
Great reward for data visualization chart plug-in development works (I)
自学软件测试要学哪些?
Real estate giant, data integration construction project plan (take it away, no thanks)
005_ SS_ Palette Image-to-Image Diffusion Models