2

In my Python / OpenCV code I'm using random homographies to simulate viewpoint changes (I'm evaluating interest point detectors and wanted to avoid searching for image pairs).

Is it possible to compute epipolar lines and the Essential matrix between one image and its warped version? The only information I have is the homography that I applied.

There are other posts somewhat related to my problem : How to calculate Rotation and Translation matrices from homography?, How to calculate Rotation and Translation matrices from homography?, Find Homography atrix from Fundamental matrix, but I have no idea how to do Homography matrix -> Essential matrix.

If what I'm trying to do doesn't make sense, please tell me why.

Thanks

2 Answers2

1

I don't think you need the Essential matrix: "Fundamental Matrix contains the same information as Essential Matrix in addition to the information about the intrinsics of both cameras so that we can relate the two cameras in pixel coordinates."

This tutorial can help you understand how to calculate epipolar lines from homography. First of all, you need to find a few points to feed the cv.findFundamentalMat. These points can be calculated from the homography matrix - just take eight or more random points (pts1) and apply the homography matrix to them (pts2).

F, mask = cv.findFundamentalMat(pts1, pts2, cv.FM_LMEDS)

Finaly you can find the epilines parameters from corresponding image:

epilines1 = cv.computeCorrespondEpilines(pts2.reshape(-1,1,2), 2, F)
epilines2 = cv.computeCorrespondEpilines(pts1.reshape(-1,1,2), 1, F)
mr NAE
  • 3,144
  • 1
  • 15
  • 35
1

Your assumption that you can just generate image pairs for stereo in this manner is flawed.

You need two images with different center of projection (i.e. the camera must move).

A 2D homography applied to an existing images does not change the center of projection. You can simulate 3D rotation about the camera center, not translation.

Intuitively, for epipolar geometry to be defined, you need a baseline. The baseline passes through both camera centers. If you have two images with identical camera center, then the baseline is not defined. You do not get disparity or depth information from such images. Neither essential nor fundamental matrices are defined in your setup.

To see this mathematically, consider that the 1D null-space of the projection matrix P of your image gives the camera center C in homogeneous coordinates

C=null(P)

with

P*C = 0

Now for the warped image, you still have

H*P*C = H*0 = 0

the same camera center.

André Aichert
  • 313
  • 2
  • 11