If I'm doing pose estimation using a single camera using 3D-2D correspondences (E.g. PNP algorithm), I have read that reprojecting the points can give me an estimate of the Jacobian (cv::projectPoints), which can then be used to compute an estimate of the covariance of the pose.
But if I have two cameras, and I am performing relative pose estimation between the cameras using the fundamental/essential matrix (cv::findEssentialMat) and subsequent decomposition of the matrix, how can I compute the covariance of the relative pose between the cameras?