python ret value vastly different from reprojection error
In this question, I am referring to the documentation example given here: https://docs.opencv.org/4.1.0/dc/dbb/...
To give a short summary: It's an example on how to calibrate a camera using a chessboard-pattern. In the example the author calibrates the camera like this:
ret, mtx, dist, rvecs, tvecs = cv.calibrateCamera(objpoints, imgpoints, gray.shape[::-1], None, None)
It is stated in the documentation, that the ret-value is supposed to be the overall RMS of the reprojection error: (Check: https://docs.opencv.org/4.1.0/d9/d0c/...)
However, at the end of the script, the author calculates the reprojection error like this:
mean_error = 0
for i in xrange(len(objpoints)):
imgpoints2, _ = cv.projectPoints(objpoints[i], rvecs[i], tvecs[i], mtx, dist)
error = cv.norm(imgpoints[i], imgpoints2, cv.NORM_L2)/len(imgpoints2)
mean_error += error
print( "self-calculated error: {}".format(mean_error/len(objpoints)) )
print( "ret-value: {}".format(ret))
So this does - to my understanding - calculate the average normed reprojection-error per point per image. However, this is vastly different from the ret-value, that is given back to the user by calibrateCamera. Running the code and comparing the results leads to the results:
self-calculated error: 0.02363595176460404
ret-value: 0.15511421684649151
These are an order of maginute different and I think that should not be the case, ...right (?!) And the more important question: It is often stated, that the most important value to define "a good calibration" is a reprojection error < 1 and close to zero. Which reprojection error should be used for that?
I really hope someone can answer this question as it has been bugging me for a week now.
Cheers,
Dennis