Ripples in stereo disparity image?
I have a stereo image captured on relatively flat ground with the camera mounted above ground and pointed down at angle relative to horizontal. The source images are already rectified and aligned using the camera API.
When I calculate the disparity image in OpenCV using either BM or SGBM, the resulting disparity image shows "ripples". When this data is re-projected into 3D space using cv2.reprojectImageTo3D() the 3D points show a curved ground surface (e.g. warped around view axis) with ripples along the view axis. The re-projection matrix Q is calculated by cv2.stereoRectify() using the focal lengths (fx,fy) and Baseline data from rectified images, but use 0 distortion since the source images were already rectified and aligned.
My best guess is that the warping is due to the camera "calibration" parameters I'm using (fx,fy,B) (based on 3rd party calibration values), and that the ripples are based on the limits of subpixel interpolation. (I do not see this effect using the Point Grey libraries, so it is not the images themselves.)
Any suggestions on how to improve?
(Unfortunately, I can't post the relevant images.)