How to calibrate rotation and translation between camera and see through display

asked 2019-06-27 16:01:24 -0600

eric_engineer gravatar image

This is a follow-on to my question here. Basically I have a camera and a see-through display, they are both rigidly mounted to the same metal frame. The camera and display are about an inch and a half apart. What I want is to do the best job I can aligning an overlay on the real-world by translating what the camera sees to the display. FOV and resolution do not match.

I see that this is sort of like the stereo camera problem, where you take pictures of the same test pattern from both and then somehow come up with the translation and rotation matrix. Here I only have my eye so I was thinking of printing a checkerboard and putting it on the wall. Then displaying that checkerboard on my display. Next I'd position myself so that the display checkerboard aligned itself over the top of the one on the wall. And then I'd take a picture. At least at that point I'd know some information about what the camera and the display see at the same time.

After that I'm not sure what to do with that data :) Am I on the right track here? Should I be taking photos from different angles? And then what do I do with this data once I get it? I'm not sure on the math here. I've been trying to read about how stereo camera are calibrated, how cameras are modeled, and how you project 3D objects into 2D space but I haven't come up with the solution to this problem yet.

Thank you

edit retag flag offensive close merge delete