2017-02-24 19:33:55 -0600 | commented answer | Using cv::solvePnP on Lighthouse Data Sorry if I sound a bit repetitive, but I really can't understand what's wrong with tan... |
2017-02-24 09:55:47 -0600 | received badge | ● Scholar (source) |
2017-02-24 04:29:14 -0600 | commented answer | Using cv::solvePnP on Lighthouse Data Ok, but can you tell me what/where my error is on this thought: |
2017-02-23 21:19:23 -0600 | commented answer | Using cv::solvePnP on Lighthouse Data Ok, first of all thanks for your help.
But isn't this redundant if I define the camera matrix as the identity matrix? And about that angle conversion: |
2017-02-23 17:14:28 -0600 | asked a question | Using cv::solvePnP on Lighthouse Data Hello, The basestations provide laser-sweeps across the tracking-volume and my tracker records the timings when a laser-plane hits one of its ir-sensors. These timings can then be converted into degrees, based on the fact that the laser-planes rotate at exactly 3600RPM. Since I know exactly where my sensors are placed on the tracker I should be able to get the pose using the But I can't figure out what kind of camera-matrix and distortion coefficients I should use.
Since a basestation has neither a lens nor a 2d-image-sensor I can't think of a way to calculate the focal-length needed for the camera-matrix. But since I have no lens there should be no distortion, or am I wrong about that? If anyone could give me a hint I would be very grateful. |