2017-02-24 19:33:55 -0500 commented answer Using cv::solvePnP on Lighthouse Data Sorry if I sound a bit repetitive, but I really can't understand what's wrong with tan... If you think of this image as the view along the y-axis, with the z-axis pointing to the right and the x-axis pointing up and I am interested in the x-offset for x-degrees (which is Theta in this image) at z=1 the image tells me this length is tan(Theta). I don't rotate my point, but rather slide it on the projection plane, maintaining its z=1. And If I would change my view to be oriented along the x-axis the same scheme applies for getting the y-offset at z=1. 2017-02-24 09:55:47 -0500 received badge ● Scholar (source) 2017-02-24 04:29:14 -0500 commented answer Using cv::solvePnP on Lighthouse Data Ok, but can you tell me what/where my error is on this thought: I want the intersection of my LOS-ray at z=1. I know that the LOS is the intersection of the yz-plane rotated around the y-axis by x-degrees and the xz-pane rotated around the x-axis by y-degrees. The rotation around the y-axis should therefore not affect the y-component of my intersection-point. The x-coordinate should be the distance from the yz-plane, which (correct me if I am wrong here) is tan(x-degrees) * z. Since this should be analogous for the rotation around the x-axis, it should boil down to: Intersection-point for x-degrees, y-degrees and z=1 is (tan(y-degrees), tan(x-degrees), 1). 2017-02-23 21:19:23 -0500 commented answer Using cv::solvePnP on Lighthouse Data Ok, first of all thanks for your help. Multiply your vector by the camera matrix. LOS = camMat*LOS; But isn't this redundant if I define the camera matrix as the identity matrix? And about that angle conversion: If I take my x/y angles and translate them into a LOS vector and then dividing by the z-component, wouldn't that be equal to simply taking the tangens of x and y as the new x and y? 2017-02-23 17:14:28 -0500 asked a question Using cv::solvePnP on Lighthouse Data Hello, I've got a project going, were I try to get the pose of a 3d-tracker, which utilizes the lighthouse basestations from Valve. The basestations provide laser-sweeps across the tracking-volume and my tracker records the timings when a laser-plane hits one of its ir-sensors. These timings can then be converted into degrees, based on the fact that the laser-planes rotate at exactly 3600RPM. Since I know exactly where my sensors are placed on the tracker I should be able to get the pose using the cv::solvePnP function. But I can't figure out what kind of camera-matrix and distortion coefficients I should use. Since a basestation has neither a lens nor a 2d-image-sensor I can't think of a way to calculate the focal-length needed for the camera-matrix. First I've tried the imagewidth/2 * cot(fov/2) formula, assuming an "image width" of 120, since this is the "domain" of my readings, which leads to a focal-length of 34.641px. But the results were completely off. I've then tried to calculate a focal length for a given scenario (tracker 1m infront of the basestation) which gave me a focal-length of 56.62px. If I place my tracker about 1 meter in front of a basestation the results are plausible but if I move away from that "sweetspot" the results are again completely off. But since I have no lens there should be no distortion, or am I wrong about that? If anyone could give me a hint I would be very grateful.