I am trying to calculate the real world distance of an arbitrary line drawn along the field of view from a one point perspective, single camera setup.
I will have a known distance running parallel. How can I find the compensation factor I need to apply to the pixel length of the measuring line?
Do I have to take into account the distance from the vanishing point, as the length per pixel increases the nearer you get to the vanishing point? Do I need to use the gradient of the known line to give me a rate of change?
I have been reading up on cross ratios but I don't understand if they are applicable in this scenario as I seem to be measuring in the opposite direction in respect to the vanishing point to apply this.