Hey folks,
So I am detecting a line, ( florescent painted line). So it is quite easy to isolate and I get a pretty much perfect mask of the line.
I take my 10 best lines and I average them together then use the average line to do all further calculations. (I actually do some time weighted averaging as well but that isn't important.) This works great for all orientations except close to vertical.
The issue I'm having is the Hough transform only gives a theta between 0 and 180 degrees. So when averaging theta of 170 and say 0 degrees the output is 85, but what I really want is 175.
The only somewhat acceptable solution I can come up with is take the a average of all lines below 90 degrees and all lines above 90 degrees. Then if the difference between theta is >90 then I add 180 degrees to the avg line above 90 take the weighted average between that and the line that was below 90.
I hope that all makes sense .
Is there a better way that I am overlooking. maybe a way to compare the sin average and the cos average to decide if I should subtract 90 degrees from the average.
Thanks