Doing a project at work and we need to have a user hold a stick. We need to track one end of this stick in real time and place a 3D particle system at that point. A HD webcam will record to a video that the user will receive.
My plan to do this: Unity as the 3D engine, it has an OpenCV plugin, and can use multiple cameras. We will place an IR LED at the end of the stick. I think to use two webcams - one modified to see just IR - I removed the IR filter and added a piece of floppy disk over the CCD and it sees IR well. I will use simple blob detector to track the IR point and get X,Y position. Then use that to position particle system in the actual video being recorded by a second, HD webcam.
Hoping for a little guidance now. I need the tracking to be spot on as possible. I'd love to not use two cameras though, but I'm not sure about tracking accuracy though. Color is out as I can't know what the user will wear, other people around, etc. IR seems good... but couple problems.
I have it working now tracking the IR but the camera can see some daylight, reflections of IR etc. and I don't know how to get just that one point I need? Would I need to get a min/max area or something and then only that blob size will work? So user would have to be correct distance from camera?
Any help much appreciated.