Ask Your Question

Crashalot's profile - activity

2016-10-10 12:58:09 -0600 commented question Would shot detection and ball tracking be easier with stereo camera versus solo camera?

Thanks @Der Luftmensch for the reply! To help future readers and myself, could you kindly elaborate on what are the advantages of stereo camera? The drawbacks are added complexity in hardware and software. But for tracking players and shots, could a solo camera do everything a stereo camera could? Re the textureless issue for stereo cameras, could you solve that problem adding IR textures to stitch together the images?

2016-10-09 23:08:57 -0600 commented question Would shot detection and ball tracking be easier with stereo camera versus solo camera?

Thanks for the response! So you're saying a stereo camera offers no advantages over a solo camera for tracking players and shots?

2016-10-09 15:41:29 -0600 received badge  Editor (source)
2016-10-09 15:38:51 -0600 commented question Would shot detection and ball tracking be easier with stereo camera versus solo camera?

Sorry should have been clearer: the smartphones were meant to simplify the example, but what if you could use a stereo camera like the Bumblebee? @Der Luftmensch

2016-10-07 19:24:56 -0600 asked a question Would shot detection and ball tracking be easier with stereo camera versus solo camera?

Assume two camera configurations on a basketball court:

1) Solo: One smartphone on a tripod.

2) Stereo: Using stereo camera like Bumblebee (or custom one) with two cameras, each sensor/lens 12 inches apart.

The computer vision goals are: (1) track the basketball; (2) track a player; (3) detect made shots; (4) detect shot distance; and (5) detect shot angle.

Are any of these goals easier with a stereo camera (configuration #2), or are they as easily achievable with a solo camera (configuration #1)?

EDIT: This paper (http://www.ai.sri.com/~beymer/vsam/ic...) suggests a stereo camera would offer advantages over a solo camera, but the paper is also very old (1999). Is player & shot tracking better with a stereo camera, or are solo cameras equally as effective?

2016-09-26 21:14:44 -0600 asked a question Capture 180 HFOV but produce video without distortion?

The goal is to create a camera system that offers 180 HFOV but produces video focused only on one part -- for instance, to cover a whole basketball court but only produce video focused on one player.

The video must be distortion free -- i.e., not look shot from a fisheye lens. It should resemble video captured by someone with a smartphone camera who's panning the scene to keep the player centered in the video.

  • Could you use a single 180-deg sensor for the camera system and remove the fisheye distortion through software? Dropcam offers a 130-deg FOV and seemingly produces video without distortion so this seems possible? If yes, what are the drawbacks to using a single, wide-angle sensor?

  • If not, which is the better sensor configuration: 2 90-deg sensors or 3 60-deg sensors? From a manufacturing perspective, fewer sensors seems better but 90-deg sensors would also introduce their own distortions, right?

2016-09-10 23:53:46 -0600 received badge  Enthusiast
2016-09-09 20:13:37 -0600 commented answer Understanding capability of camera with two lenses: measuring speed & distance?

Hmmm ok so is the angular precision a byproduct of the camera or algorithms? How can you minimize angular error? Does a higher FOV help (e.g., 180 degrees)? +/- 250 feet wouldn't help much for measuring distance of a golf ball. So I can work out the math, in the example where you're off by 1 pixel did you mean off by 1 degree (i.e., actual FOV is 35/37 degrees not 36)? Trying to understand the equation so I determine the specs needed for a tolerance of +/- 10 feet.

2016-09-09 19:32:32 -0600 commented answer Understanding capability of camera with two lenses: measuring speed & distance?

Cool thanks! You're too kind. Will follow your GitHub repo. Tracking the ball just depends on resolution right and your detection & tracking algorithms, right? Because you can assume some things to simplify the task, like the camera knows where the ball is at the start of its flight (i.e., doesn't need to pick a golf ball out of a scene full of other balls). What defines widely separated? Feet or inches or cm? The idea is to have both cameras in the same system like the iPhone 7+. What constitutes an "off" camera position, or what do you mean by that?

2016-09-09 19:01:58 -0600 commented question Understanding capability of camera with two lenses: measuring speed & distance?

Actually isn't a stereo camera what I mean? I used "lenses" because that's how people described the iPhone 7+ camera system?

2016-09-09 19:01:09 -0600 commented question Understanding capability of camera with two lenses: measuring speed & distance?

OK updated question, thanks for helping! Do you know the answer by chance?

2016-09-09 18:36:12 -0600 commented answer Understanding relationship between image resolution and "zoom" capability

Well thank you for your generosity and kindness!

2016-09-09 18:35:47 -0600 commented question Understanding capability of camera with two lenses: measuring speed & distance?

Apologies for the poor terminology. My understanding was if you wanted to measure distance with a camera you needed a minimum of two cameras in a system (to create a stereo view like human vision), and "lens" is supposed to represent a camera? What's the right way to phrase this?

2016-09-09 18:25:10 -0600 received badge  Scholar (source)
2016-09-09 18:21:52 -0600 commented answer Understanding relationship between image resolution and "zoom" capability

Ugh, obvious now in hindsight. Thanks so much. Are you a teacher by chance?

2016-09-09 18:20:59 -0600 asked a question Understanding capability of camera with two lenses: measuring speed & distance?

Assume you have a camera with two lenses and you want to measure the distance a golf ball traveled, up to 1000 feet. Also assume the camera system offers a HFOV of 180 degrees.

1) What are the drawbacks to measuring distance with a single camera (with two lenses)?

2) What are the drawbacks to measuring speed with a camera system like this?

3) In particular, how does a camera system like this compare to a laser device like this for measurement and a radar device like this for speed?

Accuracy presumably depends in part on resolution, i.e., higher resolution yields more accurate results. So for distance assume acceptable tolerances are +/- 10 feet for distance and for speed +/- 5 mph. If resolution is the only variable, how do you determine the minimum resolution required to achieve the aforementioned tolerances?

2016-09-09 18:03:46 -0600 commented answer Understanding relationship between image resolution and "zoom" capability

Yup yup, once it was clear you meant physical dimensions and not pixels, that part became apparent. Thanks again for all your help! Though if you wouldn't mind explaining the formula ... why does dividing by FOV_sub give the resolution? The other parts of the equation are clear. Thanks again for your help!

2016-09-09 02:08:44 -0600 commented question Understanding relationship between image resolution and "zoom" capability

@StevenPuttemans thanks for the clarification. Yes, this wouldn't be used for professional sporting events so the camera will be perched right next to the court. If I understand you, the downside to the lower resolution camera means you must do real-time player tracking on the camera device (let's ignore the manual option where a human pans the camera to follow the player) -- wouldn't that also increase the cost? The question then becomes: is it cheaper to do on-device tracking or to increase the camera resolution?

2016-09-09 02:03:34 -0600 commented answer Understanding relationship between image resolution and "zoom" capability

Thanks again for your help. I was initially confused by the "draw a line" comment, thinking you meant the area around the player as measured in pixels. But parsing your comments again, it's clear you meant physical distance (i.e., feet around the player) and using trig to compute the FOV if you know the distance from the camera to the player and the area (in feet) you want to capture around the player.

2016-09-08 19:14:47 -0600 commented answer Understanding relationship between image resolution and "zoom" capability

Yes, so I'm trying to work backwards to determine the minimum resolution needed for a custom camera (which may require multiple lenses to produce the resolution and FOV needed) if the goal is to produce a 1280x720 video where an arbitrary player is always centered. Could I email more specifics that may make this question simpler to answer? These comment boxes are not ideal. Thanks again for your assistance!

2016-09-08 18:56:51 -0600 commented answer Understanding relationship between image resolution and "zoom" capability

Thanks for your patience in explaining this, very much appreciated. My understanding is the area around the player would be defined in the production of the video since the camera view is supposed to capture the whole court. Put another way, the image captured by the camera(s) would cover both ends of the court. But say Player A stays on one end. So for frame 1 of the video, the algorithm would take the original image and crop everything except Player A (plus the area to yield the 1280x720 video).

2016-09-08 18:30:53 -0600 commented answer Understanding relationship between image resolution and "zoom" capability

Sorry for the confusion. What I meant was the video produced should have the player centered and have a resolution of 1280x720. I thought that's what you meant by "area to cover the player"? If you're referring to the camera covering the player, then 180 will cover the player as this FOV already covers the whole court, and we can discard frames where the player is off the court.

2016-09-08 18:08:41 -0600 commented answer Understanding relationship between image resolution and "zoom" capability

Thanks for the prompt response! Let's assume you want the player always centered in a 1280x720 video. What would FOV #1 be?

2016-09-08 13:09:28 -0600 received badge  Supporter (source)
2016-09-08 12:35:33 -0600 commented answer Understanding relationship between image resolution and "zoom" capability

Thanks so much, this is helpful! Assume a custom camera with two sensors combining to provide 180 FOV so the entire court is covered (assume the camera is at half-court). To answer your questions, it seems like the FOV would be 180 for both, does that seem right?

2016-09-08 12:31:45 -0600 commented question Understanding relationship between image resolution and "zoom" capability

@StevenPuttemans could you elaborate please? Why is his answer not helpful -- isn't a basketball game considered a stable scene? What do you mean by "get yourself a PTZ setup and apply a tracker combined with a detector"? Are you referring to hardware or software? Sorry for not understanding, but thanks for your patience!

2016-09-08 00:11:53 -0600 asked a question Understanding relationship between image resolution and "zoom" capability

New to CV so sorry if this question lacks proper terminology or is completely nonsensical. :)

The goal is to track a specific player during a basketball game and produce a video that follows him around during the game. More specifically, every frame of the video will be centered around him instead of the ball.

One CV expert suggested the following:

The general approach is to have a very high definition video feed with a still camera and zoom into and around the still image and create the illusion that you are zooming/panning a regular camera.

Could someone elaborate on how "high definition" this camera needs to be? 4K? And more abstractly, is there a relationship between the resolution of the video feed and how much "zoom" capability this yields? For instance, if you have a 4K feed, could you zoom around a basketball court of 90 feet? Could you zoom around a football field of 100 yards?

2016-09-07 23:31:41 -0600 commented question Measure shot distance for basketball and golf?

@David_86 thanks for your help!

2016-08-01 16:30:21 -0600 commented question Measure shot distance for basketball and golf?

@David_86 thanks for the comment! So two cameras (similar to human eyes) would let you gauge distance accurately? How do iPhone apps like https://itunes.apple.com/us/app/easym... measure distance? And by shot tracking, assume it's video analysis -- and ideally real-time, but asynchronous is fine as a last resort.

2016-07-31 23:14:16 -0600 asked a question Measure shot distance for basketball and golf?

Let's say you're using OpenCV to do shot tracking for basketball and golf. For basketball, could you use OpenCV to estimate where on the court a shot was taken and the distance of the shot? For golf, could you measure the distance of a shot?

For the basketball scenario, assume the height of the rim is 10 feet.