Ask Your Question

Revision history [back]

OpenCV outputting blurred/interlaced video using V4L2

I'm trying to read stream in a video feed from an EasyCap (UTV007) device. Running on a Raspberry Pi 3B, with latest Raspbian. Using OpenCV2, and Python 2.7

So far I am able to read in a clear feed when using V4L2 and mplayer with the following command:

mplayer tv:// -tv driver=v4l2:norm=NTSC:device=/dev/video0

However I need to read the feed in using OpenCV because I need to do some post-processing on the frames. I'm doing it like this:

cap = cv2.VideoCapture(0)
while cap.isOpened():
    ret, frame = cap.read()
    cv2.imshow('frame', frame)

However the output when from OpenCV looks blurred/interlaced, even when saving the frames. This is a comparison of the shot: image description image description

I don't want to simply de-interlace the frames by averaging over the lines.

If mplayer is able to read the feed in correctly, why is OpenCV struggling? I've made sure to set v4l2-ctl to use NTSC.

Any help will be greatly appreciated!