# Revision history [back]

### Resolution Issues When Setting Video Capture Device

I’m getting odd behavior when setting the capture dimensions for an HD video signal when using two different capture cards. I have an AHD video camera set to 1920x1080p @ 30fps. I have two different EZCAP USB3 video capture devices: EZCAP267 (AHD-to-USB3) EZCAP311 (HDMI-to-USB3 with an AHD-to-HDMI converter. The converter outputs at 1920x1080 @ 30fps)

If I don’t set the width and height captured, the EZCAP311 captures the video stream at 1920x1080 but the EZCAP267 captures it at 640x480 (black borders top and bottom so that the image still has an aspect ratio of 16:9)

I then played around with setting the capture devices. In all cases, the EZCAP311 captured at 1920x1080, but the EZCAP267 captured at unexpected dimensions. Here are the EZCAP267 results:

• No setting: 640x480 (4:3 aspect ratio with black borders top and bottom)
• Set to 1920x1080. Achieved 1027x768 (4:3 aspect ratio with the black borders top and bottom)
• Set to 2048x1152. Achieved 1280x720 (16:9 aspect ratio)
• Set to 2560x1400. Achieved 1360x768 (16:9 aspect ratio)
• Set to 3840x2160. Achieved 1920x1080 (16:9 aspect ratio)

Does anyone know why the capture dimensions are not what I expect and why the capture dimensions need to be set for one capture device but not the other one?

I’m running OpenCV 3.4.2 in Python 3.7 on a Windows 10 platform. Both capture devices are compatible with DirectShow and require no Windows drivers. Here is my script:

import cv2

cap = cv2.VideoCapture(0)

cap.set(cv2.CAP_PROP_FRAME_WIDTH, 2048)
cap.set(cv2.CAP_PROP_FRAME_WIDTH, 1152)

frame_width = int(cap.get(cv2.CAP_PROP_FRAME_WIDTH))
frame_height = int(cap.get(cv2.CAP_PROP_FRAME_HEIGHT))
print('Width x Height = ', frame_width, 'x', frame_height)

while(True):