Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Unsupported or Unrecognized array type error in OpenCV 3.1 in Python

Hi there! I'm getting this unexpected error in this code. I found some solutions after googling a whil, I've tried all of them but none work. Here is the code-

import cv2
import numpy as np
img = cv2.imread('circleTest.jpg',0)
img = cv2.medianBlur(img,5)
cimg = cv2.cvtColor(img,cv2.COLOR_GRAY2BGR)

circles = cv2.HoughCircles(img,cv2.HOUGH_GRADIENT,1,20, param1=50,param2=30,minRadius=0,maxRadius=0)

circles = np.uint16(np.around(circles))
for i in circles[0,:]: cv2.circle(cimg,(i[0],i[1]),i[2],(0,255,0),2)
cv2.circle(cimg,(i[0],i[1]),2,(0,0,255),3)
cv2.imshow('detected circles',cimg)
cv2.waitKey(0)
cv2.destroyAllWindows()

The code is borrowed from a tutorial about HoughCircles from here- http://opencv-python-tutroals.readthedocs.org/en/latest/py_tutorials/py_imgproc/py_houghcircles/py_houghcircles.html

Whenever I try to run the code I get an error saying Unsupported or Unrecognized array type, ( you must have seen the entire error). I tried copying DLLs, putting the image in the working directory, giving the path to the image but no good. I can run other programs like of object detection fine, on Python and C/C++.

Any help would be greatly appreciated.
Thanks!