Ask Your Question

Unsupported or Unrecognized array type error in OpenCV 3.1 in Python

asked 2016-03-27 08:38:04 -0500

YaddyVirus gravatar image

Hi there! I'm getting this unexpected error in this code. I found some solutions after googling a whil, I've tried all of them but none work. Here is the code-

import cv2
import numpy as np
img = cv2.imread('circleTest.jpg',0)
img = cv2.medianBlur(img,5)
cimg = cv2.cvtColor(img,cv2.COLOR_GRAY2BGR)

circles = cv2.HoughCircles(img,cv2.HOUGH_GRADIENT,1,20, param1=50,param2=30,minRadius=0,maxRadius=0)

circles = np.uint16(np.around(circles))
for i in circles[0,:]:,(i[0],i[1]),i[2],(0,255,0),2),(i[0],i[1]),2,(0,0,255),3)
cv2.imshow('detected circles',cimg)

The code is borrowed from a tutorial about HoughCircles from here- http://opencv-python-tutroals.readthe...

Whenever I try to run the code I get an error saying Unsupported or Unrecognized array type, ( you must have seen the entire error). I tried copying DLLs, putting the image in the working directory, giving the path to the image but no good. I can run other programs like of object detection fine, on Python and C/C++.

Any help would be greatly appreciated.

edit retag flag offensive close merge delete


"you must have seen the entire error" -- yes, indeed. so, please edit, and add the exact error msg.

berak gravatar imageberak ( 2016-03-28 01:55:56 -0500 )edit

1 answer

Sort by ยป oldest newest most voted

answered 2016-03-31 13:26:56 -0500

YaddyVirus gravatar image

It was nothing more than an indentation error in the for loop. :P

edit flag offensive delete link more

Question Tools

1 follower


Asked: 2016-03-27 08:38:04 -0500

Seen: 237 times

Last updated: Mar 27 '16