Ask Your Question
0

How to exclude outliers from detected Orb features?

asked 2019-11-18 12:55:25 -0600

postlude gravatar image

updated 2019-11-19 03:01:39 -0600

I am using the approach shown below (see bottom of post) to detect an object within a video stream.

As can be seen from the image below (red arrows) I get a number of false points / outliers outside the detected area, especially if I move the detected object. What I would like to do is draw a rectangle around the main cluster of points as returned by cv2.perspectiveTransform(), excluding the outlying points. What is the best way to achieve this?

UPDATE: I have updated the image below to hopefully show more clearly what I'm trying to achieve

image description

#!/usr/bin/python3
# 2017.11.26 23:27:12 CST

## Find object by orb features matching

import numpy as np
import cv2
imgname = "box.png"          # query image (small object)
imgname2 = "box_in_scene.png" # train image (large scene)

MIN_MATCH_COUNT = 4

## Create ORB object and BF object(using HAMMING)
orb = cv2.ORB_create()
img1 = cv2.imread(imgname)
img2 = cv2.imread(imgname2)

gray2 = cv2.cvtColor(img2, cv2.COLOR_BGR2GRAY)
gray1 = cv2.cvtColor(img1, cv2.COLOR_BGR2GRAY)

## Find the keypoints and descriptors with ORB
kpts1, descs1 = orb.detectAndCompute(gray1,None)
kpts2, descs2 = orb.detectAndCompute(gray2,None)

## match descriptors and sort them in the order of their distance
bf = cv2.BFMatcher(cv2.NORM_HAMMING, crossCheck=True)
matches = bf.match(descs1, descs2)
dmatches = sorted(matches, key = lambda x:x.distance)

## extract the matched keypoints
src_pts  = np.float32([kpts1[m.queryIdx].pt for m in dmatches]).reshape(-1,1,2)
dst_pts  = np.float32([kpts2[m.trainIdx].pt for m in dmatches]).reshape(-1,1,2)

## find homography matrix and do perspective transform
M, mask = cv2.findHomography(src_pts, dst_pts, cv2.RANSAC,5.0)
h,w = img1.shape[:2]
pts = np.float32([ [0,0],[0,h-1],[w-1,h-1],[w-1,0] ]).reshape(-1,1,2)
dst = cv2.perspectiveTransform(pts,M)

## draw found regions
img2 = cv2.polylines(img2, [np.int32(dst)], True, (0,0,255), 1, cv2.LINE_AA)
cv2.imshow("found", img2)

## draw match lines
res = cv2.drawMatches(img1, kpts1, img2, kpts2, dmatches[:20],None,flags=2)

cv2.imshow("orb_match", res);

cv2.waitKey();cv2.destroyAllWindows()
edit retag flag offensive close merge delete

Comments

I don't see nothing wrong. Fortunately, I used same ORB. Can you post orignal image?

supra56 gravatar imagesupra56 ( 2019-11-18 14:52:20 -0600 )edit

to detect an object

mind you, this will only work IF your object is actually in the scene. you cannot detect absence of it like that.

(this is NOT "object-detection")

berak gravatar imageberak ( 2019-11-19 01:26:14 -0600 )edit

@supra56 the code is working as expected, I simply want to add an additional step to remove points from the matches. I've updated the image to hopefully explain this better.

postlude gravatar imagepostlude ( 2019-11-19 03:04:09 -0600 )edit

@postlude. I tested it and it working now that excluding points that youre mentioned. But I need 2 orignal images. So I can test it with your code before I post answer.

supra56 gravatar imagesupra56 ( 2019-11-19 06:15:17 -0600 )edit
1
postlude gravatar imagepostlude ( 2019-11-19 07:27:32 -0600 )edit

Thanks for images. I will used this for cv2.perspectiveTransform, etc.

supra56 gravatar imagesupra56 ( 2019-11-19 08:58:53 -0600 )edit

2 answers

Sort by ยป oldest newest most voted
1

answered 2019-11-19 01:22:46 -0600

berak gravatar image

you can use the "ratio test"

from the tutorial :

# Apply ratio test
good = []
for m,n in matches:
    if m.distance < 0.75*n.distance:
        good.append([m])

# now use the 'good' matches ...
edit flag offensive delete link more

Comments

Thanks, but isn't this the purpose of dmatches = sorted(matches, key = lambda x:x.distance) ? I'm not looking for the closest feature matches to the original image, I'm looking for the n closest points in dst to the cluster centre.

postlude gravatar imagepostlude ( 2019-11-19 02:42:24 -0600 )edit
1

that only sorts by distance, but does not exclude outliers at all (you'd have to manually chop off items at one end)

have another look at the ratio test, it compares distance a->b to b->a, that's not what your sorting does, right ?

I'm looking for the n closest points in dst to the cluster centre.

hehe, you don't have a cluster center yet.

berak gravatar imageberak ( 2019-11-19 02:50:44 -0600 )edit

@berak OK, thanks! I assumed that crossCheck=True with dmatches = sorted and e.g. matches[:10] would work, but I just tried with crossCheck=False and the ratio test, and it does work better.

postlude gravatar imagepostlude ( 2019-11-19 03:30:09 -0600 )edit

@berak out-of-interest, how would I approach finding a cluster centre if I wanted to do that? Would kmeans be overkill?

postlude gravatar imagepostlude ( 2019-11-19 03:31:35 -0600 )edit
1

finding a cluster centre

center of mass: just sum up all pointcoords, and divide by npoints

geometric center: iteratively find a bounding box, take center of that.

but again, you'll have to remove the outliers first, so, after the matching.

berak gravatar imageberak ( 2019-11-19 03:56:00 -0600 )edit
-1

answered 2020-01-04 12:57:06 -0600

supra56 gravatar image

updated 2020-01-04 13:06:34 -0600

I had to redo it. Just a minor changed by @postlude. OpenCV 4.2.0

orb = cv2.ORB_create()

to;

orb = cv2.BRISK_create()

Image: C:\fakepath\orb_match.jpg

edit flag offensive delete link more

Comments

Just play around with matches[:28] values.

supra56 gravatar imagesupra56 ( 2020-01-04 13:20:20 -0600 )edit

Question Tools

1 follower

Stats

Asked: 2019-11-18 12:55:25 -0600

Seen: 4,314 times

Last updated: Jan 04 '20