Ask Your Question
0

Is template matching the best approach to go about when finding the "Exact" image on the screen multiple times? [closed]

asked 2017-07-03 13:06:11 -0600

jas gravatar image

updated 2017-07-09 11:26:45 -0600

pklab gravatar image

I gather that template matching only gives us the "best possible location of the image" and not the location of the exact image.

edit retag flag offensive reopen merge delete

Closed for the following reason the question is answered, right answer was accepted by pklab
close date 2017-07-12 03:05:53.371908

2 answers

Sort by » oldest newest most voted
1

answered 2017-07-04 04:47:12 -0600

pklab gravatar image

updated 2017-07-10 04:58:11 -0600

if template is present in the image at point P(x,y), templateMatching returns P(x+epsilon,y+epsilon) where epsilon might be 0 or very small if the match is similar. Exact match will returns exact position.

BTW if "exact" means T-I=0 you can use a simple slider difference like below

int ExactMatch(const Mat& I, const Mat &T, std::vector<Point> &matches)
{
    //Mat I; // the image
    //Mat T; // the template
    //std::vector<Point> matches; // vector of exact match point

    //works only on grayscale image because to minMaxLoc
    cvAssert((I.channels()==1)   && (T.channels()==1))
    matches.clear();
    Rect R(Point(0, 0), T.size()); //a rect roi over the image
    Mat delta;
    int nx = I.cols - T.cols;   // numbers of horiz sliding ROI
    int ny = I.rows - T.rows;   // numbers of vert sliding ROI
    double maxValue;
    Point maxLoc,match;
    for (int x = 0; x < nx; x++)
        for (int y = 0; y < ny; y++)
        {
            R.x = x; R.y = y;
            absdiff(T, I(R), delta);  //delta = |T - I(R) |
            // search for max in the difference
            minMaxLoc(delta, NULL, &maxValue, NULL, &maxLoc);
            //max==0 means delta == 0 means exact match
            if (maxValue == 0)
            {
                match = Point(x, y);
                cout << "Exact found at " << match << endl;
                matches.push_back(match);
            }
        }
    return (int)matches.size();
}

UPDATE:

Despite of above code works (on 1channel images) it is really inefficient. The way templateMatch/CV_TM_SQDIFF is much faster and general.

UPDATE2: This should works on Java: UPDATE3: small fix (please be tolerant to syntax error. I don't have a java machine)

Mat resultMatrix = new Mat();
int result_cols =  source.cols() - template.cols() + 1;
int result_rows = source.rows() - template.rows() + 1;
resultMatrix.create( result_rows, result_cols, CvType.CV_32FC1 );
Imgproc.matchTemplate(source, template, resultMatrix, Imgproc.TM_SQDIFF_NORMED);

epsilon = 0.1; // increase to be more tolerant
while (true)
{
    MinMaxLocResult mmr = Core.minMaxLoc(resultMatrix);
    if (mmr.minVal > epsilon )
            break; // no more matches

    Point matchLoc = mmr.minLoc;
    System.out.print("\nMatch found at: "); System.out.print(matchLoc);
    System.out.print("\tDifference: "); System.out.print(mmr.minVal);
    // clean scores around current match
    Imgproc.rectangle(resultMatrix,
        // clean around center
        new Point(matchLoc.x - template.cols()/2, matchLoc.y - template.rows()/2),
        new Point(matchLoc.x + template.cols()/2, matchLoc.y + template.rows()/2),
        // set to a value greater than your threshold
        new Scalar(epsilon+1, epsilon+1, epsilon+1), -1);
    // draw a rectangle around match
    Imgproc.rectangle(source, 
        matchLoc,
        new Point(matchLoc.x + template.cols(), matchLoc.y + template.rows()),
        new Scalar(0, 255, 0), 1);
}
edit flag offensive delete link more

Comments

I'm trying to this in java, facing an issue at this point

Core.absdiff(T, I(R), delta);

I(R) is not supported in java as such. Can you help me out?

jas gravatar imagejas ( 2017-07-04 07:21:24 -0600 )edit
1

this is "How to set a ROI in a Mat in Java". It should be Ir = I.submat(R) or you can use Ir = new Mat(I,R). And of course Core.absdiff(T, Ir, delta)

pklab gravatar imagepklab ( 2017-07-04 10:16:55 -0600 )edit

I have tried using the method you have suggested. However, minmaxLoc doesn't return me valid values. (minLoc,maxLoc,minVal,maxVal all return 0.0). Is there something i'm doing wrong/missing?

     for (int x = 0; x < nx; x++) { 
for (int y = 0; y < ny; y++) {
                    Mat Ir = templateGrayed.submat(R);
                    Core.absdiff(templateGrayed, Ir, delta);
                    MinMaxLocResult co = Core.minMaxLoc(delta);
                    // max==0 means delta == 0 means exact match
                    if (co.maxVal == 0) {
                        match = co.maxLoc;
                        System.out.print("Exact found at ");
                        System.out.print(match);
                        System.out.print("\n");
                        matches.add(match);
                    }           }       }       return matches.size();
jas gravatar imagejas ( 2017-07-04 12:50:04 -0600 )edit

if max==0 also min==0 this means exact match at 0,0 of the ROI. But I wrote match = Point(x, y); not maxLoc !

pklab gravatar imagepklab ( 2017-07-04 13:58:34 -0600 )edit

Sorry, my bad. I have made the change. Still, i'm getting the final size as 274510!!(definetly wrong!)(nx=485,ny=566). Where as the target image is present only thrice in the screen. The exact match is not at 0,0.

I forgot to mention in my previous comment, i have converted my Matrices to greyscale in order to avoid this error. OpenCV Error: Assertion failed ((cn == 1 && (_mask.empty() || _mask.type() == CV_8U)) || (cn > 1 && _mask.empty() && !minIdx && !maxIdx)) in cv::minMaxIdx, file C:\builds\master_PackSlaveAddon-win32-vc12-static\opencv\modules\core\src\stat.cpp (Occurs at Core.minMaxLoc(delta) )

jas gravatar imagejas ( 2017-07-04 16:01:29 -0600 )edit
1

sorry it's my fault now... because you should move R in x and y each iteration of 2nd for therefore R.x = x; R.y=y; BTW templateMatch/CV_TM_SQDIFF should be faster, in special case of BGR images.

pklab gravatar imagepklab ( 2017-07-05 14:35:14 -0600 )edit

This works!!

jas gravatar imagejas ( 2017-07-05 16:34:48 -0600 )edit

For the Update 3, Supposing i would like to utilise the template matching method's CCOEFF_NORMED/CCORR_NORMED. I would be utilising the maxVal. If it's a proper match then maxVal is 1.0 and 0.0 if it's not a good match. Should there be any other changes that i should be making when trying to find the next match or drawing the rectangle?

jas gravatar imagejas ( 2017-07-11 14:08:24 -0600 )edit

If exact match means Template~=Image(ROI) then TM_CCOEFF/TM_CCORR is waste time and resources because of much complex calculations they requires.

If exact match means exact location of the template, each method provides the best location. But if Template~Image(ROI), location is not perfect due to image differences. In this cases TM_CCOEFF/TM_CCORR offer better robustness.

Finally for multiple match you have clean scores around current match. Read: set scores to a value out of your bias. With TM_SQDIFF use any value higher than your bias. For TM_CCOEFF/TM_CCORR use any value lower.

NORMED versions provide output in a fixed range -1..0..1. See my answers here http://answers.opencv.org/question/51486 and http://answers.opencv.org/question/63587

pklab gravatar imagepklab ( 2017-07-12 03:03:59 -0600 )edit
0

answered 2017-07-07 04:57:14 -0600

jas gravatar image

updated 2017-07-09 10:30:41 -0600

I'm unable to figure out a proper threshold value(this is an experimental value) in order to get the co-ordinates of images on the screen multiple times. So I can't use template match/TM_SQDIFF. I guess i'll have to stick with the approach you have given.

Having said that I’m facing another issue. Consider this as my template image and source image respectively. Template Source

With your approach I’m able to identify the images on the right. That is top-right has an index 1 and bottom-right has an index 2.

The image on the left is a bit different from the others. (Was able to see the difference only when I zoomed in to 800%)!! (Highlighted the differences)

Difference

Ideally, i should be able to identify all the 3 images. Is there a way in which I can identify the image on the left as well with the current approach? I would like to know if there's some way where in we can avoid the differences before we do the matching.

UPDATE:

Tried your suggestion .Unable to get the co-ordinates of other similar images on the screen. (Also tried other Thresholding methods TRUNC,BINARY). Able to get the co-ordinates for the top-right image.

Imgproc.matchTemplate(sourceGrayed, templateGrayed, resultMatrix, Imgproc.TM_SQDIFF_NORMED);
        // Core.normalize(resultMatrix, resultMatrix, 0, 1, Core.NORM_MINMAX,
        // -1);
        Imgproc.threshold(resultMatrix, resultMatrix, 0, 1, Imgproc.THRESH_TOZERO);

        while (true) {
            MinMaxLocResult mmr = Core.minMaxLoc(resultMatrix);
            Point matchLoc;
            matchLoc = mmr.minLoc;
            if (mmr.minVal <= 1.0) {
                Imgproc.rectangle(templateGrayed, matchLoc,
                        new Point(matchLoc.x + sourceGrayed.cols(), matchLoc.y + sourceGrayed.rows()),
                        new Scalar(0, 255, 0));
                Imgproc.rectangle(resultMatrix, matchLoc,
                        new Point(matchLoc.x + sourceGrayed.cols(), matchLoc.y + sourceGrayed.rows()),
                        new Scalar(0, 255, 0), -1);

                mmrList.add(mmr);
edit flag offensive delete link more

Comments

@jas 1st requirement in my answer is if "exact" means T-I=0 . Your image isn't exact but similar hence difference is not 0 ... cmon !

If you are looking for very close similarity you can use CV_TM_SQDIFF_NORMED. It returns always between 0...1 where 0=exact match 1: complete different. This helps to set a threshold like 0.1 or 0.05 ... lower th -> higher sensibility

pklab gravatar imagepklab ( 2017-07-07 13:14:38 -0600 )edit

@pklab I'm sorry I didn't check the dissimilarity of the image in the first place. My bad!

jas gravatar imagejas ( 2017-07-08 13:22:29 -0600 )edit

@jas

  1. cv::threshold must not be used here because you cut all matching scores!
  2. Iterate while minValue<epsilon where epsilon is a small value
  3. For each iteration cut scores around match...BUT the matching point must be at centre of your cleaner rectangle.

Check UPDATE2 in my answer

pklab gravatar imagepklab ( 2017-07-09 11:00:11 -0600 )edit

I'm using opencv version 3.1.0 which has the Imgproc.rectangle methods:

1) (Mat img, Point pt1, Point pt2,Scalar color,int thickness, int lineType, int shift)

2) (Mat img, Point pt1, Point pt2,Scalar color,int thickness)

3) (Mat img, Point pt1, Point pt2,Scalar color)

I'm unable to utilise your solution because of this.

jas gravatar imagejas ( 2017-07-09 13:23:55 -0600 )edit
1

I think you are able to swithc from Rect/Size pt1/pt2 please check UPDATE3 also for right cleaner value

pklab gravatar imagepklab ( 2017-07-10 04:59:48 -0600 )edit

Just to get a bit more clarity, in case I need to identify only the "exact" matches, is this the only way? "Despite of above code works (on 1channel images) it is really inefficient. The way templateMatch/CV_TM_SQDIFF is much faster and general. "

jas gravatar imagejas ( 2017-07-11 00:24:31 -0600 )edit

Question Tools

1 follower

Stats

Asked: 2017-07-03 13:06:11 -0600

Seen: 4,342 times

Last updated: Jul 10 '17