Ask Your Question
0

OpenCV 3.1 Stitch images in order they were taken

asked 2016-09-21 06:29:41 -0600

skm gravatar image

I am building an Android app to create panoramas. The user captures a set of images and those images are sent to my native stitch function that was based on https://github.com/opencv/opencv/blob.... Since the images are in order, I would like to match each image only to the next image in the vector.

I found an Intel article that was doing just that with following code:

vector<MatchesInfo> pairwise_matches;
BestOf2NearestMatcher matcher(try_gpu, match_conf);
Mat matchMask(features.size(),features.size(),CV_8U,Scalar(0));
for (int i = 0; i < num_images -1; ++i)
{
    matchMask.at<char>(i,i+1) =1;
}
matcher(features, pairwise_matches,matchMask);
matcher.collectGarbage();

Problem is, this wont compile. Im guessing its because im using OpenCV 3.1. Then I found somewhere that this code would do the same:

int range_width = 2;
BestOf2NearestRangeMatcher matcher(range_width, try_cuda, match_conf);
matcher(features, pairwise_matches);
matcher.collectGarbage();

And for most of my samples this works fine. However sometimes, especially when im stitching a large set of images (around 15), some objects appear on top of eachother and in places they shouldnt. I've also noticed that the "beginning" (left side) of the end result is not the first image in the vector either which is strange.

I am using "orb" as features_type and "ray" as ba_cost_func. Seems like I cant use SURF on OpenCV 3.1. The rest of my initial parameters look like this:

bool try_cuda = false;
double compose_megapix = -1; //keeps resolution for final panorama
float match_conf = 0.3f; //0.3 default for orb
string ba_refine_mask = "xxxxx";
bool do_wave_correct = true;
WaveCorrectKind wave_correct = detail::WAVE_CORRECT_HORIZ;
int blend_type = Blender::MULTI_BAND;
float blend_strength = 5;

double work_megapix = 0.6;
double seam_megapix = 0.08;
float conf_thresh = 0.5f;
int expos_comp_type = ExposureCompensator::GAIN_BLOCKS;
string seam_find_type = "dp_colorgrad";
string warp_type = "spherical";

So could anyone enlighten me as to why this is not working and how I should match my features? Any help or direction would be much appreciated!

TL;DR : I want to stitch images in the order they were taken, but above codes are not working for me, how can I do that?

edit retag flag offensive close merge delete

Comments

1

Seems like I cant use SURF on OpenCV 3.1. you have to enable OPENCV_ENABLE_NONFREE in cmake

LBerger gravatar imageLBerger ( 2016-09-21 10:29:38 -0600 )edit

but is that a module that is suppposed to be non free? meaning we cant freely use it?

skm gravatar imageskm ( 2016-09-21 11:36:55 -0600 )edit
1

you are right if you want to sell your application on a store you have to pay to use surf.

LBerger gravatar imageLBerger ( 2016-09-21 11:58:40 -0600 )edit

Then that is not an option. Anyway that would not solve the issue, i guess..

skm gravatar imageskm ( 2016-09-22 04:43:31 -0600 )edit

1 answer

Sort by » oldest newest most voted
0

answered 2016-10-11 05:47:21 -0600

skm gravatar image

So I found out that the issue here is not with the order the images are stitched but rather the rotation that is estimated for the camera parameters in the Homography Based Estimator and the Bundle Ray Adjuster.

Those rotation angles are estimated considering a self rotating camera and my use case envolves an user rotating the camera (which means that will be some translation too. Because of that (i guess) horizontal angles (around Y axis) are highly overestimated which means that the algorithm considers the set of images cover >= 360 degrees which results in some overlapped areas that shouldnt be overlapped.

Still havent found a solution for that problem though.

edit flag offensive delete link more

Question Tools

1 follower

Stats

Asked: 2016-09-21 06:29:41 -0600

Seen: 518 times

Last updated: Oct 11 '16