Ask Your Question

skm's profile - activity

2016-11-18 04:07:59 -0600 received badge  Enthusiast
2016-10-11 05:52:14 -0600 asked a question OpenCV stitching - Bundle Ray Adjuster adjusts total rotation to > 360º when it shouldn't

Hi OpenCV community!

I am building an Android app that lets the user take panorama photos horizontally up to 360 degrees. I am currently using a version of the detailed example at: https://github.com/opencv/opencv/blob....

Since I can use the smartphone's sensors to get the camera parameters, I dont need homography estimation, I simply create the rotation matrix from those parameters. The problem I am having is that these parameters are subject to slight drift and other effects. I want to optimise those parameters using Bundle Adjuster Ray-based available with OpenCV. The problem is that when I do that, it increases the rotation angles a lot, and if the sum of those rotations are >= 360 degrees (when it should not be) the end result has overlapping areas where it shouldnt.

I am wondering if this is an issue with the translation that the user does while capturing and what I can do to solve it. Has anyone ever faced these problems or has any input that can put me in the right direction? Any help would be highly appreciated!

2016-10-11 05:47:42 -0600 received badge  Scholar (source)
2016-10-11 05:47:21 -0600 answered a question OpenCV 3.1 Stitch images in order they were taken

So I found out that the issue here is not with the order the images are stitched but rather the rotation that is estimated for the camera parameters in the Homography Based Estimator and the Bundle Ray Adjuster.

Those rotation angles are estimated considering a self rotating camera and my use case envolves an user rotating the camera (which means that will be some translation too. Because of that (i guess) horizontal angles (around Y axis) are highly overestimated which means that the algorithm considers the set of images cover >= 360 degrees which results in some overlapped areas that shouldnt be overlapped.

Still havent found a solution for that problem though.

2016-10-11 05:24:02 -0600 commented answer Utilise known extrinsic parameters when stitching panoramas

Hi emiswelt, did you succeed? I am trying to do a similar thing, the difference is that I only need to reach 360º horizontally.. But if I use the Ray Bundle Adjuster, the rotation for each camera parameter is highly increased, and, if the total reaches 360 the end result has overlapped areas where it should not have. I am wondering if this is because of the translation that I am doing while capturing and what can I do to solve it. My use case also uses a smartphone and envolves a user to rotating the camera, but the camera translates a bit ofc since the user is rotating with the camera in front of his face :P

2016-09-22 04:43:31 -0600 commented question OpenCV 3.1 Stitch images in order they were taken

Then that is not an option. Anyway that would not solve the issue, i guess..

2016-09-21 11:36:55 -0600 commented question OpenCV 3.1 Stitch images in order they were taken

but is that a module that is suppposed to be non free? meaning we cant freely use it?

2016-09-21 07:45:07 -0600 asked a question OpenCV 3.1 Stitch images in order they were taken

I am building an Android app to create panoramas. The user captures a set of images and those images are sent to my native stitch function that was based on https://github.com/opencv/opencv/blob.... Since the images are in order, I would like to match each image only to the next image in the vector.

I found an Intel article that was doing just that with following code:

vector<MatchesInfo> pairwise_matches;
BestOf2NearestMatcher matcher(try_gpu, match_conf);
Mat matchMask(features.size(),features.size(),CV_8U,Scalar(0));
for (int i = 0; i < num_images -1; ++i)
{
    matchMask.at<char>(i,i+1) =1;
}
matcher(features, pairwise_matches,matchMask);
matcher.collectGarbage();

Problem is, this wont compile. Im guessing its because im using OpenCV 3.1. Then I found somewhere that this code would do the same:

int range_width = 2;
BestOf2NearestRangeMatcher matcher(range_width, try_cuda, match_conf);
matcher(features, pairwise_matches);
matcher.collectGarbage();

And for most of my samples this works fine. However sometimes, especially when im stitching a large set of images (around 15), some objects appear on top of eachother and in places they shouldnt. I've also noticed that the "beginning" (left side) of the end result is not the first image in the vector either which is strange.

I am using "orb" as features_type and "ray" as ba_cost_func. Seems like I cant use SURF on OpenCV 3.1. The rest of my initial parameters look like this:

bool try_cuda = false;
double compose_megapix = -1; //keeps resolution for final panorama
float match_conf = 0.3f; //0.3 default for orb
string ba_refine_mask = "xxxxx";
bool do_wave_correct = true;
WaveCorrectKind wave_correct = detail::WAVE_CORRECT_HORIZ;
int blend_type = Blender::MULTI_BAND;
float blend_strength = 5;

double work_megapix = 0.6;
double seam_megapix = 0.08;
float conf_thresh = 0.5f;
int expos_comp_type = ExposureCompensator::GAIN_BLOCKS;
string seam_find_type = "dp_colorgrad";
string warp_type = "spherical";

So could anyone enlighten me as to why this is not working and how I should match my features? Any help or direction would be much appreciated!

TL;DR : I want to stitch images in the order they were taken, but above codes are not working for me, how can I do that?