Ask Your Question

Bernardo_Biesseck's profile - activity

2018-09-06 02:28:47 -0600 received badge  Notable Question (source)
2017-06-12 08:34:00 -0600 received badge  Popular Question (source)
2015-03-03 20:41:31 -0600 commented question Out of Memory (but I have too much memory!)

Using "sysctl -w vm.overcommit_ratio = 100" on Ubuntu's terminal has not worked, I think you're right berak, when Mat::push_back is executed it tries (and needs) to double the size of Mat object. Probably this is my problem!

Now I'm trying another solution: preallocate more memory than necessary and remove excess later with Mat::pop_back()! I create a new Mat with Mat::zeros(30000000, 128, CV_32F) method at the beginning and use Mat::row(x).copyTo(Mat::row(y)) to copy SIFT descriptors to the right rows. I hope it works now!!!

I ran htop comand on Ubuntu's terminal and my program starts using 16Gb of memory and doesn't increases anymore. That's ok but I still can't understand: even if the program double this space it would be 32Gb, much less than 50Gb!

2015-03-03 14:45:24 -0600 commented question Out of Memory (but I have too much memory!)

I printed dimensions, my cv::Mat object had 19.572.392 rows x 128 cols and the program crashed when tried to add one more row with push_back(). Its type is CV_32F. My calculation is: 19.572.392rows * 128 cols * 4bytes = 10.021.064.704bytes = 9.33Gb.

I ran "sysctl -w vm.overcommit_ratio = 100" for trying to make Ubuntu allows my program to use more memory. I'm going to wait and see what happen. Thanks!!!

2015-03-03 12:41:00 -0600 commented question Out of Memory (but I have too much memory!)

I'm sorry, I wanted write "thirty thousand", it's correct now, thanks! Some images has 18Kb, others 100Kb, others 158Kb... well they aren't big. The whole dataset has 1,2Gb in disk. But I'm not working with original images (pixels), I've described each one as a set of keypoints so I'm working with SIFT descriptors. My problem isn't too few memory, my computer has 50Gb, I thought it could be the size of stack or heap memory.

2015-03-03 12:35:03 -0600 received badge  Editor (source)
2015-03-03 10:40:52 -0600 asked a question Out of Memory (but I have too much memory!)

Hi everyone, I got an "out of memory" problem, my program needs to read and describe almost 30.000 (yes thirty thousand!) images using SIFT descriptor. For each read image I find the keypoints, describe them in a num_keypointsx128 cv::Mat object and concatenate this one on other big matrix with push_back() function. When my program gets almost the 26.000th image it crashes with an "out of memory" error!

I know there are too many images, but my surprise is that I'm running this program on a cluster computer with Ubuntu operating system with almost 50GB of memory and 16 processors (a virtual Machine in Microsoft Azure) so the problem ISN'T memory, I'm sure about it. I ran "htop" command and my program have not used 10% of total available memory.

Why am I getting this "out of memory" error? Should I allocate the cv::Mat object using "malloc()" function?