Out of Memory (but I have too much memory!)
Hi everyone, I got an "out of memory" problem, my program needs to read and describe almost 30.000 (yes thirty thousand!) images using SIFT descriptor. For each read image I find the keypoints, describe them in a num_keypointsx128 cv::Mat object and concatenate this one on other big matrix with push_back() function. When my program gets almost the 26.000th image it crashes with an "out of memory" error!
I know there are too many images, but my surprise is that I'm running this program on a cluster computer with Ubuntu operating system with almost 50GB of memory and 16 processors (a virtual Machine in Microsoft Azure) so the problem ISN'T memory, I'm sure about it. I ran "htop" command and my program have not used 10% of total available memory.
Why am I getting this "out of memory" error? Should I allocate the cv::Mat object using "malloc()" function?
30.000 != three thousand... And no, you shall not use malloc! it is so C... How big are your images? 30000*4M = 120000M = 120G...
I'm sorry, I wanted write "thirty thousand", it's correct now, thanks! Some images has 18Kb, others 100Kb, others 158Kb... well they aren't big. The whole dataset has 1,2Gb in disk. But I'm not working with original images (pixels), I've described each one as a set of keypoints so I'm working with SIFT descriptors. My problem isn't too few memory, my computer has 50Gb, I thought it could be the size of stack or heap memory.
it still can happen. Mat::push_back() works pretty much the same as a std::vector. if it runs out of preallocated space, it tries to double its capacity, giving you nasty spikes in a memory profile.
also , my calculation would be 30000 images * 128*4 bytes for sift * num_keypoints * 2 for allocation. you'll still get close or even beyond 50gb like this
I printed dimensions, my cv::Mat object had 19.572.392 rows x 128 cols and the program crashed when tried to add one more row with push_back(). Its type is CV_32F. My calculation is: 19.572.392rows * 128 cols * 4bytes = 10.021.064.704bytes = 9.33Gb.
I ran "sysctl -w vm.overcommit_ratio = 100" for trying to make Ubuntu allows my program to use more memory. I'm going to wait and see what happen. Thanks!!!
Using "sysctl -w vm.overcommit_ratio = 100" on Ubuntu's terminal has not worked, I think you're right berak, when Mat::push_back is executed it tries (and needs) to double the size of Mat object. Probably this is my problem!
Now I'm trying another solution: preallocate more memory than necessary and remove excess later with Mat::pop_back()! I create a new Mat with Mat::zeros(30000000, 128, CV_32F) method at the beginning and use Mat::row(x).copyTo(Mat::row(y)) to copy SIFT descriptors to the right rows. I hope it works now!!!
I ran htop comand on Ubuntu's terminal and my program starts using 16Gb of memory and doesn't increases anymore. That's ok but I still can't understand: even if the program double this space it would be 32Gb, much less than 50Gb!
yes 32 + 16 = 48 which is very close to 50, because first it is creating the new space (32) and then it is copying the first vector into the first part of the second, so you'll have 48gb just for that mat...