Reading large number of Images
I have a dataset containing 101,000 images and I want to use them for Image classification task but I run out of memory whenever I try to read them using imread.Is there a way to do this? i am using python
use a neural network, and train (consecutively) with small image-batches ?
But my previous model training will be over-written every time I use a different batch
there's an UPDATE flag, that can be passed to the train method on the 2nd (and further) run, will make a more detailled post tomorrow.
Here is some code where noisy images are fed into the neural network:
https://github.com/sjhalayka/opencv_i...
@sjhalayka, why not make a complete answer here, mentioning (and explaining the use of) ANN_MLP::TrainFlags::UPDATE_WEIGHTS and so on ?
@Akash... how about you generate a smaller set of images and zip them up and upload them to GitHub?
Also, while you're at it, make a text file with each line containing an image name and its classification. How many classifications are there in total?
Once I have that I can write you a full example.
Thank you for your assistance sjhalakya I have 101 classes each having 1000 images. https://github.com/akashjoshi123/Imag... is the link it is a very small subset of my dataset the meta folder has all the 101 class labels of the original dataset.
Alright. In the meantime, check out the source code that I linked to.
Also, have you tried your own C++ code?
Ahh, too bad... you're using Python. I have no idea how to do a neural network using Python. Sorry about that!
Thank you for your valuable time @sjhalayka.Extremely sorry I missed the python tag while posting the question