Hi all,
I am working with a set of image .tif where i have to apply a denoise algorithm. I saw the FastNlMeanDenoisingMulti() function of OpenCV and it works very well.
The problem is the time the function needs to parse all my set of data. I made some test and it takes 0.4-0.7seconds per frame per temporal window size. With 100frame and a temporal windows size = 3 i need, best case scenario, 0.4s * 3 * 100fr = 120seconds. I mean, with 100fps 100frame are just 1 seconds of .tiff, i cannot wait 2minutes. Imagine a full acquisition of 2000frame needs from 2400seconds to 4200seconds. Way to much time.
There's a way or a "trick" to reduce this kind of time? It depends on my PC ( i7 -8750H 2.2Ghz, RAM 16GB)? Unlucky i don't know which noise I have on my images. There is more than 1 kind of noise and is not white (unlucky).
Thanks, Giulio