# Efficient way to apply a minimum filter

Hi,

I am trying to implement a function which takes an image (type: CV_64FC3) and applies two operations on it:

1. Each RGB pixel is replaced by the minimum channel value resulting in a CV_64C1 type image.
2. A minimum filter of size blockSize is applied.

The function is performing as required but it takes approximately 1600ms to process a single image of resolution 720 x 576 with blockSize of 15. My question is whether I can do anything to make it computationally efficient, particularly for loop. Code:

Mat darkChannel(Mat im, int blockSize)
{
int padSize = (blockSize - 1) / 2;
double minVal;

Mat temp, minChannel, borderMinChannel, bgr[3];

split(im, bgr); //split into RGB channels
(cv::min)(bgr[0], bgr[1], temp); //find minimum between R and G channels
(cv::min)(temp, bgr[2], minChannel);

Mat dc = Mat::zeros(borderMinChannel.rows, borderMinChannel.cols, CV_64FC1); //create Mat to store final result
double* p;

{
p = dc.ptr<double>(i);
{
minMaxLoc(borderMinChannel(Rect(j - padSize, i - padSize, 2 * padSize + 1, 2 * padSize + 1)), &minVal, NULL); //find the minimum value in a block
p[j] = minVal; //put the minimum value in Mat
}
}

return dc;
}

edit retag close merge delete

1

May be you can use erode instead of looking minimum

( 2016-06-06 02:41:00 -0500 )edit
1

Yep. Your min of three channels looks fine, but the second step is the morphological erode operation, which has a nice method for it HERE. I am happy to note that it does take CV_64F images.

( 2016-06-06 20:19:01 -0500 )edit

Thank you @LBerger and @Tetragramm. cv::erode works fine! Also, I noted the speed difference between images of type CV_64FC3 and CV_8UC3. So I think it would be computationally better if I keep my images in CV_8UC3 type.

( 2016-06-06 23:29:02 -0500 )edit