Ask Your Question

vzamboni's profile - activity

2016-09-08 02:26:30 -0600 received badge  Enthusiast
2016-09-06 10:55:25 -0600 received badge  Critic (source)
2016-09-06 09:39:28 -0600 received badge  Editor (source)
2016-09-06 08:53:51 -0600 asked a question ZCA implementation in java

Hello everyone,

I'm trying to implement ZCA Whitening algorithm like shown here: http://stackoverflow.com/questions/31... with opencv in Scala (using Java api) but I cannot find the most of the functions used there (phython with numpy).

So far I tried with this:

      //Covariance matrix
      val covar, mean = new Mat()
      Core.calcCovarMatrix(input, covar, mean, Core.COVAR_NORMAL | Core.COVAR_ROWS) 
      Core.divide(covar, new Scalar(input.rows - 1), covar)

      //Singular Value Decomposition
      val w, u, vt = new Mat()
      Core.SVDecomp(covar, w, u, vt)

      //#Whitening constant, it prevents division by zero
      val  epsilon = 1e-5

To implement the last trasformation

      ZCAMatrix = np.dot(U, np.dot(np.diag(1.0/np.sqrt(S + epsilon)), U.T))

I tried with:

  var ZCAMatrix = new Mat
  Core.add(w, new Scalar(epsilon), ZCAMatrix)
  Core.sqrt(ZCAMatrix, ZCAMatrix)
  Core.divide(1.0, ZCAMatrix, ZCAMatrix)
  Core.gemm(Mat.diag(ZCAMatrix), u.t, 1, new Mat, 0, ZCAMatrix)
  Core.gemm(ZCAMatrix, u, 1, new Mat, 0, ZCAMatrix)

The result of thei transformation is: this

which is not exactly what it's supposed to be. Can someone help?

2016-07-15 09:04:16 -0600 commented question memory leak in java CascadeClassifier

me neither, but memory is getting full if I execute a lot of detectMultiScale. I have to manually call System.gc to empty memory and avoid crashes, but I would like to avoid this. release the objects_mat would solve the problem.

2016-07-15 05:10:05 -0600 asked a question memory leak in java CascadeClassifier

Hei everyone,

I'm facing a memory related issue when using java impl of CascadeClassifier. I'm reading frames from a video and perform a detectMultiScale on overy Mat object obtained but memory usage is increasing fast untill the memory is full and the program crashes.

I faced similar problems every time I use a Mat object, which I need to release (throug mat.release() function) if not needed anymore, otherwise the memory usage increases till crash.

Looking the CascadeClassifier class I can see that the detectMultiScale function is:

    public  void detectMultiScale(Mat image, MatOfRect objects)
{
    Mat objects_mat = objects;
    detectMultiScale_1(nativeObj, image.nativeObj, objects_mat.nativeObj);

    return;
}

which instantiates a Mat (objects_mat) object that is never released and generates a memory leak.

I tried to extends the CascadeClassifier class with my own implementation of detectMultiScale but the method detectMultiScale_1 is private and cannot be referenced.

In my opinion with this we can avoid the memory leak in this method:

    public  void detectMultiScale(Mat image, MatOfRect objects)
{
    detectMultiScale_1(nativeObj, image.nativeObj, objects.nativeObj);

    return;
}

since we don not create a new Mat every time we perform a detection.

What do you think? Is there a simpler way to do it? (probably calling System.gc would solve the problem but it's not the optimal solution)