Ask Your Question

VCS_DEV's profile - activity

2021-07-01 03:23:27 -0600 received badge  Notable Question (source)
2018-08-24 04:48:40 -0600 received badge  Popular Question (source)
2016-11-07 06:38:25 -0600 asked a question Camera opencv pixelated and lower fps

I need to know a way to quickly find a resolution for the camera where you can reach a fps near 30 and visually preview the camera does not pixelated. I'm using opencv 2.4.13.1 on Android and the class is CameraBridgeViewBase and PortraitCameraView.

I have tried to set a resolution in hardcode much smaller than the highest resolution of the device, something for example 720x1280. But I encountered problems on some devices, even the list of all possible resolutions of the device has the 720x1280, brings me a divergent / smaller resolution of type 720x720. And still the fps I can not leave in an acceptable value (without lags).

mPortraitCameraView.setMaxFrameSize(720, 1280);
2016-06-16 06:55:30 -0600 commented question Write and read Matt descriptor in SQLite

Yes, it would convert the data type float array to byte array to be saved in SQLite and return the value of the database and remount the CV_32F type matrix!

2016-06-14 19:42:24 -0600 asked a question Write and read Matt descriptor in SQLite

I'm new in opencv and accurate record Mats type of CV_32F in SQLite. I found the following post and used the idea to apply but this returning me error.

//Mat type is CV_32F 

long matBytes = mat.total() * mat.elemSize(); 
byte[] bytes = new byte[ (int)matBytes ]; 
matDescriptor.get(0, 0, bytes);     //--> line error

Error: java.lang.UnsupportedOperationException: Mat data type is not compatible: 5

How can I write and read a Mat in this format?

2016-06-14 10:52:42 -0600 asked a question Recording and Read keypoints data file on Android

I'm trying to write (and after READ) the data of the keypoints of a SURF on Android so you can later read and do matching operations. Did each of these trial and error that each returns:

Try save Mat to JSON: http://stackoverflow.com/questions/95... Error:

UnsupportedOperationException: Mat data type is not compatible: 5

Try set Java variable:

private static final double[][] boxDescriptor = new double[][]{
            {-0.0028871582, -0.0017856462, 0.0078000603, 0.003144495, 0.0042561297, -0.026877755, 0.019066211, 0.050337251, -0.00062063418, -0.022234244, 0.016169874, 0.04401375, 0.0013359544, -0.0018637108,...} };

Error:

Error: code too large Error: too many constants

Try save yml using c++:

FileStorage fsWrite(fileName, FileStorage::WRITE);
write(fsWrite, "test", keypoints); //--error line
fsWrite.release();

Error: Fatal signal 6 (SIGABRT), code -6 in tid 23802 Signal = SIGABRT (signal SIGABRT) t = {std::type_info *}

2016-06-10 07:38:59 -0600 received badge  Scholar (source)
2016-06-10 07:38:54 -0600 answered a question Surf image matcher this with distorted results

I was changing the data structure received by the camera and Mat result and operations.

@Override
    public void surfaceChanged(SurfaceHolder arg0, int arg1, int arg2, int arg3) {

        Camera.Parameters parameters = mCamera.getParameters();

        Camera.Size size = parameters.getPreviewSize();
        previewSizeHeight = size.height;
        previewSizeWidth = size.width;

        // Set the camera preview size
        parameters.setPreviewSize(previewSizeWidth, previewSizeHeight);

        imageFormat = parameters.getPreviewFormat();

        mCamera.setParameters(parameters);

        mCamera.startPreview();

        mCamera.setPreviewCallback(new Camera.PreviewCallback() {

            public void onPreviewFrame(final byte[] data, final Camera camera) {

                synchronized (this) {
                    totalFrames++;

                    if (imageFormat == ImageFormat.NV21) {

                        //We only accept the NV21(YUV420) format.
                        if (!bProcessing) {

                            if(matFrameCamera != null) {
                                matFrameCamera.release();
                            }

                            matFrameCamera = new Mat(previewSizeHeight + (previewSizeHeight / 2), previewSizeWidth, CvType.CV_8UC1);

                            totalProcFrames++;

                            matFrameCamera.put(0, 0, data);
                            mRgba2Gray = new Mat();
                            Imgproc.cvtColor(matFrameCamera, mRgba2Gray, Imgproc.COLOR_YUV2RGBA_NV21, 4);
                            mHandler.post(doImageProcessing);

                        }
                    }

                    this.notify();
                }

            }
        });
    }

    private Runnable doImageProcessing = new Runnable() {
        public void run() {

            if (!stop) {
                bProcessing = true;

                Imgproc.resize(mRgba2Gray, mRgba2Gray, new Size(mRgba2Gray.cols() / 3, mRgba2Gray.rows() / 3));
                Bitmap bitmapCameraPortrait = Bitmap.createBitmap(mRgba2Gray.cols(), mRgba2Gray.rows(),
                        Bitmap.Config.ARGB_8888);
                Utils.matToBitmap(mRgba2Gray, bitmapCameraPortrait);
                bitmap = scaleDown(bitmapCameraPortrait, mRgba2Gray.height() / 3, true);
                Utils.bitmapToMat(bitmapCameraPortrait, mRgba2Gray);
                Imgproc.cvtColor(mRgba2Gray, mRgba2Gray, Imgproc.COLOR_BGRA2GRAY);

                Mat matImageMatcher = new Mat();

                SurfMatcher(mRgba2Gray.getNativeObjAddr(), matTarget.getNativeObjAddr(),
                        matImageMatcher.getNativeObjAddr());

                int newWidthBitmap = bitmapCameraPortrait.getWidth() +
                        (matImageMatcher.cols() - bitmapCameraPortrait.getWidth());
                int newHeightBitmap = bitmapCameraPortrait.getHeight() +
                        (matImageMatcher.rows() - bitmapCameraPortrait.getHeight());

                bitmapCameraPortrait = Bitmap.createScaledBitmap(bitmapCameraPortrait, newWidthBitmap,
                        newHeightBitmap, true);

                Utils.matToBitmap(matImageMatcher, bitmapCameraPortrait);

                bitmap = bitmapCameraPortrait;

                ((Activity) context).runOnUiThread(new TimerTask() {
                    @Override
                    public void run() {
                        ivCameraPreview.setImageBitmap(bitmap);
                    }
                });

                stop = true;

                bProcessing = false;

            }

        }

    };
2016-06-07 12:47:40 -0600 asked a question Surf image matcher this with distorted results

I have the following class that uses the android camera to capture an information to make a match of another image using opencv 3.1 with surf.

public class MyCameraPreview implements SurfaceHolder.Callback, Camera.PreviewCallback {

    private Camera mCamera = null;
    private ImageView ivCameraPreview = null;
    private int[] pixels;
    private Mat matFrameCamera;
    private Mat matFrameOutputCamera;
    private Mat mRgba2Gray;
    private int imageFormat;
    private int previewSizeWidth;
    private int previewSizeHeight;
    private boolean bProcessing = false;

    private int totalFrames = 0;
    private int totalProcFrames = 0;

    private Mat matTarget;
    private Bitmap bmpFixedImage;
    private Bitmap bitmap;

    private Handler mHandler = new Handler(Looper.getMainLooper());
    private Context context;

    private boolean findFeaturesMatch;
    private boolean surfCompletedWithMatcher;
    private boolean stop;

    MyCameraPreview(int previewLayoutWidth, int previewLayoutHeight,
                  ImageView ivCameraPreview, Context context) {

        previewSizeWidth = previewLayoutWidth;
        previewSizeHeight = previewLayoutHeight;
        this.ivCameraPreview = ivCameraPreview;
        this.context = context;

        try {
            matTarget = Utils.loadResource(context, R.drawable.outback, Imgcodecs.CV_LOAD_IMAGE_GRAYSCALE);
        } catch (IOException e) {
            e.printStackTrace();
        }

    }

    @Override
    public void onPreviewFrame(byte[] arg0, Camera arg1) {    }

    void onPause() {
        mCamera.stopPreview();
    }

    @Override
    public void surfaceChanged(SurfaceHolder arg0, int arg1, int arg2, int arg3) {

        Camera.Parameters parameters = mCamera.getParameters();

        matFrameCamera = new Mat(previewSizeHeight + (previewSizeHeight / 2), previewSizeWidth, CvType.CV_8UC1);
        matFrameOutputCamera = new Mat(previewSizeHeight, previewSizeWidth, CvType.CV_8UC4);

        Camera.Size size = parameters.getPreviewSize();
        previewSizeHeight = size.height;
        previewSizeWidth = size.width;

        // Set the camera preview size
        parameters.setPreviewSize(previewSizeWidth, previewSizeHeight);

        imageFormat = parameters.getPreviewFormat();

        mCamera.setParameters(parameters);

        mCamera.startPreview();

        mCamera.setPreviewCallback(new Camera.PreviewCallback() {

            public void onPreviewFrame(final byte[] data, final Camera camera) {

                synchronized (this) {
                    totalFrames++;

                    if (imageFormat == ImageFormat.NV21) {

                        //We only accept the NV21(YUV420) format.
                        if (!bProcessing) {

                            totalProcFrames++;

                            matFrameCamera.put(0, 0, data);
                            mRgba2Gray = new Mat();
                            Imgproc.cvtColor(matFrameCamera, mRgba2Gray, Imgproc.COLOR_YUV420sp2GRAY);

                            mHandler.post(doImageProcessing);

                        }
                    }

                    this.notify();
                }

            }
        });
    }

    @Override
    public void surfaceCreated(SurfaceHolder arg0) {
        mCamera = Camera.open();
        try {
            // If did not set the SurfaceHolder, the preview area will be black.
            mCamera.setPreviewDisplay(arg0);
            mCamera.setDisplayOrientation(90);
            mCamera.setPreviewCallback(this);
        } catch (IOException e) {
            mCamera.release();
            mCamera = null;
        }
    }

    @Override
    public void surfaceDestroyed(SurfaceHolder arg0) {
        mCamera.setPreviewCallback(null);
        mCamera.stopPreview();
        mCamera.release();
        mRgba2Gray.release();
        matFrameCamera.release();
        matFrameOutputCamera.release();
        matTarget.release();
        mCamera = null;
    }

    //
    // Native JNI
    //
    public native void SurfMatcher(long matAddrFrameCamera, long matAddrFeature,
                                      long matAddrMatch);

    static {
        System.loadLibrary("native");
    }

    private Runnable doImageProcessing = new Runnable() {
        public void run() {

            if(!stop) {
                bProcessing = true;

                int bitmapWidth =  mRgba2Gray.cols() + matTarget.cols();
                int bitmapHeight = mRgba2Gray.rows() > matTarget.rows() ? mRgba2Gray.rows() : matTarget.rows();

                Mat matImageMatcher = new Mat(mRgba2Gray.rows(), bitmapWidth, CvType.CV_8UC1);

                SurfMatcher(mRgba2Gray.getNativeObjAddr(), matTarget.getNativeObjAddr(),
                            matImageMatcher.getNativeObjAddr());

                bitmap = Bitmap.createBitmap(bitmapWidth, bitmapHeight,
                        Bitmap.Config.ARGB_8888);
                Utils.matToBitmap(matImageMatcher, bitmap);

                ivCameraPreview.setImageBitmap(bitmap);

                stop = true;

                bProcessing = false;

            }

        }

    };


}

By making the image match the result is coming distorted (left image is camera frame and right is image target). image description I made the following test to show only the camera preview (mRgba2Gray variable) and this ok.

2016-06-01 05:32:49 -0600 received badge  Enthusiast
2016-05-30 12:22:27 -0600 asked a question Mat convert to int array to byte array

I need to convert a Mat to int array and byte array in C ++. How can I do this?

2016-05-27 15:51:27 -0600 asked a question Create only the opencv 3.1 and contrib

I am a beginner and would like to create a .so (shared lib) the opencv 3.1 and contrib (to use surf) on android. I own the android studio 2.2 and Windows 8.1. I can not find a good post to follow him to create these .so Always spotted posts using opencv lower versions of opencv as 2.4.10 and lower.

2016-05-14 15:55:06 -0600 received badge  Editor (source)
2016-05-14 15:45:53 -0600 asked a question ImagemView being positioned in the preview camera

I'm having problems when trying to position a imageView in JavaCameraView. I am using a ImagemView Instead of using a Mat to gain performance as it has a png image to be fixed around the eyes. I managed to locate opencv and have the eye area and can draw the area:

Core.rectangle(mRgba, eyeArea.tl(), eyeArea.br(),
                new Scalar(255, 0, 0, 255), 2);

But when trying to position and resize the ImagemView nothing is being right. Even catching the eye area position the ImageView is smaller and off the correct location and that the command above this drawing correctly. I saw that the JavaCameraView its size is different from the frame shown, I could adjust the ImageView the container to the same size frame but still there are problems in ImageView ...

Fix container size and position of the ImageView:

RelativeLayout rlMaskContainer = (RelativeLayout) findViewById(R.id.rlMaskContainer);
                        rlMaskContainer.getLayoutParams().height
= mOpenCvCameraView.dstRect.height();
                        rlMaskContainer.getLayoutParams().width
= mOpenCvCameraView.dstRect.width();
                        rlMaskContainer.setX(mOpenCvCameraView.dstLeftRect);
                        rlMaskContainer.setY(mOpenCvCameraView.dstTopRect);

Resize and positioning ImageView by EyeArea Rect:

 runOnUiThread(new Runnable() {
            @Override
            public void run() {
    RelativeLayout.LayoutParams params = new RelativeLayout.LayoutParams(eyeArea.width, eyeArea.height);
    params.leftMargin = eyeArea.x;
    params.topMargin = eyeArea.y;
    ivMask.setLayoutParams(params);
    ivMask.setScaleType(ImageView.ScaleType.FIT_XY);


} });

Preview Camera (Red area is container RelativeLayout, rectangle is Rect EyeArea): image description

2016-04-29 15:16:35 -0600 commented answer How to join png with alpha / transparency in a frame in realtime

Thanks for the reply but I walked searching to understand (in code form) and could not use your answer to a solution ...

2016-04-29 12:04:14 -0600 received badge  Supporter (source)
2016-04-28 12:39:29 -0600 asked a question How to join png with alpha / transparency in a frame in realtime

I'm working under the example of OpenCV android 2.4.11 which detects faces using the camera. Instead of drawing a rectangle on the face found, I'm trying to put a mask (png image) on the face. But to display the image on the face, the png image is coming with a black background where there was transparency.

FdActivity.java

public void onCameraViewStarted(int width, int height) {
        mGray = new Mat();
        mRgba = new Mat();

        //Load my mask png
        Bitmap image = BitmapFactory.decodeResource(getResources(), R.drawable.mask_1);

        mask = new Mat();

        Utils.bitmapToMat(image, mask);

}

public Mat onCameraFrame(CvCameraViewFrame inputFrame) {

        mRgba = inputFrame.rgba();
        mGray = inputFrame.gray();

        if (mAbsoluteFaceSize == 0) {
            int height = mGray.rows();
            if (Math.round(height * mRelativeFaceSize) > 0) {
                mAbsoluteFaceSize = Math.round(height * mRelativeFaceSize);
            }
            mNativeDetector.setMinFaceSize(mAbsoluteFaceSize);
        }

        MatOfRect faces = new MatOfRect();

        if (mDetectorType == JAVA_DETECTOR) {
            if (mJavaDetector != null)
                mJavaDetector.detectMultiScale(mGray, faces, 1.1, 2, 2,
                        new Size(mAbsoluteFaceSize, mAbsoluteFaceSize), new Size());
        }
        else if (mDetectorType == NATIVE_DETECTOR) {
            if (mNativeDetector != null)
                mNativeDetector.detect(mGray, faces);
        }
        else {
            Log.e(TAG, "Detection method is not selected!");
        }

        Rect[] facesArray = faces.toArray();


        for (int i = 0; i < facesArray.length; i++) {

              overlayImage(mRgba, mask, facesArray[i]);

        }

        return mRgba;
    }

    public Mat overlayImage(Mat background, Mat foregroundMask, Rect faceRect)
    {
        Mat mask = new Mat();

        Imgproc.resize(this.mask, mask, faceRect.size());

        Mat source = new Mat();
        Imgproc.resize(foregroundMask, source, background.size());

        mask.copyTo( background.submat( new Rect((int) faceRect.tl().x, (int) faceRect.tl().y, mask.cols(), mask.rows())) );

        source.release();
        mask.release();
        return background;
    }
2016-04-28 12:38:27 -0600 asked a question Draw on face features using OpenCV

I am newbie using OpenCV and would like a step by step how to use the AAM in Android Studio. How do I create an app with these characteristics?

link 1

link 2

Unfortunately the Internet can not find a step by step or material with good content for studies or be based.

2016-04-28 12:38:26 -0600 asked a question OpenCV Android wrong front camera orientation

I'm trying to get the front camera using OpenCV 3.1 or even 2.4.11, stay with the correct orientation of the device and the image is not blurred (stretched) or have any other problem that harms the correct function of viewing! I checked various means in other posts and none of them work or have a correct and functional effect!

Some attempts, but there are many others that do not work on the Internet!

Link 1

Link 2