AlexNet Opset6 to Opset7 Eltwise Change

asked 2020-04-24 08:27:12 -0500

dbrenn gravatar image

updated 2020-04-24 09:42:32 -0500

Hi,

I have been given two AlexNet onnx file generated from Matlab. One for Opset6 and the other for Opset7. OpenCV's activation numbers match our Matlab inference for Opset6 but not Opset7. The only difference is the data layer uses Scale vs Eltwise respectively

In tracing through the code, I see that the onnx_importer falls into the "Sub" layer and chooses Eltwise for Opset7 correctly but doesn't apply the subtraction operation

I've also seen on the "Supported Layer" page that only: Eltwise (+, *, max) is supported. Is there a reason why onnx_importer cannot be modified to support Eltwise (-)?

Thanks

Edit: Seems like changing one line below would allow Eltwise (-). However, only been working with OpenCV DNNs for a few weeks now and I don't know the ramifications. I tested this out and it matched our Matlab output for Opset7

else if (layer_type == "Add" || layer_type == "Sum" || layer_type == "Sub")
    {
        bool isSub = layer_type == "Sub";
        CV_CheckEQ(node_proto.input_size(), 2, "");
        bool is_const_0 = layer_id.find(node_proto.input(0)) == layer_id.end();
        bool is_const_1 = layer_id.find(node_proto.input(1)) == layer_id.end();
        if (is_const_0 && is_const_1)
        {
            Mat blob_0 = getBlob(node_proto, constBlobs, 0);
            Mat blob_1 = getBlob(node_proto, constBlobs, 1);
            CV_Assert(blob_0.size == blob_1.size);
            Mat output = isSub ? (blob_0 - blob_1) : (blob_0 + blob_1);
            constBlobs.insert(std::make_pair(layerParams.name, output));
            continue;
        }
        else if (is_const_0 || is_const_1)
        {
            int const_blob_id = is_const_0 ? 0 : 1;
            Mat blob = getBlob(node_proto, constBlobs, const_blob_id);
            int blob_total = blob.total();
            if (blob_total == 1) {
                layerParams.type = "Power";
                layerParams.set("shift", (isSub ? -1 : 1) * blob.at<float>(0));
            }
            else {
                MatShape inpShape = outShapes[node_proto.input(1 - const_blob_id)];
                if (shape(blob) == inpShape)
                {
                    LayerParams constParams;
                    constParams.name = layerParams.name + "/const";
                    constParams.type = "Const";
                    // Change => Seems like this would satisfy: (Eltwise -)
                    constParams.blobs.push_back((isSub ? -1 : 1) * blob);
                    //constParams.blobs.push_back(blob);
                    int id = dstNet.addLayer(constParams.name, constParams.type, constParams);
                    layer_id.insert(std::make_pair(constParams.name, LayerInfo(id, 0)));
                    outShapes[constParams.name] = shape(blob);

                    layerParams.type = "Eltwise";
                    node_proto.set_input(const_blob_id, constParams.name);
                }
                else
                {
                    layerParams.type = "Scale";
                    layerParams.set("bias_term", true);
                    blob = blob.reshape(1, 1);
                    layerParams.blobs.push_back((isSub ? -1 : 1) * blob);
                }
            }
edit retag flag offensive close merge delete