Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

When I tried to load a FP16 model I also got an error, what I would try is to use an - or convert the model to FP32. Then after that you can set OpenCL_FP16 for inference.

There could be another way but this seems the only possibility from what I know.

When I tried to load a FP16 model in ONNX I also got an error, what I would try is to use an - or convert the model to FP32. Then after that you can set OpenCL_FP16 for inference.

There could be another way but this seems the only possibility from what I know.

When I tried to load a FP16 ONNX model in ONNX I also got an error, what I would try is to use an - or convert the model to FP32. Then after that you can set OpenCL_FP16 for inference.

There could be another way but this seems the only possibility from what I know.