I want to perform inference using the dnn module of opencv. This is my code:
static cv::Mat inputBlob2;
static std::vector<cv::Mat> outputs2;
cv::dnn::blobFromImages(images, inputBlob2, 1.0 / 255.0, modelShape, cv::Scalar(), true,false);
std::cout << "Input Blob Size: " << inputBlob2.size << std::endl;
net.setInput(inputBlob2);
net.forward(outputs2, net.getUnconnectedOutLayersNames());
I use yolov8’s pretrained weights converted to onnx.
The network takes a 640×640 pattern, the images are 640×640 resolution. If the vector consists of only one image the program works, but if there are more images I get the following error.
[ERROR:[email protected]] global net_impl.cpp:1166 getLayerShapesRecursively OPENCV/DNN: [Reshape]:(onnx_node!/model.22/dfl/Reshape): getMemoryShapes() throws exception. inputs=1 outputs=1/1 blobs=0
[ERROR:[email protected]] global net_impl.cpp:1172 getLayerShapesRecursively input[0] = [ 1 64 42000 ]
[ERROR:[email protected]] global net_impl.cpp:1176 getLayerShapesRecursively output[0] = [ ]
[ERROR:[email protected]] global net_impl.cpp:1182 getLayerShapesRecursively Exception message: OpenCV(4.9.0) opt/opencv-4.9.0/modules/dnn/src/layers/reshape_layer.cpp:109: error: (-215:Assertion failed) total(srcShape, srcRange.start, srcRange.end) == maskTotal in function 'computeShapeByReshapeMask'
I understand that the problem lies in the size of input, but how do I solve it?