-
Notifications
You must be signed in to change notification settings - Fork 4
how to get mobile_sam_ecoder/decoder.onnx #6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Sorry, I will add relevant scripts when I have time. For now, you can refer to https://github.com/dinglufe/segment-anything-cpp-wrapper. |
Have you modified the official export onnx code?when I export tensorrt,that reports a error: [04/17/2024-14:59:10] [E] [TRT] ModelImporter.cpp:768: While parsing node number 1209 [Slice -> "/Slice_2_output_0"]: [04/17/2024-14:59:10] [E] [TRT] ModelImporter.cpp:771: --- End node --- |
my tensorrt version is 8.6.0,can you give me : torch and onnx version |
I convert onnx from models to tensorRT engine successfully with command:
@m-wei you can have a try with these weight files. and @zhudongwork can you add the code to get encoder/decoder onnx weights from mobile_sam.pt? Version of TensorRT is 8.6.1, on Windows 11. |
@busyyang these weight files is successful, my problem exported onnx model is from official MobileSAM library |
Glad to hear that. And I have a try with export_onnx_model.py to convert to onnx successfully.
But it is not separate encoder and decoder weights. |
@busyyang nonono, this export code is only decoder model, encoder model is from https://github.com/dinglufe/segment-anything-cpp-wrapper, this library's exported model's name is xx_preprocess.onnx,this is actually encoder model. |
In MobileSAM, there is only one weight file named mobile_sam.pt, how to get encoder and decoder weigt separately? Could you share the script for that?
The text was updated successfully, but these errors were encountered: