This repository was archived by the owner on Oct 16, 2019. It is now read-only.
This repository was archived by the owner on Oct 16, 2019. It is now read-only.
Open
Description
Thanks a lot for the overall android project the author posted, it has saved the time for me to compile from source.
I use the caffe2 MobilenetV1-ssd model for object detection and it works. However, the same model costs about 2.2s to infer an image in my android phone (although with a cpu that is not good enough--qualcomm snapdragon 616), whereas it costs 500ms to infer an image on Window 7 with CPU.
I wonder if the caffe2::predictor is running using multithreads or not?If no, how to configure it ? Is there any way to improve the performance on android ?
Thanks a lot if someone could help me about this.
Metadata
Metadata
Assignees
Labels
No labels