8000 Hope to support ONNX Runtime (Training version & Inferencing version) and DirectML · Issue #149 · CuriosAI/sai · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
Hope to support ONNX Runtime (Training version & Inferencing version) and DirectML #149
Open
@Looong01

Description

@Looong01

Hope to support ONNX Runtime (Training version & Inferencing version) and DirectML.

They can optimize the training process and inferring process, if you use it as a back end.

ONNX Runtime supports any OS and any kind of GPU, including CUDA(NVIDIA), ROCm (AMD), oneDNN (Intel), Metal (Apple M1) and other devices as in the following picture. And its performance will be much better than OpenCL.
image
https://github.com/microsoft/onnxruntime

DirectML supports any kind of GPU on Windows, but its cost of code migrating is much less than it of ONNX Runtime. And its performance will also be better than OpenCL.
https://github.com/microsoft/DirectML

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0