wonnx

GitHub Workflow Status Crates.io (latest) Crates.io

Wonnx aims for running blazing Fast AI on any device.

Supported Platforms (enabled by wgpu)

API | Windows | Linux & Android | macOS & iOS | ----- | ----------------------------- | ------------------ | ------------------ | Vulkan | :whitecheckmark: | :whitecheckmark: | | Metal | | | :whitecheckmark: | DX12 | :whitecheckmark: (W10 only) | | | DX11 | :construction: | | | GLES3 | | :ok: | |

:whitecheckmark: = First Class Support — :ok: = Best Effort Support — :construction: = Unsupported, but support in progress

Getting Started

bash git clone https://github.com/haixuanTao/wonnx.git

bash cargo run --example squeeze --release

To run a model from scratch

```bash

pip install -U pip && pip install onnx-simplifier

python -m onnxsim mnist-8.onnx opt-mnist.onnx ```

bash cargo run --example mnist --release

To use

```rust async fn execute_gpu() -> Vec { // USER INPUT

let n: usize = 512 * 512 * 128;
let mut input_data = HashMap::new();
let data = vec![-1.0f32; n];
input_data.insert("x", data.as_slice());

let mut session = wonnx::Session::from_path("examples/data/models/single_relu.onnx")
    .await
    .unwrap();

wonnx::run(&mut session, input_data).await.unwrap()

} ```

Examples are available in the examples folder

Test

bash cargo test

Test WASM (not yet implemented)

bash export RUSTFLAGS=--cfg=web_sys_unstable_apis wasm-pack test --node

Language interface

Aiming to be widely usable through: