rust_llama.cpp

Docs Crates.io

LLama.cpp rust bindings.

The rust bindings are mostly based on https://github.com/go-skynet/go-llama.cpp/

Building Locally

Note: This repository uses git submodules to keep track of LLama.cpp.

Clone the repository locally:

bash git clone --recurse-submodules https://github.com/mdrokz/rust-llama.cpp

bash cargo build

Usage

toml [dependencies] llama_cpp_rs = "0.1.2"

```rs use llamacpprs::{ options::{ModelOptions, PredictOptions}, LLama, };

fn main() { let model_options = ModelOptions::default();

let llama = LLama::new(
    "../wizard-vicuna-13B.ggmlv3.q4_0.bin".into(),
    &model_options,
)
.unwrap();

let mut predict_options = PredictOptions {
    token_callback: Some(Box::new(|token| {
        println!("token1: {}", token);

        true
    })),
    ..Default::default()
};

llama
    .predict(
        "what are the national animals of india".into(),
         predict_options,
    )
    .unwrap();

}

```

TODO

LICENSE

MIT