LLama.cpp rust bindings.
The rust bindings are mostly based on https://github.com/go-skynet/go-llama.cpp/
Note: This repository uses git submodules to keep track of LLama.cpp.
Clone the repository locally:
bash
git clone --recurse-submodules https://github.com/mdrokz/rust-llama.cpp
bash
cargo build
toml
[dependencies]
llama_cpp_rs = "0.1.0"
```rs use llamacpprs::{ options::{ModelOptions, PredictOptions}, LLama, };
fn main() { let model_options = ModelOptions::default();
let llama = LLama::new(
"../wizard-vicuna-13B.ggmlv3.q4_0.bin".into(),
&model_options,
)
.unwrap();
let mut predict_options = PredictOptions {
token_callback: Some(|token| {
println!("token1: {}", token);
true
}),
..Default::default()
};
llama
.predict(
"what are the national animals of india".into(),
&mut predict_options,
)
.unwrap();
}
```
MIT