OpenAI API for Rust

GitHub Workflow Status Crates.io Crates.io GitHub

A community-maintained library provides a simple and convenient way to interact with the OpenAI API. No complex async and redundant dependencies.

API

check official API reference |API|Support| |---|---| |Models|✔️| |Completions|✔️| |Chat|✔️| |Edits|✔️| |Images|✔️| |Embeddings|✔️| |Audio|✔️| |Files|❌| |Fine-tunes|❌| |Moderations|❌| |Engines|❌|


Usage

Add the following to your Cargo.toml file:

toml openai_api_rust = "0.1.8"

Export your API key into the environment variables

bash export OPENAI_API_KEY=<your_api_key>

Then use the crate in your Rust code:

```rust use openaiapirust::; use openai_api_rust::chat::; use openaiapirust::completions::*;

fn main() { // Load API key from environment OPENAIAPIKEY. // You can also hadcode through Auth::new(<your_api_key>), but it is not recommended. let auth = Auth::fromenv().unwrap(); let openai = OpenAI::new(auth, "https://api.openai.com/v1/"); let body = ChatBody { model: "gpt-3.5-turbo".tostring(), maxtokens: Some(7), temperature: Some(0f32), topp: Some(0f32), n: Some(2), stream: Some(false), stop: None, presencepenalty: None, frequencypenalty: None, logitbias: None, user: None, messages: vec![Message { role: Role::User, content: "Hello!".tostring() }], }; let rs = openai.chatcompletioncreate(&body); let choice = rs.unwrap().choices; let message = &choice[0].message.as_ref().unwrap(); assert!(message.content.contains("Hello")); } ```

Use proxy

Load proxy from env

rust let openai = OpenAI::new(auth, "https://api.openai.com/v1/") .use_env_proxy();

Set the proxy manually

rust let openai = OpenAI::new(auth, "https://api.openai.com/v1/") .set_proxy("http://127.0.0.1:1080");

License

This library is distributed under the terms of the MIT license. See LICENSE for details.