A Rust library for interacting with OpenAI's ChatGPT API. This library simplifies the process of making requests to the ChatGPT API and parsing responses.
Add the following line to your 'Cargo.toml' file under the '[dependencies]' section:
toml
chat-gpt-lib-rs = "0.1.3"
Then, run cargo build to download and compile the dependencies.
First, import the necessary components:
rust
use chat_gpt_lib_rs::{ChatGPTClient, ChatInput, Message, Model, Role};
Next, create a new client with your API key:
rust
let api_key = "your_api_key_here";
let base_url = "https://api.openai.com";
let client = ChatGPTClient::new(api_key, base_url);
To send a chat message, create a ChatInput structure and call the chat method:
```rust
let chatinput = ChatInput {
model: Model::Gpt35Turbo,
messages: vec![
Message {
role: Role::System,
content: "You are a helpful assistant.".tostring(),
},
Message {
role: Role::User,
content: "Who won the world series in 2020?".tostring(),
},
],
..Default::default()
};
let response = client.chat(chat_input).await.unwrap(); ``` The response will be a 'ChatResponse' structure containing the API response data.
Two example CLI chat applications are provided in the examples folder:
The cli-simple-chat-example.rs demonstrates how to use the chat-gpt-lib-rs library to interact with an AI model based on the GPT-3 architecture through a command-line interface. To run the example, first set your OPENAIAPIKEY in the .env file or as an environment variable, and then execute the following command:
sh
cargo run --example cli-simple-chat-example
The example will prompt the user to enter a question, and the AI chatbot will respond with an answer. The conversation will continue until the user exits the program.
Optionally, you can provide initial user input as a command-line argument:
sh
cargo run --example cli-simple-chat-example "Hello, computer!"
The cli-chat-example.rs demonstrates how to use the chat-gpt-lib-rs library to create an interactive AI chatbot with a command-line interface. To run the example, first set your OPENAIAPIKEY in the .env file or as an environment variable, and then execute the following command:
sh
cargo run --example cli-chat-example
The example will prompt the user to enter a message, and the AI chatbot will respond with an answer. The conversation will continue until the user exits the program.
Optionally, you can provide initial user input as a command-line argument:
sh
cargo run --example cli-chat-example "Hello, computer!"
For an enhanced experience with icons, use a terminal that supports Nerd Fonts. To enable this feature set you USE_ICONS=true in the .env file or as en environment variable.
For more details about the request parameters and response structure, refer to the OpenAI API documentation.
This project is licensed under the Apache License 2.0. See the LICENSE file for details.