llm-chain 🚀

llm-chain is a collection of Rust crates designed to help you work with Large Language Models (LLMs) more effectively. Our primary focus is on providing robust support for prompt templates and chaining together prompts in multi-step chains, enabling complex tasks that LLMs can't handle in a single step. This includes, but is not limited to, summarizing lengthy texts or performing advanced data processing tasks.

Discord Crates.io License Docs: Tutorial

Examples 💡

To help you get started, here is an example demonstrating how to use llm-chain. You can find more examples in the examples folder in the repository.

rust let exec = Executor::new_default(); let res = Step::for_prompt(prompt!( "You are a robot assistant for making personalized greetings", "Make a personalized greeting for Joe" )) .run(&Parameters::new(), &exec) .await?; println!("{}", res);

➡️ tutorial: get started with llm-chain

Features 🌟

Getting Started 🚀

To start using llm-chain, add it as a dependency in your Cargo.toml:

toml [dependencies] llm-chain = "0.1.0" llm-chain-openai = "0.1.0

The examples for llm-chain-openai require you to set the OPENAI_API_KEY environment variable which you can do like this:

bash export OPENAI_API_KEY="sk-YOUR_OPEN_AI_KEY_HERE"

Then, refer to the documentation and examples to learn how to create prompt templates, chains, and more.

Contributing 🤝

We warmly welcome contributions from everyone! If you're interested in helping improve llm-chain, please check out our CONTRIBUTING.md file for guidelines and best practices.

License 📄

llm-chain is licensed under the MIT License.

Connect with Us 🌐

If you have any questions, suggestions, or feedback, feel free to open an issue or join our community discord. We're always excited to hear from our users and learn about your experiences with llm-chain.

We hope you enjoy using llm-chain to unlock the full potential of Large Language Models in your projects. Happy coding! 🎉