Rust client for txtai
txtai is an all-in-one embeddings database for semantic search, LLM orchestration and language model workflows.
This repository contains Rust bindings for the txtai API.
Add the following lines to your project Cargo.toml
file:
toml
[dependencies]
txtai = { version = "6.0" }
tokio = { version = "0.2", features = ["full"] }
This adds txtai as a dependency as well as tokio given txtai uses async io.
The examples directory has a series of examples that give an overview of txtai. See the list of examples below.
| Example | Description | |:----------|:-------------| | Introducing txtai | Overview of the functionality provided by txtai | | Extractive QA with txtai | Extractive question-answering with txtai | | Labeling with zero-shot classification | Labeling with zero-shot classification | | Pipelines and workflows | Pipelines and workflows |
txtai.rs connects to a txtai api instance. See this link for details on how to start a new api instance.
Once an api instance is running, do the following to run the examples.
git clone https://github.com/neuml/txtai.rs
cd txtai.rs/examples/demo
cargo run