llmvm-chat

Crates.io GitHub

An llmvm frontend that acts as a CLI chat interface.

Demo

asciicast

Installation

Install this application using cargo.

cargo install llmvm-chat

The llmvm core must be installed. If you have not done so, the core may be installed via cargo install llmvm-core

A backend must be installed and configured. The llmvm-outsource is recommended for OpenAI requests. Currently, the default model preset is gpt-3.5-chat which uses this backend.

Usage

Run llmvm-chat to use the interface. Press CTRL-C when you are finished with your chat. A chat thread will be persisted, and the thread ID will be outputted.

Use the -h to see all options.

Use the -l to load the last chat thread.

Configuration

Run the chat executable to generate a configuration file at:

|Key|Required?|Description| |--|--|--| |stdio_core|No|Stdio client configuration for communicated with llmvm core. See llmvm-protocol for details.| |http_core|No|HTTP client configuration for communicating with llmvm core. See llmvm-protocol for details.|

License

Mozilla Public License, version 2.0