An llmvm frontend that acts as a CLI chat interface.
Install this application using cargo
.
cargo install llmvm-chat
The llmvm core must be installed. If you have not done so, the core may be installed via
cargo install llmvm-core
A backend must be installed and configured. The llmvm-outsource is recommended for OpenAI requests.
Currently, the default model preset is gpt-3.5-chat
which uses this backend.
Run llmvm-chat
to use the interface. Press CTRL-C when you are finished with your chat. A chat thread will be persisted, and the thread ID will be outputted.
Use the -h
to see all options.
Use the -l
to load the last chat thread.
Run the chat executable to generate a configuration file at:
~/.config/llmvm/chat.toml
.~/Library/Application Support/com.djandries.llmvm/chat.toml
AppData\Roaming\djandries\llmvm\config\chat.toml
|Key|Required?|Description|
|--|--|--|
|stdio_core
|No|Stdio client configuration for communicated with llmvm core. See llmvm-protocol for details.|
|http_core
|No|HTTP client configuration for communicating with llmvm core. See llmvm-protocol for details.|