GPTO (Geppetto)

Build Status codecov Crates.io

An Unofficial OpenAI GPT3 Terminal Client

```bash

gpto -h

A tiny unofficial OpenAI GPT3 client

Usage: gpto [OPTIONS]

Options: -p, --prompt ... The prompt(s) to generate completions for -s, --suffix ... The suffix that comes after a completion of inserted text. Defaults to an empty string -t, --temperature What sampling temperature to use. Higher values means the model will take more risks. Try 0.9 for more creative applications, and 0 (argmax sampling) for ones with a well-defined answer. Defaults to 1.0 -n, --number How many completions to generate for each prompt. Defaults to 1 -k, --topp An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with topp probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. We generally recommend altering this or temperature but not both. Defaults to 1.0 -m, --model Model to use for completion. Defaults to text-davinci-003. Use --models to see complete list. -o, --config Absolute path of configuration. Defaults to $XDGCONFIGHOME/gpto.cfg -d, --models Returns a list of models from OpenAI -e, --echo Echo back the prompt in addition to the completion. Defaults to false -h, --help Print help information -V, --version Print version information ```

Learn more about how to use text completion

Install from Crates.io

Install Rust

```bash

Linux and MacOS

curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ```

Install GTPO

bash cargo install gtpo

Install from AUR

```bash

Use yay or another AUR helper

yay gpto-bin ```

Install from GitHub

Install Rust

Clone the project

bash git clone git@github.com:alanvardy/gpto.git cd gpto ./test.sh # run the tests cargo build --release

You can then find the binary in /target/release/

Usage

Get a completion with default parameters

```bash

gpto --prompt tell me a joke

Q: What did the fish say when it hit the wall? A: Dam! ```

Get a completion with a different model (this example uses the leading code completion model). And yes, the generated code is not idiomatic!

Read more about models here

```bash

gpto -m code-davinci-002 -p language is elixir\nwrite a function that raises an error if the argument is not an integer and multiplies it by 2 if it is an integer

def multiplybytwo(x) raise ArgumentError, "Argument is not an integer" unless x.is_a? Integer x * 2 end ```

Give an exhaustive list of all models

```bash

gpto --models

Models:

babbage ada davinci babbage-code-search-code text-similarity-babbage-001 ... and so on ```