Hello!
I have created a rust project to create a library to interact with gpt (OpenAI API):
https://github.com/gptrust/gptrust
A rust library gptrust_api
- This follows the community standard i.e MIT license, unofficial, and loads key from env
- Implements a large section of OpenAI API Spec and I am actively adding support for the whole spec
- The API calls mostly passes snesible defaults, but small patches may be required to override all parameters, if bug reports are filed they’ll get prioritised
- Simplified code example
- In Cargo.toml
gptrust_api = "0.1.3"
(orcargo add gptrust_api
) let completions = gptrust_api::completions::complete("Once upon a time".to_string(), ...);
- In Cargo.toml
A CLI wrapper gptrust_cli
- I have created a cli wrapper, so we can have a reference application using the library. It is keyword based and have 1:1 relationship with API endpoints e.g
files [list|upload]
, orchat complete
, etc - Example usage:
gptrust_cli images generations "A cat is offering a flower to a mouse"
gptrust_cli chat complete --max-tokens=100 "The first rule of fight club is"
echo "Broadcast from root:" | gptrust_cli chat complete --model=gpt-4 - | wall
I also have a plan of creating a proxy that can listen to a socket/fifo, hide all the complexities of auth/billing, and expose a service to other applications which can talk to OpenAI:
- for example create
/dev/chatgpt
and let any app open the file and talk to openai api - or open a socket and allow other microservices (or become a sidecar) to be a gpt forward proxy
Bug reports, feature requests, patches, forks - everything is welcome. Will also appreciate if it can be listed in the community library page for easy discovery. I’ll keep updating this thread with major releases. Final disclaimer: I am a novice Rust programmer but with issue reports I’ll make it better.