Use CodeGate with Open Interpreter
Open Interpreter lets LLMs run code locally through a ChatGPT-like interface in your terminal.
CodeGate works with OpenAI and OpenAI-compatible APIs through Open Interpreter.
This guide assumes you have already installed Open Interpreter using their installation instructions.
Configure Open Interpreter to use CodeGate
To configure Open Interpreter to send requests through CodeGate, run
interpreter
with the
API base setting
set to CodeGate's local API port, http://localhost:8989/openai
.
By default, CodeGate connects to the OpenAI API. To
use a different OpenAI-compatible endpoint, set the CODEGATE_OPENAI_URL
configuration parameter when you run
CodeGate.
- Open Interpreter v0.4.x
- v1.0 dev branch
interpreter --api_base http://localhost:8989/openai --api_key YOUR_API_KEY --model MODEL_NAME
If you are running Open Interpreter's v1.0 development branch:
interpreter --api-base http://localhost:8989/openai --api-key YOUR_API_KEY --model MODEL_NAME
Replace YOUR_API_KEY
with your OpenAI API key, and MODEL_NAME
with your
desired model, like openai/gpt-4o-mini
.
The --model
parameter value must start with openai/
for CodeGate to properly
handle the request.
Verify configuration
To verify that you've successfully connected Open Interpreter to CodeGate, type
codegate version
into the Open Interpreter chat. You should receive a response
like "CodeGate version 0.1.16".
Next steps
Learn more about CodeGate's features and explore the dashboard.
Remove CodeGate
If you decide to stop using CodeGate, follow these steps to remove it and revert your environment.
-
Quit Open Interpreter (
Ctrl +C ) and re-run it without the API base parameter. -
Stop and remove the CodeGate container:
docker stop codegate && docker rm codegate
-
If you launched CodeGate with a persistent volume, delete it to remove the CodeGate database and other files:
docker volume rm codegate_volume