OpenAI Compatibility
The 8080 API provides partial compatibility with the OpenAI API specification to facilitate easy integration with existing applications and commonly used tools.
Environment variables
Section titled “Environment variables”Most applications compatible with the OpenAI API or requiring an OpenAI key to run, can be easily configured to run on the 8080 API by configuring these 2 variables:
export OPENAI_API_KEY="your_8080_api_key_here"export OPENAI_BASE_URL="https://api.8080.io/v1"OpenAI Python Client
Section titled “OpenAI Python Client”By simply replacing the base URL of your API client to https://api.8080.io/v1 and replacing your OpenAI access token with an 8080 one, you can use the standard OpenAI python client as you would normally:
from openai import OpenAIimport os
client = OpenAI( base_url="https://api.8080.io/v1", # optionally set the OpenAI env var OPENAI_API_KEY and omit this line api_key=os.environ.get('_8080_API_KEY'),)
completion = client.chat.completions.create( model="8080/taalas/llama3.1-8b-instruct", messages=[ { "role": "user", "content": "How do I output all files in a directory using Python?", }, ],)
print(completion.choices[0].message.content)Currently Supported
Section titled “Currently Supported”The 8080 API currently only supports 2 API endpoints compatible with the OpenAI standard:
/v1/chat/completions- chat based text completions, with support for tool calling and structured outputs using the same format options available in the OpenAI API/v1/responses- generate text completions using the Responses format/v1/models- list of models available for text completions against/v1/batches- create batch jobs for completions to be executed at a lower priority/v1/files- upload batch input files, download results files, deploy file assets for Edge apps
8080 does not currently support any other API functionality not included above.
Further reading
Section titled “Further reading”- Read more about the OpenAI Python library
- Check out the API reference