Tool Calling
You can pass a tools array in chat completion requests. The model may respond with tool_calls instead of text; your client runs the requested functions and sends the results back as tool messages, then requests again until the model returns a final text response.
Request format
Section titled “Request format”Include a tools array in the request body. Each tool is an object with type: "function" and a function object that has:
name(required): Function name the model will use when calling.description(optional): Description for the model; improves when it chooses to call the tool.parameters(optional): JSON Schema for the arguments the function accepts.
{ "model": "8080/taalas/llama3.1-8b-instruct", "messages": [{"role": "user", "content": "What's the weather in Paris?"}], "tools": [ { "type": "function", "function": { "name": "get_weather", "description": "Get the current temperature for a location by latitude and longitude.", "parameters": { "type": "object", "properties": { "latitude": {"type": "number", "description": "Latitude"}, "longitude": {"type": "number", "description": "Longitude"} }, "required": ["latitude", "longitude"] } } } ]}Response and the tool-call loop
Section titled “Response and the tool-call loop”-
First request: Send
messagesandtools. The response may be:- Normal text:
choices[0].message.contentis set andfinish_reasonis"stop". You’re done. - Tool calls:
choices[0].message.tool_callsis set andfinish_reasonis"tool_calls". Each item hasid,type: "function", andfunctionwithnameandarguments(JSON string).
- Normal text:
-
Append assistant and tool messages: Add the assistant message (including
tool_calls) to your conversation. For each tool call, append a message withrole: "tool",tool_call_id(same as in the assistant’stool_calls), andcontentset to the result of running that function (string, e.g. JSON). -
Second request: Send the updated
messages(user + assistant + tool messages) with the sametools. Repeat untilfinish_reasonis"stop"or you hit a max-turns limit.
Python example: get_weather
Section titled “Python example: get_weather”This example defines a get_weather tool, sends a user message, and runs the tool-call loop until the model returns a final answer.
import osimport jsonimport requests
API_KEY = os.environ.get("_8080_API_KEY")BASE_URL = "https://api.8080.io"
def get_weather(latitude: float, longitude: float) -> str: """Get current temperature for a location (mock implementation).""" # In production you might call a real weather API return json.dumps({"temperature_c": 18, "conditions": "Partly cloudy"})
def run_tool(name: str, arguments: str) -> str: args = json.loads(arguments) if name == "get_weather": return get_weather(args["latitude"], args["longitude"]) return json.dumps({"error": f"Unknown tool: {name}"})
def chat_with_tools(): messages = [ {"role": "user", "content": "What's the weather in Paris right now?"} ] tools = [ { "type": "function", "function": { "name": "get_weather", "description": "Get the current temperature for a location by latitude and longitude.", "parameters": { "type": "object", "properties": { "latitude": {"type": "number", "description": "Latitude"}, "longitude": {"type": "number", "description": "Longitude"} }, "required": ["latitude", "longitude"] } } } ]
while True: resp = requests.post( f"{BASE_URL}/v1/chat/completions", headers={ "Authorization": f"Bearer {API_KEY}", "Content-Type": "application/json" }, json={"model": "8080/taalas/llama3.1-8b-instruct", "messages": messages, "tools": tools} ) resp.raise_for_status() data = resp.json() choice = data["choices"][0] message = choice["message"]
messages.append(message)
if choice.get("finish_reason") == "stop": print(message.get("content", "")) return
if choice.get("finish_reason") == "tool_calls" and message.get("tool_calls"): for tc in message["tool_calls"]: fn = tc["function"] result = run_tool(fn["name"], fn["arguments"]) messages.append({ "role": "tool", "tool_call_id": tc["id"], "content": result }) else: print(message.get("content", "")) return
if __name__ == "__main__": chat_with_tools()Run it (after setting _8080_API_KEY):
export _8080_API_KEY="your-api-key"python chat_with_tools.pyUsing the eighty80 SDK
Section titled “Using the eighty80 SDK”The e80 Python SDK simplifies tool calling by decorating your functions and passing them as tools:
import requestsfrom eighty80 import chat, tool, Message
@tooldef get_weather(latitude: float, longitude: float) -> str: """Get the current temperature for a location by latitude and longitude.""" response = requests.get( f"https://api.open-meteo.com/v1/forecast?latitude={latitude}&longitude={longitude}¤t=temperature_2m" ) data = response.json() return str(data["current"]["temperature_2m"])
result = chat( model="8080/taalas/llama3.1-8b-instruct", messages=[Message("user", "What's the weather in San Francisco?")], tools=[get_weather])The SDK handles the tool-call loop and argument parsing for you.