Skip to main content
Tool calling lets the model request named functions with JSON arguments; your code runs them and returns results in the conversation. SUPA accepts the same tool schemas as OpenAI Chat Completions.

OpenAI SDK

Use tools on chat.completions.create. Define each tool with type: 'function' and a JSON Schema in function.parameters.
import OpenAI from 'openai';

const openai = new OpenAI({
  apiKey: process.env.SUPA_API_KEY,
  baseURL: 'https://api.supa.works/openai',
});

const response = await openai.chat.completions.create({
  model: 'google/gemma-3-27b-it',
  messages: [
    {
      role: 'user',
      content: 'What is the weather in Boston?',
    },
  ],
  tools: [
    {
      type: 'function',
      function: {
        name: 'get_weather',
        description: 'Get the current weather for a city',
        parameters: {
          type: 'object',
          properties: {
            city: { type: 'string', description: 'City name' },
            unit: {
              type: 'string',
              enum: ['celsius', 'fahrenheit'],
              description: 'Temperature unit',
            },
          },
          required: ['city'],
        },
      },
    },
  ],
  tool_choice: 'auto',
});

const message = response.choices[0]?.message;
console.log(message);
If the model issues a tool call, message.tool_calls contains the function name and arguments string. Parse the arguments, run your function, and send a follow-up request with role: 'tool' messages (or continue the loop until the model responds with text only).

Vercel AI SDK

Use generateText with tool and Zod inputSchema. Reuse the same SUPA provider as in the Vercel AI SDK guide.
import { createOpenAI } from '@ai-sdk/openai';
import { generateText, tool } from 'ai';
import { z } from 'zod';

const supa = createOpenAI({
  apiKey: process.env.SUPA_API_KEY,
  baseURL: 'https://api.supa.works/openai/v1',
});

const { text, toolCalls, toolResults } = await generateText({
  model: supa('google/gemma-3-27b-it'),
  prompt: 'What is the weather in Boston?',
  tools: {
    get_weather: tool({
      description: 'Get the current weather for a city',
      inputSchema: z.object({
        city: z.string(),
        unit: z.enum(['celsius', 'fahrenheit']).optional(),
      }),
      execute: async ({ city, unit }) => {
        return {
          city,
          unit: unit ?? 'fahrenheit',
          temperature: 72,
          summary: 'Sunny',
        };
      },
    }),
  },
});

console.log(text);
console.log(toolCalls, toolResults);

TanStack AI

Define tools with toolDefinition and .server() for the implementation. Use the same adapter configuration as in the TanStack AI guide.
import { chat, toolDefinition } from '@tanstack/ai';
import { createOpenaiChat } from '@tanstack/ai-openai';
import type { OpenAIChatModel } from '@tanstack/ai-openai';
import { z } from 'zod';

const adapter = createOpenaiChat(
  'google/gemma-3-27b-it' as OpenAIChatModel,
  process.env.SUPA_API_KEY!,
  { baseURL: 'https://api.supa.works/openai/v1' },
);

const getWeather = toolDefinition({
  name: 'get_weather',
  description: 'Get the current weather for a city',
  inputSchema: z.object({
    city: z.string(),
    unit: z.enum(['celsius', 'fahrenheit']).optional(),
  }),
  outputSchema: z.object({
    city: z.string(),
    temperature: z.number(),
    summary: z.string(),
  }),
}).server(async ({ city, unit }) => ({
  city,
  temperature: 72,
  summary: 'Sunny',
}));

await chat({
  adapter,
  stream: false,
  messages: [
    {
      role: 'user',
      content: [{ type: 'text', content: 'What is the weather in Boston?' }],
    },
  ],
  tools: [getWeather],
});