Use TanStack AI with the @tanstack/ai-openai adapter and SUPA’s OpenAI-compatible API.
TanStack’s OpenAI adapter builds on the official openai package. Set baseURL to SUPA and pass your SUPA model id (for example google/gemma-3-27b-it).
TanStack AI currently types chat model names as OpenAI model ids. Use a type assertion for custom SUPA model strings until your project exposes them in typings. The runtime request still sends your SUPA model name to the API.
Prerequisites
- A SUPA API key from your profile page
- Node.js 18+ (TanStack AI recommends current Node LTS; see their docs for any stricter requirement)
Install
npm install @tanstack/ai @tanstack/ai-openai zod
Use createOpenaiChat with apiKey, baseURL, and your model id.
import { createOpenaiChat } from '@tanstack/ai-openai';
import type { OpenAIChatModel } from '@tanstack/ai-openai';
const adapter = createOpenaiChat(
'google/gemma-3-27b-it' as OpenAIChatModel,
process.env.SUPA_API_KEY!,
{ baseURL: 'https://api.supa.works/openai/v1' },
);
Chat (non-streaming)
The TanStack chat activity accepts your adapter and messages. Set stream: false to collect the full reply as a string.
import { chat } from '@tanstack/ai';
const text = await chat({
adapter,
stream: false,
messages: [
{
role: 'user',
content: [{ type: 'text', content: 'What is a banana?' }],
},
],
});
console.log(text);
Streaming
Omit stream or set stream: true to receive an async iterable of stream chunks (see TanStack AI docs for consuming chunks).
Next steps