Skip to content

Commit c3ccc2d

Browse files
blog: type-safe provider tools in TanStack AI (#847)
* blog: type-safe provider tools in TanStack AI Adds a blog post introducing per-model compile-time type safety for provider tools, announcing the new /tools subpath on every adapter and the phantom-branded ProviderTool type that gates incompatible model + tool pairings at compile time. * ci: apply automated fixes --------- Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
1 parent eb49d0e commit c3ccc2d

2 files changed

Lines changed: 155 additions & 0 deletions

File tree

2.7 MB
Loading
Lines changed: 155 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,155 @@
1+
---
2+
title: 'Your AI Tool Calls Should Fail at Compile Time, Not in Production'
3+
published: 2026-04-22
4+
excerpt: Provider tools like web search and code execution are supported on some models and silently ignored on others. TanStack AI now gates them per model at the type level, so incompatible pairings fail at compile time instead of in production.
5+
authors:
6+
- Alem Tuzlak
7+
---
8+
9+
![Type-safe provider tools in TanStack AI](/blog-assets/type-safe-provider-tools-tanstack-ai/header.png)
10+
11+
You wire up `webSearchTool` on your Anthropic adapter. You ship. A week later, a user complains the model's answers feel weirdly out of date. You dig in. Turns out the model you rolled out to half your users doesn't actually support that tool. No error. No exception. The model just quietly pretends the tool doesn't exist and makes up an answer.
12+
13+
This is the worst kind of bug: type-clean, lint-clean, test-clean, production-loud.
14+
15+
As of the latest TanStack AI release, this class of bug is impossible to ship. Provider tools are now gated at the type level, per model, at compile time. If the pairing is invalid, TypeScript tells you on the line where you pass the tool.
16+
17+
## Why provider tools fail silently
18+
19+
A "provider tool" is a native capability the provider hosts for you. Anthropic web search. OpenAI code interpreter. Gemini Google search. Computer use. Bash. Text editor. These are not functions you wrote; they're first-class features the provider's infrastructure runs on the model's behalf.
20+
21+
Here's the catch: support is per model, not per provider.
22+
23+
| Model | Provider tools supported |
24+
| ------------------ | ----------------------------------------------------------------------- |
25+
| `claude-3-haiku` | `web_search` only |
26+
| `claude-3-5-haiku` | web tools only (`web_search`, `web_fetch`) |
27+
| `claude-opus-4-6` | full superset (web, code execution, computer use, bash, editor, memory) |
28+
| `gpt-3.5-turbo` | none |
29+
| `gpt-5` family | full superset |
30+
| `gemini-3-pro` | full superset |
31+
| `gemini-lite` | narrower subset |
32+
33+
If you pass `computerUseTool` to `claude-3-haiku`, the provider either rejects the request, or more commonly, the model ignores the tool and generates a confident-sounding response anyway. No stack trace. No warning. Nothing for your tests to catch, because the response shape is valid, just wrong.
34+
35+
The SDK cannot help you here unless the type system knows which model accepts which tool. That is the gap this release fills.
36+
37+
## The broken path, before
38+
39+
Before this release, every tool factory returned a plain `Tool`. The compiler treated them as interchangeable:
40+
41+
```typescript
42+
import { chat } from '@tanstack/ai'
43+
import { anthropicText } from '@tanstack/ai-anthropic'
44+
import { computerUseTool } from '@tanstack/ai-anthropic/tools'
45+
46+
const stream = chat({
47+
adapter: anthropicText('claude-3-haiku'),
48+
tools: [
49+
computerUseTool({
50+
/* ... */
51+
}),
52+
],
53+
})
54+
```
55+
56+
Ships clean. Passes CI. Fails in the wild.
57+
58+
## The fix: phantom-branded ProviderTool
59+
60+
Every provider tool factory now returns a `ProviderTool<TProvider, TKind>` brand. The adapter carries a `toolCapabilities` list in its type channel, derived from each model's `supports.tools` array. TypeScript gates the `tools: [...]` array so only brands in that list are assignable.
61+
62+
Same code, now:
63+
64+
```typescript
65+
const stream = chat({
66+
adapter: anthropicText('claude-3-haiku'),
67+
tools: [
68+
computerUseTool({
69+
/* ... */
70+
}),
71+
// ~~~~~~~~~~~~~~~~~~~~~~~~~~~
72+
// Type 'AnthropicComputerUseTool' is not assignable to type
73+
// 'Tool & { "~toolKind"?: never } | ProviderTool<string, "web_search">'.
74+
],
75+
})
76+
```
77+
78+
The error lands on the array element, exactly where you would fix it. Swap the model to `claude-opus-4-6` and the same line compiles. Swap `computerUseTool` for `webSearchTool` and it compiles on `claude-3-haiku` too, because haiku-3 supports web search.
79+
80+
You don't have to remember the compatibility matrix. The compiler remembers it for you.
81+
82+
## Your own tools stay frictionless
83+
84+
Gating only makes sense for provider-hosted tools, because only those have per-model support differences. Tools you write yourself are your code; they run wherever your code runs. The `toolDefinition(...)` helper in `@tanstack/ai` returns a plain unbranded `Tool`, and the `customTool(...)` factories from the OpenAI and Anthropic adapters do the same. All of them are universally assignable:
85+
86+
```typescript
87+
import { toolDefinition } from '@tanstack/ai'
88+
import { webSearchTool } from '@tanstack/ai-anthropic/tools'
89+
90+
const myTool = toolDefinition({
91+
/* ... */
92+
})
93+
94+
chat({
95+
adapter: anthropicText('claude-3-haiku'),
96+
tools: [
97+
myTool, // fine, always
98+
webSearchTool({
99+
/* ... */
100+
}), // fine, haiku-3 supports it
101+
],
102+
})
103+
```
104+
105+
No workarounds. No casts. User-defined tools behave the way they always have.
106+
107+
## How it works under the hood
108+
109+
Three ingredients make this safe:
110+
111+
1. **Per-model capability declarations.** Each model constant declares a `supports.tools` array: `['web_search']`, or the full superset, or `[]`. This is the ground truth.
112+
113+
2. **A capability map lifted into types.** A mapped type (for example `AnthropicChatModelToolCapabilitiesByName`) converts the runtime `supports.tools` into a type map indexed by model name. The text adapter threads this through a `toolCapabilities` generic on its `~types` channel.
114+
115+
3. **A gated discriminated union on `tools`.** `TextActivityOptions['tools']` is declared as:
116+
117+
```typescript
118+
type ToolsFor<TAdapter> = ReadonlyArray<
119+
| (Tool & { '~toolKind'?: never })
120+
| ProviderTool<string, TAdapter['~types']['toolCapabilities'][number]>
121+
>
122+
```
123+
124+
Unbranded tools match the first arm. Provider tools only match the second arm if their `TKind` is in the current adapter's capability list.
125+
126+
"Phantom-branded" means the `TKind` tag exists only in the type system. At runtime, a `ProviderTool` is a plain object; the brand is erased by the TypeScript compiler. Zero bundle cost. Zero runtime cost. The guarantee is entirely in the types.
127+
128+
## Why per-provider, not per-gateway
129+
130+
There is a popular pattern in the AI SDK space: abstract tools behind a gateway layer so one interface "works everywhere." That has real ergonomic value. It also has a ceiling: the lowest common denominator of what every provider supports, with every provider's specific options flattened away.
131+
132+
TanStack AI bets the other direction. You want _Anthropic's_ web search with its actual options (`max_uses`, domain filters, citation formatting), not a generic web-search shape that has to marshal onto four different providers. You want OpenAI's code interpreter with its full container model, not a lowest-common sandbox.
133+
134+
The tradeoff of the native approach is obvious: more surface to misuse. That is exactly what per-model type gating fixes. You get the full provider-native tool surface, _and_ you get a compiler that refuses to let you misapply it.
135+
136+
## Tree-shakeable `/tools` subpath
137+
138+
Every adapter now exports its provider tools from a dedicated `/tools` subpath:
139+
140+
```typescript
141+
import { webSearchTool } from '@tanstack/ai-anthropic/tools'
142+
import { codeInterpreterTool } from '@tanstack/ai-openai/tools'
143+
import { googleSearchTool } from '@tanstack/ai-gemini/tools'
144+
import { webSearchTool as openRouterWebSearch } from '@tanstack/ai-openrouter/tools'
145+
```
146+
147+
If you don't import a tool, it doesn't end up in your bundle. Adapter roots stay lean; tool factories ship on demand. The same pattern holds across `@tanstack/ai-anthropic`, `@tanstack/ai-openai`, `@tanstack/ai-gemini`, `@tanstack/ai-openrouter`, `@tanstack/ai-grok`, and `@tanstack/ai-groq`.
148+
149+
## Try it
150+
151+
The AI tool-calling surface has always been underspecified at the type level. Models advertise capabilities they partially implement. SDKs promise uniformity they cannot fully deliver. The result is a class of bugs that slip past tests and only show up when a real user asks a real question.
152+
153+
TanStack AI closes that gap at the exact point it matters: the line where you hand tools to a model. If the pairing is invalid, you know before you commit.
154+
155+
Upgrade your `@tanstack/ai-*` adapters, import provider tools from the `/tools` subpath, and read the [Provider Tools guide](https://tanstack.com/ai/latest/docs/tools/provider-tools) for the per-model matrix.

0 commit comments

Comments
 (0)