Back

A First Look at TanStack AI

A First Look at TanStack AI

Building AI-powered features in frontend applications often means choosing between type safety and flexibility. You either lock into a specific provider’s SDK and lose portability, or you write custom abstractions that sacrifice TypeScript’s compile-time guarantees. TanStack AI offers a different approach: a vendor-neutral AI SDK that prioritizes type safety without forcing you into a particular framework or provider.

This article introduces TanStack AI’s core concepts and explains why frontend developers should pay attention to this early-stage toolkit.

Key Takeaways

  • TanStack AI is a framework-agnostic, vendor-neutral AI toolkit with strong TypeScript support, currently in alpha.
  • Its adapter-based architecture provides per-model type inference, streaming-first design, and modular imports that keep bundle sizes small.
  • Isomorphic tools let you define a tool once and run it on the server or client with full type safety across both environments.
  • While not production-ready, TanStack AI addresses real pain points around provider lock-in and type safety that existing AI SDKs leave unresolved.

What Is TanStack AI?

TanStack AI is a framework-agnostic AI toolkit from the team behind TanStack Query and TanStack Table. It provides a unified interface for working with multiple AI providers—OpenAI, Anthropic, Gemini, Ollama, and others—while maintaining strong TypeScript support throughout.

The library is currently in alpha. APIs are changing rapidly, and the team has already shipped multiple architectural overhauls since the initial release. This isn’t production-ready infrastructure. It’s an emerging pattern worth understanding.

Core Ideas Behind the Type-Safe AI SDK

Schema-Driven Type Safety

TanStack AI treats type safety as a first-class concern. When you specify a model in your adapter, TypeScript immediately knows what options are available:

import { openaiText } from '@tanstack/ai-openai'

chat({
  adapter: openaiText('gpt-4'),
  temperature: 0.6,
})

The model lives inside the adapter call, which means autocomplete works instantly. You get per-model typing for provider-specific options without manual type annotations.

Tool and function definitions use Zod schemas (or JSON Schema), ensuring that inputs and outputs are validated at both compile time and runtime.

Streaming-First Architecture

Streaming is central to TanStack AI’s design. Rather than treating it as an afterthought, the SDK builds around it. This matters for chat interfaces, real-time transcription, and any application where users expect immediate feedback.

Modular Adapter Architecture

Recent releases split monolithic provider adapters into modality-specific imports:

import { openaiText, openaiImage, openaiVideo } from '@tanstack/ai-openai'

This approach keeps bundle sizes small. Import only what you need. The architecture also makes it easier for the team to add new modalities—image generation, transcription, text-to-speech—without updating every provider simultaneously.

TanStack AI React and Framework Support

While TanStack AI works with vanilla JavaScript and Solid, TanStack AI React provides hooks and patterns familiar to React developers. The library follows the same framework-agnostic philosophy as TanStack Query: core logic stays separate from framework bindings.

Current client libraries include:

  • Vanilla JavaScript
  • Preact
  • React
  • Solid

Additional framework support is planned.

Isomorphic Tools: Server and Client Execution

One distinctive feature is the isomorphic tool system. You define a tool once using toolDefinition(), then provide environment-specific implementations with .server() or .client() methods. This delivers type safety across your entire application while letting tools execute in the appropriate context.

This pattern is particularly useful when some operations require server-side API keys while others can run entirely in the browser.

How It Compares to Existing AI SDK Patterns

TanStack AI positions itself as a vendor-neutral alternative to the Vercel AI SDK. Key differences include:

  • Framework agnostic: Works with any JavaScript framework, not just Next.js
  • No service layer: Connect directly to providers without intermediaries
  • Modular bundles: Import only the modalities you need
  • Open protocol: Pure open source with no platform dependencies

The tradeoff is maturity. Vercel’s SDK has more production mileage and documentation. TanStack AI is moving fast, which means more flexibility but less stability.

What’s on the Roadmap

The team has outlined several upcoming features:

  • Standard Schema support (removing the Zod requirement)
  • Middleware patterns
  • Headless UI components for AI interfaces
  • Additional provider adapters (AWS Bedrock, OpenRouter)
  • Devtools and usage reporting

Conclusion

TanStack AI isn’t ready for production applications that need stable APIs. But if you’re exploring AI SDK patterns, building prototypes, or evaluating options for future projects, it’s worth experimenting with.

The combination of strong TypeScript support, framework flexibility, and modular architecture addresses real pain points in current AI tooling. As the library matures, these foundations could make it a compelling choice for teams that want control over their AI stack without sacrificing developer experience.

Start with the official documentation and expect things to change.

FAQs

TanStack AI is currently in alpha, and its APIs are changing frequently. The team has shipped multiple breaking architectural changes since launch. It is best suited for prototyping, experimentation, and evaluating future tooling options rather than production workloads that require stable, well-documented interfaces.

TanStack AI is framework agnostic and connects directly to AI providers without a service layer. It offers modular imports per modality and has no platform dependencies. The Vercel AI SDK is more mature with broader documentation but is more tightly coupled to the Next.js ecosystem.

TanStack AI currently supports OpenAI, Anthropic, Gemini, Ollama, and others through its adapter system. The roadmap includes additional providers such as AWS Bedrock and OpenRouter. Each provider adapter is modular, so you only import the modalities you actually use in your application.

Currently, TanStack AI uses Zod schemas for defining tool inputs and outputs, providing both compile-time and runtime validation. However, the team has Standard Schema support on its roadmap, which will allow alternative schema libraries and remove the hard dependency on Zod in future releases.

Gain Debugging Superpowers

Unleash the power of session replay to reproduce bugs, track slowdowns and uncover frustrations in your app. Get complete visibility into your frontend with OpenReplay — the most advanced open-source session replay tool for developers. Check our GitHub repo and join the thousands of developers in our community.

OpenReplay