Documentation Index
Fetch the complete documentation index at: https://docs.tensormesh.ai/llms.txt
Use this file to discover all available pages before exploring further.
The public repository includes a Next.js starter under examples/next-starter. It is reference code for application integration and is not included in the npm package tarball.
It demonstrates:
- Streaming chat with
streamText
- Structured output with Zod-backed schemas
- Tool calling with AI SDK tools
- Runtime model selection from
/v1/models
- Serverless by default, with optional on-demand inference settings
Clone the repository and install dependencies:
git clone https://github.com/Tensormesh-Production/tensormesh-ai-sdk-provider.git
cd tensormesh-ai-sdk-provider/examples/next-starter
npm install
cp .env.local.example .env.local
Set at least:
TENSORMESH_INFERENCE_API_KEY=your-api-key
TENSORMESH_CHAT_MODEL=mistralai/Devstral-2-123B-Instruct-2512
Optional:
TENSORMESH_STRUCTURED_MODEL=mistralai/Devstral-2-123B-Instruct-2512
TENSORMESH_TOOL_MODEL=mistralai/Devstral-2-123B-Instruct-2512
TENSORMESH_BASE_URL=https://YOUR_ON_DEMAND_BASE_URL/v1
TENSORMESH_USER_ID=your-user-id
See the starter README for the full setup.