Documentation Index
Fetch the complete documentation index at: https://docs.tensormesh.ai/llms.txt
Use this file to discover all available pages before exploring further.
The provider includes direct helpers for Tensormesh endpoints that are not represented by an AI SDK language model object.
Models
import { tensormesh } from "@tensormesh/ai-sdk-provider";
const models = await tensormesh.models.list();
console.log(models.data.map((model) => model.id));
Responses
import { tensormesh } from "@tensormesh/ai-sdk-provider";
const response = await tensormesh.responses.create({
model: "mistralai/Devstral-2-123B-Instruct-2512",
input: "Write a short product announcement.",
});
console.log(response);
For streaming Responses API calls:
const stream = await tensormesh.responses.stream({
model: "mistralai/Devstral-2-123B-Instruct-2512",
input: "Stream a short product announcement.",
});
const reader = stream.body?.getReader();
const decoder = new TextDecoder();
while (reader) {
const { value, done } = await reader.read();
if (done) break;
process.stdout.write(decoder.decode(value, { stream: true }));
}
Tokenize And Detokenize
const tokenized = await tensormesh.tokenize.create({
model: "mistralai/Devstral-2-123B-Instruct-2512",
prompt: "Hello from Tensormesh",
});
const detokenized = await tensormesh.detokenize.create({
model: "mistralai/Devstral-2-123B-Instruct-2512",
tokens: tokenized.tokens,
});
Health And Version
const health = await tensormesh.health.get();
const version = await tensormesh.version.get();
console.log({ health, version });
The serverless models, health, and version endpoints can be called without an inference API key. Generation, Responses API, tokenize, and detokenize requests require an API key.