Skip to main content

Inference Gateway API

The Tensormesh Inference Gateway provides an OpenAI-compatible chat completions endpoint for running inference on your deployed models. Base URL: https://external.nebius.tensormesh.ai

Authentication

All requests require two headers:
HeaderDescription
AuthorizationBearer token with your API key
X-User-IdYour unique user ID (UUID format)

Quick Example

curl https://external.nebius.tensormesh.ai/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "X-User-Id: <your-user-id>" \
  -H "Authorization: Bearer <your-api-key>" \
  -d '{
    "model": "<your-model-deployment-id>",
    "messages": [
      { "role": "system", "content": "You are a helpful assistant." },
      { "role": "user", "content": "Hello!" }
    ]
  }'

Tensormesh Control Plane API

The Control Plane API lets you programmatically manage models, billing, observability, support tickets, and user accounts. Base URL: https://api.tensormesh.ai
Authentication for Control Plane APIs uses a JWT Bearer token obtained through your login session.
Browse the API groups in the sidebar to explore all available endpoints.