Skip to main content
GET
/
v1
/
models
List Models
curl --request GET \
  --url https://external.nebius.tensormesh.ai/v1/models \
  --header 'Authorization: Bearer <token>' \
  --header 'X-User-Id: <x-user-id>'
{
  "object": "list",
  "data": [
    {
      "id": "openai-gpt-oss-120b-gpu-type-h200x1_8nic16",
      "object": "model",
      "owned_by": "vllm"
    }
  ]
}
Use this page when you need the raw model catalog from the routed On-Demand inference host.
  • Auth: Authorization: Bearer <API_KEY>
  • Routing: required X-User-Id: <uuid>
  • Host: choose the external Tensormesh host for your provider
  • Best for: discovering the exact served model string to reuse in routed On-Demand requests

Authorizations

Authorization
string
header
required

Bearer authentication using your On-Demand API key. Format: Bearer <API_KEY>

Headers

X-User-Id
string<uuid>
required

Tensormesh user id used for attribution and routing.

Response

Successful Response

object
string
required
Example:

"list"

data
InferenceModel · object[]
required