cURL
curl --request POST \ --url https://serverless.tensormesh.ai/v1/responses \ --header 'Authorization: Bearer <token>' \ --header 'Content-Type: application/json' \ --data ' { "model": "openai/gpt-oss-20b", "input": "Say hello." } '
{ "id": "resp_123", "object": "response", "model": "<string>", "output": [ { "id": "out_123", "type": "message", "role": "<string>", "status": "<string>", "content": [ { "type": "output_text", "text": "hello", "annotations": [ {} ] } ] } ], "created_at": 123, "status": "<string>" }
Create responses on the verified Tensormesh Serverless host.
Authorization: Bearer <API_KEY>
https://serverless.tensormesh.ai
client.inference.serverless.responses
Bearer authentication using your serverless API key. Format: Bearer <API_KEY>
Serverless model name to use.
"openai/gpt-oss-20b"
Input passed to the responses endpoint.
Optional limit for generated output tokens.
Successful Response
"resp_123"
"response"
Show child attributes