Skip to content

Responses

POST /v1/responses accepts an OpenAI Responses-style request shape. It is useful for clients that send input instead of messages, including Codex-like tools.

text
POST https://api.rout.my/v1/responses

Request

bash
curl https://api.rout.my/v1/responses \
  -H "Authorization: Bearer $ROUTMY_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "provider/model-id",
    "input": "Explain this API in one sentence."
  }'

Request fields

FieldTypeRequiredNotes
modelstringYesExact model ID from /v1/models.
inputstring or arrayYesPlain text or an array of input messages.
instructionsstringNoAdded as a system instruction before user input.
streambooleanNoSet true for Responses-style SSE events.
temperaturenumberNoSampling temperature.
top_pnumberNoNucleus sampling.
max_output_tokensintegerNoMaximum generated output tokens.
toolsarrayNoFunction tools in Responses format.
tool_choiceanyNoTool selection.
metadataobjectNoClient metadata.
previous_response_idstringNoAccepted for compatibility; the proxy is stateless.

Message input

json
{
  "model": "provider/model-id",
  "instructions": "Answer in short paragraphs.",
  "input": [
    {
      "role": "user",
      "content": "What should I check before deploying?"
    }
  ]
}

Content parts with input_text or text are converted to chat text internally.

Response

json
{
  "id": "resp_abc123",
  "object": "response",
  "created_at": 1744000000,
  "status": "completed",
  "model": "provider/model-id",
  "output": [
    {
      "type": "message",
      "role": "assistant",
      "status": "completed",
      "content": [
        {
          "type": "output_text",
          "text": "Check environment variables, API keys, and logs."
        }
      ]
    }
  ],
  "usage": {
    "input_tokens": 20,
    "output_tokens": 12,
    "total_tokens": 32
  }
}

Streaming

json
{
  "model": "provider/model-id",
  "input": "Stream a short answer.",
  "stream": true
}

Streaming responses are sent as Server-Sent Events with event objects such as response creation, output text deltas, and completion.

Stateless operations

The proxy returns response IDs for client compatibility. Persistent retrieval, cancellation, and deletion endpoints may exist for compatibility paths, but generation itself should be treated as stateless unless your client stores conversation state.

API documentation for rout.my.