跳转到主要内容

Documentation Index

Fetch the complete documentation index at: https://docs.apigo.ai/llms.txt

Use this file to discover all available pages before exploring further.

This page keeps only the HTTP endpoint notes plus direct cURL, Python, and Node.js request examples.

Endpoint Summary

EndpointSummary
POST /v1/chat/completionsStandard chat completion endpoint for multi-turn conversation, tool calling, and optional streaming.
POST /v1/responsesNewer unified response endpoint for structured output, multimodal input, and future capability expansion.

POST /v1/chat/completions

Standard chat completion endpoint for multi-turn conversation, tool calling, and optional streaming.

Request Notes

  • Authenticate with Authorization: Bearer ; the core payload fields are model and messages.
  • messages should preserve system, user, and assistant history in order; add stream=true for SSE output.
  • For OpenAI-compatible gateways, this is usually the safest default text endpoint to start with.

Response Notes

  • Synchronous output is typically read from choices[0].message.content.
  • When tool calling is enabled, handle tool_calls and the follow-up tool exchange together.
  • Streaming mode returns SSE chunks rather than one complete JSON response.

Examples

cURL

chat.completions
curl --request POST \
  --url https://api.tokenops.ai/v1/chat/completions \
  --header 'Authorization: Bearer ${YOUR_API_KEY}' \
  --header 'Content-Type: application/json' \
  --data '{
    "model": "gpt-4.1-mini",
    "messages": [
      { "role": "system", "content": "You are a concise API assistant." },
      { "role": "user", "content": "Give me a contact form field definition." }
    ]
  }'

Python

requests
import requests

response = requests.post(
    'https://api.tokenops.ai/v1/chat/completions',
    headers={
        'Authorization': 'Bearer ${YOUR_API_KEY}',
        'Content-Type': 'application/json'
    },
    json={
        'model': 'gpt-4.1-mini',
        'messages': [
            {'role': 'system', 'content': 'You are a concise API assistant.'},
            {'role': 'user', 'content': 'Give me a contact form field definition.'}
        ]
    },
    timeout=60
)

print(response.json())

Node.js

fetch
const response = await fetch('https://api.tokenops.ai/v1/chat/completions', {
  method: 'POST',
  headers: {
    Authorization: 'Bearer ${YOUR_API_KEY}',
    'Content-Type': 'application/json'
  },
  body: JSON.stringify({
    model: 'gpt-4.1-mini',
    messages: [
      { role: 'system', content: 'You are a concise API assistant.' },
      { role: 'user', content: 'Give me a contact form field definition.' }
    ]
  })
})

console.log(await response.json())

Response Example (200)

response
{
  "id": "chatcmpl_123",
  "object": "chat.completion",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "{\"name\":\"email\",\"type\":\"string\"}"
      }
    }
  ]
}

POST /v1/responses

Newer unified response endpoint for structured output, multimodal input, and future capability expansion.

Request Notes

  • It still uses Bearer auth, but the main payload shape is centered on input and instructions rather than messages.
  • If you want one endpoint shape for text and structured output, prefer this over legacy chat.completions.
  • Newer response-format and multimodal features usually show up here first.

Response Notes

  • Consumers usually read from output[] or output_text instead of choices[0].message.
  • When the workflow becomes async or tool-driven, this endpoint usually exposes richer status fields.
  • Migration work should include field mapping, retries, and server-side logging.

Examples

cURL

responses
curl --request POST \
  --url https://api.tokenops.ai/v1/responses \
  --header 'Authorization: Bearer ${YOUR_API_KEY}' \
  --header 'Content-Type: application/json' \
  --data '{
    "model": "gpt-4.1-mini",
    "input": "Return a JSON contact form field definition."
  }'

Python

requests
import requests

response = requests.post(
    'https://api.tokenops.ai/v1/responses',
    headers={
        'Authorization': 'Bearer ${YOUR_API_KEY}',
        'Content-Type': 'application/json'
    },
    json={
        'model': 'gpt-4.1-mini',
        'input': 'Return a JSON contact form field definition.'
    },
    timeout=60
)

print(response.json())

Node.js

fetch
const response = await fetch('https://api.tokenops.ai/v1/responses', {
  method: 'POST',
  headers: {
    Authorization: 'Bearer ${YOUR_API_KEY}',
    'Content-Type': 'application/json'
  },
  body: JSON.stringify({
    model: 'gpt-4.1-mini',
    input: 'Return a JSON contact form field definition.'
  })
})

console.log(await response.json())

Response Example (200)

response
{
  "id": "resp_123",
  "status": "completed",
  "output": [
    {
      "type": "output_text",
      "text": "{\"name\":\"email\",\"type\":\"string\"}"
    }
  ]
}