mirror of
https://github.com/HKUDS/nanobot.git
synced 2026-04-14 23:19:55 +00:00
docs(provider): clarify responses api routing
This commit is contained in:
parent
85c7996766
commit
d33bf22e91
24
README.md
24
README.md
@ -1053,6 +1053,30 @@ Connects directly to any OpenAI-compatible endpoint — LM Studio, llama.cpp, To
|
||||
```
|
||||
|
||||
> For local servers that don't require a key, set `apiKey` to any non-empty string (e.g. `"no-key"`).
|
||||
>
|
||||
> `custom` is the right choice for providers that expose an OpenAI-compatible **chat completions** API. It does **not** force third-party endpoints onto the OpenAI/Azure **Responses API**.
|
||||
>
|
||||
> If your proxy or gateway is specifically Responses-API-compatible, use the `azure_openai` provider shape instead and point `apiBase` at that endpoint:
|
||||
>
|
||||
> ```json
|
||||
> {
|
||||
> "providers": {
|
||||
> "azure_openai": {
|
||||
> "apiKey": "your-api-key",
|
||||
> "apiBase": "https://api.your-provider.com",
|
||||
> "defaultModel": "your-model-name"
|
||||
> }
|
||||
> },
|
||||
> "agents": {
|
||||
> "defaults": {
|
||||
> "provider": "azure_openai",
|
||||
> "model": "your-model-name"
|
||||
> }
|
||||
> }
|
||||
> }
|
||||
> ```
|
||||
>
|
||||
> In short: **chat-completions-compatible endpoint → `custom`**; **Responses-compatible endpoint → `azure_openai`**.
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user