Adding custom external models is not required to proxy requests to Anthropic, Gemini, or OpenAI
models. See our docs on proxying to Anthropic,
Gemini, or
OpenAI for more information.
To proxy requests to models from unsupported providers, you’ll need to complete the following steps:
Add an external model provider
Update your chat completion requests
To add an external model provider to your project, follow the instructions in External Models. Once it’s been added, continue to the next step.
Set the model parameter in your requests to match this format: openpipe:<external-model-provider-slug>/<external-model-slug>.For example, if you’re calling gpt-4o-2024-08-06 on Azure, the model parameter should be openpipe:custom-azure-provider/gpt-4o-2024-08-06.
Copy
Ask AI
from openpipe import OpenAI# Find the config values in "Installing the SDK"client = OpenAI()completion = client.chat.completions.create( model="openpipe:custom-azure-provider/gpt-4o-2024-08-06", messages=[{"role": "system", "content": "count to 10"}], metadata={"prompt_id": "counting", "any_key": "any_value"},)
External models can also be used for filtering and relabeling your data. We currently support custom external
models for providers with openai and azure-compatible endpoints. If you’d like support for an external provider with a different API format, send a request to hello@openpipe.ai.