Some developers have found it useful to proxy requests to arbitrary external models through OpenPipe. This is useful if you have a custom model that you’ve deployed to Azure, or an external model that you’ve deployed to another cloud provider.

Adding custom external models is not required to proxy requests to OpenAI or Anthropic models. See our docs on proxying to OpenAI or Anthropic for more information.

Proxying chat completions to an custom external model requires a few short steps.

  • Create an external model provider
  • Add a model to the external model provider
  • Adjust the model parameter in your chat completion request

Create an external model provider

Find the External Model Providers section of your project settings, and click the Add Provider button.

Give your custom provider a slug, API key, and add a custom base url if necessary. The slug should be unique, and will be used when we proxy requests to models associated with this provider.

Add a model to the external model provider

To add a model to the provider you’re creating, click the Add model button.

Give the model a slug that matches the model you’d like to call on your external provider. To call gpt-4o-2024-08-06 on Azure for instance, the slug should be gpt-4o-2024-08-06.

Setting input cost and output cost is optional, but can be helpful for showing relative costs in the evals page.

Update the model parameter in your chat completion request

Almost done! The last step is to set the model parameter in your requests to match this format: openpipe:<external-model-provider-slug>/<external-model-slug>.

For example, if you’re calling gpt-4o-2024-08-06 on Azure, the model parameter should be openpipe:custom-azure-provider/gpt-4o-2024-08-06.

from openpipe import OpenAI

# Find the config values in "Installing the SDK"
client = OpenAI()

completion = client.chat.completions.create(
    model="openpipe:custom-azure-provider/gpt-4o-2024-08-06",
    messages=[{"role": "system", "content": "count to 10"}],
    openpipe={"tags": {"prompt_id": "counting", "any_key": "any_value"}},
)

External models can also be used for filtering and relabeling your data. We currently support custom external models for providers with openai and azure-compatible endpoints. If you’d like support for an external provider with a different API format, send a request to hello@openpipe.ai.