We’ve made fine-tuning via API available through unstable routes that are subject to change. For most users, we highly recommend fine-tuning through the Webapp to achieve optimal performance with a smooth experience. However, some users may prefer to fine-tune via API for custom use cases.

The following base models are supported for general access:

  • OpenPipe/Hermes-2-Theta-Llama-3-8B-32k
  • meta-llama/Meta-Llama-3-8B-Instruct
  • meta-llama/Meta-Llama-3-70B-Instruct
  • OpenPipe/mistral-ft-optimized-1227
  • mistralai/Mixtral-8x7B-Instruct-v0.1

Learn more about fine-tuning via API on the route page. Please contact us at hello@openpipe.ai if you would like help getting set up.