Supported AI Providers
Since the Envoy AI Gateway is designed to provide a Unified API for routing and managing LLM/AI traffic, it supports various AI providers out of the box.
A "support of provider" means two things: the API schema support and the Authentication support.
The former can be configured in the AIServiceBackend resource's schema field, while the latter is configured in the BackendSecurityPolicy.
Below is a table of currently supported providers and their respective configurations.
| Provider Name | API Schema Config on AIServiceBackend | Upstream Authentication Config on BackendSecurityPolicy | Status | Note |
|---|---|---|---|---|
| OpenAI | {"name":"OpenAI","version":"v1"} | API Key | ✅ | |
| AWS Bedrock | {"name":"AWSBedrock"} | AWS Bedrock Credentials | ✅ | |
| Azure OpenAI | {"name":"AzureOpenAI","version":"2025-01-01-preview"} | Azure Credentials | ✅ | |
| Google Gemini on AI Studio | {"name":"OpenAI","version":"v1beta/openai"} | API Key | ✅ | Only the OpenAI compatible endpoint |
| Google Vertex AI | {"name":"GCPVertexAI"} | GCP Credentials | ✅ | |
| Anthropic on GCP Vertex AI | {"name":"GCPAnthropic", "version":"vertex-2023-10-16"} | GCP Credentials | ✅ | Support both Native Anthropic messages endpoint and OpenAI compatible endpoint |
| Groq | {"name":"OpenAI","version":"openai/v1"} | API Key | ✅ | |
| Grok | {"name":"OpenAI","version":"v1"} | API Key | ✅ | |
| Together AI | {"name":"OpenAI","version":"v1"} | API Key | ✅ | |
| Cohere | {"name":"OpenAI","version":"compatibility/v1"} | API Key | ✅ | Only the OpenAI compatible endpoint |
| Mistral | {"name":"OpenAI","version":"v1"} | API Key | ✅ | |
| DeepInfra | {"name":"OpenAI","version":"v1/openai"} | API Key | ✅ | Only the OpenAI compatible endpoint |
| DeepSeek | {"name":"OpenAI","version":"v1"} | API Key | ✅ | |
| Hunyuan | {"name":"OpenAI","version":"v1"} | API Key | ✅ | |
| Tencent LLM Knowledge Engine | {"name":"OpenAI","version":"v1"} | API Key | ✅ | |
| SambaNova | {"name":"OpenAI","version":"v1"} | API Key | ✅ | |
| Self-hosted-models | {"name":"OpenAI","version":"v1"} | N/A | ⚠️ | Depending on the API schema spoken by self-hosted servers. For example, vLLM speaks the OpenAI format. Also, API Key auth can be configured as well. |