Azure OpenAI
Azure OpenAI Service provides REST API access to OpenAI's powerful language models including GPT-4, GPT-3.5-turbo, DALL-E, and Whisper. These models can be easily adapted to your specific task with enterprise-grade security, compliance, and regional availability. With Lamatic, you can seamlessly integrate with Azure OpenAI models and take advantage of features like observability, prompt management, fallbacks, and more.
Learn how to integrate Azure OpenAI with Portkey to access powerful language models with enhanced observability and reliability features.
Azure OpenAI offers the same models as OpenAI but with additional benefits:
- Enterprise Security: Data residency, private networking, and compliance certifications
- Scalability: Managed infrastructure with guaranteed availability and performance
- Content Filtering: Built-in responsible AI content filtering and safety features
You can learn more about Azure OpenAI Service here (opens in a new tab).
azure-openai
1. Request Access to Azure OpenAI
Azure OpenAI requires approval for access. Fill out the Azure OpenAI Access Form (opens in a new tab) to request access to the service.
2. Create Azure OpenAI Resource
- Go to the Azure Portal (opens in a new tab)
- Click "Create a resource"
- Search for "Azure OpenAI"
- Click "Create"
- Fill in the required information:
- Subscription: Select your Azure subscription
- Resource Group: Create new or select existing
- Region: Choose a supported region
- Name: Enter a unique name for your resource
- Pricing Tier: Select Standard S0
- Click "Review + Create" then "Create"
3. Deploy a Model
- Navigate to Azure OpenAI Studio (opens in a new tab)
- Select your resource
- Go to "Deployments" in the left menu
- Click "Create new deployment"
- Select a model (e.g., GPT-4, GPT-3.5-turbo)
- Enter a Deployment Name (e.g., "gpt-4-chat")
- Configure token limits and settings
- Click "Create"
4. Get API Credentials
- In Azure OpenAI Studio, go to "Chat" playground
- Click "View Code"
- Copy the API Key and Endpoint
Alternatively, in the Azure Portal:
- Navigate to your Azure OpenAI resource
- Go to "Keys and Endpoint" in the left menu
- Copy Key 1 or Key 2 and the Endpoint
Required Information
Azure API Key
Found in the "Keys and Endpoint" section of your Azure OpenAI resource.
Azure Endpoint
The full endpoint URL for your Azure OpenAI resource:
https://[YourResourceName].openai.azure.com/
Deployment Name
The name you assigned when deploying the model (e.g., "gpt-4-chat", "gpt-35-turbo").
Azure API Version
The API version to use. Common versions:
2024-06-01
2024-02-15-preview
2023-12-01-preview
Example Configuration
Azure API Key: your-azure-openai-api-key-here
Azure Endpoint: https://my-openai-resource.openai.azure.com/
Deployment Name: gpt-4-chat
Azure API Version: 2024-06-01
Available Models
Once you provide these details, your selected foundation model will be automatically populated.
For more information, refer to the official Azure OpenAI guide (opens in a new tab).
Follow these general steps in Lamatic.ai:
- Open your Lamatic.ai Studio (opens in a new tab)
- Navigate to Models section
- Select
azure-openai
provider - Provide the following credentials:
- Azure API Key
- Azure Endpoint
- Deployment Name
- Azure API Version
- Save your changes
Model availability varies by region. Check the Azure OpenAI models documentation (opens in a new tab) for region-specific availability.
Important Notes
- Keep your API keys secure and never share them publicly
- Azure OpenAI requires access approval - apply early
- Different regions have different model availability
- Monitor your usage to avoid unexpected charges
- Content filtering is enabled by default but can be customized
- Test your integration after adding each key
- Consider using managed identities for enhanced security
- Review Azure OpenAI pricing (opens in a new tab) before deployment