Microsoft Azure OpenAI Service is a fully managed service offering from Microsoft that enables you to host your own instances of AI models from OpenAI, such as GPT-4 or GPT-35-turbo. There are security benefits to hosting your own instance of the OpenAI models, in addition to the enhanced feature set Microsoft has built into Azure aroubnd the entire Azure OpenAI Service.
Let’s take a look at the Azure Bicep code necessary to provision and manage Azure OpenAI Service, in addition to configuring OpenAI model deployments with GPT-4 and GPT-35-turbo examples below.
Create Azure OpenAI Service using Azure Bicep
The Azure ARM Microsoft.CognitiveServices/accounts resource type is used to provision an Azure OpenAI Service account / instance. Then when configuring the service, the kind property needs to be set to OpenAI to tell Microsoft Azure that this “cognitive services” account will be an Azure OpenAI Service.
The following Azure Bicep code can be used to create / manage an Azure OpenAI Service:
resource azureopenai 'Microsoft.CognitiveServices/accounts@2023-10-01-preview' = {
name: 'b59-eus2-aichatui-oai'
location: 'eastus2'
kind: 'OpenAI'
sku: {
name: 'S0'
}
properties: {
networkAcls: {
defaultAction: 'Allow'
virtualNetworkRules: []
ipRules: []
}
publicNetworkAccess: 'Enabled'
}
}
The reason the ARM Microsoft.CognitiveServices/accounts resource type is used is because the Azure OpenAI Service has grown out of what was once call Azure Cognitive Services, so thats the backend Azure resource management type. There can be multiple types of cognitive services accounts in ARM, so for the Azure OpenAI Service the kind property needs to be set to OpenAI.
Another example of this can also be found as part of the Build5Nines/AIChatUI project that provides a simple UI written as a Node.js web app to show how to build your own application that integrates the use of Azure OpenAI Service and GPT 3.5 or GPT 4 for your own use.
Create GPT-35-Turbo Deployment in Azure OpenAI Service using Azure Bicep
A Deployment within Azure Open AI Service is a deployment of an AI model; such as GPT-35-turbo. The following Azure Bicep code can be used to create / manage a GPT-35-turbo Deployment within Azure OpenAI Service:
resource azureopenaideployment 'Microsoft.CognitiveServices/accounts/deployments@2023-05-01' = {
name: 'b59-gpt-35-turbo'
sku: {
capacity: 120
name: 'Standard'
}
parent: azureopenai
properties: {
model: {
format: 'OpenAI'
name: 'gpt-35-turbo'
version: '0613'
}
raiPolicyName: 'Microsoft.Default'
versionUpgradeOption: 'OnceCurrentVersionExpired'
scaleSettings: {
capacity: 120
}
}
}
Create GPT-4 Deployment in Azure OpenAI Service using Azure Bicep
A Deployment within Azure Open AI Service is a deployment of an AI model; such as GPT-4. Similar to the above example, Azure Bicep can be used to create a GPT-4 Deployment within Azure OpenAI Service too.
The following Azure Bicep code can be used to create / manage a GPT-4 Deployment within Azure OpenAI Service:
resource azureopenaideployment 'Microsoft.CognitiveServices/accounts/deployments@2023-05-01' = {
name: 'b59-gpt4'
sku: {
capacity: 10
name: 'Standard'
}
parent: azureopenai
properties: {
model: {
format: 'OpenAI'
name: 'gpt-4'
version: '1106-Preview'
}
raiPolicyName: 'Microsoft.Default'
versionUpgradeOption: 'OnceCurrentVersionExpired'
scaleSettings: {
capacity: 10
}
}
}
Happy chatting with your GPT-based LLM!
Original Article Source: Azure Bicep: Create Azure OpenAI Service and GPT-4 / GPT-35-turbo Model Deployment written by Chris Pietschmann (If you're reading this somewhere other than Build5Nines.com, it was republished without permission.)

Microsoft Azure Regions: Interactive Map of Global Datacenters
Prompt Noise Is Killing Your AI Accuracy: How to Optimize Context for Grounded Output
Create Azure Architecture Diagrams with Microsoft Visio
IPv4 Address CIDR Range Reference and Calculator
Configuring GitHub Actions to Run Jobs on Specific Branches




