Microsoft Azure OpenAI Service is a fully managed service offering from Microsoft that enables you to host your own instances of AI models from OpenAI, such as GPT-4 or GPT-35-turbo. There are security benefits to hosting your own instance of the OpenAI models, in addition to the enhanced feature set Microsoft has built into Azure around the entire Azure OpenAI Service.
Let’s take a look at the HashiCorp Terraform code necessary to provision and manage Azure OpenAI Service, in addition to configuring OpenAI model deployments with GPT-4 and GPT-35-turbo examples below.
Create Azure OpenAI Service using Terraform
The Terraform azurerm resource type to provision for Azure OpenAI Service is the azurerm_cognitive_account resource. Then when configuring it, the kind property needs to be set to OpenAI to tell Microsoft Azure that this account will be an instance of Azure OpenAI Service.
The following Terraform code can be used to create / manage an Azure OpenAI Service:
# Terraform version requirements with AzureRM provider
terraform {
required_providers {
# AzureRM provider 3.x
azurerm = {
source = "hashicorp/azurerm"
version = "~>3"
}
}
}
provider "azurerm" {
features {}
}
locals {
resource_prefix = "b59-eus2-aichatui"
resource_group_name = "b59-eus2-aichatui-rg"
location = "eastus2"
}
resource azurerm_cognitive_account azureopenai {
name = "${local.resource_prefix}-oai"
resource_group_name = local.resource_group_name
location = local.location
kind = "OpenAI"
sku_name = "S0"
tags = {
project = "https://github.com/build5nines/AIChatUI"
}
}
The reason the azurerm_cognitive_account resource type is used is because the Azure OpenAI Service has grown out of what was once call Azure Cognitive Services, so the backend Azure ARM type is still Microsoft.CognitiveServices/accounts. There can be multiple types of cognitive services accounts in ARM, so for the Azure OpenAI Service the kind property needs to be set to OpenAI.
Another example of this can also be found as part of the Build5Nines/AIChatUI project that provides a simple UI written as a Node.js web app to show how to build your own application that integrates the use of Azure OpenAI Service and GPT 3.5 or GPT 4 for your own use.
Create GPT-35-Turbo Deployment in Azure OpenAI Service using Terraform
A Deployment within Azure Open AI Service is a deployment of an AI model; such as GPT-35-turbo. The Terraform azurerm provider doesn’t support this yet, but you can include the azapi provider to do it instead. Eventually, the azurerm should get support added for Deployments, but for now you can use azapi to get it done.
Here’s the Terraform provider definition from above with the azapi provider included:
# Terraform version requirements with AzureRM and AzAPI providers
terraform {
required_providers {
# AzureRM provider 3.x
azurerm = {
source = "hashicorp/azurerm"
version = "~>3"
}
azapi = {
source = "azure/azapi"
}
}
}
provider "azurerm" {
features {}
}
provider "azapi" {
}
The following Terraform code can be used to create / manage a GPT-35-turbo Deployment within Azure OpenAI Service:
resource "azapi_resource" azureopenaideployment {
type = "Microsoft.CognitiveServices/accounts/deployments@2023-05-01"
name = local.openai_deployment_name
parent_id = azurerm_cognitive_account.azureopenai.id
body = jsonencode({
properties = {
model = {
format = "OpenAI"
name = "gpt-35-turbo"
version = "0613"
}
versionUpgradeOption = "OnceCurrentVersionExpired"
raiPolicyName = "Microsoft.Default"
}
sku = {
capacity = 120
name = "Standard"
}
})
}
Create GPT-4 Deployment in Azure OpenAI Service using Terraform
A Deployment within Azure Open AI Service is a deployment of an AI model; such as GPT-4. Similar to the above example, the azapi Terraform provider can be used to create a GPT-4 Deployment within Azure OpenAI Service.
The following Terraform code can be used to create / manage a GPT-4 Deployment within Azure OpenAI Service:
resource "azapi_resource" azureopenaideployment {
type = "Microsoft.CognitiveServices/accounts/deployments@2023-05-01"
name = local.openai_deployment_name
parent_id = azurerm_cognitive_account.azureopenai.id
body = jsonencode({
properties = {
model = {
format = "OpenAI"
name = "gpt-4"
version = "1106-Preview"
}
versionUpgradeOption = "OnceCurrentVersionExpired"
raiPolicyName = "Microsoft.Default"
}
sku = {
capacity = 10
name = "Standard"
}
})
}
Happy chatting with your GPT-based LLM!
Original Article Source: Terraform: Create Azure OpenAI Service and GPT-4 / GPT-35-turbo Model Deployment written by Chris Pietschmann (If you're reading this somewhere other than Build5Nines.com, it was republished without permission.)

Microsoft Azure Regions: Interactive Map of Global Datacenters
Create Azure Architecture Diagrams with Microsoft Visio
Unlock GitHub Copilot’s Full Potential: Why Every Repo Needs an AGENTS.md File
IPv4 Address CIDR Range Reference and Calculator
Configuring GitHub Actions to Run Jobs on Specific Branches





Actually, the azurerm provider already supports the deployments using the cognitive_deployment resource. I have used it successfully in one case. https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/cognitive_deployment