Wednesday, 17 September 2025

Performing Terraform Import via Azure Pipelines for Existing Azure Resources

Importing state to terraform may be required when a resource is manually created previously, now need to be managed by terraform. Or it can be a situation where you Move code from one repo to another for reorganizing purpose and now you need to refer to exesting resource in Azure and map it to new repo terraform code. The state import can be performed with terraform import command manually. However, performin such task manually targeting production environments is not ideal and kind of impossible, in autmated deployment implementations. In this post let's discuss, a Azure pipeline task that can be used to perform the state imports in a rerunnable way.

Such task toupdate terraform state in Azure pipline should be placed between the terraform init and terraform plan tasks as shown below.


We can use a Azure CLI task and write a script in powershell to achive the terraform state import. 

Task can be setup with below shown settings and with an inline script. If using Linux build agents, use script type pscore instead of ps.


The below inline script provide base functionality. Existence of Azure resources can be verified before importing and feature to support skip exsistance, before state import also useful for certain resource types. If a resource is already available in terraform state it will not be imported again. This allows rerunnable task in Azure piplines.

$subscriptionId = '$(subscriptionid)';
$varFile = 'env.tfvars';            
$envName = '$(envname)';
$env = '$(env)';
$rgName = 'ch-demo-$(envname)-rg';            

$allTfResources = terraform state list;

Write-Host ('All resources in tf state:');
Write-Host '---------------------------------------------------';
Write-Host $allTfResources;
Write-Host '---------------------------------------------------';

function Test-AzResourceExists 
{
    param (
    [string]$resourceId
    )

    try {
        $resourceOutput = az resource show --ids $resourceId --output none 2>$null;
        Write-Host (-join('Resource exists: ',$resourceId));
        return $true
    }
    catch {
        Write-Host (-join('Resource not found: ',$resourceId)) -ForegroundColor Yellow;
        $global:LASTEXITCODE = 0
        return $false
    }
}

function Import-State {
    param (
    [string]$resourceName,
    [string]$resourceId,
    [string]$skipResourceCheck = 'False'
    )
              
    if ($allTfResources -contains $resourceName) 
    {
    Write-Host (-join($resourceName, ' already exists in terraform state. Skipping import.'));                
    }
    else 
    {                
    if ((Test-AzResourceExists -resourceId $resourceId) -or ($skipResourceCheck -eq 'True'))
    {
        terraform import -var-file $varFile $resourceName $resourceId;
    }
    }
    Write-Host '---------------------------------------------------';
}


Then in the same inline script, below the functions shown above you can call state import as shown below. The resource IDs can be copied from previous terraform plan execution logs easily by searching for resource name.
Import-State -resourceName "azurerm_resource_group.instancerg" -resourceId "/subscriptions/$subscriptionId/resourceGroups/$rgName";
Import-State -resourceName "azurerm_log_analytics_workspace.log_analytics_workspace" -resourceId "/subscriptions/$subscriptionId/resourceGroups/$rgName/providers/Microsoft.OperationalInsights/workspaces/ch-demo-$envName-log";
Import-State -resourceName "azurerm_monitor_workspace.instance_amw" -resourceId "/subscriptions/$subscriptionId/resourceGroups/$rgName/providers/Microsoft.Monitor/accounts/ch-demo-$envName-amw";
Import-State -resourceName "azurerm_monitor_data_collection_endpoint.dce" -resourceId "/subscriptions/$subscriptionId/resourceGroups/$rgName/providers/Microsoft.Insights/dataCollectionEndpoints/ch-demo-$envName-dce";
Import-State -resourceName "azurerm_monitor_data_collection_rule.prometheus_dcr" -resourceId "/subscriptions/$subscriptionId/resourceGroups/$rgName/providers/Microsoft.Insights/dataCollectionRules/ch-demo-$envName-prometheus-dcr";
Import-State -resourceName "azurerm_monitor_data_collection_rule.ci_dcr" -resourceId "/subscriptions/$subscriptionId/resourceGroups/$rgName/providers/Microsoft.Insights/dataCollectionRules/ch-demo-$envName-ci-dcr";
Import-State -resourceName "azurerm_user_assigned_identity.aks" -resourceId "/subscriptions/$subscriptionId/resourceGroups/$rgName/providers/Microsoft.ManagedIdentity/userAssignedIdentities/ch-demo-$envName-aks-uai";
Import-State -resourceName "azurerm_key_vault.instancekeyvault" -resourceId "/subscriptions/$subscriptionId/resourceGroups/$rgName/providers/Microsoft.KeyVault/vaults/ch-demo-$envName-kv";
Import-State -resourceName "azurerm_network_security_group.nsg" -resourceId "/subscriptions/$subscriptionId/resourceGroups/$rgName/providers/Microsoft.Network/networkSecurityGroups/ch-demo-$envName-nsg";

Some reources such as below does not allow to get via Azue cli for existance even though they have a resource id in terraform. In such cases we can use the -skipResourceCheck "True" to skip the check.
Import-State -resourceName "azurerm_monitor_diagnostic_setting.diag_nsg" -skipResourceCheck "True" -resourceId "/subscriptions/$subscriptionId/resourceGroups/$rgName/providers/Microsoft.Network/networkSecurityGroups/ch-demo-$envName-nsg|ch-demo-$envName-diag-nsg";

Some resources have dynamic guild in each env such as dev, qa and production. In this case use Azure pipline variable groups and have a variable stup so that forr each environment you can setup unique values. Such usage case is shown below.
Import-State -resourceName "azurerm_role_assignment.datareaderrole" -resourceId "/subscriptions/$subscriptionId/resourceGroups/$rgName/providers/Microsoft.Monitor/accounts/ch-demo-$envName-amw/providers/Microsoft.Authorization/roleAssignments/$(ztemp_sh_amw_datareader_roleid)";
Import-State -resourceName "azurerm_role_assignment.appconf_dataowner" -resourceId "/subscriptions/$subscriptionId/resourceGroups/$rgName/providers/Microsoft.AppConfiguration/configurationStores/ch-demo-$envName-appconfig-ac/providers/Microsoft.Authorization/roleAssignments/$(ztemp_sh_appconf_dataowner_roleid)";
Import-State -resourceName "azurerm_role_assignment.appconf_datareader_aks" -resourceId "/subscriptions/$subscriptionId/resourceGroups/$rgName/providers/Microsoft.AppConfiguration/configurationStores/ch-demo-$envName-appconfig-ac/providers/Microsoft.Authorization/roleAssignments/$(ztemp_sh_appconf_datareader_aks_roleid)";
Import-State -resourceName "azurerm_role_assignment.ops" -resourceId "/subscriptions/$subscriptionId/resourceGroups/$rgName/providers/Microsoft.Authorization/roleAssignments/$(ztemp_sh_rg_ops_roleid)";
Import-State -resourceName "azurerm_role_assignment.aks_dns_zone" -resourceId "/subscriptions/$subscriptionId/resourceGroups/ch-core-$env-network-rg/providers/Microsoft.Network/privateDnsZones/ch-demo-$envName.net/providers/Microsoft.Authorization/roleAssignments/$(ztemp_sh_aks_dns_zone_roleid)";

Variables can be defined in variable groups as shown below.


With above setup TF states will be imported as shown below via Azure pipeline task.




No comments:

Popular Posts