Securely Upload (Backup) Files With PowerShell and Task Scheduler To An Azure Container
Problem
You have files that you need to regularly need to upload to an Azure Storage Account
New files are being generated regularly, thus they are probably backup files
You would like to automate this process with PowerShell
You are too cheap to pay for that Veeam Azure appliance to use the built-in tools in your backup solution
You cleverly don’t want to keep any passwords in plain text in your script for accessing the Storage Account
Solution
Below you will find a PowerShell script that you can configure to run as a scheduled task on your Windows Server
But there is some configuration you need to do in Azure before getting everything set up, which I will walk you through
Namely you need an App Registration, which will create a Service Principal who will have “Storage Blob Data Contributor” IAM/RBAC rights to the target container
This App Registration will be authorized for the container via a certificate, which can be self-signed and be installed on your server; no purchases required
And this prevents the not-so-great practice of leaving plaintext SAS keys or Client Secrets in your PowerShell scripts
I will also provide separate scripts for the creation of the certificate and the creation of the scheduled task
Step-By-Step
First you need to create your App Registration in Microsoft Entra ID
Just provide a name for the App Registration and leave everything else default
No special Tenant stuff and no redirect URLs
But do copy the name for future pasting as it does not appear in the Role Assignment lists like normal users do; you have to type it in fully
Now navigate over to your Storage Account and then find your container that you would like to use for this.
If you are creating all of this from scratch, just know that there are no special requirements for the Storage Account or container
I used the defaults when testing all this, including importantly leaving soft-delete ON
Open your container and navigate to Access Control (IAM) > Add
The App Registration needs “Storage Blob Data Contributer” rights to use the below PowerShell script
Note that I also have “Storage Blob Delegator” rights assigned to this App Registration as well
Further the same Delegator rights are assigned to the App Registration at the Storage Account level
This is the secret sauce if you want to automate the creation of Shared Access Signatures via an App Registration; it took me a while to figure out that these needed by at the Storage Account level as well
When you are adding the Role Assignment, no conditions are needed but you will have to explicitly type in the name of the App Registration when adding it as a Member; no helpful auto-fill like with real user accounts
The next step is generate (or buy) a certificate and install it on your Windows device, which is probably a server
Again, self-signed certs work fine here but if you won’t to automate multiple devices, purchasing a cheap one might not be a bad idea or you could do something fancy with Let’s Encrypt
Note that I am opting for a User cert rather than a system one, as this was for testing, but if you don’t stay logged into your server, then the a “Local Computer” cert under Personal would be required
This is also dependant on how you want your scheduled task to run, either under a user’s context or the SYSTEM account, which is not best practice as it has too many rights on the local system
Microsoft wants you to do this, I believe or maybe this
The PowerScripts below do not require admin rights, so I stuck with a local service account user context for this project
<#
.SYNOPSIS
Creates, installs, and exports a self-signed certificate for use as authorization method for automated PowerShell scripts using App Registrations / Service Principals
Note that the certificate will not be exportable as know .pfx file will be generated, as the cert is intended for local installs on workstations and servers,
that call PowerShell scripts as Scheduled Tasks to access Azure resources, particularly Storage Accounts
.EXAMPLE
Create-CertificateForAppRegistration -nameOfCertificate "testCert3" -validityLengthInMonths "24" -folderPath "C:\Scripts"
.INPUTS
[String]$nameOfCertificate - The name of the certificate (aka Subject)
[Int]$validityLengthInMonths - How long will the certficate be valid in months
[String]$folderPath - The path where certificate will be exported to as a .cer file
.OUTPUTS
A .cer certificate file which is also automatically installed under the "Personal" user's certificate store
#>
Function Create-CertificateForAppRegistration
{
[CmdletBinding()]
Param
(
[Parameter(Mandatory = $true)]
[String]$nameOfCertificate,
[Parameter(Mandatory = $true)]
[Int]$validityLengthInMonths,
[Parameter(Mandatory = $true)]
[String]$folderPath
)
New-SelfSignedCertificate -KeyExportPolicy NonExportable -Subject "CN=$("$nameOfCertificate")" `
-CertStoreLocation "Cert:\$("CurrentUser")\My" `
-NotAfter (Get-Date).AddMonths($($validityLengthInMonths)) -KeySpec "Signature"
$Subject = "CN="+$nameOfCertificate
$certificate = Get-ChildItem -Path "Cert:\CurrentUser\My" | Where-Object {$_.Subject -eq $Subject}
$filePath = $folderPath+"\"+$nameOfCertificate+".cer"
Export-Certificate -Cert $certificate -FilePath $filePath
# Add the self-signed certificate to the CurentUser's root certificate store.
$rootStore = [System.Security.Cryptography.X509Certificates.X509Store]::new("My","CurrentUser")
$rootStore.Open("ReadWrite")
$rootStore.Add($certificate)
$rootStore.Close()
}
Create-CertificateForAppRegistration -nameOfCertificate "testCert3" -validityLengthInMonths 24 -folderPath "C:\Scripts"
Now that you have your .cer certificate file, it needs to be uploaded to the App Registration under Certificates and Client Secrets
Now you are ready to test the big boy PowerShell script that will accomplish the uploading to the container
I won’t go into too much detail here as I tried to comment the script as well as I could, but just no that it uses the Set-AzStorageBlobContent and Remove-AzStorageBlob commands to upload and soft-delete (depending on your Storage Account settings)
It also can clean up (aka delete) older copies of your files (probably backups right?), but those bits are commented out for safety
I like error catching 🙂
Here you go as either a TXT file or the raw stuff:
Below it you will find the script to create the scheduled task
<#
.SYNOPSIS
Creates a certificate based connection to an Azure Storage Account container using an authorized App Registration as a Service Principal, then uploads any files (usually backup files)
newer than an inputted number of hours to the container while deleting older backup files both locally and from the container that are older than an inputted number of days
.EXAMPLE
Backup-ToAzureBlobContainer -certificateThumbprint "346a3702c8488c49eb14858cf8be8414002cc5dc" -tenantID "074437f3-ab5a-497a-a903-d5cee636e412" `
-applicationId "781e2021-c070-497b-8442-094f796ab4d0" -subscriptionID "b85263b5-ff7b-4b5a-ab91-16da2cee1863" -storageAccountName "somestorageaccount" `
-backupPath "C:\Backups" -containerName "somecontainer" -storageTier "Cool" -ageOfBackupFileInHours 6 -ageOfToBeDeletedBackupsInDays 30
.INPUTS
[String]$certificateThumbprint - The certificate's thumbprint which has been uploaded as a client secret to the App Registration / Service Principal. It can be a self-signed cert
as hosted on the backup server or a purchased signed cert
[String]$tenantID - The ID of the tenant which can be found in the App Registration's overview page in Azure
[String]$applicationId - The ID of the App Registration, again found in the App Registration's overview page in Azure
[String]$subscriptionID - The ID of the Subscription of the Storage Account
[String]$storageAccountName - The nme of the Storage Account
[String]$backupPath - The file path to where the backup files are stored
[Int]$ageOfBackupFileInHours - NEGATIVE INTEGER: How old in hours the backup file to be uploaded can be. It is used to only upload the most recent backup file
[Int]$ageOfToBeDeletedBackupsInDays - NEGATIVE INTEGER: How old in days should the oldest backup file(s) be deleted from the local storage and the Azure Storage Account container
[String]$containerName - The name of the container in the Azure Storage account (aka the destination for the uploaded files)
The App Registration / Service Principal needs to be assigned "Storage Blob Data Contributor" to this container
[String]$storageTier - In what storage tier should the files be saved (Hot, Cool, or Archive)
.OUTPUTS
None
#>
Function Backup-ToAzureBlobContainer
{
[CmdletBinding()]
Param
(
[Parameter(Mandatory = $true)]
[String]$certificateThumbprint,
[Parameter(Mandatory = $true)]
[String]$tenantID,
[Parameter(Mandatory = $true)]
[String]$applicationId,
[Parameter(Mandatory = $false)]
[String]$subscriptionID,
[Parameter(Mandatory = $true)]
[String]$storageAccountName,
[Parameter(Mandatory = $true)]
[String]$backupPath,
[Parameter(Mandatory = $true)]
[String]$loggingPath,
[Parameter(Mandatory = $true)]
[Int]$ageOfBackupFileInHours,
[Parameter(Mandatory = $false)]
[Int]$ageOfToBeDeletedBackupsInDays,
[Parameter(Mandatory = $true)]
[String]$containerName,
[Parameter(Mandatory = $true)]
[ValidateSet('Hot','Cool','Archive')]
[String]$storageTier
)
#Log the uploads for follow-up and troubleshooting
Start-Transcript -Path $loggingPath -Append
Write-Host "Script has started at $(Get-Date)"
Import-Module Az.Storage
#Connect to Azure using the App Registration / Service Principal and locally installed certificate
Try
{
Connect-AzAccount -CertificateThumbprint $certificateThumbprint -ApplicationId $applicationId -Tenant $tenantID -ServicePrincipal -Subscription $subscriptionID -Verbose
}
Catch
{
Write-Host "Encountered Error:"$_.Exception.Message
Write-Host "Check the certificate with thumbprint of $certificateThumbprint is installed properly in the local certificate store"
Write-Host "And that the certificate is registered as a client certificate in Azure on the service principal with the id of $applicationId"
}
Try
{
#Create a storage context using the "Storage Blob Data Contributor" role assigned to the Storage Account container
$context = New-AzStorageContext -StorageAccountName $storageAccountName -UseConnectedAccount -Verbose
}
Catch
{
Write-Host "Encountered Error:"$_.Exception.Message
Write-Host "Unable to create a storage context to the Azure Storage account of $storageAccountName"
Write-Host "Check that the service principal with ID of $applicationId has Storage Blob Data Contributor rights to the Container being accessed"
}
Try
{
#Get the locally or network stored backup files
#A Where-Object clause could be added here depending on how many backup files there are
# | Where{$_.LastWriteTime -le (GetDate).AddDays(-30)} for instance
$backupFiles = Get-ChildItem -Path $backupPath -Verbose
If($backupFiles -eq $null)
{
Throw
}
}
Catch
{
Write-Host "Encountered Error:"$_.Exception.Message
Write-Host "Unable to access backup files; is the file path of $backupPath correct?"
Write-Host "Or is the folder empty?"
Write-Host "Or does the script's account have access to the folder $backupPath; whether permissions or network problems?"
}
#Loop through the backup files and look for files older than 30 days then remove them
Foreach($backupFile in $backupFiles)
{
If((Test-Path $backupFile.fullName -OlderThan (Get-Date).AddDays(-$ageOfToBeDeletedBackupsInDays)) -and $ageOfToBeDeletedBackupsInDays > 0)
{
Try
{
#COMMENTED OUT FOR SAFETY ↓
#Remove-Item -Path $backupFile.fullName -Force -Verbose
}
Catch
{
Write-Host "Encountered Error:"$_.Exception.Message
Write-Host "Unable to delete backup file $backupFile.fullName which is older than 30 days"
Write-Host "Does the script or account running the script have the necessary rights to delete the backup file?"
}
}
#If the file is newer than an inputted negative number of hours, upload it to the Storage Account container
ElseIf(Test-Path $backupFile.fullName -NewerThan (Get-Date).AddHours($ageOfBackupFileInHours))
{
Try
{
Set-AzStorageBlobContent -File $backupFile.fullName -Container $containerName -Blob $BackupFile.Name -Context $context -StandardBlobTier $storageTier -Verbose
}
Catch
{
Write-Host "Encountered Error:"$_.Exception.Message
Write-Host "Unable to upload the backup file $backupFile.fullName to the Azure storage account container: $containerName"
}
}
}
Try
{
#Get the backup files already uploaded to the Storage Account container and delete them if they are older than an inputted negative number of days
$cloudBackupFiles = Get-AzStorageBlob -Container $containerName -Blob * -Context $context
If($cloudBackupFiles -eq $null)
{
Throw
}
Foreach($cloudBackupFile in $cloudBackupFiles)
{
If($cloudBackupFile.LastModified.DateTime -lt (Get-Date).AddDays($ageOfToBeDeletedBackupsInDays))
{
#COMMENTED OUT FOR SAFETY ↓
#Remove-AzStorageBlob -Container $containerName -Blob $cloudBackupFile.Name -Context $context -Verbose
}
}
}
Catch
{
Write-Host "Encountered Error:"$_.Exception.Message
Write-Host "Unable to access the backup files in the container; is the file path of $backupPath correct?"
Write-Host "Check that the service principal with ID of $applicationId has Storage Blob Data Contributor rights to the Container being accessed"
}
#Stop logging
Write-Host "Script has finished at $(Get-Date)"
Stop-Transcript
}
Backup-ToAzureBlobContainer -certificateThumbprint "xxxxxxxc8488c49eb14858cf8be8414002cc5dc" -tenantID "xxxxxxx-ab5a-497a-a903-d5cee636e412" `
-applicationId "xxxxxxx-c070-497b-8442-094f796ab4d0" -subscriptionID "xxxxxxxx-ff7b-4b5a-ab91-16da2cee1863" -storageAccountName "somestorageaccountname" `
-backupPath "C:\Backups" -loggingPath "C:\Scripts" -containerName "somecontainername" -storageTier "Cool" -ageOfBackupFileInHours -6 -ageOfToBeDeletedBackupsInDays -30
Finally we need to create a scheduled task to run the above script and of course the task should run after the backups are completed or whatever is generating the files you want to send to the container
You have to run this task with administrator writes due to it being an unattended task
<#
.SYNOPSIS
Create a daily scheduled task in Windows Task Scheduler for running PowerShell scripts in a user's context (hopefully a service account)
.EXAMPLE
Create-ScheduledTaskForPowerShell -timeOfDay 15:00 -pathToScript "C:\Backups\Backup-ToAzureBlob.ps1" -directoryOfTheScript "C:\Backups" `
-userName "Lappy\TestUser" -nameOfScheduledTask "BackupToAzure"
.INPUTS
[DateTime]$timeOfDay - A time of the day when the scheduled task should run; it accepts AM/PM times or 24 clock times (use the 24 hour clock you savages!)
[String]$pathToScript - the full path to the script including the script's file name
[String]$directoryOfTheScript - The path where certificate will be exported to as a .cer file; no trailing backslash needed e.g. ("C:\Backups")
[String]$userName - The name of the user who will run the script (hopefully a service account, whether local or domain
include the FQDN e.g. domainName\userName or serverName\userName
[String]$nameOfScheduledTask - The name of the scheduled task
.OUTPUTS
A scheduled task in Windows Task Scheduler
#>
Function Create-ScheduledTaskForPowerShell
{
[CmdletBinding()]
Param
(
[Parameter(Mandatory = $true)]
[DateTime]$timeOfDay,
[Parameter(Mandatory = $true)]
[String]$pathToScript,
[Parameter(Mandatory = $true)]
[String]$directoryOfTheScript,
[Parameter(Mandatory = $true)]
[String]$userName,
[Parameter(Mandatory = $true)]
[String]$nameOfScheduledTask
)
$taskTrigger = New-ScheduledTaskTrigger -Daily -At 15:00 #Use the 24 hour clock here you savages!
$taskAction = New-ScheduledTaskAction -Execute "PowerShell" -Argument "-NoProfile -ExecutionPolicy Bypass -File '$pathToScript' -Output 'HTML'" -WorkingDirectory $directoryOfTheScript
$Settings = New-ScheduledTaskSettingsSet -DontStopOnIdleEnd -RestartInterval (New-TimeSpan -Minutes 1) -RestartCount 10 -StartWhenAvailable
$Settings.ExecutionTimeLimit = "PT0S"
$SecurePassword = $password = Read-Host -AsSecureString
$Credentials = New-Object System.Management.Automation.PSCredential -ArgumentList $userName, $SecurePassword
$Password = $Credentials.GetNetworkCredential().Password
Register-ScheduledTask $nameOfScheduledTask -Settings $Settings -Action $taskAction -Trigger $taskTrigger -User $UserName -Password $Password -RunLevel Highest
}
Create-ScheduledTaskForPowerShell -timeOfDay 15:00 -pathToScript "C:\Backups\Backup-ToAzureBlob.ps1" -directoryOfTheScript "C:\Backups" -userName "Lappy\TestUser" -nameOfScheduledTask "BackupToAzure"