I recently worked on a project where I was required to write an Azure Function that would watch for newly stored files in Azure Blob Storage, copy the new files over to separate Azure Storage Account, and then delete the original file. Sure, you can use azcopy from the command-line to copy blobs between Azure Storage Accounts, but this needed to be an Azure Function and written in C# for the project I was working on.

The scenario of the project was that we had a Storage Account we had an external entity uploading files to that we needed to read from in our data processing solution. However, the main Azure Storage Account for the solution had the Firewall locked down to not allow any access to it from the Internet. With security in mind, we decided to create a source blob Azure Storage Account that had the firewall configured to allow the external entity to upload files to it. Then we would use this Azure Function to securely copy the newly uploaded Blobs over to our Internal-only accessible Azur Storage Account where the data processing solution would access the Blob files as necessary.

Using Azure Functions, C# Script, and the Azure.Storage.Blobs Nuget package I was able to write a new function that performed the following:

  1. Trigger the Function App to execute on new blobs in a source Azure Storage Account
  2. Copy the new blob from the source Azure Storage Account to a destination Azure Storage Account.
  3. Delete the blob from the source Azure Storage Account once the copy operation completed.

Triggering the Function App on New Blobs

There are a few methods to configure an Azure Function to get triggered on new blobs being created in an Azure Storage Account; such as Blob Trigger, Event Grid, etc. In this case, I chose to use a simpler approach using the Blob Trigger for the Function App. This utilized the long tested and verified feature of Azure Functions to be able to watch an Azure Storage Account Container for new blobs and automatically trigger execution of the Function App when new blobs are created within the Container.

Below is the JSON for the Azure Functions Binding definition within the function.json file that sets up the Function to get triggered on new blobs being saved in the source Azure Storage Account:

    "bindings": [
        "name": "inputBlob",
        "type": "blobTrigger",
        "direction": "in",
        "path": "containerName/{blobName}",
        "connection": "source_storage_connectionstring"

The source_storage_connectionstring is the Application Setting on the Function App that contains the Azure Storage Account Connection String used to connect to and authenticate with the Azure Storage Account. The path property defines the format for the Container (named containerName in this case) and the file name of the Blob (represented by {blobName}).

Copy Source Blob to Destination Azure Storage Account

For this case, I decided to write a C# Script (.csx) based Azure Function App. The reason I decided this was to make it easier to maintain the Function App code over time by it being more like “interpreted script” stored in a Git repository, rather than requiring Visual Studio or .NET compilation to deploy. You could write what ends up to be basically the same C# code using a compiled Azure Function if that is what your project requires.

To get started, I pulled in the Azure.Storage.Blobs Nuget package. Below is what the function.proj file for the Azure Function ends up looking like to include this Nuget package reference.

<Project Sdk="Microsoft.Net.Sdk">
        <PackageReference Include="Azure.Storage.Blobs" Version="12.11.0"/>

Then the run.csx C# script file for the Azure Function was written to contain the C# code using the Azure.Storage.Blobs namespace to perform the necessary steps copy the blob that triggered the Azure Function to execute from the source Azure Storage Account to the destination Azure Storage Account.

At the top of the C# code, these using statements need to be added to include the necessary namespaces that will be referenced:

using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using Azure.Storage.Sas;

The signature of the Run method of the Function will look similar to the following with the Blob Trigger parameters getting passed in:

public static async Task Run(Stream inputBlob, string blobName, ILogger log)

To copy the source blob to the destination, the following steps must be completed in code:

  1. Generate a SAS Token for the source blob that will be copied from.
  2. Create a BlobClient for the destination blob that will be copied to.
  3. Call the BlobClient.StartCopyFromUriAsync method to initiate copying the source blob to the destination blob.

NOTE: The BlobClient.StartCopyFromUriAsync method within the Azure.Storage.Blobs namespace performs the Blob copy operation by instructing the Azure Storage service to perform the copy operation. This means the Azure Function does not download and then upload the Blob contents. The copy operation is performed entirely within the Azure Storage service.

Below is the minimum necessary code (with some comments) for performing the mentioned steps for copying the source blob to the destination blob.

// retrieve the SOURCE and DESTINATION Storage Account Connection Strings
var sourceConnString = Environment.GetEnvironmentVariable("source_storage_connectionstring")
var destConnString = Environment.GetEnvironmentVariable("dest_storage_connectionstring")

// Create SOURCE Blob Client
var sourceBlobClient = new BlobClient(sourceConnString, "containerName", blobName);

// Generate SAS Token for reading the SOURCE Blob with a 2 hour expiration
vare sourceBlobSasToken = sourceBlobClient.GenerateSasUri(BlobSasPermissions.Read, DateTimeOffset.Now.AddHours(2));

// Create DESTINATION Blob Client
var destBlobClient = new BlobClient(destConnString, "containerName", blobName);

// Initiate Blob Copy from SOURCE to DESTINATION
await destBlobClient.StartCopyFromUriAsync(sourceBlobSasToken);

After the BlobClient.StartCopyFromUriAsync call is made, the source to destination blob copy operation will be completed by the Azure Storage Account. The Function could end execution and the copy will progress to completion. However, there are cases where it may error, and you’ll want to log that error. To do this you’ll need to monitor the copy operation progress until it either completes or errors.

Below is a code block that loops on a short timer to monitor the blob copy operation progress, log the status as it progresses, and then throw an exception if a failure occurs.

var destProps = destBlobClient.GetProperties().Value;
while(destProps.BlobCopyStatus == CopyStatus.Pending) {
  // Log copy operation status
  log.LogInformation($"Copy operation status: {destProps.CopyProgress}");
  // pause for 30 seconds before checking again
  System.Threading.Thread.Sleep(30 * 1000);
  // Check copy properties again for updated status
  destProps = destBlobClient.GetProperties().Value;

// Throw exception if a failure occurred
if (destProps.BlobCopyStatus == CopyStatus.Failed) {
  threw new Exception("Copy operation failed: {destProps.CopyStatusDescription}");

Delete Source Blob Once Copy is Complete

Once the source blob has been copied to the destination blob, I has a requirement to delete the source blob. So I needed to add a block of code after the finished copy operation check to delete the source blob. However, I only wanted to delete the source blob if copy operation succeeded just in case some other status was returned that was not CopyStatus.Success or CopyStatus.Failed; such as CopyStatus.Pending if the copy operation was possibly still running.

Below is a simple code blob that checks the blob copy operation status is CopyStatus.Success and then deletes the source blob if it is.

// check that copy operation was successful
if (destProps.BlobCopyStatus == CopyStatus.Success) {
  // Delete the SOURCE blob

If you’re finding yourself in a scenario where you are required to write an Azure Function to copy a Blob from one Azure Storage Account to another, I hope you find the code examples in this article useful.

Happy serverless coding!

Microsoft MVP

Chris Pietschmann is a Microsoft MVP, HashiCorp Ambassador, and Microsoft Certified Trainer (MCT) with 20+ years of experience designing and building Cloud & Enterprise systems. He has worked with companies of all sizes from startups to large enterprises. He has a passion for technology and sharing what he learns with others to help enable them to learn faster and be more productive.
HashiCorp Ambassador Microsoft Certified Trainer (MCT) Microsoft Certified: Azure Solutions Architect