In this hands-on lab, you will construct an end-to-end Proof of concept for ticket ordering based on a microservices architecture based on Service Fabric and Azure Functions, alongside other supporting Azure features such as API Management, Web Apps, Azure Active Directory and Cosmos DB.

At the end of this hands-on lab, you will better be able to build solutions that leverage these Azure features, in addition to gaining experience with deploying, scaling, upgrading and rolling back Service Fabric applications.



Abstract and Learning Objectives

In this hands-on lab, you will construct an end-to-end Proof of concept for ticket ordering based on a microservices architecture based on Service Fabric and Azure Functions, alongside other supporting Azure features such as API Management, Web Apps, Azure Active Directory and Cosmos DB.

At the end of this hands-on lab, you will better be able to build solutions that leverage these Azure features, in addition to gaining experience with deploying, scaling, upgrading and rolling back Service Fabric applications.

Microservices Architecture Hands-on Lab: Developer Edition - Microsoft Cloud Workshop 1
Microsoft Cloud Workshop

Overview

Contoso Events is an online service for concerts, sporting and other event ticket sales. They are redesigning their solution for scale with a microservices strategy and want to implement a POC for the path that receives the most traffic: ticket ordering.

In this hands-on lab, you will construct an end-to-end POC for ticket ordering. You will leverage Service Fabric, API Management, Function Apps, Web Apps, and Cosmos DB.

Microservices Architecture Hands-on Lab: Developer Edition - Microsoft Cloud Workshop 2
Microservices Architecture MCW Overview Diagram

Solution Architecture

The following figures are intended to help you keep track of all the technologies and endpoints you are working with in this hands-on lab. The first figure is the overall architecture, indicating the Azure resources to be employed. The second figure is a more detailed picture of the key items you will want to remember about those resources as you move through the exercises.

Microservices Architecture Hands-on Lab: Developer Edition - Microsoft Cloud Workshop 3
MCW Microservices Architecture – Solution Architecture – Diagram 1
Microservices Architecture Hands-on Lab: Developer Edition - Microsoft Cloud Workshop 4
MCW Microservices Architecture – Solution Architecture – Diagram 2

Before the Hands-on Lab Setup Guide

Before you go on through the full Azure Blockchain Hands-on Lab, please go through these initial setup steps. These will setup the environment that you will need to run through the Hands-on Lab. This setup process should take you approximately 50 minutes to go through.

Requirements

  1. Microsoft Azure subscription must be pay-as-you-go or MSDN

    • Trial subscriptions will not work.
  2. A virtual machine configured with:

    • Visual Studio 2019 Community edition, or later

    • Azure Development workload enabled in Visual Studio 2019 (enabled by default on the VM)

    • Service Fabric SDK 3.3 or later for Visual Studio

    • Google Chrome browser (Swagger commands do not work in IE)

    • Docker for Windows

    • PowerShell 3.0 or higher (v5.1 already installed on VM)



Task 1: Provision Service Fabric Cluster

In this task, you will provision the Service Fabric Cluster in Azure.

  1. In the Azure portal, select +Create a Resource, then type "Service Fabric" into the Search the Marketplace box. Select Service Fabric Cluster from the results.

  2. On the Service Fabric Cluster blade, select Create.

  3. On the Basics blade of the Create Service Fabric cluster screen, enter the following:

  • Cluster name: Enter contosoeventssf-SUFFIX, replacing SUFFIX with your alias, initials, or another value to make the name unique (indicated by a green check in the text box).

  • Operating system: Set to UbuntuServer 16.04 LTS

  • Username: Enter holuser.

  • Password: Enter Password.1!!

  • Subscription: Select the subscription you are using for this lab.

  • Resource Group: Select Create new, and enter hands-on-lab for the resource group name. You can add -SUFFIX, if needed to make resource group name unique. This is the resource group you will use for all resources you create for this hands-on lab.

  • Location: Select the region to use. Select the closest region to your current location.

  • Select OK.

  1. On the Cluster configuration blade, set the following:
  • Node type count: Select 1.

  • Node type 1 (Primary): Select to configure required settings. On the Node type configuration blade enter:

    • Node type name: Enter Web.

    • Durability tier: Leave Bronze selected.

    • Virtual machine size: Select a VM size of D1_V2 Standard and select Select on the Choose a size blade.

    • Single node cluster: Leave unchecked.

    • Initial VM scale set capacity: Leave set to 5.

    • Custom endpoints: Enter 8082, 8083. This will allow the microservices to be accessible through the cluster.

    • Configure advanced settings: Leave unchecked.

    • Select OK on the Node type configuration blade.

    • Select OK on the Cluster configuration blade.

  1. On the Security blade, you can provide security settings for your cluster. This configuration is completed up front, cannot be changed later. Set the following:
  • Configuration Type: Leave "Basic" selected.

  • Key vault: Select to configure required settings. On the Key vault configuration blade select "Create a new vault".

  • On the "Create key vault" configuration blade enter:

    • Name: hands-on-lab-SUFFIX

    • Resource Group: Select "Create new" and set the name as hands-on-lab

    • Location: Use the same location as the first resource group you created.

  • Select "Create" on the Create key vault configuration blade. Wait for the key vault deployment to complete.

  • When the key vault deployment completes you will return to the Security configuration blade. You will see a warning that the key vault is not enabled for deployment. Follow these steps to resolve the warning:

    • Choose "Edit access policies for hands-on-lab-SUFFIX".

    • In the Access policies configuration blade, choose the link "Click to show advanced access policies".

    • Check the "Enable access to Azure Virtual Machines for deployment" checkbox.

    • Choose "Save". When the key vault update completes, close the Access policies blade.

    • Enter hands-on-lab-SUFFIX as the certificate name. Then choose OK on the Security configuration blade.

  1. On the Summary blade, review the summary, and select Create to begin provisioning the new cluster.

  2. It can take up to 30 minutes or more to provision your Service Fabric Cluster. You can move on to the next task while you wait.

Note: If you experience errors related to lack of available cores, you may have to delete some other compute resources, or request additional cores to be added to your subscription, and then try this again.

Task 2: Provision a lab virtual machine (VM)

In this task, you will provision a virtual machine (VM) in Azure. The VM image used will have Visual Studio Community 2019 installed.

  1. Launch a web browser and navigate to the Azure portal.

  2. Select +Create a Resource, then type "Visual Studio 2019" into the search bar. Select Visual Studio 2019 Latest from the dropdown list.

  3. On the Visual Studio 2019 Latest blade, select Visual Studio 2019 Community (latest release) on Windows 10 Enterprise N (x64) as the software plan and select Create.

  4. Set the following configuration on the Basics tab:

    • Subscription: Select the subscription you are using for this lab.

    • Resource group: Select Use existing, and select the hands-on-lab resource group created previously.

    • Virtual machine name: Enter LabVM

    • Region: Select the region you are using for resources in this lab.

    • Availability options: Leave No infrastructure redundancy required selected.

    • Image: Leave Visual Studio 2019 Community (latest release) on Windows 10 Enterprise N (x64) selected.

    • Username: Enter holuser

    • Password: Enter Password.1!!

  5. Select Change size.

  6. On the Select a VM Size blade, enter d4 into the search text field. Then, click the Select button to return to the Create a virtual machine blade.

  7. Within the INBOUND PORT RULES section, select the Allow selected ports option, then choose RDP (3389) from the Select inbound ports dropdown list.

  8. Accept all the remaining default values on the Basic blade and select Review + create.

  9. Select Create on the Create a virtual machine blade to provision the virtual machine.

Note: It may take 10+ minutes for the virtual machine to complete provisioning.

Task 3: Connect to your lab VM

In this step, you will open an RDP connection to your Lab VM and disable Internet Explorer Enhanced Security Configuration.

  1. Connect to the Lab VM (If you are already connected to your Lab VM, skip to Step 9).

  2. From the side menu in the Azure portal, select Virtual machines.

  3. In the Virtual machines blade, select LabVM from the list.

  4. In the LabVM blade, select Connect from the top menu.

  5. Download and open the RDP file.

  6. Select Connect on the Remote Desktop Connection dialog.

  7. Enter the following credentials (or the non-default credentials if you changed them):

    a. Username: Enter holuser

    b. Password: Enter Password.1!!

  8. Select Yes to connect, if prompted that the identity of the remote computer cannot be verified.

Task 4: Install Chrome on LabVM

In this task, you will install the Google Chrome browser on your Lab VM.

  1. On your Lab VM, open a web browser, and navigate to https://www.google.com/chrome/browser/desktop/index.html, and select Download Chrome.

  2. Select Accept and Install on the terms of service screen.

  3. Select Run on the Application Run — Security Warning dialog.

  4. Select Run again, on the Open File — Security Warning dialog.

  5. Once the Chrome installation completes, a Chrome browser window should open. For ease, you can use the instructions in that window to make Chrome your default browser.

Task 5: Install Docker for Windows

In this task, you will install Docker for Windows on your Lab VM.

  1. On your Lab VM, open a browser and navigate to: https://download.docker.com/win/stable/Docker for Windows Installer.exe.

  2. If prompted, select Save File to download the installer on the Lab VM.

  3. When finished, open the folder where the file was downloaded.

  4. Double-click the Docker for Windows Installer.exe file in order to run the installer.

  5. Follow the instructions to install the application.

  6. Once the Docker for Windows installation completes, select the Close and log out button. This action will log out the current session.

  7. Reconnect to the LabVM virtual machine by repeating the step 5 in Task 3.

  8. When prompted, select Ok on the Docker Desktop dialog box that asks you if you want to enable Hyper-V and Containers features. This action will restart the virtual machine.

  9. Reconnect to the LabVM virtual machine by repeating the step 5 in Task 3.

  10. Wait for Docker for Windows to start. You can see its status on the icon in the tray bar. When Docker starts successfully, it will display the Welcome window.

Task 6: Install Service Fabric SDK for Visual Studio

In this task, you will install the latest Service Fabric SDK for Visual Studio on your Lab VM.

  1. On your Lab VM, open a browser, and navigate to: https://docs.microsoft.com/azure/service-fabric/service-fabric-get-started.

  2. Scroll down on the page to the Install the SDK and tools section and select Install the Microsoft Azure Service Fabric SDK under the To use Visual Studio 2017 heading. Regardless of the heading, it can be installed on Visual Studio 2019.

  3. Run the downloaded executable and select Install in the Web Platform Installer screen.

  4. On the Prerequisites screen, select I Accept.

  5. Select Finish when the install completes.

  6. Select Exit on the Web Platform installer to close it.

  7. Restart the VM to complete the installation and start the local Service Fabric cluster service.

Task 7: Setup Service Fabric certificate

When you create a new Service Fabric Cluster using the portal, a secure cluster is deployed. In order to later on be able to make use of it, a certificate setup is required.

In this task, you will download the required certificate and install it on your Lab VM.

  1. In the Azure portal, navigate to the Resource Group you created previously and where you created the Key vault that supports the cluster.

  2. Select the key vault from the list of resources in the resource group.

  3. Under the Settings category in the menu, select Certificates and then select the existing certificate.

  4. Select the Current Version of the existing certificate.

  5. In the certificate information blade, select Download in PFX/PEM format and save the certificate.

  6. Copy the downloaded certificate into the Lab VM.

  7. On the Lab VM, double-click the copied certificate to initiate its installation. Select Local Machine as the Store Location and select Next.

  8. Select Next.

  9. Select Next.

  10. Select Next.

  11. Select Finish.

  12. When the import finishes successfully, select OK.

  13. On the Lab VM, double-click the copied certificate once again to initiate its installation. Select Current User as the Store Location and select Next.

  14. Select Next.

  15. Select Next.

  16. Select Next.

  17. Select Finish.

  18. When the import finishes successfully, select OK.

Task 8: Validate Service Fabric ports

Occasionally, when you create a new Service Fabric Cluster using the portal, the ports that you requested are not created. This will become evident when you try to deploy and run the Web App, because the required ports will not be accessible through the cluster.

In this task, you will validate that the ports are open and if not, fix the issue.

  1. In the Azure portal, navigate to the Resource Group you created previously, and where you created the cluster. If your Service Fabric cluster is still deploying, do not proceed to the next step until the deployment is completed.

  2. Select the load balancer from the list of resources in the resource group.

  3. Under the Settings category in the menu, select Health probes.

  4. Verify if a probe exists for ports 8082 and 8083, and that it is "Used By" a load balancing rule. If both of these are true, you can skip the remainder of this task. Otherwise, proceed to the next step to create the probe and load-balancing rule.

  5. Select +Add on the Health probes blade.

  6. On the Add health probe blade, enter the following:

    • Name: Enter WebApiPortProbe.

    • Protocol: Select TCP.

    • Port: Enter 8082.

    • Interval: Leave the default value.

    • Unhealthy threshold: Leave the default value.

    • Select OK to create the probe.

  7. Once the Health probe is added (this can take a few minutes to update), you will create a rule associated with this probe. Under the Settings block in the menu, select Load balancing rules.

  8. Select +Add on the Load balancing rules blade.

  9. On the Add Load balancing rules blade, enter the following:

    • Name: Enter LBWebApiPortRule.

    • IP Version: Leave IPv4 selected.

    • Frontend IP address: Leave the default value selected.

    • Protocol: Leave as TCP.

    • Port: Set to 8082.

    • Backend port: Set to 8082.

    • Backend pool: Leave the default value selected.

    • Health probe: Select the WebApiPortProbe you created previously.

    • Leave the default values for the remaining fields, and Select OK.

  10. If you get an error notification such as "Failure to create probe", ignore this, but just go check that the probe indeed exists. It should. You now have a cluster ready to deploy to and expose 8082 as the Web API endpoint / port.

  11. If necessary, repeat the same steps to create the probe for port 8083.

You should follow all steps provided before performing the Hands-on lab.

Developer Edition Hands-on Lab Guide

Exercise 1: Environment setup

Duration: 30 minutes

Contoso Events has provided a starter solution for you. They have asked you to use this as the starting point for creating the Ticket Order POC solution with Service Fabric.

Because this is a "born in Azure" solution, it depends on many Azure resources. You will be guided through creating those resources before you work with the solution in earnest. The following figure illustrates the resource groups and resources you will create in this exercise.

Microservices Architecture Hands-on Lab: Developer Edition - Microsoft Cloud Workshop 5
MCW Microservices Architecture – Environment Setup Diagram

Task 1: Download and open the ContosoEventsPoC starter solution

  1. On your Lab VM, open a browser to https://github.com/Microsoft/MCW-Microservices-architecture. Click the Clone or download link and then select Download ZIP

  2. Unzip the contents to the folder C:\.

  3. Locate the solution file (C:\MCW-Microservices-architecture-master\Source\ContosoEventsPoC-DeveloperEdition\Src\ContosoEventsPOC.sln), and double-click it to open it with Visual Studio 2019.

  4. If prompted about how you want to open the file, select Visual Studio 2019, and select OK.

  5. Log into Visual Studio or set up an account, when prompted.

  6. If presented with a security warning, uncheck Ask me for every project in this solution, and select OK.

  7. If you are missing any prerequisites (listed under Requirements above), you may be prompted to install these at this point.

  8. Verify your Visual Studio version is 16.1.0 or higher.

    a. Click Help in the menu, then select About Microsoft Visual Studio.

    b. If the version is not 16.1.0, you will need to update it. Click OK, then click View in the menu. Select Notifications, you should see an entry for Visual Studio Update is available*. Select it and then click Update to update your instance.

  9. Before you attempt to compile the solution, set the configuration to x64 by selecting it from the Solution Platforms drop down in the Visual Studio toolbar.

  10. Build the solution, by selecting Build from the Visual Studio menu, then selecting Build Solution.

Task 2: API Management

In this task, you will provision an API Management Service in the Azure portal.

  1. In the Azure portal, select +Create a resource, enter API Management into the Search the Marketplace box, then select API management from the results.

  2. In the API Management blade, select Create.

  3. In the API Management service blade, enter the following:

    a. Name: Enter a unique name, such as contosoevents-SUFFIX.

    b. Subscription: Choose your subscription.

    c. Resource group: Select Use existing, and select the hands-on-lab resource group you created previously.

    d. Location: Select the same region used for the hands-on-lab resource group.

    e. Organization name: Enter Contoso Events.

    f. Administrator email: Enter your email address.

    g. Pricing tier: Select Developer (No SLA).

    h. Enable Application Insights: Leave unchecked.

    i. Select Create.

  4. After the API Management service is provisioned, the service will be listed in the Resource Group. This may take 10-15 minutes, so move to Task 3 and return later to verify.

Task 3: Web App

In these steps, you will provision a Web App in a new App Service Plan.

  1. Select +Create a resource in the Azure Portal, select Web, then select Web App.

  2. On the Create Web App blade, enter the following:

    a. Subscription: Select your subscription.

    b. Resource group: Select the hands-on-lab resource group created previously.

    c. Name: Enter a unique name, such as contosoeventsweb-SUFFIX

    d. Publish: Select Code

    e. Runtime stack: Select ASP.NET V4.7

    f. Operating System: Select Windows.

    g. Region: Select the same location you have been using for other resources in this lab.

    h. Windows Plan / Sku and size: Leave the default values.

    i. Select Review and create, and then select Create to provision the Web App.

  3. You will receive a notification in the Azure portal when the Web App deployment completes. From this, select Go to resource.

  4. On the Web App Overview blade, you can see the URL used to access your Web App. If you select this, it will open an empty site, indicating your App Service app is up and running.

Task 4: Function App

In this task, you will provision a Function App using a Consumption Plan. By using a Consumption Plan, you enable dynamic scaling of your Functions.

  1. Select +Create a resource in the Azure Portal, and enter "Function App" in the Search the Marketplace box, then select Function App from the results.

  2. Select Create on the Function App blade.

  3. On the Create Function App blade, enter the following:

    a. App name: Enter a unique name, such as contosoeventsfn-SUFFIX.

    b. Subscription: Select your subscription.

    c. Resource group: Select Use existing, and select the hands-on-lab resource group created previously.

    d. OS: Select Windows.

    e. Hosting Plan: Select Consumption Plan.

    f. Location: Select the same location as the hands-on-lab resource group.

    g. Runtime Stack: .NET Core

    h. Storage: Leave Create new selected, and accept the default name.

    i. Application Insights: Click to open the Application Insights blade. Select Disable and then select Apply.

    j. Select Create to provision the new Function App.

Task 5: Storage account

In this section, you will create a Storage account for the application to create and use queues required by the solution.

  1. In the Azure portal, select +Create a resource to open the New blade. Select the Storage option from the menu, then select Storage account under Featured.

  2. In the Create Storage account blade Basics Tab, enter the following:

    a. Subscription: Select your subscription.

    b. Resource group: Select the existing hands-on-lab resource group previously created.

    c. Name: Enter a unique name, such as contosoeventsSUFFIX. Please note that the field can contain only lowercase letters and numbers.

    d. Location: Select the same location as the hands-on-lab resource group.

    e. Performance: Select Standard.

    f. Account kind: Select Storage (general purpose v1).

    g. Replication: Select Locally-redundant storage (LRS).

    h. Select Next : Advanced >.

  3. Within the Advanced Tab:

    a. Set Secure transfer required to Disabled, and then select Review + create tab.

    b. Select Create in the Review + create tab.

Task 6: Cosmos DB

In this section, you will provision a Cosmos DB account, a Cosmos DB Database and a Cosmos DB collection that will be used to collect ticket orders.

  1. In the Azure portal, select +Create a resource to open the New blade. Select the Databases option from the menu, then select Azure Cosmos DB.

  2. On the Create Azure Cosmos DB Account blade, enter the following:

    a. Subscription: Select your subscription.

    b. Resource group: Select the hands-on-lab resource group previously created.

    c. Account Name: Enter a unique value, such as contosoeventsdb-SUFFIX.

    d. API: Select Core (SQL).

    e. Location: Select the location used for the hands-on-lab resource group. If this location is not available, select one close to that location that is available.

    f. Geo-Redundancy: Select Disable.

    g. Multi-region Writes: Select Disable.

    h. Select Review + create.

    i. Select Create to provision the Cosmos DB after the validation has passed.

  3. When the Cosmos DB account is ready, select Azure Cosmos DB from the portal menu, and select your Cosmos DB account from the list.

  4. On the Cosmos DB account blade, under Containers in the side menu, select Browse.

  5. On the Browse blade, select +Add Container.

  6. On the Add Container dialog, enter the following:

    a. Database id: Select Create new and enter TicketManager.

    b. Provision database throughput: Leave unchecked.

    c. Container id: Enter Orders.

    d. Partition key: /EventId

    e. My partition key is larger than 100 bytes: Leave unchecked.

    f. Throughput: Enter 2500.

    g. Select OK to create the new container.

  7. Select New Container from the screen that appears.

  8. In the Add Container dialog, enter the following:

    a. Database id: Select Use existing and then select TicketManager.

    b. Container id: Enter Events.

    c. Partition key: /id

    d. Throughput: Enter 2500.

    e. Select OK to create the new container.

  9. You will be able to see that the two containers exist in the new database.

Task 7: Container Registry

In this section, you will provision an Azure Container Registry resource, that will be used to store the microservices images.

  1. In the Azure portal, select +Create a resource to open the New blade. Select the Containers option from the menu, then select Container Registry.

  2. In the Create container registry blade, set the following property values

    a. Registry name: Enter a unique name, such as contosoeventsSUFFIX.

    b. Subscription: Select your subscription.

    c. Resource group: Select the existing hands-on-lab resource group previously created.

    d. Location: Select the same location as the hands-on-lab resource group.

    e. Admin user: Select Enable.

    f. SKU: Select Standard

    g. Select Create.

Exercise 2: Microservices containerization

Duration: 30 minutes

In this exercise, you will create the Docker images for each of the microservices in the solution.

Task 1: Inspect the Dockerfile files

  1. In Visual Studio, open the Dockerfile located in the ContosoEvents.Api.Events project.

  2. The Dockerfile contains all the required steps to build the Docker image for the Events microservice. As you can see in the first line of code, it is based on the mcr.microsoft.com/dotnet/core/aspnet:2.2-stretch-slim Linux-based Docker image. On the other hand, it uses the mcr.microsoft.com/dotnet/core/sdk:2.2-stretch image to compile and publish the Events microservice project.

  3. Open the Dockerfile located in the ContosoEvents.Api.Orders project.

  4. Likewise, this Dockerfile is based on the mcr.microsoft.com/dotnet/core/aspnet:2.2-stretch-slim Linux-based Docker image. It also uses the mcr.microsoft.com/dotnet/core/sdk:2.2-stretch image to compile and publish the Orders microservice project.

Task 2: Build the Docker images

  1. Open a Command Prompt by clicking the Start menu then typing cmd.

  2. Change the directory to C:\MCW-Microservices-architecture-master\Source\ContosoEventsPoC-DeveloperEdition\Src.

  3. In the Src folder as the root, type the following command in the Command Prompt: docker build --tag [YOUR CONTAINER REGISTRY].azurecr.io/contosoevents-events:latest --file ContosoEvents.Api.Events\Dockerfile . Replace [YOUR CONTAINER REGISTRY] with the container registry name that you selected in Exercise 1, Task 7, Step 2. Press Enter to execute the command and start the Docker container image creation process for the Events microservice.

  4. Still in the Src folder as the root, type the following command in the Command Prompt: docker build --tag [YOUR CONTAINER REGISTRY].azurecr.io/contosoevents-orders:latest --file ContosoEvents.Api.Orders\Dockerfile . Replace [YOUR CONTAINER REGISTRY] with the container registry name that you selected in Exercise 1, Task 7, Step 2. Press Enter to execute the command and start the Docker container image creation process for the Orders microservice.

Task 3: Inspect the Docker images

  1. In the Command Prompt, type the following command: docker images and press Enter to execute.

  2. The images parameter of the docker tool will show all the images stored in your local repository. At this point, you should have both [YOUR CONTAINER REGISTRY].azurecr.io/contosoevents-events:latest and [YOUR CONTAINER REGISTRY].azurecr.io/contosoevents-orders:latest images at a minimum.

Exercise 3: Push the container images to Azure Container Registry

Duration: 15 minutes

Task 1: Logon into the Container Registry

  1. In the Azure Portal, select the Resource groups option in the side menu, then select the hands-on-lab resource group from the list.

  2. In the hands-on-lab blade, select the contosoeventsSUFFIX container registry that you created on Exercise 1, Task 7. For easy access, you can use the filter dropdown list and select only ‘ Container registries’ option in the list.

  3. In the Container registry blade, select the Access keys from the menu.

  4. In the Access keys blade, copy both the username and password in a Notepad. You will use this credential values to login on the Azure Container Registry from Docker.

  5. Open a Command Prompt window.

  6. Type the following command docker login [YOUR CONTAINER REGISTRY].azurecr.io. Replace [YOUR CONTAINER REGISTRY] with the name that you selected in Exercise 1, Task 7. Press Enter to execute the command.

  7. When prompted, paste the username and password to login.

Task 2: Push the images to the Container Registry

  1. Type the following command docker push [YOUR CONTAINER REGISTRY].azurecr.io/contosoevents-events:latest. Replace [YOUR CONTAINER REGISTRY] with the name that you selected in Exercise 1, Task 7. Press Enter to execute the command.

  2. Repeat this same process for the orders image, by typing the following command and then pressing Enter: docker push [YOUR CONTAINER REGISTRY].azurecr.io/contosoevents-orders:latest

Task 3: Inspect Container Registry in the Azure Portal

  1. In the Azure Portal, select the Resource groups option in the side menu, then select the hands-on-lab resource group from the list.

  2. In the hands-on-lab blade, select the contosoeventsSUFFIX container registry that you created on Exercise 1, Task 7. For easy access, you can use the filter dropdown list and select only ‘ Container registries’ option in the list.

  3. In the Container registry blade, select the Repositories option from the side menu to display the container images uploaded in the Azure Container Registry account.

Exercise 4: Publish the Service Fabric Application

Duration: 15 minutes

In this exercise, you will publish the Service Fabric Application to the Azure cluster you created previously.

Task 1: Inspect the Cosmos DB account properties

  1. In Azure Portal, navigate to the hands-on-lab resource group by using the Resource groups menu item in the side menu.

  2. In the hands-on-lab blade, locate and select the Azure Cosmos DB account that you created previously with the name contosoeventsdb-SUFFIX.

  3. In the Cosmos DB account blade, select the Keys option from the menu. In the Keys blade, copy the URI and PRIMARY KEY values in a Notepad.

Task 2: Inspect the Storage account properties

  1. In Azure Portal, navigate to the hands-on-lab resource group by using the Resource groups menu item in the side menu.

  2. In the hands-on-lab blade, locate and select the Storage account that you created previously with the name contosoeventsSUFFIX.

  3. In the Storage account blade, select the Access keys option from the menu. Copy the Connection string value from the key1 section in a Notepad.

Task 3: Set the environment variables to the Events microservice

  1. Open the ServiceManifest.xml file located in ContosoEvents\ApplicationPackageRoot\EventsPkg.

  2. Locate the <ImageName> element (line 18) and replace [YOUR CONTAINER REGISTRY] with the name of your Container Registry account. This element specifies the fully qualified name of the container image that is going to be used by the Service Fabric service.

  3. Locate the accountEndpoint and accountKey environment variables inside the <EnvironmentVariables> element. Set their values to the URI and Key respectively, that you copied from the Cosmos DB account properties.

Task 4: Set the environment variables to the Orders microservice

  1. Open the ServiceManifest.xml file located in ContosoEvents\ApplicationPackageRoot\OrdersPkg.

  2. Locate the <ImageName> element (line 18) and replace [YOUR CONTAINER REGISTRY] with the name of your Container Registry account. This element specifies the fully qualified name of the container image that is going to be used by the Service Fabric service.

  3. Locate the accountEndpoint and accountKey environment variables inside the <EnvironmentVariables> element. Set their values to the URI and Key respectively, that you copied from the Cosmos DB account properties.

  4. Locate the storageConnectionString environment variable inside the <EnvironmentVariables> element. Set its value to the Connection string that you copied previously from the Storage account properties.

Task 5: Set the Container Registry credentials in the ApplicationManifest.xml file

  1. Open the ApplicationManifest.xml file located in the ApplicationPackageRoot folder of the ContosoEvents project.

  2. Locate the <RepositoryCredentials> elements and replace the AccountName and Password attribute values with the username and password respectively that you copied previously from the Container Registry account properties.

Task 6: Publish the Service Fabric Application

  1. From Solution Explorer, right-click the ContosoEvents project and select Publish….

  2. In the Publish Service Fabric Application dialog, set the Target profile to Cloud.xml, and select your Service Fabric Cluster endpoint from the Connection Endpoint drop down, then select Publish.

    Note: Ensure that StoreLocation is CurrentUser and StoreName is My.

  3. Publishing to the hosted Service Fabric Cluster takes about 5 minutes. It follows the same steps as a local publish step with an alternate configuration. The Visual Studio output window keeps you updated of progress.

  4. From the Visual Studio output window, validate that the deployment has completed successfully before moving on to the next task.

Exercise 5: Set up the Function

Task 1: Set up the function

In this task, you will create a function that will be triggered by the externalization queue we created for the app. Each order that is deposited to the queue by the Orders microservice will trigger the ProcessOrderExternalizations function. The function then persists the order to the Orders container of the Cosmos DB instance.

  1. There appears to be an issue with Azure Functions detecting Storage accounts, so before creating your function, you will manually add your Storage account connection string to the Application Settings for your Function App.

  2. In the Azure portal, browse to the Storage Account you created in Exercise 1, Step 5, then select Access keys under Settings on the side menu, and copy the key1 Connection String value, as you did previously.

  3. Now, browse to the Function App you created in Exercise 1, Step 4.

  4. Select your Function App in the side menu, then select Configuration under Configured features.

  5. On the Application Settings tab, select +New application setting, then enter contosoeventsstore in the name textbox, and paste the key1 Connection String value you copied from your Storage account into the value textbox.

  6. Scroll back to the top of the Application Settings tab, and select Save to apply the change.

  7. From the menu, place your mouse cursor over Functions, then select the + to the side of Functions.

  8. Select In-portal as the development environment in the getting started blade, and select the Continue button.

  9. Select More templates and then select the Finish and view templates button.

  10. Select Azure Queue Storage trigger and then select install when the warning that extensions are required is displayed.

    a. Continue to install any additional extensions required.

  11. In the New Function blade, enter the following:

    a. Name: ProcessOrderExternalizations

    b. Queue name: contosoevents-externalization-requests

    c. Storage account connection: Select contosoeventsstore

    d. Select Create

  12. Under the ProcessOrderExternalizations function, select Integrate.

    a. On the Integrate screen, set Message parameter name to orderItem.

    b. Select Save.

  13. While still on the Integrate screen, select +New Output.

  14. In the outputs box, locate and select Azure Cosmos DB, then choose Select.

Note: If prompted to install extensions, Select Install and wait for the extensions to finish installing.

  1. On the Azure Cosmos DB output screen, enter the following:

    a. Document parameter name: Enter orderDocument.

    b. Use function return value: Leave unchecked.

    c. Collection Name: Enter Orders.

    d. Partition key: Leave empty.

    e. Database name: Enter TicketManager.

    f. Azure Cosmos DB account connection: Select new next to the text box, and select the Cosmos DB you created in Exercise 1, Task 6.

    g. Collection throughput: Leave empty.

    f. Select Save. You should now see an Azure Queue Storage trigger and an Azure Cosmos DB output on the Integrate screen.

  2. Next, select your function from the side menu.

  3. Now, you will retrieve the code for the function from a file in Visual Studio.

  4. In Visual Studio, go to Solution Explorer, and locate ProcessTicketOrderExternalizationEvent.cs in the Azure Functions folder.

  5. Select all the code in that file (CTRL+A) and copy (CTRL+C) it.

  6. Return to your function’s page in the Azure portal, and replace the code in the run.csx block with the code you just copied from Visual Studio. The run.csx code should now look like the following. Note: The ProcessOrdersExternalization function will enable you to process another order, and see that it is saved to the Orders collection of the Cosmos DB.

  7. Select Save.

Exercise 6: Placing ticket orders

Task 1: Test the Events microservice by using Swagger

In this task, you will test the events retrieval from the application deployed in the hosted Service Fabric Cluster.

  1. In a Chrome browser on your Lab VM, navigate to the Swagger endpoint for the Events microservice exposed by the hosted Service Fabric cluster. The URL is made of:

    For example:

    http://contosoeventssf-SUFFIX.southcentralus.cloudapp.azure.com:8082/swagger

  2. Expand the Events API and expand the GET method of the /api/Events endpoint.

  3. Select the Try it out button and then select the Execute button.

Task 2: Test the Orders microservice by using Swagger

In this task, you will test the orders creation from the application deployed in the hosted Service Fabric Cluster.

  1. In a Chrome browser on your Lab VM, navigate to the Swagger endpoint for the Orders microservice exposed by the hosted Service Fabric cluster. The URL is made of:

    For example:

    http://contosoeventssf-SUFFIX.southcentralus.cloudapp.azure.com:8083/swagger

  2. In the Orders API section, expand the POST method of the /api/Orders endpoint.

  3. Select the Try it out button.

  4. Copy the JSON below, and paste it into the order field, highlighted in the screen shot above, then select Execute.

        {
        "UserName": "johnsmith",
        "Email": "john.smith@gmail.com",
        "Tag": "Manual",
        "EventId": "EVENT1-ID-00001",
        "PaymentProcessorTokenId": "YYYTT6565661652612516125",
        "Tickets": 3
        }
    
  5. This should return with HTTP 200 response code. The Response Body includes a unique order id that can be used to track the order. Copy the Response Body value. It will be used to verify the order was persisted in Cosmos DB.

  6. Verify that the order was persisted to the Orders collection.

  7. In the Azure portal, navigate to your Cosmos DB account.

  8. Perform a query against the Orders collection, as you did previously, to verify the order exists in the collection. Replace the id in the query with the order id you copied from the Response Body above.

    Note: In order to write to the Cosmos DB database, the Azure Function must be running.

Exercise 7: API Management

Duration: 15 minutes

In this exercise, you will configure the API Management service in the Azure portal.

Task 1: Import Events API

In this task, you will import the Events API description to your API Management service to create an endpoint.

  1. In the Azure portal, navigate to the hands-on-lab resource group, and select your API Management Service from the list of resources.

  2. In the API Management blade, select APIs under API Management.

  3. In the APIs blade, select OpenAPI.

  4. In a browser window, open the Contoso Events API – Events Swagger page, by navigating to http://contosoeventssf-SUFFIX.southcentralus.cloudapp.azure.com:8082/swagger. Copy the Swagger document location by right-clicking the /swagger/v1/swagger.json link.

  5. Return to the Create from OpenAPI specification window, click the Full link and do the following:

    a. Paste the URL copied from Swagger into the OpenAPI specification textbox.

    b. Select HTTPs as the URL scheme.

    c. Enter events in the API URL suffix textbox.

    d. Tags: Leave empty.

    e. Select Unlimited in the Products.

    f. Select Create.

    Note: You would typically create a new product for each environment in a scenario like this one. For example, Development, Testing, Acceptance and Production (DTAP) and issue a key for your internal application usage for each environment, managed accordingly.

  6. Select Settings in the Contoso Events API – Events toolbar, update Web Service URL to point to your published API endpoint. Ensure that it uses HTTP instead of HTTPS, and select Save.

    Note: After saving, notice the URL under "Base URL". You will use this URL in your website configuration in the next exercise.

Task 2: Import Orders API

In this task, you will import the Orders API description to your API Management service to create an endpoint.

  1. In the Azure portal, navigate to the hands-on-lab resource group, and select your API Management Service from the list of resources.

  2. In the API Management blade, select APIs under API Management.

  3. In the APIs blade, select OpenAPI.

  4. In a browser window, open the Orders Events API – Events Swagger page, by navigating to http://contosoeventssf-SUFFIX.southcentralus.cloudapp.azure.com:8083/swagger. Copy the Swagger document location by right-clicking the /swagger/v1/swagger.json link.

  5. Return to the Create from OpenAPI specification window, click the Full link and do the following:

    a. Paste the URL copied from Swagger into the OpenAPI specification textbox.

    b. Select HTTPs as the URL scheme.

    c. Enter events in the API URL suffix textbox.

    d. Tags: Leave empty.

    e. Select Unlimited in the Products.

    f. Select Create.

    Note: You would typically create a new product for each environment in a scenario like this one. For example, Development, Testing, Acceptance and Production (DTAP) and issue a key for your internal application usage for each environment, managed accordingly.

  6. Select Settings in the Contoso Events API – Orders toolbar, update Web Service URL to point to your published API endpoint. Ensure that it uses HTTP instead of HTTPS, and select Save.

    Note: After saving, notice the URL under "Base URL". You will use this URL in your website configuration in the next exercise.

Task 3: Retrieve the user subscription key

In this task, you will retrieve the subscription key for the client applications to call the new API Management endpoint.

  1. In the Azure portal, navigate to your API Management service, and from the Overview blade, select Developer portal from the toolbar. This will open a new browser tab, and log you into the Developer portal as an administrator, giving you the rights you need to complete the following steps.

  2. In the Developer portal, expand the Administrator menu, and then select Profile.

  3. Select Show for the Primary Key of the Unlimited subscription to reveal it.

  4. Save this key for next steps.

Exercise 8: Configure and publish the web application

Duration: 15 minutes

In this exercise, you will configure the website to communicate with the API Management service, deploy the application, and create an order.

Task 1: Configure the web app settings

In this task, you will update configuration settings to communicate with the API Management service. You will be guided through the instructions to find the information necessary to populate the configuration settings.

  1. Within Visual Studio Solution Explorer on your Lab VM, expand the Web folder, then expand the ContosoEvents.Web project, and open Web.config. You will update these appSettings in this file:

    <add key="apimng:BaseUrl" value="" \>
    <add key="apimng:SubscriptionKey" value="" \>
    
  2. For the apimng:BaseUrl key, enter the base URL of the API you created in the API Management Publisher Portal (Exercise 7, Task 1, Step 6), such as https://contosoevents-SUFFIX.azure-api.net/

    Note: Make sure to use only the domain name, and include a trailing "/" or the exercise will not work.

  3. For the apimng:SubscriptionKey key, enter the subscription key you revealed in API Management developer portal (Exercise 7, Task 3, Step 3).

  4. Save Web.config. You should have values for two of the API Management app settings.

Task 2: Running the web app and creating an order

In this task, you will test the web application calls to API Management by creating an order through the UI.

  1. Using Solution Explorer in Visual Studio, expand the Web folder, then right-click the ContosoEvents.Web project, select Debug, and then Start new instance.

    Note: If you receive an error regarding roslyn\csc.exe path, clean the Solution and rebuild.

  2. If prompted about whether you would like to trust the IIS Express SSL certificate, select Yes, then select Yes again at the Security Warning prompt.

  3. If you receive a warning in the browser that "Your connection is not private," select Advanced.

  4. Under Advanced, select Proceed to localhost (unsafe).

  5. When the application launches you will see the website home page as shown in the following screen shot.

  6. Note the event presented on the home page has an Order tickets now button. Select that to place an order.

  7. Choose the number of tickets for the order, and then scroll down to see the billing fields.

  8. Enter values into the empty fields for your email, first name, last name, and Cardholder name.

  9. Select Place Order.

  10. Once the order is queued for processing, you will be redirected to a results page as shown in the following screen shot. It should indicate Success and show you the order id that was queued as confirmation.

  11. Close the web browser to stop debugging the application.

Task 3: Publish the web app

In this task, you will publish the web application to Azure.

  1. From the Visual Studio Solution Explorer, right-click ContosoEvents.Web, and select Publish.

  2. Select the App Service option, choose Select Existing, then select Publish.

  3. You may be prompted to log in to your Microsoft Account with access to the subscription where you created the resources for this hands-on lab. After logging in, you can select the subscription in the App Service screen.

  4. From the list below, expand the Resource Group you created previously (hands-on-lab), and select the web app contosoeventsweb-SUFFIX. Select OK.

  5. If the Publish does not start automatically, select Publish next to the Web Deploy publishing profile.

  6. When publishing is complete, your browser will launch, and navigate to the deployed web app home page. You can optionally submit another order to validate functionality works as in Task 2.

After the hands-on lab

Duration: 10 minutes

In this exercise, attendees will deprovision any Azure resources that were created in support of the lab.

Task 1: Delete the resource group

  1. Using the Azure portal, navigate to the Resource group you used throughout this hands-on lab by selecting Resource groups in the side menu.

  2. Search for the name of your research group, and select it from the list.

  3. Select Delete in the command bar, and confirm the deletion by re-typing the Resource group name, and selecting Delete.

You should follow all steps provided after attending the Hands-on lab.

Attribution & License

This Hands-on Lab was originally published as the “Microservices Architecture – Developer Edition Microsoft Cloud Workshop” by Microsoft and is licensed using the MIT License.

The latest update for this Microsoft Cloud Workshop was made June 2019.


Microsoft MVP

Chris is a Microsoft MVP and has 20 years of experience designing and building Cloud & Enterprise systems. He is also a Microsoft Certified: Azure Solutions Architect, developer, Microsoft Certified Trainer (MCT), and Cloud Advocate. He has a passion for technology and sharing what he learns with others to help enable them to learn faster and be more productive.

Pin It on Pinterest