← Return To Home

Azure Functions in the Portal – ALM

One of the advantages of Azure is that for some use cases you can develop solutions in the Azure Portal.  This has the benefits that you can just focus on writing some code and not have to worry about versions of Visual Studio and extensions and all of the other overheads which turn a few simple lines of code which can be written by anyone into something which requires an additional level of developer skills. Let’s face it the ALM processes have been around for years but there are still large portions of the developer community who don’t follow them.

The ability to just get the job done in the portal is compelling and I expect that we will see more of that in the future but it does give you a challenge when it comes to ALM activities like keeping a safe version of the code and being able to move between environments reliably

In this article, I wanted to explore the options for being able to develop an Azure Function in the portal but use some of the basic ALM type activities which would only be a minor overhead but give me some good practices so that developing in the portal would be ok in the real world.

In this article, I am going to start with the most basic example and then in a future article, we will look at something more complex.

My Process

The process I am going to follow is as follows:

  • I will have a development resource group which will contain my Azure Function and the code for it
  • The resource group will also contain other assets for the function like AppInsights and Storage
  • I will create a 2nd resource group called Test. In my Build process, I will refresh the Test resource group with the latest version so I can do some testing
  • Once I am happy with the test resource group I will then execute a release pipeline which will copy the latest for the function to other environments such as UAT which I assume are used by other testers etc

To summarise the pipeline usage see below:

  • Dev -> Test = Build Pipeline
  • -> UAT and beyond = Release Pipeline.

Assumptions

I am going to make a few assumptions:

  • The function apps and Azure resources will be created by hand in advance
  • Any config settings will be added to the function apps by hand.

In this simple example, we are assuming that everything is quite simple, and we can just update the code between environments.  In a future example, we will look at some more complex scenarios.

Walk-through

To begin the walk-through, let’s have a look at the code for our function below:

Azure Function in the portal

You can see this is a very simple function which is just reading some config settings and returning them.

Build Process

From here we need to go to our Build process in Azure DevOps.  The build process looks like the following:

Build process in Azure DevOps

I have defined a build process which I could use for any function app in the portal.  I would simply need to change the variables and subscriptions references and it could be reused easily via the Clone function.

The build executes the following steps:

  • Show all build variables = I use this for troubleshooting as it shows the values for all build variables
  • Export code = This uses the App Service Kudu API features to download the source code for the function app as a zip file
  • Publish Artifact = This attaches the zip file to the build so I can use it in Release pipelines later
  • Azure Function App Deploy = This will deploy the zip file to the Test function app so that I can do some manual testing if I want.

A closer look at Export

I think in the build process the Export Function Code step warrants a closer look.  This step uses Powershell to execute a web request to download the code as a zip file.  I have used the publisher profile in the function app to get the publisher credentials which I can save as build variables and then use as a basic authentication header for the web request to do the download. See the below piece of code

A closer look at Azure Function App Deploy

The Azure Function app deploy is simply using the out of the box task.  I am pointing to the zip file I have just downloaded in the earlier step and it will automatically deploy it for me.  I have set the deployment type on this task to Zip deployment.

Function App Deploy

Microsoft Ignite 2019

Release Process

We now have a repeatable build process which will take the latest version of the code from my development function app and push it to the test instance and package the zip file so I can at some future point release this version of the code.

To do the release to other environments I have an Azure DevOps Release pipeline. You can see this below:

Azure devops release pipeline

The Release pipeline contains a reference to the build output I should use and then contains a set of tasks for each environment we want to deploy to.  In this case its just UAT.  The UAT release process looks like the following:

UAT release

You can see that in this case the Release process is very simple and really it’s a cut down version of the Build process.  In this case, I am downloading the artifact we saved in the build.  We then use the Azure Function deploy to copy the function to the UAT function app.  We are just using the same OOTB configuration as in the above Build process but this time we are pointing to the UAT Function app.

I now just need to run the Release process to deploy the function to other environments.

Limitations

  • I am not using any visual studio so I am unlikely to be automatically testing my functions much. I could potentially look at doing something in this area but it’s out of the scope of this article
  • I am not keeping the code in source control in this article. I am happy that the zip file attached to the build is sufficient.  I could possibly look to saving the zip to source control or unpacking it and saving files to source control if I wanted
  • I am not using any continuous integration here, you could maybe monitor Azure events with Logic Apps and then develop your own trigger.

Summary

Hopefully, you can see that it is very simple to implement the most basic of ALM processes for your development in the portal effort which will add some maturity to it.  In future articles, we will look at some more mature options.

Author: Michael Stephenson

Mike is a very highly experienced leader when it comes to delivering real-world cloud solutions having worked on >40 projects with many multiple large customers. He has worked with Microsoft Azure technologies since they first came out and has lead the technical side of the cloud adoption at a number of companies and has helped those companies become recognized by Microsoft as the benchmark for cloud adoption in their industry sectors.