Free Trial Book a Demo

Serverless Use Cases and Best Practices

Mohan Nagaraj

9 Mins Read | Last modified on October 4th, 2023

This blog covers the session – “Serverless use cases and best practices” presented by Eduardo Laureano (Microsoft), Thiago Almeida (Microsoft), and Nick Lizotte (Dun & Bradstreet) at MSIgnite 2018.

Scenario 1: Event Stream Processing – IoT

GLAS Smart Thermostat by Johnson Controls, uses Azure Serverless to connect their room thermostat with customer’s handheld devices. The GLAS Smart Thermostat monitors to report the indoor and outdoor air quality so that you can manage the air you breathe. Initially, they were using the VM’s to connect and process the data between the thermostat and the customer’s devices. Considering their vertical growth in the expanding consumer market they made this paradigm shift to Serverless.

The team tested this scenario with a simulation of 20,000 IoT devices besides the real devices.


In this architecture, you can see four different activities, all controlled by different Azure Serverless components like Azure Event Hubs, Functions, Event Grid, Cosmos DB, and API Management. All the communication between the thermostat, Azure and the customer device is sent through the IoT Hub. As a first step when the customer installs a thermostat, JCI GLAS device sends a message to the IoT Hub to register the device. This event triggers the Functions app to save the data in Cosmos DB. The function app also raises an event into the Event Grid to trigger events like the registration success message to the customer mobile devices.


Image credits – Microsoft

The second step is to pair both the JCI GLAS device and the customer mobile device. Again, the IoT hub triggers a pairing service to the Event Grid through the pairing Event Hub and Functions app.

Finally, the user sets the temperature range in which the device should trigger a message, this setting is forwarded into Azure through the IoT Hub. The Event Hub not only sends the settings into the Cosmos DB through Functions but also triggers an event into the Event Grid and integrate with SignalR services to update the user mobile device.

The next service is to send a message from Azure into the GLAS device, this message is normally about the local weather update from the internet. In this case, the weather update is raised as an event into the Event Grid and triggers the Azure Functions to check the Cache to check if the settings are up to date and sends the message to the GLAS device through the IoT Hub.

Here you can see that all the messages are going through 1 IoT Hub to 3 Event Hubs based on the message type. There are cases where the message from IoT, directly triggering the Azure Functions and the message is segregated through the if/else coding. But given the routing mechanism and throttling in Event Hubs, it makes sense to use Event Hubs.

Lessons Learnt

  1. Serverless apps are composed of multiple parts; Compute (Functions), Data (storage, Cosmos DB, SQL Azure), Events (Event Grid, Event Hubs)
  2. Event processing can be done by a series of functions or combinations of functions + others like Kafka, Stream Analytics etc. You can read more about this here.
  3. Implement logging and telemetry to follow the message throughout the whole workflow with timestamps and correlation id; This allows you to be able to queue a message all the way from the device.
  4. Use poison queues/dead lettering to deal with messages that fail in the processing phase.

 Scenario 2 – Dun & Bradstreet

The Dun & Bradstreet Corporation is a company that provides commercial data, analytics, and insights for businesses. The Data Universal Numbering System (DUNS), is their proprietary system that assigns a unique numeric identifier, referred to as a “DUNS number” to a single business entity. The DUNS system has about 300 Million business records from about 30,000 sources updated 5 Million times a day. These transaction numbers itself shows the importance of the data in the finance industry. Therefore, to manage these records they maintain a project called D&B optimizer running on Microsoft platforms. It is basically a database management system that provides real-time enrichment of records in the Dynamics365.


The data that they handle here is very critical, as it is the financial records of their customers. So, it important to understand the technologies thry use in this project. The first is the Azure App Service environment, this is an isolated deployment that allows deploying Function apps in a subnet. This uses the security features of the virtual networks to control the inbound and outbound communications. With the functions hosted in a subnet, in a private network, public internet access to the SQL database is denied. Therefore, it allows access only from specific subnets of the virtual network using Service Endpoints.


Image credits – Dun & Bradstreet

These applications require several sets of credentials in order to authenticate services like Redis cache or Azure SQL server, internal data service etc. D&B uses Key Vault to secure and store these the credentials. Every functions app in this architecture have a system assigned Managed Identities that is used to authenticate with key vault using Azure AD. This will help to keep the credentials out of the code.

When the user creates or updates the record in the Dynamics365, the data moves to an Azure Service Bus. Here, a series of Queues and Topics are used to orchestrate a chain of Azure Functions to identify the company and enrich the data back in the Dynamics365.



Image credits – Dun & Bradstreet

There are 2 use cases to consider here, the first one is Transactional – where the customer actively creates or updates the records in Dynamics365. The second one is Batch – where the customer wants to enrich the data in bulk that is already in the Dynamics365.

  1. Transactional operations delayed by batch operations – Both transactional and batch operations use the same queuing mechanism. The problem is this doesn’t scale well because the transactional operations can be delayed by a large number of batch operations.
  2. Extending enrichment workflow is cumbersome – Each function in this architecture is important in orchestrating the next step in the workflow. Therefore, adding new functions in the workflow is difficult.
  3. Report progress of bulk operations

Orchestrations with Durable Functions

They address the challenges by using Durable Functions in their architecture.


Image credits – Dun & Bradstreet
  1. Orchestrations are defined in the code and are easily extended – In the above diagram, you can see they have created a Matching Orchestrator. This function is responsible for the order and the way that the activity functions execute. This means all the activity functions can follow the single responsibility principle, making it lot easier to extend this workflow. For example, they have added a normalization function only to improve the batch rates.
  2. Enrichment workflow is dynamic – The orchestrator can determine which activity to run based on input or other activity functions making the workflow dynamic.
  3. Bulk orchestration calls matching as a sub-orchestration to make the batch operation work faster.
  4. They used Fan in/Fan out patterns, where they can execute several orchestrations in parallel and then aggregate those results in the backend. This allows providing a progress update of the operations.

Lessons Learnt

  1. App Service environment and Service Endpoints, allows you to secure Azure Service resource to a virtual network.
  2. You can use Managed Identities to authenticate to any service that supports Azure AD authentication, without any credentials in code.
  3. Use durable functions to solve orchestration problems.

Scenario 3: Long Running and Stateful Serverless Functions

The third scenario is from a Brazilian customer communications organization which sends out millions of billing statements and communications each month for customers. Basically, they escalate billing and payment messages daily to remind the customer. This organization is augmenting its Microsoft Azure-based infrastructure with Azure Functions to achieve process faster and cost-effective.  As a result, it generates 5 times more transactions for less than a quarter of its previous costs. They are able to develop and modify the process and features 40% faster. This is a huge competitive advantage in a business where speed and volume are everything.

To cut down the compute costs, they moved their web platform from co-location data centers to Microsoft Azure a few years ago. They used Azure IaaS and PaaS offerings to gain on-demand compute scalability, web hosting, and messaging services. However, they wanted to push compute costs even lower, which they did by using serverless computing and Azure Functions. Azure Functions makes it possible to run small pieces of business logic, or functions, in the cloud.

Escalation workflow:

Usually, the payment collection works in an escalation workflow, they send 1 reminder escalation message every day. They did not want a dedicated software and infrastructure for a process that runs only a couple of seconds in a day.

Therefore, they used a simple orchestrator Function (a Durable Function) which sends a reminder message on day1. The next day the function checks if the customer has paid the bill, if not it triggers a more serious message. In case the customer has paid the bill, the loop stops by sending a thank you email.


Image credits – Microsoft

You can achieve this workflow in other ways too, but Durable Functions is much easier and can scale faster.

  1. Logic Apps – Some designers like working in the visual fashion choosing Logic Apps. With 200+ connectors available, you need not worry about writing APIs.
  2. Functions + Messaging

Durable Functions Components

The Durable Functions has 3 components – Starter Function, Orchestrator Function, and the Activity Functions. The starter function triggers the orchestrator to start the execution, this means the orchestrator goes to sleep for the rest of the day. This reduces the cost significantly.


Image credits – Microsoft

Lessons Learnt

  1. Different orchestrator method fits different needs: code-first (Durable Functions) vs design-first (Logic Apps). Developers who are comfortable with coding tends to choose durable functions more.
  2. Serverless/FaaS was a barrier to many scenarios due to lack of state management. But with concepts like durable functions, even scenarios and architecture that looks complex can be made easy using Serverless.
  3. Even if you are new to functions, durable concepts are really easy to start. The Azure portal helps out to easily create the durable functions similar to functions.

Takeaways from the session

Certainly, serverless ecosystem evolution has enabled a wide range of rich scenarios. Serverless is growing very fast but there are tons of resources to stay up to date. Above all, you can contribute to the growth of serverless in forums like GitHub/MSDN/StackOverflow.

You can watch this session here.

This article was originally published on Oct 17, 2018. It was most recently updated on October 4th, 2023.

Advanced Cloud Management Platform - Request Demo CTA
Need a better Azure Management platform?

Serverless360 helps to streamline Azure monitoring, distributed tracing, documentation and optimize cost.

Sign up Now