Try for free Book a demo

Real World Azure Serverless Use Case to Implement a B2B API

Microsoft Azure

8 Mins Read | Last modified on February 28th, 2024

azure-api

Yesterday, I wrote an article about how we refactored some problematic custom functionality in Dynamics to run on Azure instead and discussed the benefits of this. In this article, I want to discuss how we can take the same solution and extend it again using Serverless components on Azure. But this time in a different way to implement an API.

You can download this blog as a PDF document. Download Now

First off, we need to consider the wider solution. In the first article, we focused on the invoice/order processing requirements and architecture we implemented. The below diagram represents this.

Serverless Use Case

In this article, we need to consider the wider solution. The below diagram shows how we receive invoices/orders from B2B partners. They come in through a traditional approach which is supported by many of our partners where batch files are received. We have implemented a BizTalk solution which will receive the batches via SFTP and it will then debatch and process the files and load data into Dynamics.

Debatching

We then have the invoice/order processing that we discussed previously and finally, we have a BizTalk solution which takes completed orders/invoices and sends them to our line of business systems.

While this works well, we are finding that some of our partners are now moving forward with their technology capabilities and we also have new partners who already have newer technology platforms. The traditional batch-based approach doesn’t always work well for all partners. For those that are used to batches, it is great but for others, we need to do something different. The type of business means that we tend to offer multi-channel approaches for partners to integrate with us. This makes us easy to work with.

Therefore, we decided to explore the options to implement a modern looking API to give our partners an alternative option and to gain wider adoption of our business solution.

Also, we felt that the Azure platform would have a lot of services to help us with this and we wanted to explore Serverless options again. We envisioned our API would look like the below diagram.

Invoice API

We would proxy our API with Azure API Management which we already used anyway, which would give us an excellent developer experience for our partners to make it easy for them to integrate with us. A well-documented and easy to use API is key to having success in this area. The API we would implement would offer 2 services. The first would allow the partner to submit an API. The 2nd would allow the partner to check the status of the API.

At this stage, we knew that the processing of the invoice/order would be asynchronous and if we were to get spikes in load where we receive a lot of invoices/orders then we would load level and queue up requests, process them at a rate that would protect the system. The partner will submit a message and get an Id. They could then come back later to check the status of their submission. We would also support options to be able to do a call back if the partner wanted it.

Order/Invoice Submission

If we take a look at how the API is build to support invoice submission, then we can see that below in the diagram.

order invoice submission

  1. The partner will submit a message to Azure API management
    1. APIM will forward the message to an Azure Function which will act as the API backend
  2. The function will record receiving the message to an Azure table and the status will be maintained here
    1. The function will also write the message to a blob in Azure Storage. This will allow support for large messages
  3. The function will then write a message to Azure Service Bus queue to allow the received message to be processed asynchronously later. At this point, the call will be returned to the partner
  4. Behind the scenes, a Logic App will pick up the message from Service Bus
  5. The Logic App will get the payload from blob Storage
  6. The Logic App will call a function to insert the message into Dynamics and then update the table to indicate that the message is now in Dynamics

Processing the Order/Invoice

Once the message is loaded into Dynamics, the existing invoice processing engine on Azure will process the received data just like in the previous article. The difference, however, is when you publish the notifications to Service Bus. At various points in the processing, they publish to a topic. Allowing us to route the message to multiple queues using pub/sub. From here we can implement Logic Apps to process the messages and update the Azure Table to record the current status of the invoice/order.

This process will update the table to indicate that validation failed, or the processing was successful. The below picture shows how this works.

Processing order invoice

  1. The invoice processing engine published update status events to Service Bus Topics. We can route these to queues for our new API solution
  2. The Logic Apps can update the table with status changes

Partner gets status

The next part of the implementation needs us to allow the partner to check the status of their submitted message.  This implementation was easy. The partner will submit the ID they get when originally submitting the invoice/order. This will go through APIM to an Azure Function. The function would look up the status in the Azure table and return the status.

Partner Status

Implementing this particular bit in an Azure Function was really simple and will perform well. Table storage has superb latency and the function is good too. There should be very little overhead in this query.

Partner Callback

In the previous section, we discussed the partner checking the status of their submission. The requirement for this is because we have an asynchronous process in the background. Which allows us to do several good things architecturally. This will make the solution work well. But the tradeoff is that the partner submitting to an API will not get an immediate response. Sometimes this is difficult for businesses to understand, the transition from RPC style to asynchronous architectures is not an insignificant leap in thinking for the non-technical people involved in your project. The tradeoff, however, has many benefits and can save you a significant amount of money by not having to scale for peak load.

While the long polling approach can be a bit of a pain, we can also easily support a callback mechanism. In the below diagram you can see how we implemented this.

Partner Callback

  1. The invoice processing engine publishes events line we discussed earlier
  2. When the Logic App processes the event if the partner has a call back URL registered with us we can send them a message to tell them the message is done

The good thing, in this case, is the durability of the Logic App. This means we have some replay support and other good things making the call back quite reliable.

From the partners perspective, implementing and offering up these secure HTTP endpoints is a lot easier today than it used to be. An example is that they could just give us an Azure Function URL to call back to. They could then pop the message on a queue or something and process it in the background.

Where’s the value

I hope that in this solution you can see from an architecture perspective how easy it is to implement the solution.  There are a couple of Serverless services, mainly Logic Apps and Functions. The solution is simple, however, when we have implemented this asynchronous approach we will be able to accept a super high volume of messages inbound from partners as the function, APIM and queue/storage will handle the very significant load. We then process in the background at an acceptable rate for Dynamics and out solution. This is a big deal as our solution scales and adoption grows.

The result is a high performing reliable API which will give good experiences to the partner.

In terms of productivity, I can give a good example of the difference that Serverless offers. In the real world, I was able to go from zero to having this API solution built and functional and able to process 10,000 messages in our system test environment in approximately 2 days. That includes good things like automated build and automated deployment for all test environments. Now there’s a statement of Serverless allowing you to focus on the solution not getting a side track with non-functional and technical problems.

Wrap Up

Hopefully, this will inspire others to consider the Serverless technologies on Azure when looking to take advantage of new opportunities. In addition to all of the benefits, the fail fast approach we take means at any point if the solution did not work we could just bin it all and we have hardly spent any money.

In recent months, I happened to explore Turbo360 – a SaaS tool for Azure Serverless monitoring and management. After 90 days of testing this tool and putting it into use, I felt it quite interesting and would like to share my learnings with you.

This article was originally published on Oct 24, 2018. It was most recently updated on Feb 28, 2024.

Related Articles