Creating a Processing Pipeline with Azure Function and AIS

|  Posted: May 13, 2019  |  Categories: Azure

This is a short introduction of a session to be presented by Wagner Silveira at the upcoming INTEGRATE 2019

A year or so ago I was asked by a client to help to design a solution that would process a large volume of EDI messages which should be enriched by data coming from a SQL Server database in the cloud and pushed it into a big data repository for reporting and data mining. They had some hard requirements, but apart from that, the solution was pretty much up for grabs:

  • The message would need to be received via a secure HTTP interface.
  • A copy of the message should be stored for auditability.
  • After the message was successfully received, the process should guarantee the delivery of the message.
  • An invalid message should be rejected at the beginning of the process.
  • The solution should scale without requiring much intervention
  • The operations team should be able to trace the process and understand where a single message was in the process and be notified in case of any failures.
  • The process should allow for extended validation and extra processing steps. It should also allow the big data repository or the processing of the data could be replaced without a huge impact on the system.

My first reaction to this was that would be a typical Azure Integration Services solution:

  • API Management would provide the external, secure HTTP interface
  • The guaranteed delivery of the message would be provided by Service Bus
  • The processing of each message could be implemented using Logic Apps.

Matching this with some other Azure components, like Azure Storage to maintain the Auditability and Azure SQL to combine the current data and the external database. So, easy-peasy, right? Well, it wouldn’t be much of a session if it was.

As soon as we started to delve more into the details, it became apparent that some details would break my easy-peasy, straight-forward initial solution:

  • The message sizes were too big to use service bus as the messaging repository
  • The volume of the messages and the number of steps required to process it would imply in a sizeable bill for Logic Apps, which could make it hard to swallow.

Suddenly I was without a workflow engine and message repository. But nothing that a bit of lateral thinking and some well-known integration techniques wouldn’t fix.

Want to know the rest of the story? Come to my session at Integrate 2019 London. I promise that I will tell how I solve this problem, and how I would tackle it differently with the latest technologies available.

Author: Wagner Silveira

Wagner Silveira is the Principal Integration Architect at Theta and a Microsoft Azure MVP. He also holds a Microsoft Certified Solution Associate in Cloud Platform (MCSA), and a Microsoft Certified Technical Specialist (MCTS) in BizTalk Server 2010 certifications. Wagner is an active member of the Auckland Connected Systems User Group (ACSUG), MSDN Forums and one of the organizers of New Zealand community events like Integration Saturday and Global Integration Bootcamp - Auckland, as well as a presenter in the Global Azure Bootcamp - Auckland.