Try for free Book a demo

Integrate 2019 – Day 3

Events

20 Mins Read | Last modified on February 28th, 2024

Integrate 2019 - Day 3
After an exciting Day 1 and Day 2 at INTEGRATE 2019, the stage was perfectly set for the last day of the event. Before you proceed further, we recommend you to take a read at the following links – Day 1, Day 2 

Table of Contents

Scripting a BizTalk Server installation – Samuel Kastberg

After talking so much about the cloud on last 2 days of integrate2019. Samuel, Senior Premier Field Engineer at Microsoft, kick-started the first session of last day, Scripting BizTalk Server Installation.

Installing and configuring BizTalk server is not straightforward, it can take a lot of time. It is important to have scripts which cause fewer errors than manual installation. You can run such scripts in a different environment and get everything in place. The same script can be used for setting up different environments which lower time and costs.

What you should script:

  1. Things you can control
  2. Things you want to be the same over and over again
  3. Good Candidates
    • Windows Feature
    • Provision VM’s in Azure (if the environment needs to be created in Azure)
    • BizTalk features and group configurations
    • MSDTC settings, Hosts, Host Instances
    • Visual Studio, SQL server
    • You’re a favorite Monitoring tool
  4. Bad candidate
    • Things going to change over a time

Samuel advised considering below points before you start;

  1. Decide what your main drivers are, repetition and control are the drivers to automate, standardized developer machines, disaster recovery preparation, and test environments
  2. Decide what your baseline is and document it, think what could change in 6 months or a year
  3. Document the execution process, Scripting is not a replacement for the documentation
  4. Set a timeframe for your work

Good Practices

  1. The main code should orchestrate process – Create functions for tasks
  2. Name scripts to show the order in which you are going to run
  3. Write a module for common functions which you are going to run over and over (Logging, Prompts, Checking, etc.)
  4. Use a common timestamp for generating files
  5. Debugging is a good friend (Before run just step in)
  6. Be moderate with error handling Add a reminder to the information source
  7. Add reminders to information sources

Windows feature installation

  1. Choose different tools of your taste
  2. Server Manager

BizTalk Server Feature Installation

Setting up the BizTalk Server product consists of installation and configuration. Installation adds the binaries to the system. The configuration creates or joins a BizTalk Group and enable/configure other features as Rules Engine, BAM and EDI.

Setup/AddLocal – Export configuration .xml file (Select the required features).

Samuel Kastberg speaks BizTalk Server installation in Integrate 2019 Installing BizTalk server features at Integrate 2019

Installing CU/Adapter pack – More complex

Installing CU/Adapter pack in Integrate 2019

Configuration File – Used for providing required parameters

  1. Consists of feature element – Each one maps to specific sections in the configuration tool
  2. Each feature contains one or more questions – Answer attribute selected = True
  3. Export from config.exe
  4. Edit the export to be a template- Replace the real values at configuration time. You need one file while creating a group

Configure Host and instances – Use WMI or PowerShell provider

Create handlers – Use WMI and PowerShell

Handling secrets – Use KeyPass, Key Vault. KeyPass to use when no internet access is available.

Run the Script in your environment !!!

BizTalk Server Fast & Loud Part II: Optimizing BizTalk – Sandro Pereira

Sandro Pereira, as usual, sets the theme of the session by comparing the BizTalk Server Performance to different types of cars like Formula 1 cars, BMW cars, Truck Cars. 

 Performance with BizTalk Server  

Explains the different parameters that affect the performance of the BizTalk Server with car functioning 

  1. Memory 
  2. Network  
  3. Disk Space 
  4. BizTalk Server 
  5. SQL Server 

Sandro Pereira speaks BizTalk server performance in Integrate 2019

 He explains different tips to optimize the performance 

  1. What is your requirement?
    To improve the performance of the BizTalk environment, choose the right set of infrastructure to manage your business requirement.
  2. Message Processing (Slow Down)
    BizTalk message processing can be slowed down when there are quite huge volumes of messages. In that case, use queues to process the messages
  3. Techniques to optimize the performance
    Sandro explains the different techniques which can be used to fix the performance issue  

    • Observation 
    • Analysis 
    • Apply Fixes
  4. Redesign the system
    If an existing BizTalk solution is a bottleneck in the performance, you can suggest redesigning the solution to meet the business requirement and space to scale the solutions for future challenges.

    Redesign solution for scaling

  5. Move to Historic Data (Tracking)
    Sandro explains about how to manage the historical data. Storing the important data is important but use the minimum tracking necessary, to avoid database and disk performance issues.
  6. SQL Server Memory optimization
    Use SQL Server memory configuration to optimize the performance of the message processing. 
    SQL Server memory configuration to optimize the performance

 Sandro then explains two real-time solutions in which how performance improvement is achieved;

Cock Sector World Leader 

 He explains how the BizTalk Team Scaling the solution when the performance hit in one of the manufacturing sections; 

  1. Follow the sequences based on the number of messages (5 million messages) 
  2. Dividing the process – Parallel process (Queues) 
  3. Scaling up the SQL Server Tier 
  4. Move data to Historic (Enable the Minimum tracking) 

 Banking System 

Sandro explains how they provide the solution in one of Banking systems with an example of opening a Banking Account scenario. He explains various techniques used in these Banking Solutions;

  1. Azure 
  2. API Management 
  3. Composite Orchestrations 

To process the data to open a banking account takes more than 1 min to process the data. He explains the factors that affect the performance;

  1. Network/LDAP 
  2. Composite Orchestration Expression shape  
  3. Resources Unload (Slow First hit) 

 Fine tune the performance to process the creating bank account with; 

  1. Improve the warming up by Recycling BizTalk and IIS 
  2. Memory  
  3. IIS Application Pool 
  4. MQ Agent Recycling 
    BizTalk polling intervals in Integrate 2019

 He then explains various solutions to optimize performance; 

  1. BizTalk Polling Intervals in MQ Agent (Careful on the set the polling) 
  2. Tune the performance with configurations 
  3. Orchestration Dehydration (Composite Orchestration) 
  4. SQL Affinity – Maximum number of memories 
  5. Priority in BizTalk: Set Priority Send port priority (10 lowest – 1 Faster) 

Changing the game with Serverless solutions – Michael Stephenson

Michael Stephenson introduced himself as a cloud & integration freelancer as well as Microsoft MVP for more than 10 years. 

The main theme of this talk is to explain various components involved in building a real-world application using serverless solutions. 

 Mike provides the synopsis at the beginning of talk itself. The idea is to discuss the serverless components involved and solutions used to build the shopping cart called Shopify. 

He discussed how the cart is using Application Insights to capture the various stats about the client-side information. Using that information, the issues which has been faced customers can be fixed. 

 Principles 

The talk has been continued to discuss the various principles involved in his talks about the process of building the solution. 

  1. What are the common business problems? 
  2. Can I do some cool demos for talks? 
  3. Can I make a story comparable to the real world? 

To compare with a real-world example, Mike explained the components of Shopify. 

Serverless components 

Michael Stephenson speaks Serverless Components in Integrate 2019

 Shopify has been built with various components like API Management, SQL database, Power BI, Power Apps, etc. 

He explained about the flow of data from user endpoint to various storage, processing and notification services involved. 

Shopify deals with solutions like management stuff, managing supplies in the background, handling payment in PayPal or Stripe, handles the supply chain scenarios and social media engagements. 

Enterprise application platform

Demo of webhooks 

Mike showed demo on how webhook is used in the application. 

Webhooks are triggered when the Service Bus receives message. Only a few lines of codes help the user to set up the message handling scenarios. 

One can build an investigation on received data with the help of Power BI. 

Most Popular products 

The most popular articles can be found by for example, whatever the products are which added in cart or searched for in the portal. These have been stored using Logic Apps. 

This information can be processed and provided with analytics like what is the most popular product in the last two weeks. 

Most Popular products eplaination in Integrate 2019

Service Bus Cognitive Service helps to find the pattern and helps to show the related products based on user input and matching the user query with offsets. 

Shipping data 

Tracking the order is the key feature in any shopping portal. Aftership can be integrated into the ecosystem to keep us up to date with the shipping status. 

Once the shipping is started the customer can use a web application to track the order. 

For example, if an order takes longer time to deliver. If we could predict and deliver the information earlier to the customer, that will help them to get engaged with the product. 

Shipping data explainations in Integrate 2019

 Using Logic App triggers and webhooks, a notification can easily be sent to various communities in Microsoft Teams. 

Thoughts 

At the end of the talk, Mike shared his thoughts on building a business solution with Serverless. 

Mike shared his thoughts on building a business solution with Serverless in Integrate 2019

Adventures of building a multi-tenant PaaS on Microsoft Azure – Tom Kerkhove

Tom Kerkhove, Azure Architect at Codit, kicked off his session on Scale. Basically, Scaling exists of two types, being – Scale up/Scale down and Scale-out/Scale in. The right computing Infrastructures are varying from less managing Functions to More Managing Bare Metals like VMs. This reminds us the fact that the more control we have, the more complexity rises.

Tom Kerkhove speaks to choose right compute infrastruture in Integrate 2019

Tom advised on how to make designing for scale with: 

  1. Serverless  
  2. PaaS 
  3. CPaaS

Here are some pictures portraying the good, bad, and ugly scaling design ideas on Serverless, PaaS, and CPaaS: 

the good, bad, and ugly scaling design ideas on Serverless

the good, bad, and ugly scaling design ideas on PaaS

the good, bad, and ugly scaling design ideas on CPaas

Next, Tom shared his analysis of Auto-scaling on serverless services and some Tips. 

Tenancy: 

He suggested some ideas on choosing the right tenancy model which is very important for scaling. 

Here are some ideas of choosing tenancy model: 

  1. Choosing a sharding strategy – Spread all data across a smaller database (e.g. Elastic Pool). Using Shard managers and Cost efficient, sharding is a better way. 
  2. Determining tenants – Determining the tenant that is consuming your services. 

Monitoring: 

On monitoring, he recommended that training your developers to use their own toolchain and automated tests would be better as it should be a shared responsibility. 

Here are some good actions he suggested to be followed on monitoring: 

  1. Enrich your telemetry 
  2. Health checks 
  3. Handling alerts 
  4. Write Root cause analysis (RCA) 

Webhooks: 

For webhooks, He suggested several points to be followed: 

  1. Providing good DNS names of your services 
  2. Do not reduce your API security because of your 3rd Party 
  3. Always route webhooks through an API gateway 
  4. Provide user-friendly webhooks 

Embrace Change: 

Finally, He showed how they ship their product with DevOps and he concluded with the sentence “We live in a world of constant change, do change” 

Lowering the total cost of ownership of your Serverless solution with Turbo360 – Michael Stephenson

Michael Stephenson, Freelance Cloud Specialist and Architect started his session on how Turbo360 reduces your total cost of ownership in your Serverless solution. Let’s see what Microsoft does to reduce its total cost.

  1. Microsoft provides a lot of services like Function Apps, Service Bus, Logic Apps and they ensure that the core platforms do their job in an effective manner 
  2. The DevOps team should ensure that they divide the roles based upon resources in different regions 
  3. This can be easily done using Turbo360 by the powerful concept called Composite Application and the support team can be assigned based upon the entities associated in a respective composite application 
  4. Using democratization, you can make functions available for less experienced or less skilled people. However, there is a chance where less skilled support person can break your existing business solution 

Lowering the total cost of ownership of your Serverless solution with Turbo360 in Integrate 2019

Service Map/Topology in Turbo360

The Topology feature will be very useful to create a topology diagram that represents the business orchestration of your composite application. The diagram can be created with the entities that are part of a composite application. 

Service Map/Topology in Turbo360

BAM in Turbo360 

Atomic Scope is an end to end monitoring tool for Azure Integration Services with rich business context and It supports three types of solutions, 

  1. BizTalk only solution 
  2. Azure only solution 
  3. Hybrid Integration solution 

The below image depicts the current architecture of Atomic Scope. 

BAM in Turbo360

This is where Turbo360 became even more powerful, we had an idea of integrating Atomic Scope with Turbo360. So, we decided to bring the tracking capability within Turbo360.  

  1. You are now able to configure business processes in Turbo360 portal 
  2. You also get a simplified SaaS provisioning (we set up and manage Atomic Scope infrastructure behind the scenes) 
  3. Many customers were asking for a combo of these two products and we think this is the right way to do it. 

Explore more about BAM in this respective link. 

Key Features to democratize support in Turbo360 

Some of the key features that democratize support are, 

  1. Composite Application – Group your entities effectively 
  2. Service Map / Topology – Represent your business orchestration of respective composite application 
  3. Activities – Schedule activities to send messages to queues/topics, etc. and be able to purge blobs 
  4. Monitoring – You can make use of different types of monitors available in Serveless360 to monitor your azure entities 
  5. BAM – Get an end to end visibility of your business process 
  6. Governance Auditing – Maintains logs of the user activities in the system

Microsoft Integration, the Good the Bad and the Chicken Way – Nino Crudele

Nino Crudele is back to INTEGRATE with great charm, energy and his signature style in presenting sessions. This year he is a Certified Ethical Hacker as well. Congrats to him!

He started the session noting his connection with Paolo Salvatori, Principal Program Manager at Microsoft, a good friend of him remembering their days working together on BizTalk. 

He was sharing his experiences in leading cloud team adopting Azure. He still suggests BizTalk is a top option for complex on-premise and hybrid mediation. 

Adopting to Azure comes with its own challenges and How to avoid side effects moving to Microsoft Azure 

The real challenge is not the Technology, but to embrace the Azure Governance –  which is a massive and complex topic. Governance is everything – without using Azure Governance, you cannot even think of using Microsoft Azure. 

Avoid Chicken way  don’t struggle, be brave 

The transition in Azure Scaffold – which is the principle behind Microsoft Azure that now includes Security, Cost Management, Identity & Access, Automate, Templates & DevOps, Monitor & Alert.

What exactly is a Management group  

Once an agreement is established with Microsoft, an account is created with Management Group. Within Management Group, you can create subscriptions, create resource groups and then tags. 

Policies in Management Group 

  1. Do not try to manage resources directly from the Management Group.  
  2. Organize subscriptions in management groups, then configure security and cost 
  3. Policies can be disabled and enabled whenever required 

How to organize the Roles

Managing roles is very important. There are multiple roles like Owner, Reader & Contributor. Owner – should be used in delegation only. 

Global Admin – God in Azure Governance, can see whatever happens. Does not provide this access to anybody on a longer time. Use Privileged Identity Management to provide for some time. 

APIs are more powerful in the Azure portal usage. A massive framework is built to make Azure operate through APIs. Cost is the number one problem in Azure. You can have a lot of fancy and tech-savvy tools like PowerBI with analytics, but the finance team loves Excel and want to keep things simple. 

Nino suggests costing to be organized at the Department and Project level 

What Finance department will care about 

  1. Totals by Department 
  2. Totals by Location 

Departments or projects may be distributed in multiple regions. 

Avail discount from Microsoft 

Use the price sheet from Azure Portal. Mark the cost with usage and try to negotiate for a discount for specific services but not for all. 

Security 

This is important and there are a lot of tools like BurpsuiteNmapSnortMetasploitWireshark & Logstalgia 

He suggests having a Dedicated team and resource to manage security. Probably good to outsource but it is best to different companies every time for pen testing and security maintenance. 

Managing Networking is the Core

IP Schemas and Centralize security. Peering of vNet may be required. A good practice is to use Centralize Firewall like Fortigate.  

He then touched upon another problem – DDoS attack. DDoS protection can be enabled in the security center. 

Logstalgia helps to analyze how the packets are traveling around a network. Hitting webpage, endpoint, etc. The DDoS attack is well visualized 

Use Naming Standards 

It is a must in Azure. He suggested a tip to use Excel agent to create VMs in Azure with desired naming standards. 

Lack of Consolidation 

Managing all these may need more knowledge and support. If your company has an Enterprise Agreement – support is free. Create a support ticket in portal. Not only Technical support but also advisory from Microsoft. 

There are a lot of Options in Azure with pros and cons. 

One such possibility is Continuous Deployment through – Automation, ARM Templates, PowerShell. An ARM Template may have security issues in terms of script injection. He suggests not to trust anybody, there is always a possibility that support engineer could inject scripts in PowerShell. The advice – Hashing is Integrity. Any change in PowerShell security should be alerted and so the execution could be stopped. 

Advanced Troubleshooting

Need to check how the firewall is configured, who are using different ports. You cannot use the Azure portal for all these. 

He has collected some 2000+ APIs for his purpose using which troubleshooting can be done. He could provide subscription details asparameter and get all details. 

Created heuristic view

Nino emphasized on the necessity to visualize these data and correlation between data. He shared a tip on how to display the data collected from all these API in a sensible form to Excel for further analysis. 

Documentation is very important 

He also built a tool Aziverso.com that will help to collaborate cost management to Microsoft Office. Another tool Cloudockit provides better documentation for the Cloud Architecture. 

He presented this valuable session in a fun-filled manner for the audience to be well engaged throughout. 

Creating a Processing Pipeline with Azure Function and AIS – Wagner Silveira

The session started with a case study explaining the difference of how a solution was built a year before and what are the options for the same now. One of his clients had the requirement to design a solution that would process a large volume of EDI messages which should be enriched by data coming from a SQL Server database in the cloud and pushed it into a big data repository for reporting and data mining. This should allow control of retries and notification for failed messages, including the visibility of where a single message was in the entire process.

Here are some of the requirements in detail;

  1. The message would need to be received via a secure HTTP interface.
  2. A copy of the message should be stored for auditability.
  3. After the message was successfully received, the process should guarantee the delivery of the message.
  4. An invalid message should be rejected at the beginning of the process.
  5. The solution should scale without requiring much intervention
  6. The operations team should be able to trace the process and understand where a single message was in the process and be notified in case of any failures.
  7. The process should allow for extended validation and extra processing steps. It should also allow the big data repository or the processing of the data could be replaced without a huge impact on the system.

The logical design of the requirement had components like Inbound API, Inbound Processing pipelines (Logic Apps, Service Bus, Integration Accounts), Outbound Processing Pipeline (Logic Apps, Service Bus), Staging repository, External Repository, Outbound API, Big Data Repository and finally App Insights.  

Reality checks from the initial logical design

  1. The message sizes were too big to use service bus as the messaging repository
  2. The volume of the messages and the number of steps required to process it would imply in a sizeable bill for Logic Apps, which could make it hard to swallow.
  3. EDIFACT Schema was not available
  4. Higher Operational cost
  5. Lack of end to end monitoring

With reality checks, there was a need for an updated solution. The solution was updated with Azure Functions in Inbound and outbound processing pipelines with .NET components. Also using storage account and Service Bus for claim check patterns. 

Difference between the two solutions

  1. Azure Functions
  • EDIFACT support via .net package
  • Claim check pattern to use Service Bus
  • Ability to cut the cost with dedicated instances
  1. Azure Storage
  • Payload is under control for Claim check patterns
  1. Applications insights
  • End to end traceability using the correlation ID
  • Track a message for the entire process
  • Allow for the exception to capture the same way

So, the key components are Claim check patterns and Application Insights. He then showed a quick demo of the updated solution.           

Exception management and retries on Functions

  1. Catch blocks using a notify and throw patterns
  2. Leverage functions and Service Bus peek lock patterns
  3. Adjusted max delivery count on Service Bus
  4. Messages that failed were delivered to Dead Letter Queues

Dead letter Queue Management

  1. Logic Apps polling the subscriptions dead letter queues was reduced to every 6 hours
  2. Each subscription DDQ could have its own logic like;
    • Reprocessing count
    • Resubmissions logic
    • Notifications logic
  3. All DDQ messages to different blob storages with email notifications
  4. Send the Error blob storage notifications

1 year later

Now, if the same solution is created today ie., after 1 year, let have a quick look at the options available. With new technologies available, here are some suiting this requirement.

  1. Integration Service Engine – with options for On-Premise
  2. Azure Durable Functions
  3. Event Grid

In this case, they still thought that the Functions will be still a good fit as it allows more flexibility.

New features release now have more impact. Some of the important new features released in the last year are;

  1. Azure Functions Premium
  2. Integrated support to key vault
  3. Integrated support for MSI
  4. Visual network support + Service endpoints

Lesson Learned

  1. Review the fine print – know the limitations and the project volume and match that with available technologies
  2. Operational cost is a design consideration
  3. Make the best of each technology – Understand the price of each technology vs how they fit well together
  4. Think about the big picture – Don’t focus on short needs – spend more time on design

Summary

  1. Weight the technology options available
  2. Think about integrations operational cost
  3. Understand your solutions fit in the big pictures
  4. Understand the components roadmap

Integrate 2019 Day 1 and 2 Highlights

Integrate 2019 Day 1 Highlights

Integrate 2019 Day 2 Highlights

 

Integrate 2019

This blog was prepared by
Arunkumar Kumaresan, Pandiyan Murugan, Mohan Nagaraj, Nishanth Prabhakaran, Saranya Ramakrishnan, Senthil Palanisamy, Suhas Parameshwara

This article was originally published on Jun 5, 2019. It was most recently updated on Feb 28, 2024.

Related Articles