The Azure Functions team has yet again joined us for another monthly live webcast by staying remote and safe.
In this live webcast, along with Jeff Hollan, Anirudh Garg and Sonia from Engineering team joined us to give a picture on the latest happenings in Azure Functions space.
Without any further delay, let us jump in as there are tons of update are awaiting.
- Kafka Trigger scaling support in premium plan – the team has rolled out the Kafka trigger in Beta version and which can be used to scale out the instances based on the Queue or Topic length etc
- Serverless library support for Python
- New Regions – Here is a new dedicated page available where you can go ahead and check the regions that support the Azure Function plans
PyTorch and Python Functions – PyTorch is one of the deep learning libraries used in Machine Learning and which is now supported to use in Azure Functions
Now, let us jump on a quick demo that Anirudh showcased on how PyTorch is being used in Azure Functions to handle Machine Learning workloads.
In particular, the team thought that Machine Learning model inference could be a good use case for Serverless platform just because at the end of the day it all about call a function to get the work done.
So, after receiving the customer request the team analysed and addressed most of the current serverless challenges that restrict the ML use case to put into the place.
He presented a quick demo on ML inference model where it predicts the image and shows the trust factor as well. In this demo, there we four functions called out with the same code but different trained models.
The result popped up with the same prediction on the image but with different trust factors because of using different trained models.
Also, he explained how the Azure file system is integrated with functions in order to eliminate the overhead of adding files to the function itself. Every model comprises about 50 to 500 MB of storage which is stored to Azure File storage and retrieved during the run time. Along with other capabilities that function offers making it an ideal platform for ML model inference use cases. The mounting of Azure Files is available in all SKU’s including Consumption. Azure Functions is also integrated with Azure ML. Azure ML integration is now available only in the Premium plan and you can start trying out with the article here.
- Automatically mount Azure filesystem on Linux SKU’s
- Gradle plugin and Azure toolkit for IntelliJ Idea
- PowerShell cmdlets – With the latest PowerShell version you can now deploy it with Containers etc. you can go ahead and grab the latest from the PowerShell Gallery today
- Default to new portal UX
- Checkpoint controls for Event Hubs and Cosmos DB
- Distributed Tracing for Java Apps – With the new agent that the team has enabled and configuring a couple of app settings that will allow users to see more enriched monitoring experience.
With the above two app setting configured, you get much richer monitoring experience like the Application Map shows you more details on a particular instance that’s being called and the configured dashboard shows the outgoing request and more. This feature is currently in Private Preview.
- Java 11
- Durable Functions for Python and PowerShell
- IConfiguration improvement in .NET
- Scale-out limits per apps
- Python portal log streaming
- OpenID Connect Integration with App Service Authentication
- Easy Auth for Linux Consumption
- Fabian Williams – A Human Capital COVID-19 survey dashboard
- Jonathan George – Integration Testing Azure Functions with SpecFlow and C#
- Luis Quintanilla – Serverless Racket Applications using Custom Handlers
- Jon Blankenship – Razor-Powered email in Azure Functions
- Michał Żołnieruk – Generating TypeScript typings for your C# Azure Functions
- Nick Korte – Deploying Azure Functions with Visual Studio Code
- Frank Boucher – Building a budget-friendly URL shortener
- Microsoft Build – May 19-21
- INTEGRATE 2020 Remote – June 1-3
Question and Answer Session
- Is Dependency Injections works in Azure Function v3?
Yes, it works pretty well once you are done with right settings
- Are there any capacity issues in functions these days?
The team was able to manage the capacity for Functions. It is also advised to deploy the functions in a defect-free region.
- Can I automatically load my trained models from AML’s model registry?
It should be possible if you are using Azure Auth and possibly thorough MSI feature that’s available in Azure Functions.
- Any updates on IConfiguration that enables you to do custom IConfiguration?
When you create the start-up method and now it’s possible to pass in the services through DI and allows you to modify the configuration.
- any plans for enabling register your middleware implementations for the HTTP trigger?
No immediate plans right now
- When can we get an update on outstanding v3 bugs such as not getting the automatic input JSON we had in v2 for HTTPTriggers anymore?
The team is not specifically aware of it. You may Twitter to ask your queries out there.
- Do Functions on Linux have any limits compared to those on Windows?
One of the significant limits could be scaled out limit (Currently 50)