Azure Functions 2.0 – Real World Use Case for Serverless Architecture
Let's see just how much has changed for Azure Functions 2.0.
Join the DZone community and get the full member experience.
Join For FreeAt the end of September, Microsoft announced the general availability of Azure Functions 2.0. When we read Eduardo Laureano's blog post, we were very excited about improvements and new features. Eduardo wrote:
Azure Functions 2.0 is production ready and capable of handling your most demanding workloads, backed by our 99.95 percent SLA.
In Altkom Software & Consulting, we decided that it's a good time for testing it deeper than a “Hello World” example.
Why Serverless Matters
For many years, software engineers have been thought to design our systems from low coupled components with high cohesion. Yet later, all of these components were deployed to one big machine, losing much of its power to scale and its independence, as all of them have to share resources of the same machine. With serverless, it’s time to be rewarded for a good design. Now, we can construct our system from small independent components that scale independently and each component can have its own performance and scalability requirements satisfied. What’s more important – good design is also economically justified. With serverless, we only pay for CPU and RAM that our components actually use.
Another advantage of this approach is that we do not have to manage infrastructure – no need to provision VMs, install and update OSes.
Serverless cloud takes care of all of this, plus providing monitoring and auto-scaling features.
A Quick Tour Of Azure Functions Features
Main features of Azure Functions:
- Choice of language – write functions using C#, F#, Node.js, Java, PHP, batch, bash, or any executable.
- Pay-per-use pricing model – pay only for the time spent running your code. See the Consumption hosting plan option in the pricing section.
- Bring your own dependencies – functions supports NuGet and NPM, so you can use your favorite libraries.
- Integrated security – protect HTTP-triggered functions with OAuth providers such as Azure Active Directory, Facebook, Google, Twitter, and Microsoft Account.
- Simplified integration – easily leverage Azure services and software-as-a-service (SaaS) offerings: SendGrid, Twilio
- Flexible development – code your functions right in the portal or set up continuous integration and deploy your code through GitHub, local Git, Visual Studio Team Services, and other supported development tools.
- Open-source – the function’s runtime is open-source and available on GitHub.
- Possibility to deploy on-premise.
Business Use Case
We had an idea to try to implement a simple billing in serverless architecture for a customer who sells its services in subscription based model. The customer sends a list of its employees who can use offered services. Based on contract prices for each type of subscription, the system calculates fees for each employee. Then it aggregates it to create an invoice. Finally, a PDF printout is generated and sent to customers, together with sms/email notification.
The diagram below shows in detail how the flow between functions looks like. The source code and tutorial about running solution locally are available on GitHub at this link.
- User uploads a CSV file with Beneficiaries to a specific data storage – Azure Blob Container.
- The above action triggers a function GenerateBillingItemsFunc that is responsible for:
- generating billing items, which use prices from an external database – CosmosDB and saving them in the table – Azure Table;
- sending message about the need to create a new invoice to Azure Queue;
- When a new message appears on the queue, the next function is triggered (GenerateInvoiceFunc). This function creates domain object Invoice and saves this object in database – CosmosDB. After a successful save, it sends a message to two Azure Queues.
- When a new message appears in one of queues, the function PrintInvoiceFunc is triggered. This function uses external engine to PDF generation – JsReport and saves a PDF file in Azure Blob Storage.
- When a new message appears in the second queue, the function NotifyInvoiceFunc is triggered. This function uses two external systems – SendGrid for email sending and Twilio for SMS sending.
We tried to create functions with the best practices, i.e. they would be small, simple and would work independently.
We tested two approaches to create functions:
- one application/project = one function
- all functions in one application/project
AllInOneProject vs. separated functions
Then choosing an approach influences the way the functions are grouped into Function App. The following is written in the Azure docs about a Function App:
A function app provides an execution context in Azure in which your functions run. A function app consists of one or more individual functions that are managed together by Azure App Service. All of the functions in a function app share the same pricing plan, continuous deployment and runtime version. Think of a function app as a way to organize and collectively manage your functions.
If we choose the first approach, all functions will share the same pricing plan, continuous deployment and runtime version. The second approach allows you to separate it all. If you are interested in more details, read this Marc Duiker’s article.
In the following sections we will describe in more detail how each of these functions are built.
Billing Items Generation
The main responsibility of this function is to parse uploaded CSV file and generate billing items using prices from external database.
[FunctionName("GenerateBillingItemsFunc")]
public static void Run(
[BlobTrigger("active-lists/{name}", Connection = "AzureWebJobsStorage")] Stream myBlob, string name,
[Table("billingItems")] out ICollector billingItems,
[Queue("invoice-generation-request")] out InvoiceGenerationRequest queueRequest,
ILogger log)
{
log.LogInformation($"C# Blob Trigger function Processed blob: {name} Bytes");
var activeList = ActiveListParser.Parse(name, myBlob);
var generator = new BillingItemGenerator();
var priceList = GetPriceList(activeList.CustomerCode);
foreach (var bi in generator.Generate(activeList, priceList))
{
billingItems.Add(bi);
}
queueRequest = InvoiceGenerationRequest.ForActiveList(activeList);
}
Thanks to theBlobTrigger
attribute, a function will be triggered if a user uploads a CSV file to Blob Storage Container with name active-lists which is configured by the Connection
parameter.
The name of uploaded file must be compatible with a pattern: [CLIENT_CODE]_[YEAR]_[MONTH]_*, for example: ASC_2018_11_activeList.txt.
Example file content:
99050555745;Annaliese Verena;A
29120458762;Josepha Gusti;A
39091666028;Deborah Wenzi;B
77050929111;John Smith;A
76091166752;Bob Martin;A
97031653569;Alice Smith;B
35060205229;Patricia Glide;A
38112669875;Mike Kowalski;B
13102408939;Kali Mali;A
Each line in this file represents a National Identification Number (PESEL in Poland), name with surname and product code.
Based on the first part of the filename (client code, in example ASC), the system knows what prices should be used to generate billing items. In other words, prices for specific products are defined per customer and customer code is unique id.
If you are interested in details, look at the ActiveListParser.cs and later PriceRepository.cs.
Thanks to theTable
attribute, a function can save information in Azure Table Storage that stores structured NoSQL data, providing a key/attribute store with a schemaless design.
To add a new record to the table, just use billingItems.Add()
.
Thanks to the Queue
attribute, a function can save message in Azure Queue. Assigning a value to the input parameter queueRequest
will send the message to the queue.
Invoice Generation
After generating billing items, we need to generate invoices for clients.GenerateInvoiceFunc
prepares and saves in database domain objectInvoice
and sends information to queues about this.
[FunctionName("GenerateInvoiceFunc")]
public static void Run(
[QueueTrigger("invoice-generation-request")] InvoiceGenerationRequest request,
[Table("billingItems")] CloudTable billingItems,
[CosmosDB("crm", "invoices", ConnectionStringSetting = "cosmosDb")] out dynamic generatedInvoice,
[Queue("invoice-print-request")] out InvoicePrintRequest printRequest,
[Queue("invoice-notification-request")] out InvoiceNotificationRequest notificationRequest,
ILogger log)
{
log.LogInformation($"C# Queue trigger function processed: {request.CustomerCode} {request.Year} {request.Month}");
var generator = new InvoiceGenerator();
var items = GetBillingItemsFromTable(billingItems, request);
var invoice = generator.Generate(request, items);
generatedInvoice = invoice;
printRequest = new InvoicePrintRequest { InvoiceToPrint = invoice };
notificationRequest = new InvoiceNotificationRequest { InvoiceForNotification = invoice };
}
Thanks toQueueTrigger
, when a message appears on the invoice-generation-request
queue, a function should be triggered.
Thanks to the Table
attribute in th function, we can have access to table in which we saved billing items in the previous step. We used theCloudTable
class to read the table, because popularIQueryable
is not supported in the Functions v2 runtime. Using the methodGetBillingItemsFromTable
(created by us), based on data from request, we are able to download the table segment we are interested in:
static List GetBillingItemsFromTable(CloudTable billingItems, InvoiceGenerationRequest request)
{
TableQuery query = new TableQuery()
.Where(
TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, $"{request.CustomerCode}-{request.Year}-{request.Month}")
);
var querySegment = billingItems.ExecuteQuerySegmentedAsync(query, null);
var items = new List();
foreach (BillingItem item in querySegment.Result)
{
items.Add(item);
}
return items;
}
Thanks toCosmosDB
attribute, we bind the database where invoice object should be saved.
Last twoQueue
bindings are used to inform next two functions which will be responsible for creating PDF with invoice and sending notification (email and sms) to user.
Invoice Printing
This function should be triggered when we want to create PDF with invoice for the client:
[FunctionName("PrintInvoiceFunc")]
public static void Run(
[QueueTrigger("invoice-print-request")]InvoicePrintRequest printRequest,
Binder binder,
ILogger log)
{
var jsReportUrl = Environment.GetEnvironmentVariable("JsReportUrl");
var pdf = new InvoicePrinter(jsReportUrl).Print(printRequest.InvoiceToPrint);
StoreResultInBlobAsync(
binder,
$"Invoice_{printRequest.InvoiceToPrint.InvoiceNumber.Replace("/","_")}",
pdf);
}
QueueTrigger
attribute is known and is used to start the function when a message appears on the selected queue (in this example queue with name invoice-print-request
).
The PDF creation process has been delegated to an external system – JS Report. We created our own instance from a Docker image on Azure and in connection with this based on URL from environment variable.
Thanks to theBinder
method parameter, we can asynchronously save created PDF in blob storage. The following method is used for this:
private static async Task StoreResultInBlobAsync(Binder binder, string title, byte[] doc)
{
using (var stream = await binder.BindAsync(new BlobAttribute($"printouts/{title}.pdf", FileAccess.Write)))
{
using (var writer = new BinaryWriter(stream))
{
writer.Write(doc);
}
}
}
Notifications Sending
In parallel to PDF creation process, the user notification process is started.
[FunctionName("NotifyInvoiceFunc")]
public static void Run(
[QueueTrigger("invoice-notification-request")] InvoiceNotificationRequest notificationRequest,
[SendGrid(ApiKey = "SendGridApiKey")] out SendGridMessage email,
[TwilioSms(AccountSidSetting = "TwilioAccountSid", AuthTokenSetting = "TwilioAuthToken", From = "+15005550006")] out CreateMessageOptions sms,
ILogger log)
{
log.LogInformation($"C# Queue trigger function processed: {notificationRequest}");
email = CreateEmail(notificationRequest);
sms = CreateSMS(notificationRequest);
}
Thanks to the built-in integration with such systems as Twilio and SendGrid, we can send SMS text messages and emails without any problem or over-configuration.
To send an email with SendGrid, we need to useSendGrid
attribute with defined APIKey property and save created object in the method parameter. To createSendGridMessage
object we used the method below:
private static SendGridMessage CreateEmail(InvoiceNotificationRequest request)
{
var email = new SendGridMessage();
email.AddTo("CUSTOMER_EMAIL@example.com");
email.AddContent("text/html", $"You have new invoice {request.InvoiceForNotification.InvoiceNumber} for {request.InvoiceForNotification.TotalCost.ToString()}.");
email.SetFrom(new EmailAddress("YOUR_EMAIL@example.com"));
email.SetSubject($"New Invoice - {request.InvoiceForNotification.InvoiceNumber}");
return email;
}
Sending SMS should be done in the same way. Thanks to TwilioSms
attribute, we integrated with our Twilio account. The propertyfrom
is filled by magic test number based on this docs.
private static CreateMessageOptions CreateSMS(InvoiceNotificationRequest request)
{
return new CreateMessageOptions(new PhoneNumber("+15005550006"))
{
Body = $"You have new invoice {request.InvoiceForNotification.InvoiceNumber} for {request.InvoiceForNotification.TotalCost.ToString()}."
};
}
Pricing
Pricing depends mainly on the execution time and memory consumption.
Based on approximately 100 test calls to each function, we calculated the average execution time:
Function Name | Average Execution Time (seconds) |
GenerateBillingItemsFunc | 5,61 |
GenerateInvoiceFunc | 3,27 |
PrintInvoiceFunc | 3,00 |
NotifyInvoiceFunc | 2,00 |
Unfortunately, in Azure Functions there is no way to check what “Memory consumption” per request was. Version 2.0 did not bring any changes in this topic. Issue on GitHub is still open. Unofficial ways to measure this (comparing to the local environment, checking how much memory the Function App allocates) did not seem precise enough, so we skipped them. For this reason, we assume 512 MB for each function, for purpose of our estimation (which greatly exceeds actual memory usage).
Next, based on documentation, we prepared Excel file with pricing calculation:
According to these calculations, we would pay 79.3 euro for a million executions of the process. Limiting Resource consumption to 256MB decreases this amount to 28 euros.
Approximately first 130,000 calls are free.
It should be noted that storage rates and network rates have been omitted in the calculations.
Monitoring
Thanks to integration with Azure Application Insight, we can monitor functions very well and easily observe results.
In Functions 2.0, creators expanded integration with Application Insights to give more visibility into distributed tracing. Thanks to visualisation in the Application map, we can understand how components interact and drill into individual executions to diagnose issues.
Application Map all functions in one application (without separation)
Application Map for functions in separate projects
Thanks to end-to-end transaction details view, we can monitor each execution and find performance issues.
Summary
This example, in our opinion, shows that Azure Functions 2.0 are production-ready technology. In a few points below we have written the main pros and cons of serverless architecture and this solution.
The developers’ experience is great. We can easily build the whole solution for our machines, as well as deploy it to Azure.
The platform provides declarative bindings for access to all resources: blobs, tables, databases, queues, http request/response, and external services like SendGrid or Twilio. It removes the burden of having to manage connections manually and freeing resources, which greatly simplifies code. Monitoring capabilities are of high quality and help diagnose problems quickly.
Pros
- Great developers’ experience
- Promotes better design practices
- Allows developers to focus on writing small, autonomous components that adhere to solid/oop practices
- Autoscaling and monitoring out of the box
- Pay only for resources that actually use
- Removes the whole burden of server / vm / container management
- Integration with a lot of technologies (queues, databases, blobs, external systems like twilio, send grid) is very simple thanks built-in methods
- Monitoring tools available out of the box
Cons
- Magical infrastructure increases the risk of integration problems
- “Cold start” problem still exists
- The integration possibilities are limited to what azure offers (but you can always try to get around this via http trigger)
- Lack of control over server apps requires re-thinking around sessions and authorization
- Configuration becomes a lot more complex
- Cost control is not perfect, but it improves from version to version
Published at DZone with permission of Robert Witkowski. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments