Azure Runbook vs Log Analytics: Head to Head
If you’ve been asking yourself what your best options are for monitoring and alerting on blobs within containers on Azure, you’ve come to the right place. We’re going to take a deep dive into the differences between using Log Analytics Service, or Azure Runbooks (Automation Accounts), explaining the process for both options, and seeing which comes out strongest, too.
Let’s look at a single example. Every day at 00:01, a new blob is created with the name “blob” and the current date. For instance, on Tuesday 22nd of December 2020, at 00:01, a new blob was created. It will have the name “blob22.12.2020”. Your goal is to monitor this blob and to alert the relevant stakeholder if the blob was not created. Let’s look at the two options, and then compare them on relevant metrics, to see how they measure up, head to head.
Understanding Azure Runbook
The first option is using the Automation Accounts service. You write a code that alerts if the blobs weren’t created and then insert the code into a PowerShell runbook. Finally, you schedule the script to run once a day.
The first step is creating the Automation Accounts. In the setup process, you will be asked whether you want to create ‘AzureRunAsConnection’. Run As Accounts is a service that is principally used to provide authentication for managing resources in Azure with the Azure cmdlets. If you want to authenticate via this service principle, you need to first allow the creation.
The second step is to create a new runbook within the automation account. In our scenario, the scripting language is PowerShell and therefore the runbook type should be PowerShell, too.
Understanding the Code Process:
- The first part of the code is for authenticating to our Azure environment. Below, you can see an example code.
- The second part of the code is for creating a function that sends the email (‘SendMail’) with the help of SendGrid. In the following code, the email will be sent from ‘[email protected]’ to ‘[email protected]’. The name of the sender will be BlobAutoMail. Note: Be aware that in the first line you will need to specify the variable of this cmdlet.
- The third part of the code is the most crucial one. It checks whether a specific blob exists. If it’s missing, it invokes the ‘SendEmail’ function. Below you can see an example code.
These three steps will allow you to monitor the creation of this blob using Azure Runbook. Now, let’s look to Log Analytics, to see how the process measures up.
Understanding Log Analytics
The second option starts by enabling storage diagnostic logs and then exporting them to Log Analytics Workspace.
Then, you write a query via Log Analytics Workspace. If the query result is null, an Azure Monitor Alert will be triggered that automatically sends an email. Here is an example of a simulated query, so you can see how it works:
It is simple to preconfigure the alert condition logic according to your own business needs, as you can see below.
Gloves On
Round One: Pricing
Azure Runbook:
Our code does not execute for long, so within 1-2 minutes, the runbook should be complete, depending on how many blobs you want to monitor. Azure offers 500 minutes of job run time for free each month, therefore if this is the only Azure Automation in your environment, the monitoring will be free. If you exceed the 500-minute limit, you will pay 0.002$/minute for job runtime.
Log Analytics:
To monitor using Log Analytics, you must stream all jobs from the specified storage account to Log Analytics Workspace. You cannot stream specific container logs. So, first, you need to take into account that all of your Storage Account logs will be streamed to Log Analytics workspace. Azure offers 5GB of data ingestion from a storage account to Log Analytics workspace free per month. If you exceed the limit, you will pay 2.76$ per GB.
Remember, you will also need to create an alert that runs once a day. This costs $0.50 per log monitored, per month.
Winner: Azure Runbook
Round Two: Reliability
Azure Runbook:
As Azure Runbook depends on a script if there are any errors or exceptions the runbook may not complete. In this case, you will not know whether the blobs were created. You can be sure that the runbook is completed by monitoring Azure Runbook jobs, but this will add additional monitoring and overhead to the solution, increasing the complexity and resources required.
Log Analytics:
As a reliable, automated service, there is nothing you need to worry about regarding this choice. As long as you correctly configured the streaming jobs, alert conditions and alert actions initially, you can be sure that you will know if the blob was not created.
Winner: Log Analytics
Round Three: Simplicity
Azure Runbook:
To monitor blob existence via Azure Runbook you must execute the following steps: Write the PowerShell script that determines if the blob exists, and triggers an email if it does not. Create a runbook.
Schedule the runbook to run once a day. While the process is not long to complete, there can be some delays with writing and testing the code itself.
Log Analytics:
To monitor blob existence via Log Analytics, the process is as follows: Enable the Diagnostic setting in the destination storage account and transfer logs to your workspace.
Query StorageBlobLogs table Create an alert that notifies if the blob does not exist.
The process is just as easy, and you don’t have to consider delays in writing and testing, making log analytics the winner by a slight margin here.
And the Winner is….
While Azure Runbooks may be the more cost-effective way to monitor the creation of blobs, depending on your additional environment usage, the ease of set-up, and the reliability of the monitoring itself makes Log Analytics the clear winner here.
Have more questions? We’ve got the answers! You can always send us an email with Azure-related questions to [email protected]