Back to 2b site

6 Best Practices for Blobs Storage on Azure

How much do you know about using blobs on Azure? Here’s your lowdown on how they work, and how to start seeing cost-savings immediately.

When should I be using blobs?

Storing data in blobs on Azure will allow you to reliably store vast amounts of unstructured data on the Azure cloud. Think about use cases such as media files served to users’ browsers, large quantities of log files, or large files for disaster recovery and backup.  Best of all, if you do it right, blobs can provide huge cost savings. Here are our 6 top tips for using blobs effectively.   

  1.   Start as You Mean to Go On

Who doesn’t like keeping their options open? If you’ve decided to store blobs on the cloud, use General Purpose v2 storage account  (GPv2) or Blob storage account. If you don’t, you won’t be able to take advantage of advanced features, such as lifecycle management and access tiering, which are available only with these types of storage accounts.

  1.   Think about the Right Access Tiers

There are 3 access tiers available: hot, cool and archive. The cooler the access tier goes, the less you pay for the storage, with archive being the cheapest option. However, costs of accessing the blobs are converse, which means they will get higher as the tier gets cooler. Smart usage of the tiers will work something like this:

  • Hot Tier: Use for storing blobs that are accessed the most. This could be media files used for the end customer that are modified or used very often. 
  • Cool Tier: Use for storing blobs that are accessed less frequently, such as log files or reporting that are not regularly read or accessed. You will be charged for at least 30 days of storage, even if you remove the files after a shorter period of time.
  • Archive Tier: Use to store blobs that are rarely if ever accessed, such as information you need to keep for compliance. You will not be able to access these blobs immediately, so only utilize this tier if you’re happy to wait several hours for your data. In addition, you will be charged for a minimum of 180 days of storage, so think carefully about whether this is cost-effective for your business.

Remember, blobs that utilize all of the access tiers can coexist on the same account.

Blobs Storage Access Tier

  1.   Choose Your Default Access Tier Wisely

While setting up a new GPv2 or Blob storage account, you will be asked to define its default access tier. When you upload blobs, these will be automatically assigned to this tier, as well as any blobs that function under the default account, too. If you would like to configure an access tier separately for each blob, make sure to explicitly mark this under the Access Tier property.

It’s a good practice to set the hot tier as the default tier, to avoid incurring charges for accessing your storage. You can always change the default access tier afterwards. Remember, changing the default access tier will change the access tier for all the blobs that are inferred, too, and may incur some additional costs.

  1.   Enable Logging for your Blobs

Enabling logs will allow you to spot access patterns. In turn, you can use this to create smart policies that make your storage more cost-effective. Here’s a step by step guide to how to set up logging. You can do this using Azure portal, Powershell, or Storage SDKs. Note that the same blob storage prices will apply to the logs also. Make sure to set up automatic log deletion in the settings, so that you’re only paying for what you need.

  1.   Don’t Forget to Set Up Lifecycle Management

These logs can then be used to establish intelligent lifecycle management policies. In the lifecycle management tool, you can assign access tiers to the blobs and/or delete them using automation. For your convenience you may define several lifecycle management policies. The number of days since the last blobs’ modification date should be explicitly specified in the policy. Strive to define lifecycle management policies as early as possible based on the access patterns derived from the logs.

Inside the lifecycle management policies, you can choose optional filter sets, which are applied to any policy. For example, ‘blobs in the fileLogs container which have a name that starts with myappLogs ought to be removed after 1 week has passed since the last modification.’ Use the prefix in the filter set to include any subfolders that you want included in the policy definition, and remember that you can stipulate several filter sets from a single rule. If there are several rules defined on a folder, Azure will add a logical AND to all of them.

Remember that you can’t apply rules to the system containers, for example $logs, or $web.

  1.   Recognize When You Need to Use Premium Storage Accounts on Azure

Still finding the costs of your hot tier more than you can manage? If your data is accessed very frequently, you might benefit from creating a separate Premium storage account and moving the frequently accessed blobs there. The storage costs there are higher than for hot tier blobs, but the operations will cost you less. It’s important to note that tiering isn’t available in Premium storage accounts. However, you can still use lifecycle management with its advanced filters. In general, after analyzing the workload, you can use Azure Pricing Calculator in order to compare different scenarios and work out which is best for your organizational needs.

Have more questions? We’ve got the answers! You can always send us an email with Azure-related questions to [email protected]