Monthly Archives: November 2017

Azure Blob Storage – Part 2 (WHAT and WHERE) 

This is the second part of Azure Blob Storage series.

  1. Azure Blob Storage – Part 1 (WHY)
  2. Azure Blob Storage – Part 2 (WHAT and WHERE)
  3. Azure Blob Storage – Part 3 (HOW)     —  Coming shortly
  4. Access Control policies (HOW LONG) — Coming shortly
  5. Triggering Azure function to process Blobs —  Coming shortly

In part 1 we discussed regarding why should we re-consider the storage strategies to be ready for ever-growing data and fickle minded users with very short attention span. You want your storage to be secure, reliable, faster, scalable and easy to manage.

Lets start with the necessary pre-requisite steps we would need before we are able to expose the blob container as a service to clients.

 Create a storage account in Azure portal. 

  1. Active Azure Subscription:  You will need an active azure subscription which should have enough credit or you can start with a free trial.

2. Create storage account:  Please refer to the link to learn how to create storage account. Please remember to make informed choices for below parameters while creating the storage account.

  • Deployment model: Either use resource manager templates or classic (manual-scripted) deployment.Choose resource manager for this example.
  • Account Kind: 
    • General Purpose – It can be used to host files,blobs,tables and queues. 
    • Blob Storage – It uses advanced features like cold/hot access to optimize storage as per item’s usage patterns.
  • Performance: 
    • Standard – These are backed by hard disc devices. Stick to standard for lower environments.
    • Premium –  It is backed by Solid State devices SSD for faster access hence better performance. Use this for production.
  • Replication: Replication helps you decide what targets you have set for scalability. Analyze the cost affectiveness of your choice. This MSDN Link is good to make an informed choice.
  • Subscription: If you have many azure subscriptions, choose one.
  • Resource Group: Think of these as Lifecycle boundaries. When multiple azure resources share a group their lifecycle (create,update etc) are managed in an integrated way. For demo purpose you can create a new one or use ayone existing.

storage account

  • After creation of your storage account it will look like above screenshot. You can see the overview is at account level which actually hosts all storage services Blob, Files, Tables and Queues.
  • You can also see the link “Access Keys” on left which is important. This key section contains a ConnectionString. It has an account name and a key (Primary and secondary). You may understand that this key is maintained at account level hence if user should access blobs ONLY then this access policy is not very effective. We will come back to this problem in part 4 about access control policies.

3)  Click on link to Blobs. You should see the screen below

Blob Container

You can create a new container if you wish to. I got a container by default from my subscription.

You are all set to start uploading the files in this blob store. You can use the portal to upload files manually as well. You should see the below listing if you have few files uploaded.Blob store



Till now we have covered

  • WHY to use Blob as storage option
  • WHAT we can do with it.
  • WHERE to procure the necessary infrastructure. [Azure]

We have not written any code as yet. In next part 3 we will walk through the code to achieve the following.

  1. Upload a file to blob storage.
  2. Download a file from blob stroage.

Access control policies will be discussed in part 4.

Azure Blob Storage – Part 3 (HOW)    —  Coming shortly



Azure Blob Storage – Part 1 (WHY should you use it)

Azure has come a long way to come up with reliable storage options offered as cloud services. In this series of articles we will go into details in the below order.

  1. Azure Blob Storage – Part 1 (WHY)
  2. Azure Blob Storage – Part 2 (WHAT and WHERE) 
  3. Azure Blob Storage – Part 3 (HOW)     —  Coming shortly
  4. Access Control policies (HOW LONG) — Coming shortly
  5. Triggering Azure function to process Blobs —  Coming shortly

Even though I am focusing mainly on Blob storage in particular, We will first analyze the need to choose Azure storage and then continue to cover Blob in depth.

Azure offers below storage options as service.

  1. Blob
  2. File
  3. Disc
  4. Queue
  5. Table

Conceptual Diagram


WHY:  Let’s see why we must re-visit the conventional storage methods which were serving well earlier.

  • Data growth is exponential in ever-growing social media platforms and streaming services.
  • Scalability. You may not want it today but you want to be ready for future.
  • Simplifying storage infrastructure management like backup and Site Recovery.
  • Simplifying the user access management for stored resources. Audit compliance.
  • Simplify to access large size data over web faster using redundancy and SSD.

Previous file shares were useful though it came with many vulnerabilities. Maintenance and user access audits were time consuming tasks. Once the user or a service account has access to file share, there was no way to control the duration of access.

Azure on the other hand takes care of most of the maintenance and other critical tasks for us to focus on the business problem not the operational aspect of infrastructure.

Here is a quick comparison snapshot of storage options as per MS documentation.



Even though the comparison chart above is self explanatory, in my personal experience I had observed the below use cases.

  • Upload/download large sized documents of different formats to server. 

    • If this is a new project and you have analyzed the cost aspects of storage, Blob is the easiest to implement which provides all benefits like Redundancy,security and time controlled access.


  • Migrate existing File share to Azure storage with minimal or no code change. (Lift and Shift) 
    • In such cases Azure File Share fits better. You can even map a middle tier VM (on IAAS) directory and access it as a regular network file share with no code changes.


  • Process large size files in a distributed system. Durability and decoupling is more important here.
    • Choose Queues.


  • Store unstructured (noSQL) data which can be quickly accessed across regions.

    • Use Tables.


Azure provides extensive range of tooling to support different audiences like Powershell, CLI, Export/Import services.

Now that basic context has been set, we can double click on Blob storage.

First thing you would need is to have an Azure Subscription where we are going to generate and subscribes for these storage services. Refer to the below link to know the steps to create a storage account.

Create a Storage account with Azure subscription. 

Blob Storage:

A blob can be considered to store any file (assume its binary) which can be in any format like .xls,.ppt etc. Blobs reside inside Containers. We will learn more about containers in Part 2.

Blob Storage Types

  • Block blobs :  Are most commonly used files. Up to ~4.7 TB

  • Page blobs:    Are used for random access large files like VHDs. This is how Azure manages files for itself.

  • Append blobs:  Are made up of blocks like the block blobs, mainly used to append information to the existing block blob. Like logging information.

I hope it gives you a good start and we shall go in deeper in Part 2.

Azure Blob Storage – Part 2 (WHAT and WHERE)


Access configuration with .Net Core

.Net Core is an open source framework developed on github. For those who are new to it, .Net core is a paradigm shift from the previous release (in a good way). It is an implementation of OWIN (Open web interface for Net).

In previous frameworks, we had appsettings.config file (i.e. for console app) where we configure various key value pairs apart from some project specific settings. In .Net Core (2.0) console App, we don’t get .config file by default.

Below steps will help you reading the config values from config files in .Net Core.

Reading AppSettings using previous System.Configuration.ConfigurationManager class


This won’t work in .Net Core as System.Configuration is not present for Core yet (till Core 2.0). You need to do as per below.

  • Create a Console App with .Net Core 2.0
  • Create a appsettings.json file and add to the root of the project.
  • Add desired key value settings in this file.

“docDbSettings”: {
“EndpointUrl”: “https://localhost/443”,
“AuthKey”: “134KJKJSHDKFJL###LKJSLKFGK;E 12934903459-          @@@0983459+==”,
“DatabaseId”: “ToDoList”,
“CollectionId”: “Items”

  • Right click on this json file and select “Copy To Output directory” to “copy always” . This will ensure that file is present in client under ‘bin\Debug\netcoreapp2.0’.
  • Install the below nuget packages
    • Microsoft.Extensions.Configuration
    • Microsoft.Extensions.Configuration.FileExtensions
    • Microsoft.Extensions.Configuration.Json
  • Below is the code to set upConfigurationBuilder which is equivalent to previous ConfigurationManager class to read the keys.
private static IConfiguration Configuration;

static void Main(string[] args)
    var builder = new ConfigurationBuilder()

    Configuration = builder.Build();

    // read the keys
    var Endpoint = Configuration["docDbSettings:EndpointUrl"]?.ToString(); 
    var Key = Configuration["docDbSettings:AuthKey"]?.ToString();
    var DatabaseId = Configuration["docDbSettings:DatabaseId"]?.ToString();
    var CollectionId = Configuration["docDbSettings:CollectionId"]?.ToString();  
  • You can notice the tags traversal supported withing json structure. It even supports accessing the keys with index like [“superadmins[0]:Name”] if  it is an array of array like the below


“superadmins”: [
“Name”: “XYZ”,
“email”: “”
“Name”: “XYZ”,
“email”: “”



You can get more details from Microsoft documentation on here

You can also read an excellent article from Scott Hanselman on how to secure the config file having connection strings and other keys better in cloud and CI-CD ready development world.


Hope it helps!