Azure Blob Storage – Part 2 (WHAT and WHERE) 

This is the second part of Azure Blob Storage series.

  1. Azure Blob Storage – Part 1 (WHY)
  2. Azure Blob Storage – Part 2 (WHAT and WHERE)
  3. Azure Blob Storage – Part 3 (HOW)     —  Coming shortly
  4. Access Control policies (HOW LONG) — Coming shortly
  5. Triggering Azure function to process Blobs —  Coming shortly

In part 1 we discussed regarding why should we re-consider the storage strategies to be ready for ever-growing data and fickle minded users with very short attention span. You want your storage to be secure, reliable, faster, scalable and easy to manage.

Lets start with the necessary pre-requisite steps we would need before we are able to expose the blob container as a service to clients.

 Create a storage account in Azure portal. 

  1. Active Azure Subscription:  You will need an active azure subscription which should have enough credit or you can start with a free trial.

2. Create storage account:  Please refer to the link to learn how to create storage account. Please remember to make informed choices for below parameters while creating the storage account.

  • Deployment model: Either use resource manager templates or classic (manual-scripted) deployment.Choose resource manager for this example.
  • Account Kind: 
    • General Purpose – It can be used to host files,blobs,tables and queues. 
    • Blob Storage – It uses advanced features like cold/hot access to optimize storage as per item’s usage patterns.
  • Performance: 
    • Standard – These are backed by hard disc devices. Stick to standard for lower environments.
    • Premium –  It is backed by Solid State devices SSD for faster access hence better performance. Use this for production.
  • Replication: Replication helps you decide what targets you have set for scalability. Analyze the cost affectiveness of your choice. This MSDN Link is good to make an informed choice.
  • Subscription: If you have many azure subscriptions, choose one.
  • Resource Group: Think of these as Lifecycle boundaries. When multiple azure resources share a group their lifecycle (create,update etc) are managed in an integrated way. For demo purpose you can create a new one or use ayone existing.

storage account

  • After creation of your storage account it will look like above screenshot. You can see the overview is at account level which actually hosts all storage services Blob, Files, Tables and Queues.
  • You can also see the link “Access Keys” on left which is important. This key section contains a ConnectionString. It has an account name and a key (Primary and secondary). You may understand that this key is maintained at account level hence if user should access blobs ONLY then this access policy is not very effective. We will come back to this problem in part 4 about access control policies.

3)  Click on link to Blobs. You should see the screen below

Blob Container

You can create a new container if you wish to. I got a container by default from my subscription.

You are all set to start uploading the files in this blob store. You can use the portal to upload files manually as well. You should see the below listing if you have few files uploaded.Blob store

 

 

Till now we have covered

  • WHY to use Blob as storage option
  • WHAT we can do with it.
  • WHERE to procure the necessary infrastructure. [Azure]

We have not written any code as yet. In next part 3 we will walk through the code to achieve the following.

  1. Upload a file to blob storage.
  2. Download a file from blob stroage.

Access control policies will be discussed in part 4.

Azure Blob Storage – Part 3 (HOW)    —  Coming shortly

 

Advertisements

Azure Blob Storage – Part 1 (WHY should you use it)

Azure has come a long way to come up with reliable storage options offered as cloud services. In this series of articles we will go into details in the below order.

  1. Azure Blob Storage – Part 1 (WHY)
  2. Azure Blob Storage – Part 2 (WHAT and WHERE) 
  3. Azure Blob Storage – Part 3 (HOW)     —  Coming shortly
  4. Access Control policies (HOW LONG) — Coming shortly
  5. Triggering Azure function to process Blobs —  Coming shortly

Even though I am focusing mainly on Blob storage in particular, We will first analyze the need to choose Azure storage and then continue to cover Blob in depth.

Azure offers below storage options as service.

  1. Blob
  2. File
  3. Disc
  4. Queue
  5. Table

Conceptual Diagram

AzureStorage

WHY:  Let’s see why we must re-visit the conventional storage methods which were serving well earlier.

  • Data growth is exponential in ever-growing social media platforms and streaming services.
  • Scalability. You may not want it today but you want to be ready for future.
  • Simplifying storage infrastructure management like backup and Site Recovery.
  • Simplifying the user access management for stored resources. Audit compliance.
  • Simplify to access large size data over web faster using redundancy and SSD.

Previous file shares were useful though it came with many vulnerabilities. Maintenance and user access audits were time consuming tasks. Once the user or a service account has access to file share, there was no way to control the duration of access.

Azure on the other hand takes care of most of the maintenance and other critical tasks for us to focus on the business problem not the operational aspect of infrastructure.

Here is a quick comparison snapshot of storage options as per MS documentation.

StorageComparison.png

 

Even though the comparison chart above is self explanatory, in my personal experience I had observed the below use cases.

  • Upload/download large sized documents of different formats to server. 

    • If this is a new project and you have analyzed the cost aspects of storage, Blob is the easiest to implement which provides all benefits like Redundancy,security and time controlled access.

 

  • Migrate existing File share to Azure storage with minimal or no code change. (Lift and Shift) 
    • In such cases Azure File Share fits better. You can even map a middle tier VM (on IAAS) directory and access it as a regular network file share with no code changes.

 

  • Process large size files in a distributed system. Durability and decoupling is more important here.
    • Choose Queues.

 

  • Store unstructured (noSQL) data which can be quickly accessed across regions.

    • Use Tables.

 

Azure provides extensive range of tooling to support different audiences like Powershell, CLI, Export/Import services.

Now that basic context has been set, we can double click on Blob storage.

First thing you would need is to have an Azure Subscription where we are going to generate and subscribes for these storage services. Refer to the below link to know the steps to create a storage account.

Create a Storage account with Azure subscription. 

Blob Storage:

A blob can be considered to store any file (assume its binary) which can be in any format like .xls,.ppt etc. Blobs reside inside Containers. We will learn more about containers in Part 2.

Blob Storage Types

  • Block blobs :  Are most commonly used files. Up to ~4.7 TB

  • Page blobs:    Are used for random access large files like VHDs. This is how Azure manages files for itself.

  • Append blobs:  Are made up of blocks like the block blobs, mainly used to append information to the existing block blob. Like logging information.

I hope it gives you a good start and we shall go in deeper in Part 2.

Azure Blob Storage – Part 2 (WHAT and WHERE)

 

Access configuration with .Net Core

.Net Core is an open source framework developed on github. For those who are new to it, .Net core is a paradigm shift from the previous release (in a good way). It is an implementation of OWIN (Open web interface for Net).

In previous frameworks, we had appsettings.config file (i.e. for console app) where we configure various key value pairs apart from some project specific settings. In .Net Core (2.0) console App, we don’t get .config file by default.

Below steps will help you reading the config values from config files in .Net Core.

Reading AppSettings using previous System.Configuration.ConfigurationManager class

ConfigurationManager.AppSetting[“key”]

This won’t work in .Net Core as System.Configuration is not present for Core yet (till Core 2.0). You need to do as per below.

  • Create a Console App with .Net Core 2.0
  • Create a appsettings.json file and add to the root of the project.
  • Add desired key value settings in this file.

{
“docDbSettings”: {
“EndpointUrl”: “https://localhost/443”,
“AuthKey”: “134KJKJSHDKFJL###LKJSLKFGK;E 12934903459-          @@@0983459+==”,
“DatabaseId”: “ToDoList”,
“CollectionId”: “Items”
}
}

  • Right click on this json file and select “Copy To Output directory” to “copy always” . This will ensure that file is present in client under ‘bin\Debug\netcoreapp2.0’.
  • Install the below nuget packages
    • Microsoft.Extensions.Configuration
    • Microsoft.Extensions.Configuration.FileExtensions
    • Microsoft.Extensions.Configuration.Json
  • Below is the code to set upConfigurationBuilder which is equivalent to previous ConfigurationManager class to read the keys.
private static IConfiguration Configuration;

static void Main(string[] args)
{
    var builder = new ConfigurationBuilder()
        .SetBasePath(Directory.GetCurrentDirectory())
        .AddJsonFile("appsettings.json");

    Configuration = builder.Build();

    // read the keys
    var Endpoint = Configuration["docDbSettings:EndpointUrl"]?.ToString(); 
    var Key = Configuration["docDbSettings:AuthKey"]?.ToString();
    var DatabaseId = Configuration["docDbSettings:DatabaseId"]?.ToString();
    var CollectionId = Configuration["docDbSettings:CollectionId"]?.ToString();  
}
  • You can notice the tags traversal supported withing json structure. It even supports accessing the keys with index like [“superadmins[0]:Name”] if  it is an array of array like the below

{

“superadmins”: [
{
“Name”: “XYZ”,
“email”: “abc@text.com”
},
{
“Name”: “XYZ”,
“email”: “abc@text.com”
}

]

}

You can get more details from Microsoft documentation on here

You can also read an excellent article from Scott Hanselman on how to secure the config file having connection strings and other keys better in cloud and CI-CD ready development world.

 

Hope it helps!

 

Execute an sql query as string using sp_executesql and store the result in a variable

Small but useful tip for Sql Server 2012.

Here is a simple example for how we can execute an sql string query, execute it and store the result in a variable for further use.

declare @query nvarchar(1000), @result datetime

set @query = ‘select @result = GETDATE()’

EXEC sp_executesql @query,
N’@result datetime OUTPUT’,
@result OUTPUT

print CAST(@result as nvarchar(100))

How to open partial view as modal dialog in asp.net mvc

Asp.Net MVC is picking up fast and yet there is not much expertise around it. Main reason I feel is now developer has complete control on markup so they approach in different ways. It is difficult to for a beginner to implement any design principles.

Anyway, today I am going to showcase how to open a partial view dynamically as a popup using Jquery. I am hoping you have basic idea of how MVC works and you know the Jquery syntax.

View code: Main page ProductView.cshtml. click on the image to see the .cshtml code.
wordpress

Partial view html for list.
code2

Old classic view of productList page (it is also a partial view which rendres as a grid) with edit/delete links in each row. When you click on edit link in the grid, updateDialog div will open as a modal popup and load a partial view in that div dynamically using below code.

Controller action method which returns the partial view to be loaded in div updateDialog.

public ActionResult EditPartialGet(int id)
{
ProductViewModel productVM = new ProductViewModel();
Product p = new Product();
p = productVM.GetById(id);
return PartialView("ProductUpdatePartial",p);
}

finally jquery code to open popup and load partial view using ajax call.

$(function () {
// on click of editLink in grid.
$(“#editLink”).live(‘click’, function () {

var hrefValue = $(this).attr(“href”);
var index = hrefValue.lastIndexOf(“/”) + 1;
var productId = hrefValue.substring(index, hrefValue.length);
// this will call EditPartialGet action method in Products controller which returns
// a partial view html.
$.ajax({
type: “GET”,
url: “Products/EditPartialGet?id=” + productId,
datatype: “json”,
contentType: “application/json”,
success: function (data) {
$(“#updateDialog”).html(data);// returned data is an html view which will be rendered inside the div.
return false;
},
error: function (a, b, c) { alert(b); }
});

$(“#updateDialog”).dialog(
{
modal: true,
buttons: {
// in modal dialog popup Save button will be created. write what you want to do here.
Save: function () {

var ctx = $(“#updateDialog”);
debugger;
// parse the DOM within context of div id.
var product = {
ProductID : $(“#ProductID”,ctx).val(),
ProductName : $(“#ProductName”,ctx).val(),
SupplierID : $(“#SupplierID”, ctx).val(),
CategoryID : $(“#CategoryID”, ctx).val(),
QuantityPerUnit: $(“#QuantityPerUnit”, ctx).val(),
UnitPrice : $(“#UnitPrice”, ctx).val(),
UnitsInStock : $(“#UnitsInStock”, ctx).val(),
UnitsOnOrder : $(“#UnitsOnOrder”, ctx).val(),
ReorderLevel : $(“#ReorderLevel”, ctx).val(),
Discontinued : $(“#Discontinued”, ctx).attr(“checked”)
};

$.ajax({
type: “POST”,
url: “Products/Update”,
data: JSON.stringify(product),
datatype: “json”,
contentType: “application/json”,
success: function (data) {
// submit the data
alert(‘done’);
return false;
},
error: function (a, b, c) { debugger; alert(b); }
});

$(“#updateDialog”).dialog(‘close’);
return false;
},
Cancel: function () { $(“#updateDialog”).dialog(‘close’); }
}
});
return false;
});
});

I think the above code is all you need to implement dynamically loading of partial view in a div.

I hope it helps.