Feature Management – Azure App Configuration Service

Recently I was working on a project where we need to release new features to target specific set of users based on custom settings. While we started looking for alternatives, we found that Azure came up with a fully managed service to provide

A universal storage for all your Azure apps. Manage configurations effectively and reliably, in real time, without affecting customers by avoiding time-consuming redeployments. Azure App Configuration Preview is built for speed, scalability and security.

 

In this article I will explain WHY, WHAT and HOWs of using “Azure App Configuration” as a service.

Scenarios

If you are working on enterprise applications where your team regularly Pilot/flag new features dynamically without the need to deploy your code. Primarily isolating your configuration deployment from code deployments. Few examples could be

  • Use Case 1: Enable a feature only during black Friday sale [start time and End time are fixed]
  • Use Case 2: Enable a feature only for N% of incoming requests
  • Use Case 3: Enable a feature only for configurable user attributes like their “email/country” etc.
  • Use Case 4: Enable a feature only if ”somefeaturekey” value is “true”

Why Azure App Configuration Service?

  1. It is a fully managed azure service.
  2. It’s in preview as of today and currently Free. GA should be out by mid-October’19.
  3. Unified Feature Management configuration as a service.
  4. Unified app settings configuration as a service.
  5. Azure Portal has Ux with RBAC support by default.
  6. Disaster recovery is enabled.
  7. Maintains History of key value changes.
  8. Managed Service Identity MSI support is enabled.
  9. Supports event grid triggers/webhooks when any setting is changed. Amazing right?
  10. Supports Secret Manager tool to work efficiently in development mode.
  11. Works with clients like Asp.Net Core, .Net framework, Function Apps and JavaSpring as of today.

Get Started with App Configuration: Feature Manager

  1. Got to Azure portal search for -> App Configuration
  2. Create new App configuration store. Follow simple steps and wait for spinning it up.
  3. You should see the below blade of options. If you are familiar with Azure offerings each section is self-explanatory

blog-image-1

4. Note down its connection string prefer to read only connection strings from “Access Keys” section. You will need it to connect it from any service.

5. Go to “Feature Manager” section. Looks like below.

blog-image-2.png

6. Click on Add.

7. Add new FeatureKey “ShowNewNotifications” and set it to true.

8. Ignore the “Add Filter” for now. It’s used to add more custom conditions. We will come back to it later.

9. Click on Apply. You should see

blog-image-3

Feature Key is all set for your client to start consuming it.

Asp.Net Core Client – Simple Feature Toggle

While it works with clients like Function Apps and JavaSpring, I am demonstrating using Asp.Net Web Application (WebApi) project which targets .net framework 4.6.2. I am using Asp.Net Core 2.2 as a nuget package for web capabilities.

 

  1. Setup required nuget packages
    • Extensions.Configuration.AzureAppConfiguration 1.0.0-preview-008920001-990
    • Extensions.Configuration 2.2.0
    • FeatureManagement.AspNetCore 1.0.0-preview-009000001-1251
    • FeatureManagement 1.0.0-preview-009000001-1251
  2. Initialize App Configuration Service by Injecting the service on startup like below.

public class Startup  {

public IConfiguration Configuration { get; }

public Startup(IHostingEnvironment env)

{

var builder = new ConfigurationBuilder()

.SetBasePath(env.ContentRootPath)

.AddJsonFile(“appsettings.json”, optional: true, reloadOnChange: true)

.AddJsonFile($”appsettings.{env.EnvironmentName}.json”, optional: true);

Configuration = builder.Build();

builder.AddAzureAppConfiguration(options =>                            options.Connect(“AppConfigurationConnectionString_YouCopiedFromPortal”)                       .UseFeatureFlags()

);

}

public void ConfigureServices(IServiceCollection services)        {

services.AddFeatureManagement()              .AddFeatureFilter<PercentageFilter>()  // inbuilt filter to enable a feature for only N% of incoming requests

.AddFeatureFilter<TimeWindowFilter>()  // inbuilt filter to enable a feature for only during a time window.

.AddFeatureFilter<PremierEmailRuleFeatureFilter>(); //custom filter to enable a feature only when your specific logic returns true. Example is discussed in this article next.

}

}

That’s all the wiring you need to start reading your feature flag settings from App Configuration service.

3. Create Api Controller which will read the feature flag “ShowNewNotifications” we have setup to true.

using Microsoft.AspNetCore.Http;

using Microsoft.AspNetCore.Mvc;

using Microsoft.Extensions.Configuration;

using Microsoft.Extensions.Logging;

using Microsoft.Extensions.Primitives;

using Microsoft.FeatureManagement;

using System;

using System.Diagnostics.CodeAnalysis;

using System.Linq;

namespace TestCoreWebApp.Controllers  {

[Route(“api/[controller]”)]

public class ValuesController : Controller    {

private readonly IFeatureManager _featureManager;

private readonly IConfiguration _config;

public ValuesController(IFeatureManager featureManager, IConfiguration config)        {

_featureManager = featureManager;

_config = config;

}

public string Get()

{

var isEnabled = _featureManager.IsEnabled(FeatureFlagConstants.ShowNewNotifications);

var output = $” Feature key ShowNewNotifications is set to {isEnabled}”;

return output;

}

}

}

4. Run the project in debug mode. Hit F5

5. You will see that _featureManager.IsENabled method will return true which is reading value from App Configuration portal. It returns true.

That’s how easy it is. By injecting IFeatureManager to your service code and you can enable features dynamically.

 

Add Conditional Feature Filter

While its really simple to access On/Off feature keys using _featureManager.IsEnabled(“keyname”) more often we will need custom logic to execute before a feature should be enabled.

We will write a custom filter to be ON conditionally only when certain EmailId is listed in a filter list. This feature flag decided if we need to send the notifications to this email or not.

  1. Go to azure portal -> App Configuration -> Feature Manager section
  2. Click on Add new feature.
  3. You should see the screen as below if you click the radio button “On/Conditional”. It means that If the condition is true Enable else disable always.
  4. blog-image-4
  5. Click on Add Filter. Type filter name “PremierEmailRule”.
  6. Click on the ellipses new to filter name to add parameters to this filter.
  7. Add Parameter name “AllowedEmailList”.
  8. Add Value to this parameter like test1@contoso.com,test2@contoso.com
  9. Please note that value is just a string with comma separated email ids.
  10. Click Apply to Parameters.
  11. Click Apply to Feature blade as well.
  12. You will see this new custom filter listed like this
  13. blog-image-5
  14. You can verify by clicking Edit if you want.

Your custom feature is all set up. Let’s go to api controller to consume this.

Write your own Custom Feature Filter

  1. Create a class which inherits from IFeatureFilter.
  2. Please read comments on the code below.
  3. Using Microsoft.FeatureManagement;

    // this name must match the Filtername in portal.

    [FilterAlias(“PremierEmailRule”)]

    public class PremierEmailRuleFeatureFilter : IFeatureFilter    {

    //Used to access HttpContext attributes like user,claims,request params etc.

    private readonly IHttpContextAccessor _httpContextAccessor;

    public PremierEmailRuleFeatureFilter(IHttpContextAccessor httpContextAccessor)        {

    _httpContextAccessor = httpContextAccessor;

    }

    /// <summary>

    /// Executes when _featureManager.IsEnabled(featurename) is

    /// called and azure app configuration has a valid filter present

    /// </summary>

    public bool Evaluate(FeatureFilterEvaluationContext context)        {

    StringValues queryVal;

    // context will have all Parameters we have setup in portal. Names and their values.

    // we will also create the class “PremierEmailRuleFilterSettings”. Check below.

    var featureFilterParams = context.Parameters.Get<PremierEmailRuleFilterSettings>();

    // Retrieve the current user email to find out if he is listed for notification or not.    _httpContextAccessor.HttpContext.Request.Query.TryGetValue(“emailIdFromQueryString”, out queryVal);

    // Only enable the feature if the user has ALL the required claims

    var isEnabled = featureFilterParams.AllowedEmailList.Split(‘,’).Contains(queryVal.FirstOrDefault(), StringComparer.OrdinalIgnoreCase);

    return isEnabled;

    }

    }

    // it’s a simple poco class name it anything you like.

    public class PremierEmailRuleFilterSettings

    {

    //this name must match the parameter name you have set in portal. refer to screen shots above.

    public string AllowedEmailList { get; set; }

    }

  4. Register all such custom filters in startup Configure method like this
  5. services.AddFeatureManagement()

                  .AddFeatureFilter<PremierEmailRuleFeatureFilter>()

                 .AddFeatureFilter<PercentageFilter>();

  6. As you can see _featureManager.IsEnabled(“keyname”) will now invoke Evaluate method from IFeatureFilter. If you are debugging below controller code it will hit Evaluate method.
  7. [Route(“api/[controller]”)]

    public class ValuesController : Controller    {

    private readonly IFeatureManager _featureManager;

    private readonly IConfiguration _config;

    public ValuesController(IFeatureManager featureManager, IConfiguration config)        {

    _featureManager = featureManager;

    _config = config;

    }

    public string Get()

    {

    // pass the key name you have setup in portal

    var isEnabled = _featureManager.IsEnabled(“SendNotificationsToPremierPartners”);

    var output = $” Feature key SendNotificationsToPremierPartners is set to {isEnabled}”;

    return output;

    }

    }

     

  8. Above code will hit the Evaluate method implemented by custom feature filter class PremierEmailRuleFeatureFilter explained above.

    That’s all there is to implement feature flagging/flighting in Azure ecosystem.

    Configuration Explorer

    If you are wondering that can we also use to replace the app settings json file completely. Answer is YES you can.

blog-image-7

Using Configuration explorer, you can maintain all your key value pair settings in a single location which can be consumed by WebApis, services on VMs, Function Apps etc. Read further here for more information.

References

You may not find may articles on this as of today (June’19) but MS documentation is good enough for you start and implement it right away.

Advertisements

Azure Blob Storage – Part 2 (WHAT and WHERE) 

This is the second part of Azure Blob Storage series.

  1. Azure Blob Storage – Part 1 (WHY)
  2. Azure Blob Storage – Part 2 (WHAT and WHERE)
  3. Azure Blob Storage – Part 3 (HOW)     —  Coming shortly
  4. Access Control policies (HOW LONG) — Coming shortly
  5. Triggering Azure function to process Blobs —  Coming shortly

In part 1 we discussed regarding why should we re-consider the storage strategies to be ready for ever-growing data and fickle minded users with very short attention span. You want your storage to be secure, reliable, faster, scalable and easy to manage.

Lets start with the necessary pre-requisite steps we would need before we are able to expose the blob container as a service to clients.

 Create a storage account in Azure portal. 

  1. Active Azure Subscription:  You will need an active azure subscription which should have enough credit or you can start with a free trial.

2. Create storage account:  Please refer to the link to learn how to create storage account. Please remember to make informed choices for below parameters while creating the storage account.

  • Deployment model: Either use resource manager templates or classic (manual-scripted) deployment.Choose resource manager for this example.
  • Account Kind: 
    • General Purpose – It can be used to host files,blobs,tables and queues. 
    • Blob Storage – It uses advanced features like cold/hot access to optimize storage as per item’s usage patterns.
  • Performance: 
    • Standard – These are backed by hard disc devices. Stick to standard for lower environments.
    • Premium –  It is backed by Solid State devices SSD for faster access hence better performance. Use this for production.
  • Replication: Replication helps you decide what targets you have set for scalability. Analyze the cost affectiveness of your choice. This MSDN Link is good to make an informed choice.
  • Subscription: If you have many azure subscriptions, choose one.
  • Resource Group: Think of these as Lifecycle boundaries. When multiple azure resources share a group their lifecycle (create,update etc) are managed in an integrated way. For demo purpose you can create a new one or use ayone existing.

storage account

  • After creation of your storage account it will look like above screenshot. You can see the overview is at account level which actually hosts all storage services Blob, Files, Tables and Queues.
  • You can also see the link “Access Keys” on left which is important. This key section contains a ConnectionString. It has an account name and a key (Primary and secondary). You may understand that this key is maintained at account level hence if user should access blobs ONLY then this access policy is not very effective. We will come back to this problem in part 4 about access control policies.

3)  Click on link to Blobs. You should see the screen below

Blob Container

You can create a new container if you wish to. I got a container by default from my subscription.

You are all set to start uploading the files in this blob store. You can use the portal to upload files manually as well. You should see the below listing if you have few files uploaded.Blob store

 

 

Till now we have covered

  • WHY to use Blob as storage option
  • WHAT we can do with it.
  • WHERE to procure the necessary infrastructure. [Azure]

We have not written any code as yet. In next part 3 we will walk through the code to achieve the following.

  1. Upload a file to blob storage.
  2. Download a file from blob stroage.

Access control policies will be discussed in part 4.

Azure Blob Storage – Part 3 (HOW)    —  Coming shortly

 

Azure Blob Storage – Part 1 (WHY should you use it)

Azure has come a long way to come up with reliable storage options offered as cloud services. In this series of articles we will go into details in the below order.

  1. Azure Blob Storage – Part 1 (WHY)
  2. Azure Blob Storage – Part 2 (WHAT and WHERE) 
  3. Azure Blob Storage – Part 3 (HOW)     —  Coming shortly
  4. Access Control policies (HOW LONG) — Coming shortly
  5. Triggering Azure function to process Blobs —  Coming shortly

Even though I am focusing mainly on Blob storage in particular, We will first analyze the need to choose Azure storage and then continue to cover Blob in depth.

Azure offers below storage options as service.

  1. Blob
  2. File
  3. Disc
  4. Queue
  5. Table

Conceptual Diagram

AzureStorage

WHY:  Let’s see why we must re-visit the conventional storage methods which were serving well earlier.

  • Data growth is exponential in ever-growing social media platforms and streaming services.
  • Scalability. You may not want it today but you want to be ready for future.
  • Simplifying storage infrastructure management like backup and Site Recovery.
  • Simplifying the user access management for stored resources. Audit compliance.
  • Simplify to access large size data over web faster using redundancy and SSD.

Previous file shares were useful though it came with many vulnerabilities. Maintenance and user access audits were time consuming tasks. Once the user or a service account has access to file share, there was no way to control the duration of access.

Azure on the other hand takes care of most of the maintenance and other critical tasks for us to focus on the business problem not the operational aspect of infrastructure.

Here is a quick comparison snapshot of storage options as per MS documentation.

StorageComparison.png

 

Even though the comparison chart above is self explanatory, in my personal experience I had observed the below use cases.

  • Upload/download large sized documents of different formats to server. 

    • If this is a new project and you have analyzed the cost aspects of storage, Blob is the easiest to implement which provides all benefits like Redundancy,security and time controlled access.

 

  • Migrate existing File share to Azure storage with minimal or no code change. (Lift and Shift) 
    • In such cases Azure File Share fits better. You can even map a middle tier VM (on IAAS) directory and access it as a regular network file share with no code changes.

 

  • Process large size files in a distributed system. Durability and decoupling is more important here.
    • Choose Queues.

 

  • Store unstructured (noSQL) data which can be quickly accessed across regions.

    • Use Tables.

 

Azure provides extensive range of tooling to support different audiences like Powershell, CLI, Export/Import services.

Now that basic context has been set, we can double click on Blob storage.

First thing you would need is to have an Azure Subscription where we are going to generate and subscribes for these storage services. Refer to the below link to know the steps to create a storage account.

Create a Storage account with Azure subscription. 

Blob Storage:

A blob can be considered to store any file (assume its binary) which can be in any format like .xls,.ppt etc. Blobs reside inside Containers. We will learn more about containers in Part 2.

Blob Storage Types

  • Block blobs :  Are most commonly used files. Up to ~4.7 TB

  • Page blobs:    Are used for random access large files like VHDs. This is how Azure manages files for itself.

  • Append blobs:  Are made up of blocks like the block blobs, mainly used to append information to the existing block blob. Like logging information.

I hope it gives you a good start and we shall go in deeper in Part 2.

Azure Blob Storage – Part 2 (WHAT and WHERE)

 

Access configuration with .Net Core

.Net Core is an open source framework developed on github. For those who are new to it, .Net core is a paradigm shift from the previous release (in a good way). It is an implementation of OWIN (Open web interface for Net).

In previous frameworks, we had appsettings.config file (i.e. for console app) where we configure various key value pairs apart from some project specific settings. In .Net Core (2.0) console App, we don’t get .config file by default.

Below steps will help you reading the config values from config files in .Net Core.

Reading AppSettings using previous System.Configuration.ConfigurationManager class

ConfigurationManager.AppSetting[“key”]

This won’t work in .Net Core as System.Configuration is not present for Core yet (till Core 2.0). You need to do as per below.

  • Create a Console App with .Net Core 2.0
  • Create a appsettings.json file and add to the root of the project.
  • Add desired key value settings in this file.

{
“docDbSettings”: {
“EndpointUrl”: “https://localhost/443&#8221;,
“AuthKey”: “134KJKJSHDKFJL###LKJSLKFGK;E 12934903459-          @@@0983459+==”,
“DatabaseId”: “ToDoList”,
“CollectionId”: “Items”
}
}

  • Right click on this json file and select “Copy To Output directory” to “copy always” . This will ensure that file is present in client under ‘bin\Debug\netcoreapp2.0’.
  • Install the below nuget packages
    • Microsoft.Extensions.Configuration
    • Microsoft.Extensions.Configuration.FileExtensions
    • Microsoft.Extensions.Configuration.Json
  • Below is the code to set upConfigurationBuilder which is equivalent to previous ConfigurationManager class to read the keys.
private static IConfiguration Configuration;

static void Main(string[] args)
{
    var builder = new ConfigurationBuilder()
        .SetBasePath(Directory.GetCurrentDirectory())
        .AddJsonFile("appsettings.json");

    Configuration = builder.Build();

    // read the keys
    var Endpoint = Configuration["docDbSettings:EndpointUrl"]?.ToString(); 
    var Key = Configuration["docDbSettings:AuthKey"]?.ToString();
    var DatabaseId = Configuration["docDbSettings:DatabaseId"]?.ToString();
    var CollectionId = Configuration["docDbSettings:CollectionId"]?.ToString();  
}
  • You can notice the tags traversal supported withing json structure. It even supports accessing the keys with index like [“superadmins[0]:Name”] if  it is an array of array like the below

{

“superadmins”: [
{
“Name”: “XYZ”,
“email”: “abc@text.com”
},
{
“Name”: “XYZ”,
“email”: “abc@text.com”
}

]

}

You can get more details from Microsoft documentation on here

You can also read an excellent article from Scott Hanselman on how to secure the config file having connection strings and other keys better in cloud and CI-CD ready development world.

 

Hope it helps!

 

Execute an sql query as string using sp_executesql and store the result in a variable

Small but useful tip for Sql Server 2012.

Here is a simple example for how we can execute an sql string query, execute it and store the result in a variable for further use.

declare @query nvarchar(1000), @result datetime

set @query = ‘select @result = GETDATE()’

EXEC sp_executesql @query,
N’@result datetime OUTPUT’,
@result OUTPUT

print CAST(@result as nvarchar(100))