Thursday, July 2, 2020

Soft Delete for Azure Storage using ARM template , Azure CLI & PowerShell

Soft delete is an Azure offering which helps in data protection on Azure Blobs and Azure File Service to prevent accidental data deletion either by you or by someone. It’s a part of Azure backup.

Use Case:Assume a use case, You work in an enterprise application an somehow a user has gained access over a azure storage and accidentally deleted some blobs so how will you recover that.

Azure Storage soft delete enables you to achieve this.

Prerequisites:There should be already azure storage or Create a Azure storage on fly using ARM template.

If you have already create Azure storage just go to Azure Storage -> data Protection -> Enable blob soft delete.

Refer an image below for reference :

Azure Infrastructure as a Code (ARM Template)

An ARM template is an Infrastructure as a code (IaC) to provision a resource in Azure .In this section I’ll create a ARM template for storage account with soft a delete feature enable , Storage account is a resource type under Azure Storage provider Microsoft.Storage. Blob service is a sub resource with a single instance default. You can access this at Microsoft.Storage/storageAccounts/piperstorage/blobServices/default.

Following arm template creates a azure storage and enables soft delete for that.We will be running this using azure cli.


         "name":"[concat(parameters('storageAccountName'), '/default')]",
            "[concat('Microsoft.Storage/storageAccounts/', parameters('storageAccountName'))]"

Once everything is setup , use az login command to get into your azure subscription. Create a resource group as per your choice. Follow below commands to Login , create resource group and execute azrue cli to create and enable storage account and soft delete for blob respectively.

Az Login

az group create --name cloudPipersRG --location "East US"

Deploy template using Azure CLI command :

az group deployment create --name StorageDeployment --resource-group cloudPipersRG --template-file "C:\Learning\Docs\ARM Templates\azureDeploy.Storage.json" --parameters storageAccountName=cloudpiperstorage location=eastus

Deploy template using Azure Powershell:

$resourceGroupName = cloudPipersRG

New-AzResourceGroup -Name $resourceGroupName -Location "centralus"

New-AzResourceGroupDeployment -ResourceGroupName $resourceGroupName -TemplateFile "C:\Learning\Docs\ARM Templates\azureDeploy.Storage.json" -storageAccountName "cloudpiperstorage" -location "westus"

Once it has succeeded , got to azure portal to verify the deployment and related things.

Jump to portal - >cloudPipersRG-> deployment

Verify that storage account would have been created and looks like the below screen shot cloudpiperrg-deployment

Deployment using Azure CLI

Now go to recently create storage account , find data protection and click on that . it opens a new window as appears below enable-softdelete.png and shows that soft delete is enabled with the 30 days retention days.

Enabled soft delete with retention days

You can go through with video demonstration HERE:

For more deep dive learning about disaster recovery and account failover follow this link:

Tuesday, June 23, 2020


Step-by-Step Deploy ARM Templates Using Azure DevOps

with CI-CD Pipeline

In This video demo we are deploying Azure storage using YAML CI pipeline later we create Release pipeline to deploy using ARM deployment

For More Video Click Here : CloudPipers Youtube Channel

Sunday, June 7, 2020

Create Build Pipeline using YAML in Azure DevOps

This video covers the followings:

1. Explanation of Dot Net Core and Cosmos DB integration
2. Explanation of YAML file
3. Create build pipeline in Azure DevOps

#Azure #AzureDevOps #YAML #CosmosDB #NetCore

Reach for more Video's click here :

Wednesday, May 27, 2020

AZURE Service Principal using Azure CLI & Portal

AZURE Service Principal using Azure CLI & Portal

AZURE Service Principal using Azure CLI & Portal It helps to create SP to register user app to authenticate itself with specific role like contributor.

Wednesday, May 20, 2020

Resource provider registration using Azure Portal

Resource provider registration using Azure Portal

Each functionality in azure there is a resource provider like Microsoft.DataFactory. By default, your Azure Subscription is not registered with all resource providers and because your Subscription is not registered with Microsoft.DataFactory resource provider, you're getting this error

From the portal, select All services.
enter image description here
Select Subscriptions.
enter image description here
From the list of subscriptions, select the subscription you want to use for registering the resource provider. refer subscription.png 

 For your subscription, select Resource providers.Refer an image below for reference.

Look at the list of resource providers, and if necessary, select the Register link to register the resource provider of the type you're trying to deploy. As in my example I've installed Microsfot.DataFactory
Kindly refer an image Configure-resource-provider.png for reference

Hope it helps you o registered namespace Microsoft.Datafactory

Wednesday, April 1, 2020

Frequently use Docker commands

Frequently use Docker commands 

Docker is well-known and famous containerization platform , in this article i'll be sharing some frequently use docker command and which will help in your work. There are network and volume section which elaborates more with pictorial representation.

Here we go...

Docker Commands 

docker container ls

docker container ls -a

docker container run -d nginx

docker container inspect 182 | less    # to get an IP of running container

docker container pause 182

After pausing you wont be able to access this

docker container unpause 182

To Stop and Kill Container

docker container stop 182

To go inside the container

docker container exec -it ngnixContainer bash

Network in docker

docker network ls

docker network inspect add7666agsg

If you want to create your own network

docker network create -d bridge my-bridge-network

Image network.jpg

docker network inspect ba858ba01ad3

Image network-inspect.jpg

docker container run -d --name nginx3 -p 8081:80  --network my-bridge-network nginx
After this if you run the following command than it shows that newly created container is connected to that defined network

docker network inspect ba858ba01ad3

Image network-linking-container.jpg

Docker Volumes-> docker volume is use to persist data so in case if container goes down or delete than we can save the data
 To create a docker volume use the below command - volume 

docker volume create volume1

docker volume ls

To attach a volume to container and mount point which is created within container

docker container run -d –name nginx4 -v volume1:/volume1 -p 8085:80 nginx

in an above command left side keyword volume1 is host volume which stores on local host or local machine while right side "volume1" is mount point for within container

Refer an image below:  volume.jpg

TO go inside a volume , follow the below path

cd var -> cd lib -> cd docker -> cd volumes -> ls

image volume-list.jpg

To inspect a volume use the following command

docker volume inspect volume1

refer an image showing below: volume-inspect.jpg

If you notice in an image above mount point shows an exact path of volume along with Name.

docker image ls

docker image rm logstash

docker image history logstash

docker image save logstash > logstash.tar

To load image from standard input

docker image load < logstash.tar

Thursday, March 26, 2020

 Enable performance counter for Log analytics and execute KUSTO Query


This article states you about how can you execute log analytics query or KUSTO query over log analytics workspace. A KUSTO query is combination of SQL,PowerShell and bash and becomes KQL (Kusto query language).
Prior to this please go through to this which is about how to create log analytics workspace.


The following must be present for this solution:
1.     You should have Log analytics workspace already configured in your Azure subscription.

Log analytics workspace

Once you are done with creation of LAW, go to that and find an option logs on t eleft side of the panel as you can see in an image LA.jpg below marked as RED

There I’ve executed a query which is describes about list the number of computers sending logs, each hour. For that purpose I selected a table named HeartBeat which contains a potential information about the virtual machine connected to Log analytics workspace aka LAW

Go to Data -> Windows Performance Counter -> Add the selected performance counter
And click on that. As soon as you click all those counter will be enable and will start sending telemetry to log analytics workspace on which you can query and get virtual machine performace information. Refer enable-counter.jpg

Once performance counter got enabled , it starts sending those information to Log Analytics workspace.

NOTE: If you don’t get any records while you execute query on PERF table than restart MMA agent persists on virtual machine or disconnect/connect from virtual machine option visible in Log Analytics Workspace.

You can easily see the performance of a virtual machine connected to Log analytics workspace aka LAW. For that select PERF table .

Another set of query is for Usage and rendering it in piechart. You can render it in table , scatterchart and with few more option

// Usage by data types
// Chart the amount of logs reported for each data type, today
| summarize count_per_type=count() by DataType
| sort by count_per_type desc
| render piechart

In an image below you should be able to understand how does it works in real use case.

KUSTO keywords in use

Refer an image below kusto-query-piechart.jpg below for the output in piechart.

NOTE:  Following query fetches information about "%committed bytes in use"  (in case of windows OS) for Linux base machine counter name will be "% Used Memory".
After executing the below query I get 3 rows as a result because If you remember I’ve enable performance counter for windows computer. Refer screen shot enable-counter.jpg

| where TimeGenerated > ago(30m)
| where  CounterName == "% Committed Bytes In Use"
| project TimeGenerated, CounterName, CounterValue, Computer 
| summarize UsedMemory = avg(CounterValue) by CounterName, bin(TimeGenerated, 10m), Computer
| where UsedMemory > 0
| render timechart

Refer an image result.jpg below


I hope it helps you a bit to understand how can we run KUSTO on LOG analytics workspace.