Friday, October 9, 2020

Azure Data Explorer - Kusto Query - Get Categorial Count

It’s been a while since I started working on data analysis part. When it comes to data analysis, it’s all about how efficiently one can filter and fetch the small set of useful data from humongous collection.

I used Kusto Query Language (KQL) for writing advanced queries for Azure Log Analytics. At first, when you will start writing queries, it would be very daunting and keeping that in mind, I thought, I should share a few of those queries which could save huge on the beginner’s time.

Hence, my next few posts would be mostly based on how to achieve expected output using KQL. So, let’s get started with a simple scenario first.

Below is the sample data on which we are going to query:






2020-05-21 00:00:00:0000000

2020-05-25 02:00:00:0000000

Schedule Task

Read feed from server 1


2020-05-21 00:00:00:0000000

2020-05-25 03:00:00:3000000

Schedule Task

Read feed from server 1


2020-05-21 00:00:00:0000000

2020-05-25 03:00:00:3000000

Monitoring Task

Monitoring failed for LOC


2020-05-22 00:00:00:0000000

2020-05-26 02:00:00:0000000

Schedule Task

Data missing for palto


2020-05-22 00:00:00:0000000

2020-05-26 00:09:00:0000000

Schedule Task

Read feed from server 1


2020-05-22 00:00:00:0000000

2020-05-27 00:04:00:0000000

Failover Handling

Disk fault occurred in region R


Query description:

How to get the varied description count for each FeedKey.


  1. DemoData   
  2. where GenerationDate >= datetime(2020-05-20) and GenerationDate <= datetime(2020-05-23)  
  3. | extend Descriptions = strcat(DescriptionTitle," : ",DescriptionDetail)  
  4. | summarize dcount(FeedKey) by Descriptions, FeedKey   
  5. | summarize DescriptionCount = count() by FeedKey | sort by DescriptionCount desc;

Expected output:







 Happy kustoing!

Thursday, July 23, 2020

Excluding Teams from Office Deployment

Recently I received a request from one of my readers on how to tailor Office deployments and to be specific, how to exclude Teams from deployment configuration file.

Let’s navigate to from the browser. This is the place from where administrators can manage and deploy office products and subscriptions.

You can see that there are two options on the page: 
  • Create a new configuration 
  • Import your configuration

Here I am going to create a new configuration, but if you have existing configuration, you can import that too and update it based on your business needs.

Once you have entered the Deployment Settings page, there are many options which need to be configured.

Architecture Selection

Select the architecture for which we are creating a deployment:

Office Suite Selection

Next is to select the Office Suite:

Version Selection

Next is to select the version which we want to deploy:

App Exclusion

Now comes the most important part, wherein we are going the exclude the apps which we do not want to export as part of our deployment script:

Language Selection

Next mandatory parameter is the to select the primary language:

File Format Selection

We are almost done. Final step is to export this newly created configuration, and that can be done by clicking on the Export button on top of the page.

As soon as Export button is clicked, another dialog will pop up asking for the file format:

Accept Licensing Terms

Next is to accept the license agreement and provide a name for the configuration file:

Export Configuration

Click on Export and deployment file will get downloaded to your machine. Lets open the file and have a look at the configuration settings:

In above image, you can see that Teams is excluded and will no more be part of our Office deployment. 

Happy Deployment.

Monday, July 20, 2020

Acessing Azure DevOps By Using PAT

You may have come across a requirement, wherein you needed to update Azure DevOps objects programmatically and it is obvious that there must be some authentication mechanism which has to be in place.

Now, there are many methods one can use to authenticate, but for this post, I’ve specifically chosen personal access token. PAT, which is short for Personal Access Token is a way to provide an alternate password to authenticate to Azure DevOps.

To understand more about how to generate this token and how to utilize this, let’s follow certain steps and make a successful REST API call. More...

Wednesday, July 15, 2020

Made debut as an International Speaker

Today, I've something good to share. Recently I was given an opportunity to talk in one of the technical conference named 'Lightup Virtual Conference', which was a fund raising event to support UNICEF in order to take a stand against COVID19 and it was organized by C# Corner and The Tech Platform. 

I spoke on a topic 'Azure Bot Services Utilizing LUIS Capabilities' which was a very wonderful experience. Hope recording of the same will be available soon.

Sunday, June 7, 2020

Generating Client Code from OData Metadata

Sometimes when we need to call APIs, we are just given the information about the entities in the form of OData metadata. Now when we are not aware about possible methods and their parameters which are exposed for use, it becomes very cumbersome and time consuming to call even a single method.

Till some extent, we can still manage our life for Get methods as they are easy to guess using entity names, but when it comes to PUT/POST/PATCH, things are not straight forward.

And here comes the need to generate classes with given metadata. Let’s have a look, on how we can generate these client-side proxy classes.

Install the required Visual Studio Extension

I’m using Visual Studio 2019 but the equivalent features are available in other versions too. Navigate to Manage Extensions dialog and search for OData Connected Service as shown below and install it.

Using this tool, one can easily generate classes for all the entities and complex types which are mentioned in the metadata.

Generating Proxy Classes
Next is to open the project in Visual Studio, inside which proxy classes have to be generated. Once the OData Connected Service extension is installed successfully, right click on the project and selected Add Connected Service as shown below:

Next is to select OData Connected Service as shown below:

Next is to configure the endpoints, but before that, get ready with metadata in the form of an XML file. Here is the gist of how metadata looks like:

Let’s browse the metadata file as shown below:

Click on Next and select all the required entities for which schema is to be generated as shown:

Click on Next and select all the required Function/Action which needs to be imported as shown below:

Clicking on Next will take you to next screen wherein you can mention the class file name, in which all the generated code would be saved. Here I am taking the class name as RetailReference as shown below:

Now if you wish to place all the generated code in respective separate-separate files, rather than pushing everything into the single file, then this setting can be done by clicking on the Advanced Settings link as shown in the above screenshot, which will open up below options:

There are a few more options under Advanced Settings, which can be utilized based on our coding guidelines.

Click on Finish and you will notice that all the entities are added to solution explorer as shown below:

We are all set to utilize our classes. Happy learning!

Friday, June 5, 2020

Making a call to Retail Server APIs

This article will talk about how to make a call to Retail APIs(non anonymousand what all information is required to get the response.

I started by generating the access token using username-password flow and obviously the client id as shown in below image:

Then I tried to make a call to an API using Postman as shown below:

And here is the 401 Unauthorized error ☹ and the reason is - Microsoft_Dynamics_Commerce_Runtime_DeviceTokenNotPresent

After spending hours, I got to know that Retail APIs can’t be called just by passing the access token. In order to make API call successful, there is one additional information ‘devicetoken’, which needs to be sent. 

Now where to pass this information?

Well, fortunately I was able to figure it out. This devicetoken has to be passed as an header while making API call as shown below:

Once device token is passed, I received the expected response from the API. 

Hope I saved your hours. Enjoy troubleshooting!