Sunday, June 7, 2020

Generating Client Code from OData Metadata

Sometimes when we need to call APIs, we are just given the information about the entities in the form of OData metadata. Now when we are not aware about possible methods and their parameters which are exposed for use, it becomes very cumbersome and time consuming to call even a single method.

Till some extent, we can still manage our life for Get methods as they are easy to guess using entity names, but when it comes to PUT/POST/PATCH, things are not straight forward.

And here comes the need to generate classes with given metadata. Let’s have a look, on how we can generate these client-side proxy classes.

Install the required Visual Studio Extension

I’m using Visual Studio 2019 but the equivalent features are available in other versions too. Navigate to Manage Extensions dialog and search for OData Connected Service as shown below and install it.














Using this tool, one can easily generate classes for all the entities and complex types which are mentioned in the metadata.

Generating Proxy Classes
Next is to open the project in Visual Studio, inside which proxy classes have to be generated. Once the OData Connected Service extension is installed successfully, right click on the project and selected Add Connected Service as shown below:



















Next is to select OData Connected Service as shown below:














Next is to configure the endpoints, but before that, get ready with metadata in the form of an XML file. Here is the gist of how metadata looks like:








Let’s browse the metadata file as shown below:















Click on Next and select all the required entities for which schema is to be generated as shown:













Click on Next and select all the required Function/Action which needs to be imported as shown below:








Clicking on Next will take you to next screen wherein you can mention the class file name, in which all the generated code would be saved. Here I am taking the class name as RetailReference as shown below:








Now if you wish to place all the generated code in respective separate-separate files, rather than pushing everything into the single file, then this setting can be done by clicking on the Advanced Settings link as shown in the above screenshot, which will open up below options:













There are a few more options under Advanced Settings, which can be utilized based on our coding guidelines.

Click on Finish and you will notice that all the entities are added to solution explorer as shown below:



















We are all set to utilize our classes. Happy learning!

Friday, June 5, 2020

Making a call to Retail Server APIs

This article will talk about how to make a call to Retail APIs(non anonymousand what all information is required to get the response.

I started by generating the access token using username-password flow and obviously the client id as shown in below image:












Then I tried to make a call to an API using Postman as shown below:







And here is the 401 Unauthorized error ☹ and the reason is - Microsoft_Dynamics_Commerce_Runtime_DeviceTokenNotPresent

After spending hours, I got to know that Retail APIs can’t be called just by passing the access token. In order to make API call successful, there is one additional information ‘devicetoken’, which needs to be sent. 

Now where to pass this information?

Well, fortunately I was able to figure it out. This devicetoken has to be passed as an header while making API call as shown below:

Once device token is passed, I received the expected response from the API. 

Hope I saved your hours. Enjoy troubleshooting!

Friday, May 29, 2020

Providing Admin Consent to Azure Registered Application

It has been a while since I drafted anything. So, all these times, I was busy learning new things, which includes Dynamics 365, Graph API, MS Teams, some of the Azure services and much more. 

Basically, this entire tenure was full of ups and downs where some things went very smooth and some things took many hours to get sorted.

But now, I am back with so much on my plate and of course my next few posts would be majorly on troubleshooting part and how-tos. May be, there would be similar solutions which you may find on the internet, but I would still love to add it on my own blogger with my own findings for my own future references.

With all this, let us get started with our first troubleshooting tip.

Background
The requirement was about adding a new Task in Planner, which is part of Office 365. Now for any adding Task, we have to traverse through Groups, then Bucket and then inside any defined Bucket we can create a Task

In order to perform all these, authentication and token generation part have to be in place because using JWT tokens only we are going to interact with our application, which is registered via App Registrations under the Azure portal. Here is the snapshot of how API permissions look like after registration of app: 










Now with everything in place, when I tried to launch my application from Visual Studio, I got the below error:











Which means that in order to access the registered application, few consents are required. Now how to provide this consent and what this consent is required for?

Analysis
If you will closely see the first screenshot having API permissions, you will notice that for performing the Read and WriteAll operations under Microsoft Graph, Admin consent is required but it is not granted [Just to brief you on these Read/WriteAll, these are required in order to perform any operations under Planner].

So, how to provide this consent and who will provide this?

Solution
Well, this consent would be provided by the admin of the application by simply hitting an URL in his browser. This URL would contain TenantId as well as ClientId as follows:

https://login.microsoftonline.com/<TenantId>/adminconsent?client_id=<ClientId>

So, as soon as proper URL is entered into the browser, below dialog will pop up: 





















As soon as Admin clicks on Accept button, all the consents would be granted and this can be verified by going back to API permissions page on the Azure portal as shown below: 










Happy troubleshooting.  

Thursday, April 16, 2020

Jupyter error - No module named ‘selenium’

Recently I installed Anaconda to learn more about it and the first thing I was about to try was opening a web page automatically using Selenium.

So, to perform this, I used Jupyter and tried to import Selenium webdriver. Till here, everything went well, but when I ran my code using Jupyter Notebook, I got an error: ‘No module named ‘selenium’.

The strange thing is, I got an error, although I have got Selenium installed on my machine using pip with below command: 

pip install selenium.

Now what could be the reason?

So, to analyze it further, I wrote the same Python code in Visual Studio and ran it. It worked perfectly alright.

So, I just thought to give a try to check the version of Selenium and first I tried with pip as shown below:

As the above message says, it is already installed and didn’t complain anything. So, next I thought to try with Anaconda command prompt as shown below: 





















Did you notice that rectangle marked with orange? 

Yes, that was the culprit who was not allowing my code to work. There was a difference in versions and as Jupyter was launched from Anaconda, it was not able to get the correct version.

Once above code ran, I was successfully able to run my below code:







Hope this trick will save you hours.

Tuesday, February 25, 2020

Utilizing Azure Blob and WebJob to Convert Excel Files to Flat File Format

I believe, there are many articles or blogs already available which speaks about how to convert an excel file to a comma separated file using C# and in all the cases (which I referred), excel is read from a hard drive of a local machine and csv file is saved back to the same hard drive. But in spite of knowing this, again, I’m going to draft another post.

Wondering, why?

Well, this post is going to be slightly different in the way files are being read and saved back. Below are the major offerings of this post:   
  • What if we have many excel files to convert but disk is not having enough space to save all of those? Same is the case for conversion output too.
  • What if we don’t have permission to save our converted files on to the local machine?
  • How can we run this conversion utility using web jobs?
In order to address the above challenges, we can utilize Azure capabilities wherein we will do everything on the fly without utilizing disk space as a storage for our files. Let’s see everything in action by going step by step.

Problem Statement
Reading excel files from Azure blob storage, convert them to csv format and uploading them back to Azure blob storage. This entire process has to run via triggered web job, so that it can be repeated as and when excel to csv conversion is required.

Setting up environment
I’m using Visual Studio 2019 v16.4.0 and having an active Azure subscription.

High level steps
  • Creating containers in Azure storage
  • Reading Excel from Azure storage
  • Converting Excel to CSV format
  • Uploading CSV to Azure storage
  • Creating Azure WebJob
  • Triggering Azure WebJob
Creating containers in Azure storage
A container must be created under blob service to store all the excel files which need to be converted to csv format. Now there are two ways, one can create a container – one is through the Azure portal and another one is by using C#. As both these are easily available on MSDN, I’m not going to repeat the entire procedure. 

For detailed steps on how to create a container, please refer references section placed at the end of this article.         
     
For our exercise, I’ve created two containers named excelcontainer and csvcontainer under one storage account. Where,

excelcontainer – holds excel files which are to be converted to csv
csvcontainer – holds the converted csv files

Below is the screenshot of my excelcontainer, which holds 3 excel workbooks:








Reading Excel from Azure storage
Now we have excelcontainer ready with uploaded files, it’s time to read data from all those files and here is the code to do that:


public async Task<List<BlobOutput>> Download(string containerName)
        {
            var downloadedData = new List<BlobOutput>();
            try
            {
                // Create service and container client for blob
                BlobContainerClient blobContainerClient =
                     _blobServiceClient.GetBlobContainerClient(containerName);
                // List all blobs in the container
                await foreach (BlobItem item in blobContainerClient.GetBlobsAsync())
                {
                    // Download the blob's contents and save it to a file
                    BlobClient blobClient = blobContainerClient.GetBlobClient(item.Name);
                    BlobDownloadInfo downloadedInfo = await blobClient.DownloadAsync();
                    downloadedData.Add(new BlobOutput
                       { BlobName = item.Name, BlobContent = downloadedInfo.Content });
                }
            }
            catch (Exception ex)
            {
                throw ex;
            }
            return downloadedData;

        }

Where BlobOutput is the DTO with below members.
public class BlobOutput
{       
      public string BlobName { getset; }
      public Stream BlobContent { getset; }
}

Converting Excel to CSV format
In above step, we have collected the data from each blob object into a stream. So, in this step, we will convert the streamed data into csv format and here is the code for that:


public static List<BlobInput> Convert(List<BlobOutput> inputs)
        {
            var dataForBlobInput = new List<BlobInput>();
            try
            {
                foreach (BlobOutput item in inputs)
                {
                    using (SpreadsheetDocument document =
                           SpreadsheetDocument.Open(item.BlobContent, false))
                    {
                        foreach (Sheet _Sheet in
                                 document.WorkbookPart.Workbook.Descendants<Sheet>())
                        {
                            WorksheetPart _WorksheetPart =
                               (WorksheetPart)document.WorkbookPart.GetPartById(_Sheet.Id);
                            Worksheet _Worksheet = _WorksheetPart.Worksheet;
                            SharedStringTablePart _SharedStringTablePart =
                                     document.WorkbookPart.GetPartsOfType
                                               <SharedStringTablePart>().First();
                            SharedStringItem[] _SharedStringItem =
                                     _SharedStringTablePart.SharedStringTable.Elements
                                               <SharedStringItem>().ToArray();
                            StringBuilder stringBuilder = new StringBuilder();
                            foreach (var row in _Worksheet.Descendants<Row>())
                            {
                                foreach (Cell _Cell in row)
                                {
                                    string Value = string.Empty;
                                    if (_Cell.CellValue != null)
                                    {
                                        if (_Cell.DataType != null &&
                                            _Cell.DataType.Value == CellValues.SharedString)
                                            Value = _SharedStringItem[int.Parse
                                                    (_Cell.CellValue.Text)].InnerText;
                                        else
                                            Value = _Cell.CellValue.Text;
                                    }
                                    stringBuilder.Append(string.Format("{0},", Value.Trim()));
                                }
                                stringBuilder.Append("\n");
                            }
                            byte[] data = Encoding.UTF8.GetBytes
                                               (stringBuilder.ToString().Trim());
                            string fileNameWithoutExtn = item.BlobName.ToString().Substring
                                             (0, item.BlobName.ToString().IndexOf("."));
                            string newFilename = $"{fileNameWithoutExtn}_{_Sheet.Name}.csv";
                            dataForBlobInput.Add(new BlobInput { BlobName = newFilename,
                                                                 BlobContent = data });
                        }
                    }
                }
            }
            catch (Exception Ex)
            {
                throw Ex;
            }
            return dataForBlobInput;
        }

where BlobInput is the DTO with below members:
public class BlobInput
 {
        public string BlobName { getset; }
        public byte[] BlobContent { getset; }

 }

If a workbook contains multiple sheets, then a separate csv will be created for each sheet with the file name format as <ExcelFileName>_<SheetName>. csv.

Uploading CSV to Azure storage
Once the data is converted to csv, we are good to go for uploading the csv files back to container and here is the code to perform this:


public async Task Upload(string containerName, List<BlobInput> inputs)
        {
            try
            {
                // Create service and container client for blob
                BlobContainerClient blobContainerClient =
                         _blobServiceClient.GetBlobContainerClient(containerName);
                foreach (BlobInput item in inputs)
                {
                    // Get a reference to a blob and upload
                    BlobClient blobClient =
                        blobContainerClient.GetBlobClient(item.BlobName.ToString());
                    using(var ms=new MemoryStream(item.BlobContent))
                    {
                        await blobClient.UploadAsync(ms, overwrite: true);
                    }                   
                }               
            }
            catch (Exception ex)
            {
                throw ex;
            }

        }

So far, we have read the excel file from container, convert it to csv format and uploaded back to another container. All good. The next task is to automate this using triggered WebJob.

Creating Azure WebJob
WebJob can be created using Visual Studio by right clicking on the project and selecting Publish…

Apart from this, there are many ways to create a triggered WebJob and all are mentioned over here on MSDN.

Triggering Azure WebJob
If everything is setup correctly, you will be able to see below screen on your Azure portal. 










As this is triggered WebJob, clicking on Run button will trigger this job and will create output as shown below:










Takeaway
Using Azure storage and WebJob, we have converted files from one format to another without utilizing the local disk space for saving file during this entire conversion process.

References: