Setup Power BI Enterprise Gateway on Azure VM

In the previous blog post, I went through steps 1 – 6 for building a SQL Server 2016 BI development VM on Microsoft Azure:

1. Provision the Azure VM using SQL Server 2016 developer image
2. Configure the SSAS/SSRS services
3. Install development software
4. Install sample databases
5. Integrate SSRS and Power BI Service
6. Install and configure Personal Power BI Gateway
7. Install and configure Enterprise Power BI Gateway and setup SSAS connection

This blog post details the last step, installing and configuring Power BI Enterprise Gateway and setting up live connections to Azure VM hosted SSAS from online Power BI services.

In the previous blog post, the personal Gateway has been setup on the Azure VM that allows the Power BI services to refresh/load data from the data sources hosted in the Azure VM. However, there is one important feature missing from personal Gateway, the support of live connections to SSAS,  which is only available in the Enterprise edition of Power BI Gateway.


Considering the importance of the live SSAS connection feature, which empowers Power BI to be the visualization front-end of an enterprise-class BI solution, the Azure-based development environment we are building now need to be able to support and test this feature.

Therefore I downloaded the Enterprise Power BI Gateway, installed on the Azure VM, and configured the data source connection and service account/credentials on Power BI services.


Everything seems simple and going pretty smoothly, however, when we I was trying to access the dataset or report in Power BI services, I got this error.


After a little bit more researching on how SSAS live connection works, I’ve learnt that the stored credentials I configured in the Data Source Settings (snapshot above) only setup the initial connection to the SSAS services, and the actual queries are executed based on individual user’s permissions. For the SSAS services to know which user the current query is from, along with the query itself, the EffectiveUserName property (in the format of UPN) of the Power BI user, who started the query through UI interaction, is also passed through Enterprise Gateway to the SSAS. The UPN passed from Power BI needs to match a windows account in the local Active Directory (which is the Power BI user’s mapping account in the on-premise Active Directory). SSAS will then check the Analysis Services roles of the matched windows account and decide whether or not to execute the query.

After understanding how SSAS live connection works, we can now list the requirements we need to meet in order to get the SSAS live connection working in our Azure VM environment:

  1. The Azure VM need to be in an Active Directory Domain
  2. For each Power BI user, a corresponding windows account need to be created in the Active Directory.
  3. The UPN of the Power BI user account need to be mapped to the UPN of the windows account
  4. The windows account need to be added into the appropriate Analysis Service role

Based on the requirements above, I gave it a try to setup the SSAS live connection in the following steps:

Firstly, I promoted the Azure VM to be a Domain Controller so that the Azure VM itself will be in the domain.


Then, I created a windows account (linxiaodev\ with UPN like (which is the mapping account for the Power BI user)


and added the account into the “Users” role in SSAS.


Next, I went to the “Map user names” panel on the Power BI portal and added a rule to map the UPN of a Power BI user account to the UPN of the windows account we created.


Now, all the settings have been done and the datasets and reports on the Power BI services started to get data through from the live SSAS data source.



Setup a SQL Server 2016 BI Development VM on Azure

I like to do developing on VM, no matter SharePoint development, BI development, or others. Therefore, in the earlier years (not that far back from now), I invested on a quite powerful (in other words, expensive and heavy) laptop, installed the VirtualBox, and had development VMs hosted inside. However, I’ve been wondering, nowadays, whether or not I really need a mobile workstation only for running VMs locally. I have been pretty tired of carrying something like this heavy to anywhere, or letting it sitting on top of and burning my lap. I think all I need, nowadays, is just something with a monitor and keyboard and the capability to connect to the cloud where my VMs are hosted, and most importantly, it need to be at least 4 timers thinner than the laptop I currently have!

Now, back to the topic of this blog post, I will setup a SQL Server 2016 BI development VM on Microsoft Azure in the following steps:

1. Provision the Azure VM using SQL Server 2016 developer image
2. Configure the SSAS/SSRS services
3. Install development software
4. Install sample databases
5. Integrate SSRS and Power BI Service
6. Install and configure Personal Power BI Gateway
7. Install and configure Enterprise Power BI Gateway and setup SSAS connection

This blog post will cover the steps 1 – 6. For the step 7, as the VM need to be promoted to be a Domain Controller and make some special settings in order to connect the Power BI service to the SSAS service hosted in the Azure VM, I will go through the details in a separate blog post.

1. Provision the Azure VM using SQL Server 2016 developer image
Firstly, we need provision the Azure VM using the SQL Server 2016 RTM Developer on Windows Server 2012 R2 image.


Assuming you already have an Azure account, you can log on the Azure portal, select to create a new Virtual Machine, choose the SQL Server 2016 Developer image, and open the “Create Virtual Machine” wizard.

On the Basics tab, you need provide the name of the VM you want to create and also the username and password of the local administrator for the VM.


Then you need select the size of the virtual machine you want to have. There is a wide array of options to fit you need. Microsoft recommended DS3 or above for a production environment, but as we are going to use this VM for development, we can choose something smaller. I chose DS11_V2 personally for my environment.


Then you need configure the settings for Storage, Network, etc., you can use the default settings if there is no specific requirement applied.


Lastly, you need configure the SQL Server settings, I have set the SQL connectivity as “Public (internet)” and enable SQL Authentication in order to access the SQL server from client computers.


After we have reviewed the summary of the settings, we can click “OK” to start the VM provisioning. The VM will be created in a short while.


After the VM is provisioned, we can get the connection details on the VM panel and connect to the VM using RDP.


When we log into the VM, we can see the SQL Server 2016 database engine along with SSIS, SSAS and SSRS component all installed and running on this VM.


2. Configure the SSAS/SSRS shipped with the image

2.1 (Optional) Change SSAS from Multidimensional model to Tabular model
In our VM provisioned from the SQL 2016 developer image, the SSAS is deployed in the Multidimenional mode by default. If you need to work with the SSAS Tabular model, you can change the deployment mode in the SSAS configuration file which is located at C:\Program Files\Microsoft SQL Server\MSAS13.MSSQLSERVER\OLAP\Config\msmdsrv.ini.


Open the file, and change the DeploymentMode from 0 to 2. Before the change takes effect, you need to restart the SSAS services on the SQL Server Configuration manager.


After the SSAS services is restarted, we can see the SSAS is running using Tabular model.


2.2 Configure Reporting Services
Next, we need to configure the reporting services through the Reporting Services Configuration Manager wizard.


The steps to configure SQL Server 2016 reporting services are pretty similar to the configuration of previous versions. I will not go into the details of every step. You can find tons of guides on the Google if you need. The only difference of the SQL Server 2016 version configuration is the Power BI integration which will be covered later in the blog post. After the Reporting Services is configured, we should be able to access the reports web portal to manage SSRS reports.


3. Install development tool
Initially, I will install SSDT, Power BI Desktop and DAX Studio in this development VM, you can install other development software and tools based on your requirements.
The link for SSDT:
The link for Power BI Desktop:
The link for DAX Studio:

4. Install Databases Sample
A number of sample databases can be deployed in the VM as test data sources to support BI development:

5. Integrate SSRS and Power BI Service
One of the best features shipped with SQL Server 2016 SSRS is the integration with Power BI which allows SSRS developers to pin an on-premise SSRS chart to Power BI dashboard. Here, we are going to integrate the Azure VM hosted SSRS to online Power BI services.

The steps to integrate SSRS with Power BI services are actually very simple. Firstly, you need enable the SQL Server Agent.


Then, you need go to the Reporting Services Configuration Manager, register the Report server with the Power BI service.


After the report server is registered with Power BI services, you need go to Report Manager, locate the report you want to pin to Power BI dashboard, and check whether or not stored credentials are used in the report, if not, you need use stored credentials in the report.



Now, you are ready to pin the SSRS charts to Power BI dashboard.


6. Install and configure Power BI Personal Gateway
As we have our SQL Server engine and sample databases up running in the Azure VM, we can now connect to the databases and start to build Power BI reports and publish to the online Power BI services. However, to allow the online Power BI services refresh the data sources hosted in the Azure VM, we need setup the Personal Gateway.

Firstly, we can download the Personal Gateway from the Power BI web site:, and install the app on the Azure VM. The installation process is very simple and you are not supposed to meet any problem here. After Personal Gateway is installed, you will be asked to sign in the Power BI online service with you Power BI account which will setup the connection between the gateway and the Power BI services.


After the Power BI is installed on the Azure VM, you can create Power BI reports connecting the data sources hosted in the Azure VM, publish them onto Power BI services, and schedule refresh on the report.


We can then do a little test on the refreshing. As an example, I built a report using the Adventure Works products tables which shows the number of products in each product category.


Then I went to the database on Azure VM, and add a new product.


After refreshed the dataset on Power BI online portal,


the report on the online Power BI was updated to reflect the change.


Microsoft Flow – Automatic Language Translations in SharePoint-based solutions

Had a quick play on the preview version of Microsoft Flow, one of the newest toys offered by Microsoft for creating automated workflows between a wide range of online apps and services. One feature interested me is the integration of Microsoft translator service which automates the translation between languages. I was thinking what kind of business requirements we can use this feature to fulfill.

My immediate idea is the auto-generation (and possibly distribution) of translated versions of an Office document (Word, Excel or PowerPoint) when it is dropped into OneDrive. I did actually give it a try, unfortunately, the translator service cannot parse the Office document format, and I can only get the plain txt files translated and generated successfully. As I believe there is really not many people using notepad to author their business documents, this attempt was failed.

As a SharePoint guy, another idea came up was to automate language translations on SharePoint based solutions. One possible business requirement I imagined out is to auto-translate service/support requests and replies between the languages used in a multinational company (assuming not every employee can speak English). In the rest of this blog post, I prototyped a simple service request app to test Microsoft Flow and the language translation feature.

The service request app is expected to work in this way:

  1. An employee from a non-English speaking country raises a service request (using his or her local language) into a SharePoint list, namely “Service Requests”.
  2. Once the request is added into the SharePoint list, a Microsoft Flow workflow (“Service Request Submission”) is triggered to translate the request “title” and “description” into English which is the language used in the enterprise services/supports department. The translated “title” and “description” fields along with other information of the service requestor are added into a separated SharePoint list, “Service tracker” (As the current version of Microsoft Flow does not support the update of a SharePoint list item yet, we have to use a separate list to store the translated fields).
  3. The enterprise services/supports department then solves the request, adds reply in the “Service tracker” list in English and marks the status of the request as “Completed”.
  4. Another Microsoft Flow workflow “Service Request Reply” monitors the update of the “Service tracker” list items, if the status of a request is changed to ‘Completed’, the Microsoft Flow workflow translates the “reply” field into the language used by the requestor and then sends an email with the reply message to the requestor.

In this example, two Microsoft Flow workflows, “Service Request Submission” and “Service Request Reply”, are created. The detailed steps to create these workflows are listed below:

“Service Request Submission” Workflow

Firstly, we add a “SharePoint online – When a new item is created” step and configure the SharePoint site url and the name of the “Service Requests” list.

Then, we add the “detect language” action to detect which language the request is using.


After that, we add the “translate text” action to translate the “title” and “description” fields into English.



Finally, we create an item in the “Service tracker” list with the translated “title” and “description” fields, and also the other information of the requestor (name, email and language) that will be used in the “Service Request Reply” workflow later.


After the “Service Request Submission” workflow is created, we can raise a request to the “Service Requests” SharePoint list, something like:


Waiting a short while, an entry is added into the “Service Tracker” list with the translated fields and the other information of the requestor.


“Service Request Reply” Workflow


The “Service Request Reply” workflow starts from the enterprise services/supports department adding the reply to the “Service tracker” list and marked the “status” field as “Completed”.


Firstly, we add the “SharePoint online – When a new item is created” step and configure the SharePoint site url and the name of the “Service Tracker” list.


We then add a condition step to check if the “status” field is marked as “Completed”.


If so, we translate the “Title” and “Reply” fields from English to the language used by the requestor.



At the end, we send an Office 365 email to the requestor with the translated reply.


After the “Service Request Reply” workflow is triggered and executed, the requestor will receive the email with the translated reply.


ABC Classification Analysis with Tableau

In one of my previous posts, I have conducted an ABC Classification analysis using MDX against a SSAS Cube. In this post, I will conducted an ABC Classification analysis using Tableau calculations.

In this example, we want to classify the products based on their sales amount into three categories: ‘A’ (make up the top 5% of total sales), ‘B’ (make up the top %5-20%), ‘C’ (the others).

Firstly, we add the Product Name attribute and Sales Amount measure onto a tableau table and sort the table by Sales Amount in Decs order.


We then create three calculations, “Sales Running Total”, “% Sales Running Total”, and “ABC Classification Category”:

1. [Sales Running Total]


This calculation sums up the total sales amount from the top ranked product to the product of current row.

2. [% Sales Running Total]


While the [Sales Running Total] is divided by the total sales of all products, we can have the related position of the current product against all products.

3. [ABC Classification Category]


This calculation checks the position of the current product in the sales amount range and place the product into the corresponding category. Tableau LOOKUP function is used to evaluate whether or not the total sales of all the higher ranked products above the current product is < 5% of total sales of all products, if so, the current product falls into the category ‘A’, if the total sales of all the higher ranked products > or = 5%, that means the current product does not consist of the top 5% product sales amount and it will falls into the category ‘B’ or ‘C’. Then we can use the same approach to find which product falls into category ‘B’ and the remaining ones will be in category ‘C’.

We can then add the calculations into the table and have the products mapped to the categories.


or a nice little chart:


Power BI Fun – Euro Football Clubs Spending & Peformances

For fun, I’ve Just quickly put together a Power BI report using the data of European major football league clubs’ spending and performances. If you are interested, you can download the pbix file here.