Hide measures using Row Level Security – #PowerBI

In some cases, you might not want to give all users access to all measures in your model – you might not want to show the profit to certain users.

In the Power BI Desktop designer/Service we can’t hide measures depending on the active user but by combining dynamic measures and row level security we can make our way around this.

In my example I use data from AdventureWorksDW2014 and created a datamodel around FactResellersales.

So, we have 5 measures but Sales Profit, Product Cost and Profit pct should be hidden for some users.

First up creating a dynamically fact

I created a table by entering data in a table

The column Secret should be used to use to filter by user and FactKey we will use the FactKey in a SWITCH statement to create a dynamic fact.

The dynamic fact

The fact will be created like this

Selected Fact =
SWITCH(
SELECTEDVALUE(‘Dynamic Fact'[FactKey]),
1,[Sales Amount],
2,[Sales Profit],
3,[Sales Units],
4,[Product Cost],
5,[Profit pct],
BLANK()
)

In the model I hide all the columns and only show the fact

Hide the table FactResellerSales

To disable the user to be able to select any of the measures created in the FactResellerSales.

Use the dynamic fact

So, in order to use the fact we have to tell the visual which fact to use.

In this case a card visual I have selected the fact.

You can also use the matrix to show more facts at once

Create the Row Level Security

Now we need to add Row Level Security

NonSecretMeasures is now set to filter out the Facts where [Secret] is set to True.

Test the RLS

We can now test the Row Level Security in Power BI Desktop designer

So when viewing as NonSecretMeasures the user sees this

But when viewing as AllMeasures we see

Scaling it – consider moving it to Azure Analysis Services/Tabular model

This method doesn’t really scale very well but can be used in small models.

If your model is bigger and more complicated, you should look at building the model using Azure Analysis Services or a On Prem tabular model where you can implement object level security.

Q & A can help

Using Q & A in the report – it makes it a bit easier to create the visuals

Let me know what you think

Link to demo file – here

 

#PowerQuery – Filter a table based on another table column or list – and some Filter aha’s

One of my favourite features in Excel and Power BI is Power Query / M – and I just wanted to share a small trick when you want to filter a table on specific items.

Let’s imagine you want to filter a list of customers based on different CustomerCategoryID’s –

Using the interface, you would select the different categories via the filter menu

If you select 3, 4 and 5 – you will get this filter

= Table.SelectRows(Sales_Customers, each ([CustomerCategoryID] <> 6 and [CustomerCategoryID] <> 7))

Notice that it M creates an expression that excludes 6 and 7 and not specifically selects the 3, 4 and 5 – this means that when new customer categories is created they will be included in your query as well – perhaps not what you intended!!

If you only select 3 and 4 the expression built will be

= Table.SelectRows(Sales_Customers, each ([CustomerCategoryID] = 3 or [CustomerCategoryID] = 4))

So, it seems that if you pick more than half it will build and expression with and <> statement instead of and equal statement.

To make sure that only categories that you want to include or exclude you can use a list to specify the keys to be included

To create a list you can use this expression to

= {3..5} – will give you values from 3 to 5

Or

= {3,6,5} – will give you 3, 6 and 5

To filter your table, you now need to modify the Table.SelectRows expression

= Table.SelectRows(Sales_Customers, each ([CustomerCategoryID] = 3 or [CustomerCategoryID] = 4))

To

= Table.SelectRows(Sales_Customers, each List.Contains(Query1, [CustomerCategoryID]))

The List.Contains will check whether each row in the table will have a CustomerCategoryID number that exists in the list and return true if it does and your table will then only contains rows where True is returned

If you wanted to exclude the values that you have in your list you can change the expression to

= Table.SelectRows(Sales_Customers, each List.Contains(Query1, [CustomerCategoryID]) = false

 

 

Happy Querying

 

 

 

I am now a MVP :-)

Last week I received an e-mail from Microsoft that I had been awarded the MVP Award.

This made me glad and proud.

My colleagues, friends and family all know me as a very passionate and loyal Microsoft fan – and now I can show that it has been noticed from Microsoft as well

I want to say thanks for all the people that has attended my talks during the years, people that have allowed me to talk and the great community of SQL Saturday, MS BIP Denmark and finally the Power BI Community.

I will continue my work for especially the Power BI community and have set a few goals for the coming year – one them is of course to get the MVP Award for 2019-2020.

A special thanks to Ruth Pozuelo for nominating me.

 

 

 

How to build a location aware #PowerApp – ‘Guide book Copenhagen’ – #opendatadk and #powerquery

In one of my latest projects we have used PowerApps to create a location aware selection of stores – and I wanted to share my experience about how to do this.

So, I found an open data set about attractions, restaurants, hotels and much more in Copenhagen.

https://portal.opendata.dk/dataset/guidedanmark-oplevelser-overnantning-aktiviteter-i-hele-danmark

In order to get the data into PowerApps – I created an Excel workbook and used PowerQuery to import the data in a Table.

The Query to create the table is quite simple and contains a little renaming and removal of unwanted columns, and I only imported the rows that has an Latitude and Longitude.

let

Source = Json.Document(Web.Contents(“https://portal.opendata.dk/dataset/44ecd686-5cb5-40f2-8e3f-b5e3607a55ef/resource/23425a7f-cc94-4e7e-8c73-acae88bf1333/download/guidedenmarkcphenjson.json&#8221;)),

#”Converted to Table” = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),

#”Expanded Column1″ = Table.ExpandRecordColumn(#”Converted to Table”, “Column1”, {“Id”, “Created”, “CreatedBy”, “Modified”, “ModifiedBy”, “Serialized”, “Online”, “Language”, “Name”, “CanonicalUrl”, “Owner”, “Category”, “MainCategory”, “Address”, “ContactInformation”, “Descriptions”, “Files”, “SocialMediaLinks”, “BookingLinks”, “ExternalLinks”, “MetaTags”, “RelatedProducts”, “Places”, “MediaChannels”, “Distances”, “Priority”, “Periods”, “PeriodsLink”, “PriceGroups”, “PriceGroupsLink”, “Routes”, “Rooms”, “Capacity”}, {“Id”, “Created”, “CreatedBy”, “Modified”, “ModifiedBy”, “Serialized”, “Online”, “Language”, “Name”, “CanonicalUrl”, “Owner”, “Category”, “MainCategory”, “Address”, “ContactInformation”, “Descriptions”, “Files”, “SocialMediaLinks”, “BookingLinks”, “ExternalLinks”, “MetaTags”, “RelatedProducts”, “Places”, “MediaChannels”, “Distances”, “Priority”, “Periods”, “PeriodsLink”, “PriceGroups”, “PriceGroupsLink”, “Routes”, “Rooms”, “Capacity”}),

#”Expanded Address” = Table.ExpandRecordColumn(#”Expanded Column1″, “Address”, {“AddressLine1”, “AddressLine2”, “PostalCode”, “City”, “Municipality”, “Region”, “GeoCoordinate”}, {“AddressLine1”, “AddressLine2”, “PostalCode”, “City”, “Municipality”, “Region”, “GeoCoordinate”}),

#”Expanded GeoCoordinate” = Table.ExpandRecordColumn(#”Expanded Address”, “GeoCoordinate”, {“Latitude”, “Longitude”}, {“Latitude”, “Longitude”}),

#”Filtered Rows” = Table.SelectRows(#”Expanded GeoCoordinate”, each ([Latitude] null and [Latitude] 0)),

#”Removed Columns” = Table.RemoveColumns(#”Filtered Rows”,{“Municipality”, “Region”, “ContactInformation”, “Descriptions”, “Files”, “SocialMediaLinks”, “BookingLinks”, “ExternalLinks”, “MetaTags”, “RelatedProducts”, “Places”, “MediaChannels”, “Distances”, “Priority”, “Periods”, “PeriodsLink”, “PriceGroups”, “PriceGroupsLink”, “Routes”, “Rooms”, “Capacity”}),

#”Expanded Category” = Table.ExpandRecordColumn(#”Removed Columns”, “Category”, {“Name”}, {“Name.1”}),

#”Removed Columns1″ = Table.RemoveColumns(#”Expanded Category”,{“Owner”}),

#”Expanded MainCategory” = Table.ExpandRecordColumn(#”Removed Columns1″, “MainCategory”, {“Name”}, {“Name.2”}),

#”Renamed Columns” = Table.RenameColumns(#”Expanded MainCategory”,{{“Name.2”, “MainCategory”}}),

#”Removed Columns2″ = Table.RemoveColumns(#”Renamed Columns”,{“AddressLine2”}),

#”Renamed Columns1″ = Table.RenameColumns(#”Removed Columns2″,{{“Name.1”, “Category”}}),

#”Removed Columns3″ = Table.RemoveColumns(#”Renamed Columns1″,{“Created”, “CreatedBy”, “Modified”, “ModifiedBy”, “Serialized”, “Online”, “Language”})

in

#”Removed Columns3″

The Excel file is then saved in Onedrive for business.

 

Lets build the app

 

I use the Web studio.

 

And select the Blank app with a Phone layout

On the canvas I click the Connect to data and create a new connection that connects to Onedrive for Business and pick the Excel file

 

So now we have a connection to data in our App

 

And I insert the following controls

 

The first two labels show the my location as latitude and longitude, and the I inserted a slider with a min and max of 0 to 2000 as the radius in meters around my location. The label above my slider is just to show the selected radius.

Now we can insert a drop down and set the Items to the data connection and the column Name in that data connection and see it works.

 

Now we must filter the items based on our current location. In order to do this, we must filter our items. This can be done using the FILTER function.

The formula the uses the slider to modify the radius around our location

Filter(CopenhagenGuide, Value(Latitude, “en-US”) >= Location.Latitude – Degrees(Slider1/1000/6371) && Value(Latitude, “en-US”) = Value(Location.Longitude, “en-US”) – Degrees(Slider1/1000/6371/Cos(Radians(Location.Latitude))) && Value(Longitude,”en-US”) <= Location.Longitude + Degrees(Slider1/1000/6371/Cos(Radians(Location.Latitude))) )

And if I now limit the radius to 173 meters you can see I have 4 places nearby

If you want to add a map as well highlighting the selected Attraction you can do that as well

 

You can find the information to do that here – https://powerapps.microsoft.com/en-us/blog/image-control-static-maps-api/

 

If you want a copy of the PowerApp file you are welcome to add a comment or ping me on twitter @donsvensen and I will send it to you.

 

Hope you can use this – Power ON!

 

 

 

 

How to change the #PowerBI desktop file connection from data model to a Power BI Service dataset or #AzureAS

In April 2017 we got the ability to connect our Power BI Reports to datasets in the service (link) and that is really cool.

Today I got a question from a colleague on how to change a reports dataset in order to separate the reports from the data model – thereby having a pbix file with data model and then design reports by connecting to the dataset in the Power BI service and I came up with this workaround.

This technique can also be used if you have reports that you want to change the connection of a report to an Azure Analysis Services or copying a report to another workspace and modify the connection to a dataset in that report.

Let’s see how we can do this

In this example I have designed a data model and report that is connected to data in a SQL database

And deployed the pbix file to the service

This gives us a dataset and a report.

Now if we open the report and chose to export as pbix

Now I named the report – Demo PBI Exported.pbix

This will include the data model including all the queries etc.

If we deploy the this to the service again we will end up with 2 datasets – not a good idea – we will end up with two datamodels 😦

The best thing would be that this report was tied to the dataset in the service – but the “Get Data” doesn’t give us the option to change the connection in a file with a data model

 

So, I decided to create another pbix file with a connection to the Power BI

This gives me the same fields but is connected to the service

Notice that you can’t see the data and relationship in the panel to the left

 

So now I have 2 pbix files – one connected to the service and one with the report and the datamodel (and the original model)

First thing is to change the 2 files extension to zip – as the pbix files is just a zip file with different files within.

Now the exported pbix files listed in the picture to the right has a “large” datamodel as it includes the data and queries.

I then extracted both zip files to separate folders

I then copied the 2 files

Connections

Datamashup

From the pbix file connected to the service

And pasted them into the folder containing the extracted pbix file Exported from the service

I choose to overwrite the existing

Deleted the file called datamodel and zip all the files in the container to a new zip file – in this case called magic

Now change the extension to pbix and opened the file.

This will give us this look of our reports

Not exactly Magic you might say – but wait – It actually is – now go the get data and connect to the Power BI Service

 

And voila your report is now connected to the Power BI service

 

When publishing this to the service you won’t get another dataset but a new report connected to the dataset.

 

And even better when/if users download the report as a pbix file its connected to the service and not the data model.

Using this method also enables you to copy reports between workspaces and just point to the correct power bi service dataset.

 

Can it be done if we want to change connection to Azure Analysis Services – It sure can 🙂

 

So, I uploaded the pbix file to Azure Analysis Services to create a copy of the data model in Azure using the web-designer.

 

I then repeated the steps of overwriting the Connections and Mashup files in the extracted zip folder with the files from the extracted service folder.

And created a new zip file – Magic Azure AS.zip – then change the extension to pbix and opened the file.

Opened the Get Data experience and switched to Azure Group and choose Azure Analysis services database

 

And entered the server and database information

And the report is now connected to an Azure Analysis Services 😀

This file can now be published to any workspace as it isn’t connected to the Power BI service – dataset.

This will of course give us a new dataset in the service as it is pointing to the Azure Analysis Services

You should also check out what you can do with the Power BI Rest API if you are interested in automating the creation of reports and changing connection to data set – but you can’t do the rebinding of a report to a power bi dataset using the API’s (but it would be nice if we could)

Link to documentation about the Power BI Rest API’s – find them here

Please Please

 

Please let me know if you find this useful by adding a comment or a like on this post.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Installing the #PowerBI Desktop from the Windows Store – with a few surprises

At Ignite it has just been announced that the Power BI Desktop now can be downloaded via the Windows Store meaning that desktop will update automatically – and that is great news.

Announcement in the middle of this post – https://powerbi.microsoft.com/en-us/blog/power-bi-at-ignite-2017-announcing-enterprise-ready-capabilities-for-establishing-data-culture-at-scale/

But here is what I noticed when installing the app from the store (link)

My desktop before installing – and I have updated as soon as the September release was ready and not after that.

So my version was 2.50.489.502

Then I went to the store and found the Power BI Desktop app

This gave me a new app but it didn’t replace my existing app – and as my language was set to Danish I got a Danish version – – we will fix that later – read on

Notice the version – is now 2.50.4859.781 – arhh new version number.

Then I decided to check the “old” place to download the desktop – https://www.microsoft.com/en-us/download/details.aspx?id=45331&751be11f-ede8-5a0c-058c-2ee190a24fa6=True

And found that the latest version here is

2.50.4859.782

So, a build later than the store App …. Hmmm…

Old version to the left and Store App to the right.

And now I have 2 app’s in my start menu – the folder contains the “old” manually installed Power BI Desktop

 

When opening files and you haven’t uninstalled the “old” version you will be prompted to choose which application you want to open the file with

 

The one with the smaller Power BI desktop is the Windows Store App.

I think I will continue to have both applications installed and see what happens the next couple of weeks – especially until the October release

PS – by installing English (United Kingdom) language package I got my Power BI Desktop “App” to run in English instead of Danish 🙂

 

Updating a #PowerBI dataset using Microsoft #Flow, #Azure Functions and a #Flic button

I am working on a project where customer would like to update a PowerBI dataset when specific events occur – in their case when an email with a specific subject arrives in a specific mailbox – and we succeeded by using Microsoft Flow, Azure Functions and PowerBI REST API.

But as I recently bought flic buttons (https://flic.io/) so I wanted to change the setup a bit so the dataset is updated when I click the button – sort of my first IOT project.

The ingredients

  • 1 dataset in your PowerBI tenant that can be scheduled for refresh
  • 1 Power BI Pro license (or premium if you want more than 8 refreshes a day)
  • 1 Registration of an Application for Power BI ( link – https://dev.powerbi.com/apps )
  • 1 Azure subscription
  • 1 Power Shell script
  • 1 Azure Function/Logic App
  • 1 Microsoft Flow license in any tenant
  • 1 Flic button
  • 10-15 minutes

First up – the information about the dataset and the group

We need the dataset ID and group ID where the dataset is placed.

The easiest way to find it is to navigate to the settings and the dataset tab and click the data set – the group ID is in blue area and the dataset ID is in the red box,

Notice that the Scheduled refresh is set to Off.

Get a client ID for your Power BI Application

You will need to register an application for the update and you can do this via

Sign in and fill out the information.

The App Type has to be Native app and the Redirect URL must be – urn:ietf:wg:oauth:2.0:oob

Select both the Dataset API’s and in step 4 click to register the App and save the Client ID.

Powershell script to update the dataset

You can find the Powershell script here – https://github.com/Azure-Samples/powerbi-powershell/blob/master/manageRefresh.ps1

Fill in the $ClientID, $groupID and $datasetID in the script.

In order to avoid popup windows with authentication in the Azure Function I had to modify the script and hardcode the username and password in my script.

$userName = “username”

$password = “password”

$creds = New-Object “Microsoft.IdentityModel.Clients.ActiveDirectory.UserCredential” -ArgumentList $userName, $password

$authContext = New-Object “Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext” -ArgumentList $authority

$authResult = $authContext.AcquireToken($resourceAppIdURI, $clientId, $creds)

The bold lines are added and the $authResult is modified as well.

Azure Function

Log on to the Azure portal – https://portal.azure.com/ and create a Function App

Wait for the deployment to finish – it takes about 1-2 minutes.

Choose to create your own custom function

Select language Powershell

And the HttpTrigger

In my example, I will keep authorization level to anonymous – but you can add more security if needed.

Now replace the script with the PowerShell script from the previous section.

Now you can save and click run to see if the script works

If the script succeeds – you can check whether the dataset is updated via the Power BI dataset refresh history

By clicking Get Function URL

We will get the POST Url that we will use later in our Flow.

Now – Microsoft Flow

Open your Microsoft Flow in your Office 365 portal and create a new blank flow

Now we can select between a lot of different triggers that can activate our data refresh in Power BI. It could be certain tweet, a mail in an inbox or in onedrive and many more – we currently have 160 triggers to choose from.

In my example, I want to integrate with a flic button

And by connecting to my Flic account I can choose the button I want to use and which Event (it has click, double click and hold) it should be attached to on the button

To activate the refresh, I add a new step as an action I choose the HTTP where I can post the URL to the function.

After the HTTP action, I add a new step that sends and email to me – with information about the refresh and which button was presses and where it was located when pressed.

Then save the flow and we should be ready to update the dataset via the button

Trying the flow

So by clicking the button we can see the Run History

As you can see the flow ran as 23:25

And when checking the refresh history in Power BI we can see its refreshed –

And I get an e-mail as well

OBS – The refresh can “only” be done 8 times with a Power BI Pro License but 48 if you should be so lucky that you can have access to a premium edition.

Wrapping up

I have been so fun to create this and it was surprisingly easy – with off course some challenges underway.

But the toolset is amazing and combining Flow and Power BI opens a lot of possibilities and ideas to activate refreshes and do stuff with the Power BI REST API triggered by different events.

I will at least have an extra look at the API and see what can be done – link to documentation

Let me know if you find this interesting.