#powerquery – How to handle different decimal separator when importing csv files

Recently I have been working on a project where the solution should import a csv file exported from a SQL server. For some reason sometimes the data comes with a , (comma) as the decimal separator and other times with . (dot) as the decimal separator.

This meant that when importing the different files, I had to find a way to dynamically change the culture code setting when importing the file.

Let’s try and open the SalesWithDot.csv file in Power BI Desktop

As my Power BI Desktop uses Danish settings and danes uses , as a decimal separator Power BI desktop will think the values is 30.000 instead of 300 etc. – and we have to tell Power Query that the source is from another Culture – so I click Edit

As we can see from the documentation Table.TransformColumnTypes – the function has an optional parameter called Culture –

And by adding “en-US” to our formula bar we can have the right value in the SalesValue column

But when we use this when importing a file where the sales value is formatted with a , as the decimal separator and we use the same culture value (“en-US”) then we have the problem.

And by changing the culture to da-DK it shows the right values

So how can we make Power Query dynamically determine which local to use ?

In this case I know that the column SalesValue will either contain a comma or a dot – and by checking the first value of the file imported I can select which culture to use – this can be done like this

The step before we “Changed Type” is called “Promoted Headers”

To check whether the Sales Value contains a comma – we can use the Text.Contains function

And we can refer to the first value in the SalesValue column like this

#”Promoted Headers”[SalesValue]{0}


Text.Contains(#”Promoted Headers”[SalesValue]{0}, “,”)

Will give us true if the cell has a comma in the cell before its changed to a decimal value.

When we know this we can change the

= Table.TransformColumnTypes(#”Promoted Headers”,{{“Product”, type text}, {“SalesValue”, type number}, {“SalesUnits”, Int64.Type}}, “da-DK”)


= Table.TransformColumnTypes(#”Promoted Headers”,{{“Product”, type text}, {“SalesValue”, type number}, {“SalesUnits”, Int64.Type}}, if Text.Contains(#”Promoted Headers”[SalesValue]{0}, “,”) then “da-DK” else “en-US”)

The if statement will then use the da-DK culture if the SalesValue contains a comma.

And in the example file with the dot

You can download an example file here.

Hope you find this useful – Power On!

Designing R chart in #Powerbi just got a lot easier for a R novice

I have for a long time had the wish to use the power of R to create some fancy visuals in PowerBI, but I simply haven’t had the time to read the manual 🙂

Then today I saw a retweet about how to make ggplot2 easily –

And thought that might be able to help me get started – AND IT SURE DID

The tools are still under development so expect a few bumps

First – RStudio

You need to install RStudio on your computer to support the Addin. So if you haven’t done so already – go to https://www.rstudio.com/

To install the addin you can follow the instructions on the github site for the addin – https://github.com/dreamRs/esquisse

Simply insert this in code window and run it

R-Studio will then install the addin and all its help files etc.

Now we can run the addin by running this line


and the design window will pop-up

But the magic happens when we do it from PowerBI Desktop

So I created a small data model based on AdventureWorksDW2014 database – for the ResellerSales

And I plot all the resellers by Sales_Amount and Sales_Profit and use the business type as the legend.

Then I insert the same fields in an R visual

And in order for the R- Visual to show a chart we have to open the R-Script Editor at the bottom of the screen

And now I have to paste or type my R-script code here ….. and this is typically where I previously started to search on google for what I needed to do.

I would quickly find the library ggplot2 was needed to create a scatterplot but then I needed to find out a lot about formatting and legends and axis etc. and normally I would give up within a few hours perhaps

The addin to the rescue

It is actually possible to activate the AddIn from the R Script editor 😀

If we add this line to the editor – and click RUN – magic will happen


It loads the designer in the browser !!!

Our dataset is automatically available in dialog for choosing the dataset

635 records and 4 fields !!!

Choosing the dataset will give us a validation of the fields (OBS -Don’t use spaces in the fields you add !!)

We can now drag the fields to the different areas of axis, color and size and start to format the chart further

It will automatically switch between different types of visualizations depending on what you choose – and you can via the menus at the bottom do a lot of formatting and when you are done – you can open the Export & code windows and copy the code to the clipboard.

Then close the window and switch back to Power BI Desktop and paste in the code

Remember comment out the line esquisse::esquisser() by adding a # in front

Click on the Run Script and minimize the R Script editor and the R chart is now in your desktop – awesome!!!

The chart types supported in ggplot2 are these

And designing and using them just got a lot easier.

So I recommend that you follow and support them on github and help them fix bugs by reporting any issues.

Power ON!

Time your Power Queries – #powerbi #powerquery

Sometimes your power queries can get quite heavy and you might need to optimize the steps in your query but how can you calculate the time it takes for your query to load.

Wouldn’t it be nice if you could have Power Query to do it for you

Use Power Query to time Power Query

Well – the Query dependencies window gave me the idea – what if I had a query with the start time of the refresh and then made the sales table dependent on that and then a duration table that where dependent on the sales table

Steps needed

First a query that calculates when the refresh is started


Source = DateTime.LocalNow()



This will use the DateTime.LocalNow() to set the start time

Now in this example I am loading a Excel file on my local harddrive on 22,3 mb with appx. 365.000 rows.

After a navigation step and a promoted header step – I add a Custom Column where I refer to the Query “Start”

This will add the start date to all rows in a separate column and will make the sales table dependent on the Query “Start”.

Next, we need to calculate the duration in a query that is dependent on the “Sales” table.

Step 1

Create a calculation of Time now.

Step 2

Convert it into a table

Step 3

To make it dependent on Sales I add a calculated column that retrieves the MIN value of the values “Start” in the table “Sales”

Step 4

Rename the columns

Step 5

Now we can calculate the duration in seconds by using the function Duration.Seconds() and subtracting [End] and [Start]

Step 6

And finally convert it to a decimal value

The full query is now


Source = DateTime.LocalNow(),

#”Converted to Table” = #table(1, {{Source}}),

#”Added Custom1″ = Table.AddColumn(#”Converted to Table”, “Custom.1”, each List.Min(Sales[Start])),

#”Renamed Columns” = Table.RenameColumns(#”Added Custom1″,{{“Column1”, “End”}, {“Custom.1”, “Start”}}),

#”Added Custom2″ = Table.AddColumn(#”Renamed Columns”, “Query Duration”, each Duration.Seconds([End]-[Start])),

#”Changed Type” = Table.TransformColumnTypes(#”Added Custom2″,{{“Query Duration”, type number}})


#”Changed Type”

Then I disable the load of Sales table in order not to have Power Query read the file several times – (OBS be careful if you already have created measures on the table as the disable of load will remove these measures !!!!)

To time the refresh I click the Refresh button

And the card I have inserted in my report will show the number of seconds the query took.

Now let’s see what a conditional column cost

So, in the sales Query I add a conditional column that does evaluation on every row using the contains operator

And click refresh again.

Depending on your scenario you properly run the refresh several times in order to see the effect on your query.


Please let me know if you have comments or have solved how to time your power queries in another way.

Happy querying

Change the order of reports in the #powerbi service

Just wanted to share a hack of the problem that you can’t determine the sort order of your reports in the service so they will also be sorted alphabetically.

But by using little trick you can overcome this – and not by using a number or label them A., B. etc. in front of the name.

As you can see my “Demo” report is actually before “Buildings Con….” Report.

You can do it like this

Choose the rename report in the navigation pane.

And then simply a space before the name of the report – so if you have 4 reports the report you want to be listed first should have 4 spaces and no 3 should have added 3 spaces etc.

The interface will not show the spaces so it will still look nice – you might have to switch to another workspace and back again in order for the cache of the webpage to be refreshed.


Hope this can help you as well.


Move or Resize #PowerBI visuals with the arrowkeys

You might already know that you can move one selected visual with the arrow key – one point and if you hold down the SHIFT key it will move 8 or 9 pts when you click the arrow key.

See this example – move the visual with the arrow keys

But can you also resize multiple visuals !!!

Until today I didn’t think it was possible to resize visuals using the arrow keys – but it can be done – and even when you select the more than one of the same type of visuals.

So, if you want to make all your cards or bar chart – you can simply select them and then switch to the Format tab of the visual – under General you will find the width and height of the selected visuals.

You can enter new values OR use the magic of the arrow keys !!!!!!! – if you use Arrow up or down you can actually change the number 1 point at a time

Check out this video

This will naturally also make your visuals exactly the same size.

It will save me and hopefully also you a lot of mouse clicks in alignment and resizing.

Hide measures using Row Level Security – #PowerBI

In some cases, you might not want to give all users access to all measures in your model – you might not want to show the profit to certain users.

In the Power BI Desktop designer/Service we can’t hide measures depending on the active user but by combining dynamic measures and row level security we can make our way around this.

In my example I use data from AdventureWorksDW2014 and created a datamodel around FactResellersales.

So, we have 5 measures but Sales Profit, Product Cost and Profit pct should be hidden for some users.

First up creating a dynamically fact

I created a table by entering data in a table

The column Secret should be used to use to filter by user and FactKey we will use the FactKey in a SWITCH statement to create a dynamic fact.

The dynamic fact

The fact will be created like this

Selected Fact =
SELECTEDVALUE(‘Dynamic Fact'[FactKey]),
1,[Sales Amount],
2,[Sales Profit],
3,[Sales Units],
4,[Product Cost],
5,[Profit pct],

In the model I hide all the columns and only show the fact

Hide the table FactResellerSales

To disable the user to be able to select any of the measures created in the FactResellerSales.

Use the dynamic fact

So, in order to use the fact we have to tell the visual which fact to use.

In this case a card visual I have selected the fact.

You can also use the matrix to show more facts at once

Create the Row Level Security

Now we need to add Row Level Security

NonSecretMeasures is now set to filter out the Facts where [Secret] is set to True.

Test the RLS

We can now test the Row Level Security in Power BI Desktop designer

So when viewing as NonSecretMeasures the user sees this

But when viewing as AllMeasures we see

Scaling it – consider moving it to Azure Analysis Services/Tabular model

This method doesn’t really scale very well but can be used in small models.

If your model is bigger and more complicated, you should look at building the model using Azure Analysis Services or a On Prem tabular model where you can implement object level security.

Q & A can help

Using Q & A in the report – it makes it a bit easier to create the visuals

Let me know what you think

Link to demo file – here


Updating a #PowerBI dataset using Microsoft #Flow, #Azure Functions and a #Flic button

I am working on a project where customer would like to update a PowerBI dataset when specific events occur – in their case when an email with a specific subject arrives in a specific mailbox – and we succeeded by using Microsoft Flow, Azure Functions and PowerBI REST API.

But as I recently bought flic buttons (https://flic.io/) so I wanted to change the setup a bit so the dataset is updated when I click the button – sort of my first IOT project.

The ingredients

  • 1 dataset in your PowerBI tenant that can be scheduled for refresh
  • 1 Power BI Pro license (or premium if you want more than 8 refreshes a day)
  • 1 Registration of an Application for Power BI ( link – https://dev.powerbi.com/apps )
  • 1 Azure subscription
  • 1 Power Shell script
  • 1 Azure Function/Logic App
  • 1 Microsoft Flow license in any tenant
  • 1 Flic button
  • 10-15 minutes

First up – the information about the dataset and the group

We need the dataset ID and group ID where the dataset is placed.

The easiest way to find it is to navigate to the settings and the dataset tab and click the data set – the group ID is in blue area and the dataset ID is in the red box,

Notice that the Scheduled refresh is set to Off.

Get a client ID for your Power BI Application

You will need to register an application for the update and you can do this via

Sign in and fill out the information.

The App Type has to be Native app and the Redirect URL must be – urn:ietf:wg:oauth:2.0:oob

Select both the Dataset API’s and in step 4 click to register the App and save the Client ID.

Powershell script to update the dataset

You can find the Powershell script here – https://github.com/Azure-Samples/powerbi-powershell/blob/master/manageRefresh.ps1

Fill in the $ClientID, $groupID and $datasetID in the script.

In order to avoid popup windows with authentication in the Azure Function I had to modify the script and hardcode the username and password in my script.

$userName = “username”

$password = “password”

$creds = New-Object “Microsoft.IdentityModel.Clients.ActiveDirectory.UserCredential” -ArgumentList $userName, $password

$authContext = New-Object “Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext” -ArgumentList $authority

$authResult = $authContext.AcquireToken($resourceAppIdURI, $clientId, $creds)

The bold lines are added and the $authResult is modified as well.

Azure Function

Log on to the Azure portal – https://portal.azure.com/ and create a Function App

Wait for the deployment to finish – it takes about 1-2 minutes.

Choose to create your own custom function

Select language Powershell

And the HttpTrigger

In my example, I will keep authorization level to anonymous – but you can add more security if needed.

Now replace the script with the PowerShell script from the previous section.

Now you can save and click run to see if the script works

If the script succeeds – you can check whether the dataset is updated via the Power BI dataset refresh history

By clicking Get Function URL

We will get the POST Url that we will use later in our Flow.

Now – Microsoft Flow

Open your Microsoft Flow in your Office 365 portal and create a new blank flow

Now we can select between a lot of different triggers that can activate our data refresh in Power BI. It could be certain tweet, a mail in an inbox or in onedrive and many more – we currently have 160 triggers to choose from.

In my example, I want to integrate with a flic button

And by connecting to my Flic account I can choose the button I want to use and which Event (it has click, double click and hold) it should be attached to on the button

To activate the refresh, I add a new step as an action I choose the HTTP where I can post the URL to the function.

After the HTTP action, I add a new step that sends and email to me – with information about the refresh and which button was presses and where it was located when pressed.

Then save the flow and we should be ready to update the dataset via the button

Trying the flow

So by clicking the button we can see the Run History

As you can see the flow ran as 23:25

And when checking the refresh history in Power BI we can see its refreshed –

And I get an e-mail as well

OBS – The refresh can “only” be done 8 times with a Power BI Pro License but 48 if you should be so lucky that you can have access to a premium edition.

Wrapping up

I have been so fun to create this and it was surprisingly easy – with off course some challenges underway.

But the toolset is amazing and combining Flow and Power BI opens a lot of possibilities and ideas to activate refreshes and do stuff with the Power BI REST API triggered by different events.

I will at least have an extra look at the API and see what can be done – link to documentation

Let me know if you find this interesting.