#PowerQuery – Add Year, Month and Day to your date table with Date.ToRecord – #PowerBI

When you build a date table in Power Query you might use the functions under Date to add year, month and day

And this will give you three steps in your Query

But we can do this a bit faster, and you will save a few clicks with your mouse

If you add a custom column using this formula

= Table.AddColumn(#”Changed Type”, “Custom”, each Date.ToRecord([Date]))

You will get a column containing a record

Then next step is to expand the record

And Voila – Year, month and day in 2 steps instead of 3.

Update – 25/10-2021

If you want to specify the datatype for each of the Year, Month and Day you can modify the formula for addColumn to

= Table.AddColumn(#”Changed Type”, “Custom”, each Date.ToRecord([Column1])
, type [ Year = number, Month = number, Day = number]
)

This will make the columns numbers

Happy Querying

How to connect to CDS from #PowerBI – Or where the h.. can I find the server URL

The last few month I have used CDS a few times in my solutions – and connected the data to PowerBI.

But one of the things I always search for is the Server URL

Above is the screen shot using the Common Data Service connector

And even the beta connector requires me to specify URL –

And even though the dialog says “Environment domain” – it is in fact the URL the connector wants – BUT without the https://

OBS – If you want to test the Beta connector – remember to enable TDS Endpoint under the Environments – Settings – Features

So where can I find the address

If you have access to the Admin center – Power Platform admin center (microsoft.com) – you can go into the environment and see URL.

If you do not have access to it – then open the model driven app and the URL is available in the address bar.

Hope this can help you.

Power On !

Tips to limit the number of steps in #PowerQuery – #PowerBI

When returning to some of my first queries built in Power Query, I always get a bit sad.

My old queries contain some unnecessary extra steps that makes it harder to evaluate what is happening in the query and in most cases I haven’t even bothered to rename the steps which makes the queries very hard to read for me and for others.

So, in order to improve here are some tips that can make mine and your queries shorter and easier to read.

Always Rename your Steps

Always spend time to rename your steps and avoid spaces in the step name.

Makes it easier to read and is documentation.

Especially important because the step name does not necessarily describe what is happening.

In this example I used the Remove Other columns function, and the step name refers to that function

– but what is actually happening is I am selecting specific columns – notice the formula bar – Table.SelectColumns

So, rename the step.

Check if you can avoid a column rename step

Using the interface in the Query Editor makes it very easy to get the “Renamed Columns” step but it can sometimes be avoided – For instance expanding a record column – in this example the DimGeography Column

– in this example the DimGeography Column and then double clicking the headers returned will lead into 2 steps but notice the last argument in the formula bar.

That is a list of column names that will be given and by modifying that we can do the expansion and renaming in one step.

Avoid the empty filter steps ie to not filter by (Select All)

When testing my queries, I sometimes filter the result to check the data

And then to clear it I sometimes use the (Select All) to remove the filter

This however leaves a filter step in your query where the Table.SelectRows function just is “each true”.

If you use the clear filter instead

The Filter step will be deleted

Limit the number of Rename Columns

Doing column renaming during the query steps can lead to several column renaming steps.

Try to do only 1 renaming step.

Limit the number of steps with Remove columns

Try to consolidate the column selection or deletion of columns in one step.

And remember the optional parameter in the Table.SelectColumns and Table.RemoveColumns where you can specify what should happen if a field in the column list is missing.

Don’t reorder your columns

You can change the column order in the Query editor but there is no need to do it

The columns will appear in alphabetical order in your data model anyway.

If you have a large table and can’t find the column use the Go to Column to locate it

A good bad example

Yes, this is one of my queries

This is the query after a clean-up – it does exactly the same but in fewer step and with a proper explanation in the step name.

Do you have any tips ?

This was a few tips to limit the number of steps in your query.

Let me know if you have any in the comments.

Stay queryious.

#PowerQuery – Dynamically reference a table via a Parameter

The other day I had a fun challenge from my co-worker – Søren Faurum – that wanted to change the refence to a column name from one table to another table with the same column name.

OBS – The solution is not supported in the Power BI Service.

In his case it was

Let
   x= List.Distinct(tableName1[StoreKey])
in
   x

And he wanted TableName1 to be a value of a parameter.

Here is an example (data is from AdventureWorks) – in my model I have two tables – DimReseller and DimCustomer – both have a GeographyKey

A parameter called TableName should now be used in a query to retrieve either the unique list of values of the key from either DimReseller or DimCustomer.

If we just use

List.Distinct(TableName[GeographyKey])

We do get an expression error as the parameter is a text value and not a table that has fields.

Then I thought we could the function Expression.Evaluate() – link to documentation

But no – apparently, we can’t reference a Table name like this.

#shared to the rescue 🆘

Then I thought why not use the #shared function – as I knew that #shared will list all the functions, tables, parameters etc. that are available in the current pbix file – including all loaded or not loaded tables.

Then we can turn the record into a table and filter it based on a parameter

This will give us a table with one record.

Where the Value column will contain the table we want to be dynamic

Then by adding the following to the formula

  • {0} to get the first row/record
  • [Value] to retrieve the Table in the record
  • And [GeographyKey]

We can get all the Geography Keys in the table

And by putting all of this into the argument of the List.Distinct function

= List.Distinct(Table.SelectRows(#"Converted to Table", each [Name] = TableName){0}[Value][GeographyKey])

Which returns 510 rows.

And now I can change the parameter to DimCustomer

And see it returns 336 rows

However if we publish the model to the Power BI Service it won’t work if you want to schedule refresh as #shared is not supported in the service.

So we can only use it in the desktop

Stay Querious

Converting xls files to xlsx file using #powerautomate and avoid the pitfalls in #powerquery using xls

As described in this post – https://www.ehansalytics.com/blog/2020/2/15/avoid-using-excel-xls-files-as-data-source
there are issues to be aware off when you use xls files instead of xlsx in Power Query. See also this thread
https://social.technet.microsoft.com/Forums/en-US/41f2c8ec-1f2c-4591-ac6a-54764b2a90a7/bug-in-excelworkbookwebcontents-powerquery?forum=powerquery
.

Answering the twitter started by Imke Feldmann (https://twitter.com/TheBIccountant) thread by Ruth Pozuelo (go follow her excellent youtube – channel – link) encouraged me to write this post – – as I claimed we can convert the xls files to xlsx using Power Automate.

So here is a guide on how to do it

Convert xls files to xlsx

In the scenario I will use a trigger when an e-mail is received and use a rest API provided by https://cloudconvert.com/.

OBS – This is a paid service where you pay by the minute the conversion takes – price from $0.02 to $0.01 per minute.

First we start by selecting to build an automated flow and select the trigger “When a new email arrives (V3)”

Set the advanced options to only trigger when Attachments is included and include the attachments in the following steps in our flow.

As the next step I use an Azure Blob storage to store the file in the e-mail. When selecting the output from the previous step – power automate will automatically create an Apply to each container in which we can refer to each attachment in the mail.

In the Create Blob Action I connect to a blob storage and load the attachment into a preload folder

Now add a step where we create an HTTP request within the Apply to each container

In order to use the cloud convert REST API we need first to create a Process and use the process ID to create the conversion – documentation here

In the body property you specify your own APIKey and tell the process you want to do is a conversion from xls format to xlsx.

Next – add another HTTP request

We use a POST request again

And in the URI we use the result from the previous step as it returns a unique address to the process id from cloud convert

In the Body property we specify where

{

"input": {

"azureblob": {

"storageaccount": "<NAMEOFBLOBACCOUNT>",

"storageaccesskey": "<AccountKey>",

"container": "preload"

}

},

"file": @{body('Create_blob')?['Name']},

"timeout": 10,

"output": {

"azureblob": {

"storageaccount": "<NAMEOFBLOBACCOUNT>",

"storageaccesskey": "<AccountKey>",

"container": "converted"

}

},

"outputformat": "xlsx"

}

Here is the final overview of the steps needed.

Now send an email to the inbox you have connected your trigger to run.

And in our preload folder we can see the files

And in the converted folder we have the converted xlsx files

Hope this can help you converting your xls files to xlsx.

This will also make it much easier if you want to be able to setup

Power On!

#PowerQuery – Using SQL scalar value functions in PowerQuery

Currently I am working with a project where we extract data from a SQL server – some of the business logic is built into scalar value functions (documentation).

Now the magic of PowerQuery enables us to reuse these functions within PowerQuery and Query Folding is supported – more about this at the end of this post.

Let’s step through how to do this

I will use the famous AdventureWorksDW2014 database.

In my database I have added a function called IsoweekwithYear

This function uses a date a parameter and returns an Integer representing the year and the week. In this example it returns 202018 as 28 april 2020 is in iso week 18 of 2020.

In the Power BI Desktop we can now connect to the SQL server

And in this example I pick the DimDate table from the database

And if we scroll down you will also see the functions, we have in the database

Notice that the data preview pane will show the function and the parameters it can take.

Then I click Transform Data and the Power Query Editor will appear

By default it will Invoke the function and in this case it will return Null

Notice the Applied Steps in the Query Settings pane

If you remove the last step where the function is invoked

You will see it as a function that has a parameter called @DATE which is optional.

Now let’s use the function

In the Query DimDate we can the call the function – as we now we need to call the function using the column FullDateAlternateKey as the parameter I select this column first and then click Invoke Custom Function via the Add Column section in the ribbon.

We can then choose the Function Query IsoweekWithYear and use the column name as the parameter value

This will add a new column at the end of my query

Containing the ISOWeek – yuppieee 🙂

Is Queryfolding supported ?

It sure is

As we can see as we can select “View Native Query” and see that the SQL statement is added.

This means that the server does the work instead of the M-engine.

What’s next

I must admit that I wasn’t aware of this possibility before – and I feel like it opens up a lot of scenarios to reuse already built and tested functions that your database developers have built.

Let me know if you find this useful as well by leaving a comment or give the post a like

Happy querying !!!

#PowerQuery – Easily Copy all queries from a PBIX to Excel and vice versa

Today I discovered a way to easily copy all queries (including the groups) from a PBIX file to Excel without entering the power query editor in Excel.

In my case I needed to copy the data returned by a query and find it some much easier to get the data using Excel instead of generating a csv file from a visual in Power BI Desktop and importing that into Excel.

My PBIX contains some queries to extract weather data from different stations in Sweden.

These are grouped to create a logic structure of the queries.

Select all queries and copy them.

Now open a blank workbook and click the Queries & connections in the Data tab in the ribbon

And the Queries & Connections panels opens

Now right click in the panel

And select Paste – and the queries, parameters and load settings for the queries are also included.

For the table you want to load you now need to specify whether you want to load to the table to the datamodel or to a worksheet – default will be worksheet.

If you collapse all your groups in the panel and select all of them you can also copy them

And paste them into the Power Query editor in the Power BI Desktop

And after paste

Hope you this can help you too and stay Queryious

#PowerQuery – Avoid an extra step by not clicking OK

Just wanted to share a small tip in PowerQuery where an extra step is added to your queries even though you might not want it.

In this example I have imported the FactInternetSales from the AdventureWorks database and have filtered it with orderdatekey after a specific date.

Now I just want to check the different product keys and I click the dropdown next to the field name.

If you do not specify a filter but just click OK (the default button) you will get an extra step inserted in the Applied Steps

= Table.SelectRows(#”Filtered Rows”, each true)

Which just means that all rows should be returned from the previous step.

If you press Cancel instead the step won’t be added.

So, keep your queries nice and clean and click Cancel instead of OK and stay querious 🙂

PS – and remember to rename the steps to keep them as explanatory as possible.

Quick #PowerQuery trick – Get duration days between two dates

Just a quick tip that you might not be aware of in the Power Query Editor.

If you select two columns in the view and on the Add Column tab, select the Date button – you can select to Subtract Days

This will give you the number of days between the dates in the selected columns

Use the formula bar to rename the new column by modifying the step

 

Hope this can help you too –

#PowerQuery – Calculate the ISO date from year and a week number

Just wanted to share a M function that I just created to calculate the date from a Year and a ISO week number.

The example data is the following

Now the function I created can be called with the Year and Week number as parameters to get the following result

The function has the following code and will return the date of the Monday of the week number.

(TheYear as number, TheWeek as number) as date =>
let
//test
//TheYear = 2018,
//TheWeek = 1,
//
offsetToISO = Date.AddDays(#date(TheYear,1,1),-4),
dayOfWeek = Date.DayOfWeek(offsetToISO),
offset = -dayOfWeek + (TheWeek * 7),
isoWeekDate = Date.AddDays(offsetToISO, offset)
in
isoWeekDate

Hope this can help you too.

Here is a link to an example pbix file – link