#PowerBI – External Tool – Open Power BI Report Builder – part 1

As you may have noticed I have blogged twice about the new awesome feature in Power BI Desktop where we can build our own external tool buttons in Power BI Desktop.

Here is a link to the previous posts.

  • Analyze in Excel (link)
  • Open in Tableau (link)

One perhaps forgotten member in Power BI is the Power BI Report Builder – aka – Paginated reports – and even though it requires a premium capacity or Power BI embedded A-SKU to publish/share reports – I thought it might be interesting to see if we could link the local pbix file to the Report Builder.

I will write a part 2 where the external tool will support if the desktop file is connected to an Azure Analysis Server or Power BI Dataset as well.

You can download the free Power BI Report Builder from here.

How to build the external tool

The tool has two files

  • A powershell file – ConnectToPowerBIReportBuilder.ps1
    Must be stored in C:\temp
  • A external tool file – OpenInPowerBIReportBuilder.pbitool.json
    Must be stored in C:\Program Files (x86)\Common Files\Microsoft Shared\Power BI Desktop\External Tools

OBS – in order to run a powershell script on your pc you need to have to set the execution policy – https://go.microsoft.com/fwlink/?linkid=135170

You can download the files from here – https://github.com/donsvensen/openinpowerbireportbuilder

The powershell

The report builder uses the rdl – file format which is an xml file.

So the PowerShell script is simply to create an rdl file and make the connection dynamic using the parameters about server and database name that the External tool provides as arguments when the button is clicked in Power BI Desktop.

This is a small part of the xml that creates the rdl file and it could be expanded to create datasets and insert visualizations.

        $tdsXml = "<?xml version=""1.0""?>
<Report MustUnderstand=""df"" xmlns=""http://schemas.microsoft.com/sqlserver/reporting/2016/01/reportdefinition"" xmlns:rd=""http://schemas.microsoft.com/SQLServer/reporting/reportdesigner"" xmlns:df=""http://schemas.microsoft.com/sqlserver/reporting/2016/01/reportdefinition/defaultfontfamily"">
  <rd:ReportUnitType>Mm</rd:ReportUnitType>
  <rd:ReportID>cd1b0079-99d9-4e99-9f5e-6fda21f9f09d</rd:ReportID>
  <df:DefaultFontFamily>Segoe UI</df:DefaultFontFamily>
  <AutoRefresh>0</AutoRefresh>
  <DataSources>
    <DataSource Name=""PowerBI"">
      <rd:SecurityType>Integrated</rd:SecurityType>
      <ConnectionProperties>
        <DataProvider>OLEDB-MD</DataProvider>
        <ConnectString>Data Source=$port;Initial Catalog=$database</ConnectString>
        <IntegratedSecurity>true</IntegratedSecurity>
      </ConnectionProperties>
      <rd:DataSourceID>8c230b7c-799c-4007-af55-6bc26eca797c</rd:DataSourceID>
    </DataSource>
  </DataSources>
  <ReportSections>
    <ReportSection>

What happens

After you have placed the OpenInPowerBIReportBuilder.pbitool.json in the correct folder and you have restarted Power BI Desktop

the toolbar button

You will see a new button in the ribbon.

When you have a pbix file open containing a datamodel and click the button a desktoptobuilder.rdl file will be generated in C:\temp

And Power BI Report Builder will open – in the xml I have also inserted text in the report header showing the local host port.

In the data Sources you will find

A data source called PowerBI which is connect to your running instance of tabular model

And then we can create a dataset that uses the data source

And use the designer to create the DAX query

And use this to in a chart

Obs – remember to save the file with another name – and to update the connection information after you close and reopen the desktop file.

In my next post I will demonstrate how we can do the same if the pbix is connected to Azure Analysis Services or a Power BI Dataset.

Stay tuned 🙂

PS – This one is for #paginatedreportbear

Paginated Report Bear (@PaginatedBear) | Twitter

#PowerQuery – Dynamically reference a table via a Parameter

The other day I had a fun challenge from my co-worker – Søren Faurum – that wanted to change the refence to a column name from one table to another table with the same column name.

OBS – The solution is not supported in the Power BI Service.

In his case it was

Let
   x= List.Distinct(tableName1[StoreKey])
in
   x

And he wanted TableName1 to be a value of a parameter.

Here is an example (data is from AdventureWorks) – in my model I have two tables – DimReseller and DimCustomer – both have a GeographyKey

A parameter called TableName should now be used in a query to retrieve either the unique list of values of the key from either DimReseller or DimCustomer.

If we just use

List.Distinct(TableName[GeographyKey])

We do get an expression error as the parameter is a text value and not a table that has fields.

Then I thought we could the function Expression.Evaluate() – link to documentation

But no – apparently, we can’t reference a Table name like this.

#shared to the rescue 🆘

Then I thought why not use the #shared function – as I knew that #shared will list all the functions, tables, parameters etc. that are available in the current pbix file – including all loaded or not loaded tables.

Then we can turn the record into a table and filter it based on a parameter

This will give us a table with one record.

Where the Value column will contain the table we want to be dynamic

Then by adding the following to the formula

  • {0} to get the first row/record
  • [Value] to retrieve the Table in the record
  • And [GeographyKey]

We can get all the Geography Keys in the table

And by putting all of this into the argument of the List.Distinct function

= List.Distinct(Table.SelectRows(#"Converted to Table", each [Name] = TableName){0}[Value][GeographyKey])

Which returns 510 rows.

And now I can change the parameter to DimCustomer

And see it returns 336 rows

However if we publish the model to the Power BI Service it won’t work if you want to schedule refresh as #shared is not supported in the service.

So we can only use it in the desktop

Stay Querious

Connect your #PowerBI desktop model to #Tableau Desktop via External Tools in PowerBI

I recently created an external tool to PowerBI desktop that connects your Power BI desktop model to Excel (https://eriksvensen.wordpress.com/2020/07/27/powerbi-external-tool-to-connect-excel-to-the-current-pbix-file/) and then I thought – could we also have a need for an external tool that could open the desktop model in Tableau desktop.

So, I downloaded a trial version of the Tableau Desktop to see what is possible.

And sure, enough Tableau can connect to Microsoft Analysis Services and therefor also the localhost port that Power BI Desktop uses.

We can also save a data source as a local data source file in Tableau

Which gives us a file with a tds extension (Tableau Data Source)

When opening the file in Notepad we can see the connection string and some extra data about metadata-records.

It turns out that the tds file does not need all the meta data record information – so I cleaned the tds file to contain

Opening this file from the explorer will open a new Tableau Desktop file with the connection to the specified model/database/server.

The external tool

Knowing this I could create an external tool the same way as my Excel connector.

First create a PowerShell

OBS – in order to run a powershell script on your pc you need to have to set the execution policy – https://go.microsoft.com/fwlink/?linkid=135170

The PowerShell script

Function ET-TableauDesktopODCConnection
{  

	[CmdletBinding()]
    param
    (
        [Parameter(Mandatory = $false)]        
		[string]
        $port,
        [Parameter(Mandatory = $false)]        
		[string]
        $database,
        [Parameter(Mandatory = $false)]        
		[string]
        $path	
    )
    
        $tdsXml = "<?xml version='1.0' encoding='utf-8' ?>
<datasource formatted-name='LocalPowerBIDesktopFile' inline='true' source-platform='win' version='18.1' xmlns:user='http://www.tableausoftware.com/xml/user'>
  <document-format-change-manifest>
    <_.fcp.SchemaViewerObjectModel.true...SchemaViewerObjectModel />
  </document-format-change-manifest>
  <connection authentication='sspi' class='msolap' convert-to-extract-prompted='no' dbname='$database' filename='' server='$port' tablename='Model'>
</connection>
</datasource>"   
                
        #the location of the odc file to be opened
        $tdsFile = "$path\tableauconnector.tds"

        $tdsXml | Out-File $tdsFile -Force	

        Invoke-Item $tdsFile

}

ET-TableauDesktopODCConnection -port $args[0] -database $args[1] -path "C:\temp"

The script simply creates a tableauconnectort.tds file and stores it in C:\temp – and the xml content in the file is dynamically referenced as arg(0) and arg(1) when the external tool is called from Power BI Desktop.

Save the script in C:\temp and call it ConnectToTableau.ps1.

The OpenInTableau.pbitool.json file

Next step was to create a pbitool.json file and store it in C:\Program Files (x86)\Common Files\Microsoft Shared\Power BI Desktop\External Tools

{
  "version": "1.0",
  "name": "Open In Tableau",
  "description": "Open connection to desktop model in Tableau ",
  "path": "C:/Windows/System32/WindowsPowerShell/v1.0/powershell.exe",
  "arguments": "C:/temp/ConnectToTableau.ps1 \"%server%\" \"%database%\"",
  "iconData": "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAJAAAACQCAYAAADnRuK4AAAABmJLR0QA/wD/AP+gvaeTAAADRklEQVR4nO3dv27TUBiH4WPEitSRS+iCurO0GzdRiS5sXRhAXZhYEAxd2LoUiZtgaxb2iqWXwFiJCzgsqPRPrMb5Jc1x/TxbqgSi5O2xE3+uSwGAUeo2/QRac3R8cla6bvfqB7XOPr19s7e5Z9S2J5t+AoybgIgIiIiAiAiIiICICIiIgIgIiIiAiEziUMbR8cnZovetXbfTlbJ1dbuUy67W80UfP7XDHk83/QQexPVjW/fd9e7trSGPnxqbMCICItLEJqyeljrv593BivbRap0tfNdwH2hVDj58mfuanH5819R+axMBrduQHdvb80BdredT2zEewiaMiICICIiIgIgIiIiAiAiIiICICIiIgIhM4lDGEA5bDGMFIiIgIgIiIiAiAiISTbf1TRK2ZmWTjQvomyRszaomG61ARAREREBEBEREQESaOMdo7eeFjdBYzguzAhEREBHjHP/8fv/i3i8An3/+1dTmowVWICICIiIgIgIiIiAiAiIiICICIiIgIgIiIiAiSx8Lc3Xjcdk/nJ2VWv+/X103+/51dy/9d61ARAREpIlxjilPHvZpbfKwjxWIiICICIiIgIgIiEgTn8KGWmQAfiz/79gH9a1ARG7UP5arG29qBVqHZAXaP5ydDbj7Tqn16v0qXXdZSln4/eo77HFzE+bqxuNy/djW8MdulVLi98smjIiAiNzchI3w6saT1nULv18l3AfqfQrLPnCT80B2ooczD0STRvlF4jp+a/11juVYgYgIiIiAiAiIiICINPEp7Of29txPQC8vLib7qefZq29zX5M/P1439ZpYgYgIiMjSmzCnMY/LKg5bzGMFIiIgIgIiIiAiAiIiICICIiIgIgIiIiAiAiLSxDhHCwzML8cKRERARKJlu2+SsDUPOdnYN0nYmlVNNlqBiAiIiICICIiIgIg08eWZ88Lucl4YkyAgIgIiIiAiAiJinOOWdf0108fKCkREQEQERERARAREREBEBEREQEQERERARCZxKGPw1Y1v3R7y+Kkd9mgioLVPHjZwdeOhWps87GMTRkRARJrYhK1dA1c3fqxGsZ19SOaBhrEJIyIgIgIiIiAiAiIiICICIiIgIgIiIiAAAAAYjb8VJdQbiRXyOAAAAABJRU5ErkJggg=="
}

Test it

Now restart your Power BI desktop and the external tool should be visible in the ribbon

Then open a pbix file with a model and hit the button.

A PowerShell screen will shortly be visible and then Tableau opens the tds file and now we have a new tableau book with a connection to active power bi desktop datamodel.

And we can start to do visualizations that are not yet supported in Power BI –

How can you try it

You can download the files needed from my github repository – link

Feedback

Let me know what you think and if possible share some of the viz that you make.

#PowerBI – External tool to connect Excel to the current PBIX file

In the July update of the Power BI Desktop we now can add external tools to the ribbon.

If you install the latest versions of Tabular Editor, DAX Studio and the ALM Toolkit these will be added as tools in the ribbon.

But you can also build and add your own tools.

David Eldersveld (link) has written an excellent series of blogposts about using Python as an external tool – link to part one – and this inspired me to give it a go as well.

The official documentation can be found here.

Short description of what an external tool really is

An external tool will point to an exe file and you can supply the call to the exe file with arguments including a reference to the %server% and %database%.

The information about the external tool needs to be stored in

C:\Program Files (x86)\Common Files\Microsoft Shared\Power BI Desktop\External Tools

And name the file “<tool name>.pbitool.json”.

This will give me these buttons in my Power BI Desktop

My idea to an external tool

When I build models – I use Excel pivot tables to test and validate my measures and typically I would use DAX Studio to find the localhost port to setup a connection to the currently open PBIX file.

So, I thought it be nice just to click a button in PowerBI Desktop to open a new Excel workbook with a connection to the current model. That would save me a couple of clicks.

If I could create an ODC file when clicking on the button in Power BI and then open the ODC file (Excel is the default application to open these) my idea would work.

I have previously used Rui Romano’s (link) excellent PowerBI powershell tools – link to github and link his blogpost about analyse in Excel – so why not use PowerShell to do this.

Here is a guide to build your own version

Step 1 Create a powershell script

I created a powershell file called ConnectToExcel.ps1 and saved the file in local folder C:\Temp – you can save this where you want it stored. (Link to sample files last in this post)

The script is a modified version of Rui’s function Export-PBIDesktopODCConnection – thank you so much these.

Function
ET-PBIDesktopODCConnection

{

# modified the https://github.com/DevScope/powerbi-powershell-modules/blob/master/Modules/PowerBIPS.Tools/PowerBIPS.Tools.psm1

# the Function Export-PBIDesktopODCConnection

    [CmdletBinding()]

param

(

[Parameter(Mandatory =
$false)]

        [string]

$port,

[Parameter(Mandatory =
$false)]

        [string]

$path

)

$port = $port

$odcXml= “<html xmlns:o=””urn:schemas-microsoft-com:office:office””xmlns=””http://www.w3.org/TR/REC-html40″”><head><meta http-equiv=Content-Type content=””text/x-ms-odc; charset=utf-8″”><meta name=ProgId content=ODC.Cube><meta name=SourceType content=OLEDB><meta name=Catalog content=164af183-2454-4f45-964a-c200f51bcd59><meta name=Table content=Model><title>PBIDesktop Model</title><xml id=docprops><o:DocumentProperties xmlns:o=””urn:schemas-microsoft-com:office:office”” xmlns=””http://www.w3.org/TR/REC-html40″”&gt; <o:Name>PBIDesktop Model</o:Name> </o:DocumentProperties></xml><xml id=msodc><odc:OfficeDataConnection xmlns:odc=””urn:schemas-microsoft-com:office:odc”” xmlns=””http://www.w3.org/TR/REC-html40″”&gt; <odc:Connection odc:Type=””OLEDB””>

<odc:ConnectionString>Provider=MSOLAP;Integrated Security=ClaimsToken;Data Source=$port;MDX Compatibility= 1; MDX Missing Member Mode= Error; Safety Options= 2; Update Isolation Level= 2; Locale Identifier= 1033</odc:ConnectionString>

<odc:CommandType>Cube</odc:CommandType> <odc:CommandText>Model</odc:CommandText> </odc:Connection> </odc:OfficeDataConnection></xml></head></html>”

#the location of the odc file to be opened

$odcFile = $path\excelconnector.odc”

$odcXml|Out-File $odcFile -Force

# Create an Object Excel.Application using Com interface

$objExcel=New-Object -ComObject Excel.Application

# Make Excel visible

$objExcel.Visible = $true

# Open the Excel file and save it in $WorkBook

$WorkBook = $objExcel.Workbooks.Open($odcFile)

}

write $args[0]

ET-PBIDesktopODCConnection -port $args[0] -path “C:\Temp”

The script contains a function that creates an ODC file where the Datasource and path of the ODC file is determined by to arguments in the function – port and path, The Script also opens Excel and then opens the file.

The scripts contain a

$args[0]

This will in the end be the value localhost:xxxxx that will be provided when we click the External tool button in Power BI Desktop – and will make more sense after step 2

Notice that I have hardcoded the path where the ODC file will be stored to C:\Temp.

Step 2 Create a .pbitool.json file

The pbitool.json file is relatively simply

Name is the text that will appear in the ribbon.

Description is the tooltip that appears in Power BI Desktop according to the documentation – but it doesn’t work at the moment.

Path is the reference to the exe file you want to activate – and only the exe file.

Arguments is the arguments that you want to pass the exe file – and here we have the to built in references %server% and %database%. Arguments are optional so we could just start Excel or any other program if we wanted .

IconData is the icon that you want to appear in the ribbon – I found an icon via google and then used https://www.base64-image.de/ to convert it to the string.

In this tool we use the Powershell.exe file that can be called with arguments where we specify the script file that we want to be executed and we pass the extra arguments server and database as well – in my script I only use the %server% reference which will give me the server name and portnumber of the local instance.

It means that when the button is clicked in PowerBI Desktop it will execute

C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe C:\temp\connetToExcel.ps1 localhost:xxxxx databasename

The localhost:xxxxxx can is the first argument provided and the value can then be referred to by using $args[0].

The file must then be stored in C:\Program Files (x86)\Common Files\Microsoft Shared\Power BI Desktop\External Tools and in my case I called it OpenInExcel.pbitool.json.

Depending on your privileges on your computer you might be warned that you need administrative rights to save files in that location.

And if you save the script file elsewhere you need to modify the pbitool.json file.

Step 3 – Test it

Now we are ready to restart Power BI Desktop – and

And it does appear

Next – open a pbix file

This will open a Windows PowerShell window and write the server information

And in the background opens Excel and the ODC file – which results in a pivotable connected to the local instance.

With a connection to the localhost:52510

The files

You can download the files needed from here – https://github.com/donsvensen/erikspbiexcelconnector

Feedback

I think the use of PowerShell opens a lot of interesting scenarios for external tools and I look forward to see what other external tools that appear in the community.

Please let me know what you think and if you find it useful.

Converting xls files to xlsx file using #powerautomate and avoid the pitfalls in #powerquery using xls

As described in this post – https://www.ehansalytics.com/blog/2020/2/15/avoid-using-excel-xls-files-as-data-source
there are issues to be aware off when you use xls files instead of xlsx in Power Query. See also this thread
https://social.technet.microsoft.com/Forums/en-US/41f2c8ec-1f2c-4591-ac6a-54764b2a90a7/bug-in-excelworkbookwebcontents-powerquery?forum=powerquery
.

Answering the twitter started by Imke Feldmann (https://twitter.com/TheBIccountant) thread by Ruth Pozuelo (go follow her excellent youtube – channel – link) encouraged me to write this post – – as I claimed we can convert the xls files to xlsx using Power Automate.

So here is a guide on how to do it

Convert xls files to xlsx

In the scenario I will use a trigger when an e-mail is received and use a rest API provided by https://cloudconvert.com/.

OBS – This is a paid service where you pay by the minute the conversion takes – price from $0.02 to $0.01 per minute.

First we start by selecting to build an automated flow and select the trigger “When a new email arrives (V3)”

Set the advanced options to only trigger when Attachments is included and include the attachments in the following steps in our flow.

As the next step I use an Azure Blob storage to store the file in the e-mail. When selecting the output from the previous step – power automate will automatically create an Apply to each container in which we can refer to each attachment in the mail.

In the Create Blob Action I connect to a blob storage and load the attachment into a preload folder

Now add a step where we create an HTTP request within the Apply to each container

In order to use the cloud convert REST API we need first to create a Process and use the process ID to create the conversion – documentation here

In the body property you specify your own APIKey and tell the process you want to do is a conversion from xls format to xlsx.

Next – add another HTTP request

We use a POST request again

And in the URI we use the result from the previous step as it returns a unique address to the process id from cloud convert

In the Body property we specify where

{

"input": {

"azureblob": {

"storageaccount": "<NAMEOFBLOBACCOUNT>",

"storageaccesskey": "<AccountKey>",

"container": "preload"

}

},

"file": @{body('Create_blob')?['Name']},

"timeout": 10,

"output": {

"azureblob": {

"storageaccount": "<NAMEOFBLOBACCOUNT>",

"storageaccesskey": "<AccountKey>",

"container": "converted"

}

},

"outputformat": "xlsx"

}

Here is the final overview of the steps needed.

Now send an email to the inbox you have connected your trigger to run.

And in our preload folder we can see the files

And in the converted folder we have the converted xlsx files

Hope this can help you converting your xls files to xlsx.

This will also make it much easier if you want to be able to setup

Power On!

Incremental refresh in #PowerBI with data from BigQuery

Back in February 2020 the Incremental refresh in Power BI became GA (link) and supported in Power BI Pro as well.

I have not had a chance to use this a lot yet but in a recent project for a client we were working with a large dataset 500+ mill. rows of historical search data stored in Big Query (yes Google BigQuery 🙂 ).

One of the prerequisites for Incremental Refresh to work is that your Power Query can be query folded – ie. sent back to the source system as a native query from Power Query so all can be folded into a query understood by the source.

You will find the documentation here.

But let me take you through the steps if you want to do this with a query against Google BigQuery.

The data

I am using the flight dataset that you are guided through creating when you create a new project in BigQuery.

It contains 7,95 gb of data and 70.588.485 rows with 10 years of flight data from jan 2002 until dec 2012. (To work with incremental refresh I will add 8 years to all dates)

Setting up incremental refresh for this dataset

Open a new pbix file and select Get data in the ribbon – filter the connectors by big to find the Google BigQuery connector.

You will then be prompted to signin to BigQuery and then presented by the projects you have access to.

When you select the tables or views you want to import – hit “Transform data”

As the BigQuery connector also supports direct query you will be prompted whether you want to use import or direct Query mode – in this case we want to import the data – so I select Import.

This will open the Power Query editor and display all the columns and the data for the first 1000 records

Native Query greyed out – Why ?

If you have worked with query folding before you properly know that you can view the native query constructed via the shortcut menu “View Native Query”

But when we use the BigQuery connector it is greyed out – but let’s check whether it does fold or not.

You can see the query constructed in Query History pane in Google Console for BigQuery

And the query folds nicely – at least until we do Power Query transformations.

Let’s setup DateRanges for incremental refresh

In order to do a incremental refresh we need to create two parameters in our PBIX – called RangeStart and RangeEnd

Remember the parameters must be Date/Time Type.

In our flights query – we then add a filter to the date column.

But we can’t really select the parameters

This is because the date column is the datatype – Date and not Date/Time.

So, let us change the date columns datatype to Date/Time

The rows in the table should be filtered like this

REMEMBER that only one of the filters must have the “equal to” – otherwise you risk two partitions include the same period.

The Native query is still grey.

But when we check the query history in BigQuery we can see that it is nicely folded back to the engine

Let us build the simple datamodel and setup the incremental refresh

Clicking the Close & Apply in the editor we start the import of the filtered information

After importing the data, I create a simple measure that calculates the number of records in the flights table.

And we can start visualizing the data even more using the other fields in the dataset

If we publish the dataset now – the only data we will ever get in model is for January 2003 (2011) – but we want more data so we have to setup incremental refresh so when the data is refreshed in the service it will contain all the data for the period we want in the model.

!!!! – As the refresh uses the date/time value of the current date I have modified the data by adding 8 years to all the records so it (And even that is folded back)

I then select Incremental refresh for the flights table by selecting Incremental refresh in the shortcut menu.

This opens the following dialog.

Don’t worry about the warning in the top – this is due to “native query” being greyed out as the connector apparently doesn’t return the information to Power Query – but as you have seen previously we do know that the query is folded.

I set the model to contain

8 years of date in the column “date” and the model should only refresh the latest rows within the last month.

When you click Apply All you cannot really see nothing happened – but if you open the dialog again you will see it contains the settings you have set.

Publishing the model to the service

When the model is published it

The dataset will before refresh the first time will only contain the data, we have imported in the pbix

We need to set the credentials in the data

And set a time of day where you want to refresh the dataset.

We can activate the first refresh by clicking the refresh button next to the dataset name

Or you can republish the pbix file to the service – this will also activate a refresh. This refresh didn’t happen with the first publish as you need to specify the credentials to BigQuery.

Successful refresh

After the refresh has finished, we can see the total number of rows has increased to 56.007.003.

And the period included is eight years plus current year and as we selected 1 month of update the dataset will include all the dates in may 2020. In your case you might only have data until Today() and this will then be included in your dataset.

The first refresh took almost 20 minutes.

But when we hit refresh again –

The refresh only took 1 min and 47 seconds as it only had to update the data for May 2020.

No premium

All of this is done without a Premium capacity – 👍

With a premium capacity you get a XMLA endpoint with which you can do even more – this is perhaps another blogpost worthy – so leave a comment if you want me to write a post about this.

Power On !

Update – check out this follow up post from Daniel Lennartson – https://www.linkedin.com/pulse/partitioned-tables-bigquery-power-bi-incremental-daniel-lennartsson/?trackingId=maXDCJvRSgaan6uAZHlBqg%3D%3D – describing how to use partitioned table to limit the cost.

#PowerQuery – Using SQL scalar value functions in PowerQuery

Currently I am working with a project where we extract data from a SQL server – some of the business logic is built into scalar value functions (documentation).

Now the magic of PowerQuery enables us to reuse these functions within PowerQuery and Query Folding is supported – more about this at the end of this post.

Let’s step through how to do this

I will use the famous AdventureWorksDW2014 database.

In my database I have added a function called IsoweekwithYear

This function uses a date a parameter and returns an Integer representing the year and the week. In this example it returns 202018 as 28 april 2020 is in iso week 18 of 2020.

In the Power BI Desktop we can now connect to the SQL server

And in this example I pick the DimDate table from the database

And if we scroll down you will also see the functions, we have in the database

Notice that the data preview pane will show the function and the parameters it can take.

Then I click Transform Data and the Power Query Editor will appear

By default it will Invoke the function and in this case it will return Null

Notice the Applied Steps in the Query Settings pane

If you remove the last step where the function is invoked

You will see it as a function that has a parameter called @DATE which is optional.

Now let’s use the function

In the Query DimDate we can the call the function – as we now we need to call the function using the column FullDateAlternateKey as the parameter I select this column first and then click Invoke Custom Function via the Add Column section in the ribbon.

We can then choose the Function Query IsoweekWithYear and use the column name as the parameter value

This will add a new column at the end of my query

Containing the ISOWeek – yuppieee 🙂

Is Queryfolding supported ?

It sure is

As we can see as we can select “View Native Query” and see that the SQL statement is added.

This means that the server does the work instead of the M-engine.

What’s next

I must admit that I wasn’t aware of this possibility before – and I feel like it opens up a lot of scenarios to reuse already built and tested functions that your database developers have built.

Let me know if you find this useful as well by leaving a comment or give the post a like

Happy querying !!!

#PowerQuery – Easily Copy all queries from a PBIX to Excel and vice versa

Today I discovered a way to easily copy all queries (including the groups) from a PBIX file to Excel without entering the power query editor in Excel.

In my case I needed to copy the data returned by a query and find it some much easier to get the data using Excel instead of generating a csv file from a visual in Power BI Desktop and importing that into Excel.

My PBIX contains some queries to extract weather data from different stations in Sweden.

These are grouped to create a logic structure of the queries.

Select all queries and copy them.

Now open a blank workbook and click the Queries & connections in the Data tab in the ribbon

And the Queries & Connections panels opens

Now right click in the panel

And select Paste – and the queries, parameters and load settings for the queries are also included.

For the table you want to load you now need to specify whether you want to load to the table to the datamodel or to a worksheet – default will be worksheet.

If you collapse all your groups in the panel and select all of them you can also copy them

And paste them into the Power Query editor in the Power BI Desktop

And after paste

Hope you this can help you too and stay Queryious

#PowerQuery – Avoid an extra step by not clicking OK

Just wanted to share a small tip in PowerQuery where an extra step is added to your queries even though you might not want it.

In this example I have imported the FactInternetSales from the AdventureWorks database and have filtered it with orderdatekey after a specific date.

Now I just want to check the different product keys and I click the dropdown next to the field name.

If you do not specify a filter but just click OK (the default button) you will get an extra step inserted in the Applied Steps

= Table.SelectRows(#”Filtered Rows”, each true)

Which just means that all rows should be returned from the previous step.

If you press Cancel instead the step won’t be added.

So, keep your queries nice and clean and click Cancel instead of OK and stay querious 🙂

PS – and remember to rename the steps to keep them as explanatory as possible.