#PowerBI – External Tool – Open Power BI Report Builder – part 1

As you may have noticed I have blogged twice about the new awesome feature in Power BI Desktop where we can build our own external tool buttons in Power BI Desktop.

Here is a link to the previous posts.

  • Analyze in Excel (link)
  • Open in Tableau (link)

One perhaps forgotten member in Power BI is the Power BI Report Builder – aka – Paginated reports – and even though it requires a premium capacity or Power BI embedded A-SKU to publish/share reports – I thought it might be interesting to see if we could link the local pbix file to the Report Builder.

I will write a part 2 where the external tool will support if the desktop file is connected to an Azure Analysis Server or Power BI Dataset as well.

You can download the free Power BI Report Builder from here.

How to build the external tool

The tool has two files

  • A powershell file – ConnectToPowerBIReportBuilder.ps1
    Must be stored in C:\temp
  • A external tool file – OpenInPowerBIReportBuilder.pbitool.json
    Must be stored in C:\Program Files (x86)\Common Files\Microsoft Shared\Power BI Desktop\External Tools

OBS – in order to run a powershell script on your pc you need to have to set the execution policy – https://go.microsoft.com/fwlink/?linkid=135170

You can download the files from here – https://github.com/donsvensen/openinpowerbireportbuilder

The powershell

The report builder uses the rdl – file format which is an xml file.

So the PowerShell script is simply to create an rdl file and make the connection dynamic using the parameters about server and database name that the External tool provides as arguments when the button is clicked in Power BI Desktop.

This is a small part of the xml that creates the rdl file and it could be expanded to create datasets and insert visualizations.

        $tdsXml = "<?xml version=""1.0""?>
<Report MustUnderstand=""df"" xmlns=""http://schemas.microsoft.com/sqlserver/reporting/2016/01/reportdefinition"" xmlns:rd=""http://schemas.microsoft.com/SQLServer/reporting/reportdesigner"" xmlns:df=""http://schemas.microsoft.com/sqlserver/reporting/2016/01/reportdefinition/defaultfontfamily"">
  <rd:ReportUnitType>Mm</rd:ReportUnitType>
  <rd:ReportID>cd1b0079-99d9-4e99-9f5e-6fda21f9f09d</rd:ReportID>
  <df:DefaultFontFamily>Segoe UI</df:DefaultFontFamily>
  <AutoRefresh>0</AutoRefresh>
  <DataSources>
    <DataSource Name=""PowerBI"">
      <rd:SecurityType>Integrated</rd:SecurityType>
      <ConnectionProperties>
        <DataProvider>OLEDB-MD</DataProvider>
        <ConnectString>Data Source=$port;Initial Catalog=$database</ConnectString>
        <IntegratedSecurity>true</IntegratedSecurity>
      </ConnectionProperties>
      <rd:DataSourceID>8c230b7c-799c-4007-af55-6bc26eca797c</rd:DataSourceID>
    </DataSource>
  </DataSources>
  <ReportSections>
    <ReportSection>

What happens

After you have placed the OpenInPowerBIReportBuilder.pbitool.json in the correct folder and you have restarted Power BI Desktop

the toolbar button

You will see a new button in the ribbon.

When you have a pbix file open containing a datamodel and click the button a desktoptobuilder.rdl file will be generated in C:\temp

And Power BI Report Builder will open – in the xml I have also inserted text in the report header showing the local host port.

In the data Sources you will find

A data source called PowerBI which is connect to your running instance of tabular model

And then we can create a dataset that uses the data source

And use the designer to create the DAX query

And use this to in a chart

Obs – remember to save the file with another name – and to update the connection information after you close and reopen the desktop file.

In my next post I will demonstrate how we can do the same if the pbix is connected to Azure Analysis Services or a Power BI Dataset.

Stay tuned 🙂

PS – This one is for #paginatedreportbear

Paginated Report Bear (@PaginatedBear) | Twitter

Connect your #PowerBI desktop model to #Tableau Desktop via External Tools in PowerBI

I recently created an external tool to PowerBI desktop that connects your Power BI desktop model to Excel (https://eriksvensen.wordpress.com/2020/07/27/powerbi-external-tool-to-connect-excel-to-the-current-pbix-file/) and then I thought – could we also have a need for an external tool that could open the desktop model in Tableau desktop.

So, I downloaded a trial version of the Tableau Desktop to see what is possible.

And sure, enough Tableau can connect to Microsoft Analysis Services and therefor also the localhost port that Power BI Desktop uses.

We can also save a data source as a local data source file in Tableau

Which gives us a file with a tds extension (Tableau Data Source)

When opening the file in Notepad we can see the connection string and some extra data about metadata-records.

It turns out that the tds file does not need all the meta data record information – so I cleaned the tds file to contain

Opening this file from the explorer will open a new Tableau Desktop file with the connection to the specified model/database/server.

The external tool

Knowing this I could create an external tool the same way as my Excel connector.

First create a PowerShell

OBS – in order to run a powershell script on your pc you need to have to set the execution policy – https://go.microsoft.com/fwlink/?linkid=135170

The PowerShell script

Function ET-TableauDesktopODCConnection
{  

	[CmdletBinding()]
    param
    (
        [Parameter(Mandatory = $false)]        
		[string]
        $port,
        [Parameter(Mandatory = $false)]        
		[string]
        $database,
        [Parameter(Mandatory = $false)]        
		[string]
        $path	
    )
    
        $tdsXml = "<?xml version='1.0' encoding='utf-8' ?>
<datasource formatted-name='LocalPowerBIDesktopFile' inline='true' source-platform='win' version='18.1' xmlns:user='http://www.tableausoftware.com/xml/user'>
  <document-format-change-manifest>
    <_.fcp.SchemaViewerObjectModel.true...SchemaViewerObjectModel />
  </document-format-change-manifest>
  <connection authentication='sspi' class='msolap' convert-to-extract-prompted='no' dbname='$database' filename='' server='$port' tablename='Model'>
</connection>
</datasource>"   
                
        #the location of the odc file to be opened
        $tdsFile = "$path\tableauconnector.tds"

        $tdsXml | Out-File $tdsFile -Force	

        Invoke-Item $tdsFile

}

ET-TableauDesktopODCConnection -port $args[0] -database $args[1] -path "C:\temp"

The script simply creates a tableauconnectort.tds file and stores it in C:\temp – and the xml content in the file is dynamically referenced as arg(0) and arg(1) when the external tool is called from Power BI Desktop.

Save the script in C:\temp and call it ConnectToTableau.ps1.

The OpenInTableau.pbitool.json file

Next step was to create a pbitool.json file and store it in C:\Program Files (x86)\Common Files\Microsoft Shared\Power BI Desktop\External Tools

{
  "version": "1.0",
  "name": "Open In Tableau",
  "description": "Open connection to desktop model in Tableau ",
  "path": "C:/Windows/System32/WindowsPowerShell/v1.0/powershell.exe",
  "arguments": "C:/temp/ConnectToTableau.ps1 \"%server%\" \"%database%\"",
  "iconData": "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAJAAAACQCAYAAADnRuK4AAAABmJLR0QA/wD/AP+gvaeTAAADRklEQVR4nO3dv27TUBiH4WPEitSRS+iCurO0GzdRiS5sXRhAXZhYEAxd2LoUiZtgaxb2iqWXwFiJCzgsqPRPrMb5Jc1x/TxbqgSi5O2xE3+uSwGAUeo2/QRac3R8cla6bvfqB7XOPr19s7e5Z9S2J5t+AoybgIgIiIiAiAiIiICICIiIgIgIiIiAiEziUMbR8cnZovetXbfTlbJ1dbuUy67W80UfP7XDHk83/QQexPVjW/fd9e7trSGPnxqbMCICItLEJqyeljrv593BivbRap0tfNdwH2hVDj58mfuanH5819R+axMBrduQHdvb80BdredT2zEewiaMiICICIiIgIgIiIiAiAiIiICICIiIgIhM4lDGEA5bDGMFIiIgIgIiIiAiAiISTbf1TRK2ZmWTjQvomyRszaomG61ARAREREBEBEREQESaOMdo7eeFjdBYzguzAhEREBHjHP/8fv/i3i8An3/+1dTmowVWICICIiIgIgIiIiAiAiIiICICIiIgIgIiIiAiSx8Lc3Xjcdk/nJ2VWv+/X103+/51dy/9d61ARAREpIlxjilPHvZpbfKwjxWIiICICIiIgIgIiEgTn8KGWmQAfiz/79gH9a1ARG7UP5arG29qBVqHZAXaP5ydDbj7Tqn16v0qXXdZSln4/eo77HFzE+bqxuNy/djW8MdulVLi98smjIiAiNzchI3w6saT1nULv18l3AfqfQrLPnCT80B2ooczD0STRvlF4jp+a/11juVYgYgIiIiAiAiIiICINPEp7Of29txPQC8vLib7qefZq29zX5M/P1439ZpYgYgIiMjSmzCnMY/LKg5bzGMFIiIgIgIiIiAiAiIiICICIiIgIgIiIiAiAiLSxDhHCwzML8cKRERARKJlu2+SsDUPOdnYN0nYmlVNNlqBiAiIiICICIiIgIg08eWZ88Lucl4YkyAgIgIiIiAiAiJinOOWdf0108fKCkREQEQERERARAREREBEBEREQEQERERARCZxKGPw1Y1v3R7y+Kkd9mgioLVPHjZwdeOhWps87GMTRkRARJrYhK1dA1c3fqxGsZ19SOaBhrEJIyIgIgIiIiAiAiIiICICIiIgIgIiIiAAAAAYjb8VJdQbiRXyOAAAAABJRU5ErkJggg=="
}

Test it

Now restart your Power BI desktop and the external tool should be visible in the ribbon

Then open a pbix file with a model and hit the button.

A PowerShell screen will shortly be visible and then Tableau opens the tds file and now we have a new tableau book with a connection to active power bi desktop datamodel.

And we can start to do visualizations that are not yet supported in Power BI –

How can you try it

You can download the files needed from my github repository – link

Feedback

Let me know what you think and if possible share some of the viz that you make.

#PowerBI – External tool to connect Excel to the current PBIX file

In the July update of the Power BI Desktop we now can add external tools to the ribbon.

If you install the latest versions of Tabular Editor, DAX Studio and the ALM Toolkit these will be added as tools in the ribbon.

But you can also build and add your own tools.

David Eldersveld (link) has written an excellent series of blogposts about using Python as an external tool – link to part one – and this inspired me to give it a go as well.

The official documentation can be found here.

Short description of what an external tool really is

An external tool will point to an exe file and you can supply the call to the exe file with arguments including a reference to the %server% and %database%.

The information about the external tool needs to be stored in

C:\Program Files (x86)\Common Files\Microsoft Shared\Power BI Desktop\External Tools

And name the file “<tool name>.pbitool.json”.

This will give me these buttons in my Power BI Desktop

My idea to an external tool

When I build models – I use Excel pivot tables to test and validate my measures and typically I would use DAX Studio to find the localhost port to setup a connection to the currently open PBIX file.

So, I thought it be nice just to click a button in PowerBI Desktop to open a new Excel workbook with a connection to the current model. That would save me a couple of clicks.

If I could create an ODC file when clicking on the button in Power BI and then open the ODC file (Excel is the default application to open these) my idea would work.

I have previously used Rui Romano’s (link) excellent PowerBI powershell tools – link to github and link his blogpost about analyse in Excel – so why not use PowerShell to do this.

Here is a guide to build your own version

Step 1 Create a powershell script

I created a powershell file called ConnectToExcel.ps1 and saved the file in local folder C:\Temp – you can save this where you want it stored. (Link to sample files last in this post)

The script is a modified version of Rui’s function Export-PBIDesktopODCConnection – thank you so much these.

Function
ET-PBIDesktopODCConnection

{

# modified the https://github.com/DevScope/powerbi-powershell-modules/blob/master/Modules/PowerBIPS.Tools/PowerBIPS.Tools.psm1

# the Function Export-PBIDesktopODCConnection

    [CmdletBinding()]

param

(

[Parameter(Mandatory =
$false)]

        [string]

$port,

[Parameter(Mandatory =
$false)]

        [string]

$path

)

$port = $port

$odcXml= “<html xmlns:o=””urn:schemas-microsoft-com:office:office””xmlns=””http://www.w3.org/TR/REC-html40″”><head><meta http-equiv=Content-Type content=””text/x-ms-odc; charset=utf-8″”><meta name=ProgId content=ODC.Cube><meta name=SourceType content=OLEDB><meta name=Catalog content=164af183-2454-4f45-964a-c200f51bcd59><meta name=Table content=Model><title>PBIDesktop Model</title><xml id=docprops><o:DocumentProperties xmlns:o=””urn:schemas-microsoft-com:office:office”” xmlns=””http://www.w3.org/TR/REC-html40″”&gt; <o:Name>PBIDesktop Model</o:Name> </o:DocumentProperties></xml><xml id=msodc><odc:OfficeDataConnection xmlns:odc=””urn:schemas-microsoft-com:office:odc”” xmlns=””http://www.w3.org/TR/REC-html40″”&gt; <odc:Connection odc:Type=””OLEDB””>

<odc:ConnectionString>Provider=MSOLAP;Integrated Security=ClaimsToken;Data Source=$port;MDX Compatibility= 1; MDX Missing Member Mode= Error; Safety Options= 2; Update Isolation Level= 2; Locale Identifier= 1033</odc:ConnectionString>

<odc:CommandType>Cube</odc:CommandType> <odc:CommandText>Model</odc:CommandText> </odc:Connection> </odc:OfficeDataConnection></xml></head></html>”

#the location of the odc file to be opened

$odcFile = $path\excelconnector.odc”

$odcXml|Out-File $odcFile -Force

# Create an Object Excel.Application using Com interface

$objExcel=New-Object -ComObject Excel.Application

# Make Excel visible

$objExcel.Visible = $true

# Open the Excel file and save it in $WorkBook

$WorkBook = $objExcel.Workbooks.Open($odcFile)

}

write $args[0]

ET-PBIDesktopODCConnection -port $args[0] -path “C:\Temp”

The script contains a function that creates an ODC file where the Datasource and path of the ODC file is determined by to arguments in the function – port and path, The Script also opens Excel and then opens the file.

The scripts contain a

$args[0]

This will in the end be the value localhost:xxxxx that will be provided when we click the External tool button in Power BI Desktop – and will make more sense after step 2

Notice that I have hardcoded the path where the ODC file will be stored to C:\Temp.

Step 2 Create a .pbitool.json file

The pbitool.json file is relatively simply

Name is the text that will appear in the ribbon.

Description is the tooltip that appears in Power BI Desktop according to the documentation – but it doesn’t work at the moment.

Path is the reference to the exe file you want to activate – and only the exe file.

Arguments is the arguments that you want to pass the exe file – and here we have the to built in references %server% and %database%. Arguments are optional so we could just start Excel or any other program if we wanted .

IconData is the icon that you want to appear in the ribbon – I found an icon via google and then used https://www.base64-image.de/ to convert it to the string.

In this tool we use the Powershell.exe file that can be called with arguments where we specify the script file that we want to be executed and we pass the extra arguments server and database as well – in my script I only use the %server% reference which will give me the server name and portnumber of the local instance.

It means that when the button is clicked in PowerBI Desktop it will execute

C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe C:\temp\connetToExcel.ps1 localhost:xxxxx databasename

The localhost:xxxxxx can is the first argument provided and the value can then be referred to by using $args[0].

The file must then be stored in C:\Program Files (x86)\Common Files\Microsoft Shared\Power BI Desktop\External Tools and in my case I called it OpenInExcel.pbitool.json.

Depending on your privileges on your computer you might be warned that you need administrative rights to save files in that location.

And if you save the script file elsewhere you need to modify the pbitool.json file.

Step 3 – Test it

Now we are ready to restart Power BI Desktop – and

And it does appear

Next – open a pbix file

This will open a Windows PowerShell window and write the server information

And in the background opens Excel and the ODC file – which results in a pivotable connected to the local instance.

With a connection to the localhost:52510

The files

You can download the files needed from here – https://github.com/donsvensen/erikspbiexcelconnector

Feedback

I think the use of PowerShell opens a lot of interesting scenarios for external tools and I look forward to see what other external tools that appear in the community.

Please let me know what you think and if you find it useful.

Incremental refresh in #PowerBI with data from BigQuery

Back in February 2020 the Incremental refresh in Power BI became GA (link) and supported in Power BI Pro as well.

I have not had a chance to use this a lot yet but in a recent project for a client we were working with a large dataset 500+ mill. rows of historical search data stored in Big Query (yes Google BigQuery 🙂 ).

One of the prerequisites for Incremental Refresh to work is that your Power Query can be query folded – ie. sent back to the source system as a native query from Power Query so all can be folded into a query understood by the source.

You will find the documentation here.

But let me take you through the steps if you want to do this with a query against Google BigQuery.

The data

I am using the flight dataset that you are guided through creating when you create a new project in BigQuery.

It contains 7,95 gb of data and 70.588.485 rows with 10 years of flight data from jan 2002 until dec 2012. (To work with incremental refresh I will add 8 years to all dates)

Setting up incremental refresh for this dataset

Open a new pbix file and select Get data in the ribbon – filter the connectors by big to find the Google BigQuery connector.

You will then be prompted to signin to BigQuery and then presented by the projects you have access to.

When you select the tables or views you want to import – hit “Transform data”

As the BigQuery connector also supports direct query you will be prompted whether you want to use import or direct Query mode – in this case we want to import the data – so I select Import.

This will open the Power Query editor and display all the columns and the data for the first 1000 records

Native Query greyed out – Why ?

If you have worked with query folding before you properly know that you can view the native query constructed via the shortcut menu “View Native Query”

But when we use the BigQuery connector it is greyed out – but let’s check whether it does fold or not.

You can see the query constructed in Query History pane in Google Console for BigQuery

And the query folds nicely – at least until we do Power Query transformations.

Let’s setup DateRanges for incremental refresh

In order to do a incremental refresh we need to create two parameters in our PBIX – called RangeStart and RangeEnd

Remember the parameters must be Date/Time Type.

In our flights query – we then add a filter to the date column.

But we can’t really select the parameters

This is because the date column is the datatype – Date and not Date/Time.

So, let us change the date columns datatype to Date/Time

The rows in the table should be filtered like this

REMEMBER that only one of the filters must have the “equal to” – otherwise you risk two partitions include the same period.

The Native query is still grey.

But when we check the query history in BigQuery we can see that it is nicely folded back to the engine

Let us build the simple datamodel and setup the incremental refresh

Clicking the Close & Apply in the editor we start the import of the filtered information

After importing the data, I create a simple measure that calculates the number of records in the flights table.

And we can start visualizing the data even more using the other fields in the dataset

If we publish the dataset now – the only data we will ever get in model is for January 2003 (2011) – but we want more data so we have to setup incremental refresh so when the data is refreshed in the service it will contain all the data for the period we want in the model.

!!!! – As the refresh uses the date/time value of the current date I have modified the data by adding 8 years to all the records so it (And even that is folded back)

I then select Incremental refresh for the flights table by selecting Incremental refresh in the shortcut menu.

This opens the following dialog.

Don’t worry about the warning in the top – this is due to “native query” being greyed out as the connector apparently doesn’t return the information to Power Query – but as you have seen previously we do know that the query is folded.

I set the model to contain

8 years of date in the column “date” and the model should only refresh the latest rows within the last month.

When you click Apply All you cannot really see nothing happened – but if you open the dialog again you will see it contains the settings you have set.

Publishing the model to the service

When the model is published it

The dataset will before refresh the first time will only contain the data, we have imported in the pbix

We need to set the credentials in the data

And set a time of day where you want to refresh the dataset.

We can activate the first refresh by clicking the refresh button next to the dataset name

Or you can republish the pbix file to the service – this will also activate a refresh. This refresh didn’t happen with the first publish as you need to specify the credentials to BigQuery.

Successful refresh

After the refresh has finished, we can see the total number of rows has increased to 56.007.003.

And the period included is eight years plus current year and as we selected 1 month of update the dataset will include all the dates in may 2020. In your case you might only have data until Today() and this will then be included in your dataset.

The first refresh took almost 20 minutes.

But when we hit refresh again –

The refresh only took 1 min and 47 seconds as it only had to update the data for May 2020.

No premium

All of this is done without a Premium capacity – 👍

With a premium capacity you get a XMLA endpoint with which you can do even more – this is perhaps another blogpost worthy – so leave a comment if you want me to write a post about this.

Power On !

Update – check out this follow up post from Daniel Lennartson – https://www.linkedin.com/pulse/partitioned-tables-bigquery-power-bi-incremental-daniel-lennartsson/?trackingId=maXDCJvRSgaan6uAZHlBqg%3D%3D – describing how to use partitioned table to limit the cost.

DAX Time intelligence and the 29th of February – #PowerBI

Yesterday I visited a client and was asked – how do the time intelligence functions handle the fact that February has 29 days in 2020.

Well – in fact there was a few surprises depending on what you select from you date table.

Let’s look as some examples – I will use the following Internet Sales Amount from the years 2011-2013 from Adventure Work Database where we in February 2012 have 29 days.

As you can see, we have the year 2012 where we have 29 days.

SAMEPERIODLASTYEAR()

In order to calculate Internet Sales Amount LY – I use the following

Internet Sales Amount LY = CALCULATE([Internet Sales Amount];SAMEPERIODLASTYEAR(DimDate[Date]))

Which works fine

But notice the behavior if we put dates or days numbers on the rows

SURPRISE – Internet Sales Amount LY will show the value for the 28th of February 2011 instead of a blank value as you perhaps would expect

If you select year 2013 we will see this

The 29 of feb 2012 will “disappear” but the total for February will include the number.

DATEADD() – last year

If we use the function DATEADD instead – it will work exactly the same way.

IAS LY = CALCULATE([Internet Sales Amount];DATEADD(DimDate[Date];-1;YEAR))

DATEADD() – same day last year

If you want to compare the same Saturday (the 29th of feb 2020 is a Saturday) last year – which is the 2nd of march we can do this by using the same DATEADD function but with different parameters

IAS LY same weekday = CALCULATE([Internet Sales Amount];DATEADD(DimDate[Date];-364;DAY))

This will compare the same day 52 weeks ago (52 * 7 = 364) and there by giving us the value from the 29th of feb 2012 on the 27th of feb 2013.

DATESMTD()

Now what about the function DATESMTD()

ISA MTD = CALCULATE([Internet Sales Amount];DATESMTD(DimDate[Date]))

ISA MTD LY = CALCULATE([Internet Sales Amount LY];DATESMTD(DimDate[Date]))

These functions will calculate the running total for the month for the given day number

Notice that the ISA MTD works fine in 2012 for the 29th and the LY measure will show the same result for the 28th and 29th in 2012 – and in 2013 it will actually for the 28th show the sum of both the 28 and 29th 

Conclusion

You might find that some users find it difficult to understand how the calculations works when the look at dates instead of month totals especially in the case where they will get the same value for LY on both the 28 and 29th in 2012/2020.

If you compare cross years on calendar dates I find the result that SAMEPERIODLASTYEAR() returns makes better sense than leaving it empty/blank but what do you or your users think. Let me know in the comments.

Hope you find this little walkthrough useful.

And remember to ALWAYS use a datetable in your model if you do time intelligence calculations in DAX.

#PowerBI – A quicker way to set Summarize to none for all your fields in your date dimension (or others)

When designing your model you will in most cases need a date dimension and one of the things you always should do is set the Summarization of the number columns in your table to none instead of the default Sum – to avoid this when you insert for instance Calendar year in your visuals

So set the summarization to Don’t summarize to avoid this.

But you have to do this for all the number columns in your date table – –

but if you use the Model view instead –

We can multi select columns and set it for all at once

In the model view we have to select None to avoid summarization of number fields – and voila all fields in my date table is set to don’t summarize

Hope this can help you as well.

#PowerQuery – Control the expand columns so it includes new columns

Image a scenario where your column in your PowerQuery that contains a table with a set a columns that you know at some point will have more columns added.

In the case above I know that we at some point will add more columns in the merged ProductAttributes table.

How can we make this dynamic using PowerQuery

When we click the icon for expanding the table, we might just select this and move on

But notice the formula created in

It says

= Table.ExpandTableColumn(#”Merged Queries”, “ProductAttributes”, {“Brand”}, {“Brand”})

This means that even though we might add new columns to the ProductsAttributes table – it will still only be Brand that is expanded and only that column.

The bolded arguments is 2 lists that contains the Column names to expand and the new names of the columns – the last argument is optional so we can actually skip that if we want the original names – https://docs.microsoft.com/en-us/powerquery-m/table-expandtablecolumn

Now by changing the formula to this

= Table.ExpandTableColumn(#”Merged Queries”, “ProductAttributes”,List.RemoveItems(Table.ColumnNames(#”Merged Queries”[ProductAttributes]{0}), {“ProductKey”})
)

We can make the table dynamically expand when adding new columns in the table ProductAttributes

We get the new column included as well

The magic formula does this

Table.ColumnNames(#”Merged Queries”[ProductAttributes]{0})

Will return a list of column names from the step before
expansion (note I use the step name and column name) – and I use the {0} to extract the column names only form the first row – otherwise the formula will fail.

But as we cannot have the same column names twice (i.e. ProductKey needs to go away) so we need to use the List.RemoveItems functions

List.RemoveItems(Table.ColumnNames(#”Merged Queries”[ProductAttributes]{0}), {“ProductKey”})

Thereby removing the ProductKey Item in the list

And this means that when we get more columns in the table “ProductAttributes” table they will automatically be included in the expanded columns

Hope this can help you power queries even more dynamic.

Here is an example file – Link

Power Query On !

Adding keyboard shortcuts to your favourite buttons in #powerbi

If you find yourself clicking the same button in the Power BI Desktop you might find this little tip useful.

When you right click a button in the ribbon – you can add it to the Quick Access Toolbar.

This will add the button at the top (unless you have positioned it below the ribbon)

The button in the quick access toolbar can now be activated using a shortcut key combination.

When you press ALT the buttons will be labelled with numbers

So, pressing ALT + 7 will I my case open the Query Editor –

You can use the same technique in the Power Query Editor so by pressing ALT + 2 in my Query Editor will open the Advanced Editor

Hope this can help you too and do things even faster in Power BI with fewer clicks.

#powerbi Report to browse and watch sessions from #mbas 2019 using the #powerapp visual

Unfortunately I wasn’t able to participate in the Microsoft Business Application Summit this year – but luckily we can watch all the session via https://community.powerbi.com/t5/MBAS-Gallery/bd-p/MBAS_Gallery

But that would mean I had to leave Power BI Desktop in order to search and watch the videos – and the website also made it hard to see the distribution of sessions between categories.

So, I created this –

And to watch the video from within PowerBI I created a drill through page where I used the PowerApp visual to be able to show the

As none of the Microsoft standard visuals can play videos from within a report – I created a power App to show the video based on the selected video link.

If you want to embed this huge resource of learning material in your own tenant you can download the elements from here

The Desktop file – link

The Power App – link

If you are interested in learning how I scraped the web page for all the relevant data – check out these functions to extract data from pages using CSS query capabilities in the power query function Html.Table

Highly inspired by this blog post by Chris Webb – https://blog.crossjoin.co.uk/2018/08/30/power-bi-extract-urls-web-page/

Move your pbix datamodel to Azure Analysis services – #powerbi

After the Azure Analysis Services web designer was discontinued per march 1 2019 – link – there is no official tool to do a move of a PBIX datamodel to Azure Analysis Service. But by using a few different tools we do have ways of doing it anyway.

Step 1 – DAX Studio

Open your PBIX file in the Power BI Desktop and then open the DAX Studio (link to download) and connect DAX studio to the PBI model

In the status bar you will see the local port of the Analysis model running on your machine

Step 2 SQL Management Studio

Connect to Analysis Services using the server address from Step 1

This will give you a connection to the model

Now right click the Database and choose to script the Database to a new Query Editor window (or the clipboard)

Step 3 Connect to Azure Analysis Services

Use the SQL Server Management Studio to connect to Azure Analysis Services

Select to run a New XMLA Query on the server

And paste the query created in Step 2 in the new Query window – You can specify a model name in the highlighted area

Run the Query – and after a few seconds you should get this result – that the query has completed successfully.

And after a refresh of the server object you should see the newly scripted data model

Step 4 Finish the move

Now depending on the data sources in your model you need to setup the necessary connections and gateway.

And use your favourite Analysis Services tool to modify model – for instance the Tabular editor ( link )

You will have to go through all your data sources and Tables to make sure the connections and M-Scripts are functioning.

Tip

Before you script your PBIX data model – turn off the Time intelligence in your current file – otherwise you could get a lot of extra date/time tables in your model