#PowerBI – Change the data source in your composite model with direct query to AS/ Power BI Dataset

I have been playing around with the new awesome (preview) feature in the December Power BI Desktop release where we can use DirectQuery for Power BI datasets and Azure Analysis services (link to blogpost)

In my case I combined data from a Power BI dataset, Azure Analysis Services, and a local Excel sheet. The DirectQuery sources was in a test environment.

I then wanted to try this on the actual production datasets and wanted to change the datasources – and was a bit lost on how to do that but luckily found a way that I want to share with you.

Change the source

First you click on Data source settings under Transform data

This will open the dialog for Data source settings and show you the list of Data sources in the current file.

Now you can either right click the data source you want to change

Or click the button “Change Source…”

Depending on your data source different dialogs will appear

This one for my Azure Analysis Services Connection

And this one for Power BI Dataset

And this one for the Local Excel workbook

Hope this can help you to.

Happy new year to you all.

You must know about this shortcut key in #PowerBI Desktop

Working with the field list on a large model in Power BI Desktop can quickly make you end up with a lot of expanded tables and you collapsing them one by one.

Don’t do that

Even though that is good if you want to improve your chances of beating your kids in Fortnite – it probably won’t – so instead do one of the following

If you want to use your mouse

Click the show/hide pane in the header of the Fields panel

This will collapse all expanded tables in the field list at once – plus if you have used the search field – it will clear that as well.

But you want to do it using the keyboard use

ALT + SHIFT + 1

This will collapse all the expanded tables as well.

Here is a link to the documentation about short cut keys in Power BI desktop – run through them – there might be some that can save you a click or two

Keyboard shortcuts in Power BI Desktop – Power BI | Microsoft Docs

Use hidden measures and members from #PowerBI dataset in an Excel Pivot table

When you connect to a Power BI Dataset from Power BI desktop you might have noticed that you can see and use hidden measures and columns in the dataset.

But the hidden fields cannot be seen if you browse the dataset in Excel.

But that does not mean that you cannot use the fields in Excel – and here is how you can do it.

Using VBA

You can use VBA by creating a macro

The code will add the field AddressLine1 from the DImReseller dimension as a Rowfield if the active cell contains a pivotable.

Sub AddField()
    Dim pv As PivotTable
        Set pv = ActiveCell.PivotTable
        pv.CubeFields("[DimReseller].[AddressLine1]").Orientation = xlRowField
End Sub

If you want to add a measure/value to the pivotable you need to set change the Orientation property to xlDataFields

This means that we now have added two hidden fields from the dataset

Add hidden measures using OLAP Tools

You can also add hidden measures using the OLAP Tools and MDX Calculated Measure

Simply create a new calculated measure by referencing the hidden measure in the MDX

This will add a calculated Measure to the measure group you selected

And you can add that to your pivotable

Referencing hidden items using CUBE functions

Notice that you can also reference the hidden measures using CUBE functions

Simply specify the name of the measure as the member expression in this case as “[Measures].[Sales Profit]”

You can also refer to members from hidden fields using the CUBEMEMBER functions

Hope this can help you too.

Power On!

Spot the difference between Power BI Desktop and Power BI Desktop (Store Version) #PowerBI

On my computer I have 2 versions of Power BI Desktop installed – one from the Microsoft Store which is updated automatically and the downloaded version from downloads – and typically I have last month edition as my downloaded version.

But in my taskbar its impossible to tell the difference between the two.

Well we can solve that by changing the icon for the downloaded version – its not possible for the store version.

If you right click the icon in the taskbar and then right click the Power BI Desktop

You can select the properties for this App.

Now click the Change Icon

This will show you the current icon and now you can change this by clicking Browse – in my case I will select the icon for the PBIDocument

And click open – then icon will now be set to this

And when clicking OK

We will see the icon has changed for the Shortcut.

Notice that it will change immediately

But after a restart it will appear

Hope this can make your choice of Power BI Desktop versions easier for you as well.

How to connect to CDS from #PowerBI – Or where the h.. can I find the server URL

The last few month I have used CDS a few times in my solutions – and connected the data to PowerBI.

But one of the things I always search for is the Server URL

Above is the screen shot using the Common Data Service connector

And even the beta connector requires me to specify URL –

And even though the dialog says “Environment domain” – it is in fact the URL the connector wants – BUT without the https://

OBS – If you want to test the Beta connector – remember to enable TDS Endpoint under the Environments – Settings – Features

So where can I find the address

If you have access to the Admin center – Power Platform admin center (microsoft.com) – you can go into the environment and see URL.

If you do not have access to it – then open the model driven app and the URL is available in the address bar.

Hope this can help you.

Power On !

Tips to limit the number of steps in #PowerQuery – #PowerBI

When returning to some of my first queries built in Power Query, I always get a bit sad.

My old queries contain some unnecessary extra steps that makes it harder to evaluate what is happening in the query and in most cases I haven’t even bothered to rename the steps which makes the queries very hard to read for me and for others.

So, in order to improve here are some tips that can make mine and your queries shorter and easier to read.

Always Rename your Steps

Always spend time to rename your steps and avoid spaces in the step name.

Makes it easier to read and is documentation.

Especially important because the step name does not necessarily describe what is happening.

In this example I used the Remove Other columns function, and the step name refers to that function

– but what is actually happening is I am selecting specific columns – notice the formula bar – Table.SelectColumns

So, rename the step.

Check if you can avoid a column rename step

Using the interface in the Query Editor makes it very easy to get the “Renamed Columns” step but it can sometimes be avoided – For instance expanding a record column – in this example the DimGeography Column

– in this example the DimGeography Column and then double clicking the headers returned will lead into 2 steps but notice the last argument in the formula bar.

That is a list of column names that will be given and by modifying that we can do the expansion and renaming in one step.

Avoid the empty filter steps ie to not filter by (Select All)

When testing my queries, I sometimes filter the result to check the data

And then to clear it I sometimes use the (Select All) to remove the filter

This however leaves a filter step in your query where the Table.SelectRows function just is “each true”.

If you use the clear filter instead

The Filter step will be deleted

Limit the number of Rename Columns

Doing column renaming during the query steps can lead to several column renaming steps.

Try to do only 1 renaming step.

Limit the number of steps with Remove columns

Try to consolidate the column selection or deletion of columns in one step.

And remember the optional parameter in the Table.SelectColumns and Table.RemoveColumns where you can specify what should happen if a field in the column list is missing.

Don’t reorder your columns

You can change the column order in the Query editor but there is no need to do it

The columns will appear in alphabetical order in your data model anyway.

If you have a large table and can’t find the column use the Go to Column to locate it

A good bad example

Yes, this is one of my queries

This is the query after a clean-up – it does exactly the same but in fewer step and with a proper explanation in the step name.

Do you have any tips ?

This was a few tips to limit the number of steps in your query.

Let me know if you have any in the comments.

Stay queryious.

Setup data driven report subscriptions for #PowerBI paginated reports with #PowerAutomate

This weekend I had the pleasure of presenting at SQL Saturday Copenhagen #963 with a session with the title “Power BI Paginated Reports – How to and why”.

I was asked if it was possible to create datadriven report subscriptions in paginated reports and I said “Not yet” – it turns out that is not exactly true.

With the new action “Export to File for Paginated Reports (preview)” – in Power Automate – link

It is possible !!!

And yes, for now we need the reports to be stored in a workspace backed by premium capacity but soon it will be available here

How to set it up

In my workspace I have a simple report with 2 parameters – Year and Country

In order to get a report delivered to multiple recipients with different parameters I created an Excel Workbook in Onedrive for business with 3 different reports I want sent all with different parameters for County and Year.

You can store the data in other sources that you can connect Power Automate with – but for the example I just used Excel file.

Next is to design the flow in Power Automate

In the example I choose the trigger “Manually trigger a flow” – but you can of course choose other triggers to activate this flow – for instance a Recurrence so you can execute the reports at a specified frequency.

Next step is to get the rows from the Excel file – So I use the List rows present in a table

This action will return the three rows and next is to use the Control “Apply to each” to loop through each row

For the output we choose the value from the “List Row…” step

Then we add the action “Export To File for Paginated Reports

This action will run a selected report from a specific workspace

We can choose between the different file formats that is possible for paginated reports – in this example I chose PDF.

Add the end of the Action window – we can specify the values for each of the parameters.

OBS – You have to type in the Parameter name manually and be aware that the name of the parameter doesn’t necessarily match the label shown

As in this case – the Year parameter is called “DateCalendarYear” – and that is the name you must use for “ParametersValues name”

We bind the value of the parameter to the value from the Excel file using the Dynamic content selector.

Next step is to add a step to send the report and bind the fields for the To field – add a subject and in the attachments section we give the attachment a name and then use the result of the “Export to File for paginated reports” as the attachment content

The flow is now ready for testing – Save you flow and hit the test button.

Accept the used connections and hit Run Flow and we can go to the flow run overview and follow the steps.

Hopefully you will see that all steps are successfully.

And we can check our mail box to see the report delivered

The recipients of the e-mail can be sent to any e-mail address.

Can we pay by the hour ?

In my case I used an Power BI Embedded A4 SKU as capacity – this can be turned on and off so you can expand the flow with actions that can start the capacity and run the subscriptions and then turn the capacity off again.

You only pay for the time the capacity is on – and the list price per hour is $8.

Hope this can help you and really looking forward to all exciting stuff that is coming around Power BI paginated reports.

#PowerBI – External Tool – Open Power BI Report Builder – part 1

As you may have noticed I have blogged twice about the new awesome feature in Power BI Desktop where we can build our own external tool buttons in Power BI Desktop.

Here is a link to the previous posts.

  • Analyze in Excel (link)
  • Open in Tableau (link)

One perhaps forgotten member in Power BI is the Power BI Report Builder – aka – Paginated reports – and even though it requires a premium capacity or Power BI embedded A-SKU to publish/share reports – I thought it might be interesting to see if we could link the local pbix file to the Report Builder.

I will write a part 2 where the external tool will support if the desktop file is connected to an Azure Analysis Server or Power BI Dataset as well.

You can download the free Power BI Report Builder from here.

How to build the external tool

The tool has two files

  • A powershell file – ConnectToPowerBIReportBuilder.ps1
    Must be stored in C:\temp
  • A external tool file – OpenInPowerBIReportBuilder.pbitool.json
    Must be stored in C:\Program Files (x86)\Common Files\Microsoft Shared\Power BI Desktop\External Tools

OBS – in order to run a powershell script on your pc you need to have to set the execution policy – https://go.microsoft.com/fwlink/?linkid=135170

You can download the files from here – https://github.com/donsvensen/openinpowerbireportbuilder

The powershell

The report builder uses the rdl – file format which is an xml file.

So the PowerShell script is simply to create an rdl file and make the connection dynamic using the parameters about server and database name that the External tool provides as arguments when the button is clicked in Power BI Desktop.

This is a small part of the xml that creates the rdl file and it could be expanded to create datasets and insert visualizations.

        $tdsXml = "<?xml version=""1.0""?>
<Report MustUnderstand=""df"" xmlns=""http://schemas.microsoft.com/sqlserver/reporting/2016/01/reportdefinition"" xmlns:rd=""http://schemas.microsoft.com/SQLServer/reporting/reportdesigner"" xmlns:df=""http://schemas.microsoft.com/sqlserver/reporting/2016/01/reportdefinition/defaultfontfamily"">
  <rd:ReportUnitType>Mm</rd:ReportUnitType>
  <rd:ReportID>cd1b0079-99d9-4e99-9f5e-6fda21f9f09d</rd:ReportID>
  <df:DefaultFontFamily>Segoe UI</df:DefaultFontFamily>
  <AutoRefresh>0</AutoRefresh>
  <DataSources>
    <DataSource Name=""PowerBI"">
      <rd:SecurityType>Integrated</rd:SecurityType>
      <ConnectionProperties>
        <DataProvider>OLEDB-MD</DataProvider>
        <ConnectString>Data Source=$port;Initial Catalog=$database</ConnectString>
        <IntegratedSecurity>true</IntegratedSecurity>
      </ConnectionProperties>
      <rd:DataSourceID>8c230b7c-799c-4007-af55-6bc26eca797c</rd:DataSourceID>
    </DataSource>
  </DataSources>
  <ReportSections>
    <ReportSection>

What happens

After you have placed the OpenInPowerBIReportBuilder.pbitool.json in the correct folder and you have restarted Power BI Desktop

the toolbar button

You will see a new button in the ribbon.

When you have a pbix file open containing a datamodel and click the button a desktoptobuilder.rdl file will be generated in C:\temp

And Power BI Report Builder will open – in the xml I have also inserted text in the report header showing the local host port.

In the data Sources you will find

A data source called PowerBI which is connect to your running instance of tabular model

And then we can create a dataset that uses the data source

And use the designer to create the DAX query

And use this to in a chart

Obs – remember to save the file with another name – and to update the connection information after you close and reopen the desktop file.

In my next post I will demonstrate how we can do the same if the pbix is connected to Azure Analysis Services or a Power BI Dataset.

Stay tuned 🙂

PS – This one is for #paginatedreportbear

Paginated Report Bear (@PaginatedBear) | Twitter

#PowerQuery – Dynamically reference a table via a Parameter

The other day I had a fun challenge from my co-worker – Søren Faurum – that wanted to change the refence to a column name from one table to another table with the same column name.

OBS – The solution is not supported in the Power BI Service.

In his case it was

Let
   x= List.Distinct(tableName1[StoreKey])
in
   x

And he wanted TableName1 to be a value of a parameter.

Here is an example (data is from AdventureWorks) – in my model I have two tables – DimReseller and DimCustomer – both have a GeographyKey

A parameter called TableName should now be used in a query to retrieve either the unique list of values of the key from either DimReseller or DimCustomer.

If we just use

List.Distinct(TableName[GeographyKey])

We do get an expression error as the parameter is a text value and not a table that has fields.

Then I thought we could the function Expression.Evaluate() – link to documentation

But no – apparently, we can’t reference a Table name like this.

#shared to the rescue 🆘

Then I thought why not use the #shared function – as I knew that #shared will list all the functions, tables, parameters etc. that are available in the current pbix file – including all loaded or not loaded tables.

Then we can turn the record into a table and filter it based on a parameter

This will give us a table with one record.

Where the Value column will contain the table we want to be dynamic

Then by adding the following to the formula

  • {0} to get the first row/record
  • [Value] to retrieve the Table in the record
  • And [GeographyKey]

We can get all the Geography Keys in the table

And by putting all of this into the argument of the List.Distinct function

= List.Distinct(Table.SelectRows(#"Converted to Table", each [Name] = TableName){0}[Value][GeographyKey])

Which returns 510 rows.

And now I can change the parameter to DimCustomer

And see it returns 336 rows

However if we publish the model to the Power BI Service it won’t work if you want to schedule refresh as #shared is not supported in the service.

So we can only use it in the desktop

Stay Querious

Connect your #PowerBI desktop model to #Tableau Desktop via External Tools in PowerBI

I recently created an external tool to PowerBI desktop that connects your Power BI desktop model to Excel (https://eriksvensen.wordpress.com/2020/07/27/powerbi-external-tool-to-connect-excel-to-the-current-pbix-file/) and then I thought – could we also have a need for an external tool that could open the desktop model in Tableau desktop.

So, I downloaded a trial version of the Tableau Desktop to see what is possible.

And sure, enough Tableau can connect to Microsoft Analysis Services and therefor also the localhost port that Power BI Desktop uses.

We can also save a data source as a local data source file in Tableau

Which gives us a file with a tds extension (Tableau Data Source)

When opening the file in Notepad we can see the connection string and some extra data about metadata-records.

It turns out that the tds file does not need all the meta data record information – so I cleaned the tds file to contain

Opening this file from the explorer will open a new Tableau Desktop file with the connection to the specified model/database/server.

The external tool

Knowing this I could create an external tool the same way as my Excel connector.

First create a PowerShell

OBS – in order to run a powershell script on your pc you need to have to set the execution policy – https://go.microsoft.com/fwlink/?linkid=135170

The PowerShell script

Function ET-TableauDesktopODCConnection
{  

	[CmdletBinding()]
    param
    (
        [Parameter(Mandatory = $false)]        
		[string]
        $port,
        [Parameter(Mandatory = $false)]        
		[string]
        $database,
        [Parameter(Mandatory = $false)]        
		[string]
        $path	
    )
    
        $tdsXml = "<?xml version='1.0' encoding='utf-8' ?>
<datasource formatted-name='LocalPowerBIDesktopFile' inline='true' source-platform='win' version='18.1' xmlns:user='http://www.tableausoftware.com/xml/user'>
  <document-format-change-manifest>
    <_.fcp.SchemaViewerObjectModel.true...SchemaViewerObjectModel />
  </document-format-change-manifest>
  <connection authentication='sspi' class='msolap' convert-to-extract-prompted='no' dbname='$database' filename='' server='$port' tablename='Model'>
</connection>
</datasource>"   
                
        #the location of the odc file to be opened
        $tdsFile = "$path\tableauconnector.tds"

        $tdsXml | Out-File $tdsFile -Force	

        Invoke-Item $tdsFile

}

ET-TableauDesktopODCConnection -port $args[0] -database $args[1] -path "C:\temp"

The script simply creates a tableauconnectort.tds file and stores it in C:\temp – and the xml content in the file is dynamically referenced as arg(0) and arg(1) when the external tool is called from Power BI Desktop.

Save the script in C:\temp and call it ConnectToTableau.ps1.

The OpenInTableau.pbitool.json file

Next step was to create a pbitool.json file and store it in C:\Program Files (x86)\Common Files\Microsoft Shared\Power BI Desktop\External Tools

{
  "version": "1.0",
  "name": "Open In Tableau",
  "description": "Open connection to desktop model in Tableau ",
  "path": "C:/Windows/System32/WindowsPowerShell/v1.0/powershell.exe",
  "arguments": "C:/temp/ConnectToTableau.ps1 \"%server%\" \"%database%\"",
  "iconData": "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAJAAAACQCAYAAADnRuK4AAAABmJLR0QA/wD/AP+gvaeTAAADRklEQVR4nO3dv27TUBiH4WPEitSRS+iCurO0GzdRiS5sXRhAXZhYEAxd2LoUiZtgaxb2iqWXwFiJCzgsqPRPrMb5Jc1x/TxbqgSi5O2xE3+uSwGAUeo2/QRac3R8cla6bvfqB7XOPr19s7e5Z9S2J5t+AoybgIgIiIiAiAiIiICICIiIgIgIiIiAiEziUMbR8cnZovetXbfTlbJ1dbuUy67W80UfP7XDHk83/QQexPVjW/fd9e7trSGPnxqbMCICItLEJqyeljrv593BivbRap0tfNdwH2hVDj58mfuanH5819R+axMBrduQHdvb80BdredT2zEewiaMiICICIiIgIgIiIiAiAiIiICICIiIgIhM4lDGEA5bDGMFIiIgIgIiIiAiAiISTbf1TRK2ZmWTjQvomyRszaomG61ARAREREBEBEREQESaOMdo7eeFjdBYzguzAhEREBHjHP/8fv/i3i8An3/+1dTmowVWICICIiIgIgIiIiAiAiIiICICIiIgIgIiIiAiSx8Lc3Xjcdk/nJ2VWv+/X103+/51dy/9d61ARAREpIlxjilPHvZpbfKwjxWIiICICIiIgIgIiEgTn8KGWmQAfiz/79gH9a1ARG7UP5arG29qBVqHZAXaP5ydDbj7Tqn16v0qXXdZSln4/eo77HFzE+bqxuNy/djW8MdulVLi98smjIiAiNzchI3w6saT1nULv18l3AfqfQrLPnCT80B2ooczD0STRvlF4jp+a/11juVYgYgIiIiAiAiIiICINPEp7Of29txPQC8vLib7qefZq29zX5M/P1439ZpYgYgIiMjSmzCnMY/LKg5bzGMFIiIgIgIiIiAiAiIiICICIiIgIgIiIiAiAiLSxDhHCwzML8cKRERARKJlu2+SsDUPOdnYN0nYmlVNNlqBiAiIiICICIiIgIg08eWZ88Lucl4YkyAgIgIiIiAiAiJinOOWdf0108fKCkREQEQERERARAREREBEBEREQEQERERARCZxKGPw1Y1v3R7y+Kkd9mgioLVPHjZwdeOhWps87GMTRkRARJrYhK1dA1c3fqxGsZ19SOaBhrEJIyIgIgIiIiAiAiIiICICIiIgIgIiIiAAAAAYjb8VJdQbiRXyOAAAAABJRU5ErkJggg=="
}

Test it

Now restart your Power BI desktop and the external tool should be visible in the ribbon

Then open a pbix file with a model and hit the button.

A PowerShell screen will shortly be visible and then Tableau opens the tds file and now we have a new tableau book with a connection to active power bi desktop datamodel.

And we can start to do visualizations that are not yet supported in Power BI –

How can you try it

You can download the files needed from my github repository – link

Feedback

Let me know what you think and if possible share some of the viz that you make.