#PowerQuery – Control the expand columns so it includes new columns

Image a scenario where your column in your PowerQuery that contains a table with a set a columns that you know at some point will have more columns added.

In the case above I know that we at some point will add more columns in the merged ProductAttributes table.

How can we make this dynamic using PowerQuery

When we click the icon for expanding the table, we might just select this and move on

But notice the formula created in

It says

= Table.ExpandTableColumn(#”Merged Queries”, “ProductAttributes”, {“Brand”}, {“Brand”})

This means that even though we might add new columns to the ProductsAttributes table – it will still only be Brand that is expanded and only that column.

The bolded arguments is 2 lists that contains the Column names to expand and the new names of the columns – the last argument is optional so we can actually skip that if we want the original names – https://docs.microsoft.com/en-us/powerquery-m/table-expandtablecolumn

Now by changing the formula to this

= Table.ExpandTableColumn(#”Merged Queries”, “ProductAttributes”,List.RemoveItems(Table.ColumnNames(#”Merged Queries”[ProductAttributes]{0}), {“ProductKey”})
)

We can make the table dynamically expand when adding new columns in the table ProductAttributes

We get the new column included as well

The magic formula does this

Table.ColumnNames(#”Merged Queries”[ProductAttributes]{0})

Will return a list of column names from the step before
expansion (note I use the step name and column name) – and I use the {0} to extract the column names only form the first row – otherwise the formula will fail.

But as we cannot have the same column names twice (i.e. ProductKey needs to go away) so we need to use the List.RemoveItems functions

List.RemoveItems(Table.ColumnNames(#”Merged Queries”[ProductAttributes]{0}), {“ProductKey”})

Thereby removing the ProductKey Item in the list

And this means that when we get more columns in the table “ProductAttributes” table they will automatically be included in the expanded columns

Hope this can help you power queries even more dynamic.

Here is an example file – Link

Power Query On !

23 thoughts on “#PowerQuery – Control the expand columns so it includes new columns

  1. And if you are expanding a list of records this line does the trick:

    = Table.ExpandRecordColumn(#”Expanded Documents”, “Record”,
    List.RemoveItems(
    Record.FieldNames(#”Expanded Documents”[Record]{0}),{“id”})
    )

  2. Hi, I have a similar thing going here, but cannot figure out how retrieve new entries. It looks like this:

    And I’d like to have the last couple of dates included. (I can they are there, but I need this dynamic approach…)

    = Table.ExpandRecordColumn(#”Removed Columns”, “Value”, {“2020-02-02”, “2020-01-31”, “2020-01-30”, “2020-01-29”, “2020-01-28”, “2020-01-27”, “2020-01-24”, “2020-01-23”, “2020-01-22”, “2020-01-21”, “2020-01-20”, “2020-01-17”, “2020-01-16”, “2020-01-15”, “2020-01-14”, “2020-01-13”, “2020-01-10”, “2020-01-09”, “2020-01-08”, “2020-01-07”, “2020-01-03”, “2020-01-02”})

    1. Hi Peter,

      You could try this

      = Table.ExpandRecordColumn(#”Removed Columns”, “Value”, Record.FieldNames(#”Removed Columns”[Value]{0}), Record.FieldNames(#”Removed Columns”[Value]{0}))

      The Record.FieldNames(#”Removed Columns”[Value]{0}) will read the fieldnames from the first row of the table returned by the step #”Removed Columns”

      Hope this helps you

      /Erik

  3. Hello, this is a great article and the function works fine, but it only takes the list of columns from the first rows {0}, in my case i have multiple tables and only some of them may have a new columns added, but I don’t know which table it is… do you know how I could check this or take columns from all rows and then remove duplicates?

      1. Erik, Just imagine you have 4 excel files in a SharePoint folder. Each file has the same table, let’s call it tblForecast. You need to upload all files from the folder into Power BI and append all tables into a single table. Now, initially each table has 10 columns P1 – P10, then a month later in one of the files somebody added next column P11. Let’s say in the 3rd file you are loading into Power BI. With your function i can only load columns from a single file. So, if it’s not the one with a new column it will not get the data. The problem is i don’t know which file had the columns added.
        I found a few various solutions on internet for example with list.union which works…. but for some reason it takes 5-10min to complete the query with 4 small files and around 130 rows. Your function does the same in around 4 seconds. I don’t really understand why the version with list.union works so slow as it actually gives me a list of the columns very fast, but then when i use it under the Table.ExpandTableColumns it takes for ever to do it…

      2. Hi Piotr – sorry for the late answer – the reason why M is slow in your case is might because M will evaluate your full/all queries when reading the files meaning it can end up with reading the files many times – you can use the new Query diagnostics to check this.

        Do you use table.combine or Table.ExpandTableColumn to get the full table and is P1-10 referring to periods ?

        /Erik

  4. Hi Erik,
    I have an issue regarding Table reload from OData – sharepoint , when i am refreshing with only source , it takes roughly 15 minutes to refresh, But once i add expandcedolumn step then it is taking forever to refresh, cant understand the issue,

    the Query in Advance Editor is

    let
    Source = OData.Feed(“https://rkwonder.sharepoint.com/sites/Helpdesk/_api/web/lists/GetByTitle(‘ServiceIncidentHistory’)/items?$select=RequestID,TicketIDId,Created,Technician/Title,Technician/ID,Author/Title&$expand=Technician/Title,Technician/ID, Author/Title&$orderby=Created desc”, null, [Implementation=”2.0″]),
    #”Expanded Technician” = Table.ExpandRecordColumn(Source, “Technician”, {“Title”}, {“Title”}),
    in
    #”Expanded Technician”

    1. Hi Hardik,

      I haven’t tried that scenario – I can see that you use an orderby criteria in your odata query – I would try to remove that – no need to sort the data.

      And have you tried the Sharepoint connector ?

      /Erik

  5. Hello dear Erik, thanks for all your help and tips and teachings, I’ve learned so many things from you 🙂
    I have a problem in PowerBI, hope you can help.
    I have 2 different Tables, the 1st table has a HFM account and SAP and month columns, like 01.11.2019, 01.12.2019, ….
    In the 2nd one I have HFM account and SAP and I should calculate the sum of specific elements and change the Header of the only column (with dynamic codes) to the exact month column names of the 1st table.
    Then I need to merge these 2 tables with choosing HFM account and SAP as matching columns, then copy the month column from 2nd table in the 1st table.
    Any idea how it should be done? Because in the Expand table step, after I choose the column from 2nd table, it is being used as string and any further changes will make errors;
    Unfortunetlity your solution does not work here! Is it because I have to choose 2 keys (HFM account & SAP)?

    I have tried your solution
    I will appreciate any help, thanks in advanced

  6. (Table2 has only these 3 Columns and the column date changes dynamically every month, in feb it will be 01.02.2020, March 01.03.2020)
    Table2: Table1:
    HFM| SAP | 01.01.2020 HFM| SAP | 01.11.2019 | 01.12.2019
    1 | 1a | 1.2 1 | 1a | 1.6 | 1.7
    1 | 1b | -3.2 1 | 1b | -3.9 | -4.0
    1 | 1c | 0.3 1 | 1c | 4.0 | 0.9
    2 | 2a | -4.0 2 | 2a | 8.0 | -3.9
    2 | 2b | 4.6 3 | 2b | 3.9 | null
    3 | 3a | 3.9 3 | 2c | 3.9 | 3.4
    3 | 3a | 3.9 | null
    I merge these 2 Tables with choosing HFM and SAP as matching columns(Right Outer), with your solution, I get the errors like;
    SAP exists in 1st Table
    Expression.SyntaxError: Invalid identifier
    etc.
    what I need is just merging Table 1 with the date column from Table2 that has dynamic name every month.
    thank you so much

  7. Hi Erik – This is a wonderfully elegant solution. Thank you for sharing it.

    My use case is very similar to your example. I need a expand a dynamic list of columns from a table. The glitch seems to be width of the table being expanded. My list of columns to be expanded is very long, approx 30. This dynamic option only expands the first 10 columns and omits the remaining columns in the table. In the static syntax that hard-codes the column names, I am able to “Load More Columns” to select all 30 columns for expansion. How can I force the Table.ColumnNames to return ALL column names?

    I greatly appreciate your advice. Thanks!

    1. Hi Leah, Thank you for the kind words.

      I have just tried to do it with a table with 21 columns and it works

      // Table21Col
      let
      Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText(“HYrBEQAhCMR64X0PAQGvFsf+25D4yWQzu7dUySc+G2qNxBZNHVuYNv7ReJ0UpASVDXvfoHEzfhME00vOuQ==”, BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Col1 = _t, Col2 = _t, Col3 = _t, Col4 = _t, Col5 = _t, Col6 = _t, Col7 = _t, Col8 = _t, Col9 = _t, Col10 = _t, Col11 = _t, Col12 = _t, Col13 = _t, Col14 = _t, Col15 = _t, Col16 = _t, Col17 = _t, Col18 = _t, Col19 = _t, Col20 = _t, Col21 = _t]),
      #”Changed Type” = Table.TransformColumnTypes(Source,{{“Col1”, Int64.Type}, {“Col2”, Int64.Type}, {“Col3”, Int64.Type}, {“Col4”, Int64.Type}, {“Col5”, Int64.Type}, {“Col6”, Int64.Type}, {“Col7”, Int64.Type}, {“Col8”, Int64.Type}, {“Col9”, Int64.Type}, {“Col10”, Int64.Type}, {“Col11”, Int64.Type}, {“Col12”, Int64.Type}, {“Col13”, Int64.Type}, {“Col14”, Int64.Type}, {“Col15”, Int64.Type}, {“Col16”, Int64.Type}, {“Col17”, Int64.Type}, {“Col18”, Int64.Type}, {“Col19”, Int64.Type}, {“Col20”, Int64.Type}, {“Col21″, Int64.Type}})
      in
      #”Changed Type”

      // Query1
      let
      Source = 1,
      #”Converted to Table” = #table(1, {{Source}}),
      #”Added Custom” = Table.AddColumn(#”Converted to Table”, “Custom”, each Table21Col),
      #”Expanded Custom” = Table.ExpandTableColumn(#”Added Custom”, “Custom”, Table.ColumnNames(Table21Col))
      in
      #”Expanded Custom”

      Can you share your M queries so I can see if I can find the problem in your query ?

      Best regards
      Erik

      1. Erik, thank you so much for the response.

        Upon further research, I realize it isn’t the size of the table, as you correctly demonstrate. I’m working with a list of appended tables, each with different columns. I don’t just need the columns from the table in the first row {0}, I need the columns from the tables on the first 9 rows {0-8}. The tables share some and not all columns in common.

        My current code, with static column names, is:

        Source = Table.Combine({Table1, Table2, Table3, Table4, Table5, Table6, Table7, Table8, Table9}),
        #”Expanded Transform File” = Table.ExpandTableColumn(Source, “Transform File”, {“Column 1”, “Column 2”, … “Column 30”}),

        Thanks again for your advice!

        Leah

      2. Hi Leah,

        Does your Table1, Table2 contains columns or is it a column containing a table in each of the tables ?

        If so you need to use the ExpandTableColumn in each of your tables in the Combine function for it to work

        /Erik

  8. Thanks Erik,
    I have a similar question:
    I need to expand all the columns not only from {0} record but from all the records.
    I am working with json file and each member can have different properties that Power Query parses to columns.
    So I need it first to scan all the possible column names and then to expand.
    Sort of what clicking on the “expand icon” does – populates all possible column names.
    Otherwise, if I use only {0} record – it will miss some of the following records’ columns.
    Please advise.
    Thank you!

  9. No master record unfortunately.
    Column names vary from record to record.
    So it has to “scan” all the records for all possible column names, then expand.
    Thanks a lot!

Leave a comment