Power BI Exchange

Please login or click SIGN UP FOR FREE to create your PowerBIUG account to join this user group.

Power BI bulk Export using R-based function

By Michal Haviar posted Jan 15, 2019 07:44 PM


This guide is an extension of original article by Soheil Bakhshi. It shows how to add parameters, use them in R-based function and add some global environment references controls for more comfortable usage.

Use this guide if you:

  1. Have relatively a small amount of data to export like a list of values or bridge tables.
  2. Want to utilize awesome Power BI ability to digest all kinds of data and use it as an ETL tool.
  3. Already applied some advanced M transformations (in my case it was Power Query data anonymization) and you want to save it back to your source database.
  4. Need to frequently export your transformed data, but don’t need to load it incrementally (I haven’t needed this yet, but it can definitely be adjusted for incremental loads)
  5. Don’t really want to bother with data types or table names (R will take care of it).
  6. Want to be flexible with server, database and table names (parameters)

If you don’t have any previous experience with R script and R libraries I suggest to check Soheil’s article first and then come back to this one. He does an awesome job describing how to install the needed prerequisites.


- Power BI Desktop

- R for Windows or SQL Server 2016 R Services

- R library called RODBC

- local instance and database of SQL Server (in example below I made a database “PowerBI_export” on a local server called “.\SQLExpress”)

Solution 1

This solution is identical to what Soheil published here. If you are missing any of the prerequisites - check his article first.

Simply create a duplicate of your query, press this button
Picture1.pngand add the code below:


conn <- odbcDriverConnect("driver=SQL Server; server=.\\SQLExpress; Database=PowerBI_export")


sqlSave(conn, dataset, tablename="TableExport_Single",rownames=FALSE, safer=FALSE, append=FALSE)


This R script is pretty straightforward and you only need to edit the highlighted parts:

- server is the instance of server on which you want to export your data

- database is name of the database where you want your data to be exported

- tablename is the name of a new table which would be created if it didn’t exist already

In case you’d like to know more – check this R documentation.

Successful export would look something like this in the Power Query Editor:
Successful export solution 1

… and in the SQL Server Management Studio:

So let’s summarize some pros and cons of this approach.


- you can actually EXPORT data from Power Query Editor

- you don’t have to create a table in advance and define column types

- R components installation is just a one-time job


- one must duplicate each query and add this R script step manually. There must be a way around it..

- server, database and table names are hardcoded – this is fine for let’s say 5 tables, but if you are exporting 30 and more you are going to have a lot of typing ahead of you… Now that you managed to type all 30 servers, databases and table names you might find out that someone has changed the server. There must be a way around it..


Solution 2

This solution gets rid of some of the cons and extra coding from Solution 1 by creating an R script function. First, we need to create some parametrized queries for server, database and a function. Copy/Paste the following codes to 3 blank queries:


".\\SQLExpress" meta [IsParameterQuery=true, List={".\\SQLExpress", "SQL"}, DefaultValue=..., Type="Text", IsParameterQueryRequired=true]



"PowerBI_Export" meta [IsParameterQuery=true, List={"PowerBI_Export", "master"}, DefaultValue=..., Type="Text", IsParameterQueryRequired=true]




Source= (Dataset as table, Tablename as text) => let

Rscript = R.Execute("library(RODBC)#(lf)

conn <- odbcDriverConnect(""driver=SQL Server; server=" & ServerToUpload & "; Database=" & DatabaseToUpload & """)




sqlSave(conn, dataset, tablename=""" & Tablename & """, rownames=FALSE, safer=FALSE, append=FALSE)








Highlighted parts will be function parameteres for table name and table content which you want to export.

Red parts are references to parametrized queries for server and database you’ve previously created from blank queries. These should not be changed so often, but there is no reason to edit the code if they do - simply change their current values either through UI or Advanced Editor:
Note: R needs local instance to be separated with "\\" instead of "\"

To export the table via function simply click on Rscript_Export function and Invoke it. Dataset is the table you want to export and TableName is how you want to call it in SQL Server.

Result on the SQL Server:

So much for not editing code every time, but can we do this in a bulk?

In the demo file at the end we have 3 manually created tables to export (of course these can be M queries with excel or webpage source and dozens of transformation steps).

Now let’s create a manual table with records of table names:

To get the actual data without creating duplicates of each query you can utilize the Power BI global environment variables by using #shared in Expression.Evaluate function. Try adding this formula to the Custom Column:

The formula returns Dataset column with nested tables.

It works just fine if the table keep some naming conventions like not having any special characters or using “_” instead of “ ”. Otherwise you get Expression.Error. There is a way to fix this in Solution 3 – so stay tuned.

To invoke the function:

  1. select Invoke Custom Function
  2. type the column name like Exported or Rscript_Export
  3. select the Rscript_Export function 
  4. on dataset parameter input you will have table type by default - change it in the rollup option to column type
  5. in DatasetSource input select Dataset column
  6. in Tablename input select Table column

Successful export of this function here looks like this in Power Query Editor:

And this SQL Server Management Studio:

Let’s summarize the pros and cons – again.


- you can still EXPORT data from Power Query Editor

- you still don’t have to create the table in advance and define column types

- R components installation is just a one-time job

- thanks to using R script via function instead of separate transformation step - you don’t need to duplicate each exported table and apply Rscript separately. Exporting 30 or 300 tables (even repeatedly in case you need to update them) is not a problem anymore.

- server and database are inbuilt parameters of the R script function, so if you need to change them - no extra coding is needed



- you still need to type all the table names - manually. There must be a way around it..

- you have to create an extra data column for the function – or do you? There must be a way around it..

- this doesn’t work on tables with “bad” naming conventions. There must be a way around it..


Solution 3

Gets rid of all the cons and optimize for in bulk export. We can start with probably the most pressing one – naming errors. If you can’t refer to tables with space how does Power BI do it? Let's try to Reference some tables:

Power BI usually refers to the other queries by using just a name, but if special characters are used it wraps it into #” “ – this looks like solution.

Previous solution used extra column to display data that were afterwards exported by using following formula: Expression.Evaluate( [Table] , #shared).

So, it also uses tablename column as an input, but do we need to display it? Not really. It could be bypassed by referring to tabledata directly - using the same Expression.Evaluate in the function.

To upgrade the R script change following parts:


Source= (Tablename as text) => let

Dataset = Expression.Evaluate("#"""&Tablename&"""", #shared), 

Rscript = R.Execute("library(RODBC)#(lf)

conn <- odbcDriverConnect(""driver=SQL Server; server=" & ServerToUpload & "; Database=" & DatabaseToUpload & """)




sqlSave(conn, dataset, tablename=""" & Tablename & """, rownames=FALSE, safer=FALSE, append=FALSE)








You can see that there is now just one input parameter – tablename. Dataset is derived from tablename and it is also wrapped into #“ “ to handle problematic names.

Now when using this new R script, we don’t need extra Dataset column and the TableExport 3 table error is fixed.

But typing each table name is still, shall we say.. imperfection. The previously created Dataset column uses global environment variables to invoke the table data. So there’s definitely also a way to get all tables names in current file using #shared.

Use this script in new blank query and call it AllTables.


     Source = #shared,

    #"Converted to Table" = Record.ToTable(Source),

    #"Added Custom" = Table.AddColumn(#"Converted to Table", "Type",

each Type.TableSchema(Value.Type([Value]))),

    #"Removed Errors" = Table.RemoveRowsWithErrors(#"Added Custom",


    #"Removed Columns" = Table.RemoveColumns(#"Removed Errors",{"Type",



    #"Removed Columns"


This script results in table with all table names in current pbix file, where you can consequently filter which tables you are interested to export (faster than writing them down right?).


Now all you have to do is to invoke custom function column in AllTables table with the only parameter you need – Name.

And voila.. Errorless bulk R-script export from Power BI 😉

Note: For the record, if you haven’t deleted column “Value” in 4th step of Alltables you can use it with Rscript function from Solution 2 as Dataset parameter and it fixes the Evaluation errors as well.

1 comment