Changing power query data source across tables #627
-
Situation: Original model uses a different SQL DB and AAS instances then the new one. I've read some of the discussion around Legacy vs Power Query, but being new to TE3 maybe need clarification on options available. All tables use a power query to load the data source... i.e. let source= SQL/server;database Although I changed the primary data source to the new server/database... I see that all tables still have the prior server/database defined in their Partition data source powerquery source statement. Do I have to manually edit every table to adjust? Or there is some global method, replace, etc. ? As during every PROD deploy I'm going to have to change the value from DEV to PROD database values, this can be done in a parameterization replacement script during Release, or how accomplished? Is there an issue importing new tables with Power Query loading? Or editing them later? Although I'm looking at TE3 workflows, I see mention of TE2 users still using VS Pro to import tables? I'm new to TE. Thanks, Tony |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
I think I see part of the root of my issue... having done more power query in power bi instead of AAS Tabular model. The same name is used in the Power Query Source=xxxx I changed the Address and database name in properties of the Data Source (data sources folder)... correcting now. so the question becomes:
Tony |
Beta Was this translation helpful? Give feedback.
-
Solved. As I'm using the Analysis Services API to do our data refreshes (ProcessFull called from Azure Function)... I also had to extend the Release script to redundantly set database, server, and credential{username, password} directly, as well same values in ConnectionString... hence additional items added to script run at Dev or Prod deploy task in Release. On following script...Remove line breaks for actual execution by single echo statement (breaks added for readability) START SCRIPT echo (Model.DataSources[0] as StructuredDataSource).ConnectionString = Environment.GetEnvironmentVariable("SQLDWConnectionString"); (Model.DataSources[0] as StructuredDataSource).Password = Environment.GetEnvironmentVariable("SQLDWPassword"); (Model.DataSources[0] as StructuredDataSource).Username = "$(SQLDWUsername)"; (Model.DataSources[0] as StructuredDataSource).Server = "$(SQLDWServer)"; (Model.DataSources[0] as StructuredDataSource).Database = "$(SQLDWDatabase)"; > SetConnectionStringFromEnv.cs END SCRIPT This requires some extra variables, of which I put Password and ConnectionString in environment variables for security. The others (username, server and database name) I did not secure further. BTW, I also apparantly have SchemaCheck working with (single) Structured DataSource. Tony |
Beta Was this translation helpful? Give feedback.
Solved.
For our (single) Structured Data Source...
Azure Analysis Services deployed (not SSAS or Power BI Premium deploy)...
As I'm using the Analysis Services API to do our data refreshes (ProcessFull called from Azure Function)... I also had to extend the Release script to redundantly set database, server, and credential{username, password} directly, as well same values in ConnectionString... hence additional items added to script run at Dev or Prod deploy task in Release.
On following script...Remove line breaks for actual execution by single echo statement (breaks added for readability)
START SCRIPT
echo (Model.DataSources[0] as StructuredDataSource).ConnectionString = Environment.Get…