We are building dashboard and datasets in quicksight. We have multiple environment like dev, qa,stage, etc. How do we promote the datasets and dashboard from one env to another
Hi @vinayg ,
this is now possible though APIs.
You can build your development pipelines and manage the export/import of the assets between different environments also in different accounts.
You can have some more informations from our AWS Solutions and Blogposts.
These are some references:
- Guidance for Multi-Account Environments on Amazon QuickSight
- Support multi-tenant applications for SaaS environments using Amazon QuickSight | AWS Business Intelligence Blog
- Automate your Amazon QuickSight assets deployment using the new Amazon EventBridge integration | AWS Business Intelligence Blog
- CI/CD and QuickSight multi-account best practices
Hope these helps ![]()
Andrea
Hi @andrepgn ,
I tried using the import bundle aws cli for Quicksight, however even after passing the over-ride parameters, I am not able to import datasets as it is still referring to the datasource of my source account. how can I make sure it used the datasource ARN of my target account where I am trying to import.
Hi @vinayg , all that export stuff is just an archive with JSONs. So, for example, you can replace all datasource IDs with correct ones before importing your assets to a selected environment. By updating these JSONs you can do much more then by using overriding parameters.
Hi @vinayg ,
as @Hrolol is saying, you can export assets definition in two ways:
- Quick Sight JSON
- Cloud Formation
In both cases, you can search for the ARNs of the assets opening the exported files and changing the IDs accordingly to what you need.
As best practice, if you create all the assets, starting from the Data Sources, with APIs, the IDs will be re-used in different accounts, so you don’t even need to replace these IDs.
If you created the Data Sources or Data Sets manually in the new environment, then you just need to open the definition and change it accordingly.
You can, e.g., create a script to automate it.
- script that just replace strings
- dictionary to map IDs in source and target environment
And that’s it!
This is an example of export in JSON of data coming from an Athena connection:
DataSource:
{
"resourceType":"datasource",
"dataSourceId":"123ab456-a1bc-12a3-abc1-1234a567b890",
"name":"DataSource-Name",
"type":"ATHENA",
"dataSourceParameters":{
"athenaParameters":{
"workGroup":"primary"
}
},
"sslProperties":{
"disableSsl":false
}
}
You have:
- dataSourceId: you can leave it as it is. This is the asset you’re creating now, and can have same ID it has in another account.
- name: the name you want this Data Source to have in Quicksight
- the rest depends on the type of Data Source. In this case it’s an Athena one, so you have workGroup.
Note that there is no reference to AWS Account ID or Region. These info will be taken from the account and the AWS Region you’re launching the import API
Then you will have a Data Set created from that Data Source.
DataSet:
{
"resourceType":"dataset",
"dataSetId":"abcdefghi-1234-0987-abcd-01234abcdef987",
"name":"DataSet-Name",
"physicalTableMap":{
"another-not-useful-ud":{
"relationalTable":{
"dataSourceArn":"arn:aws:quicksight:my-aws-region:my-aws-account-id:datasource/123ab456-a1bc-12a3-abc1-1234a567b890",
"catalog":"AwsDataCatalog",
"schema":"my-athena-database",
"name":"my-athena-table",
"inputColumns":[
...................... other not interesting things, if the columns are the same etc...
Here you have these info:
- dataSetId: you can leave it as it is. This is the asset you’re creating now, and can have same ID it has in another account.
- name: same as before, it’s just the QuickSight name of the Data Set
- another-not-useful-id: you can leave it as it is
- dataSourceArn: This is the important part.
- if you have created the DataSource with APIs (the previous code), and so you haven’t changed the DataSource ID, you can leave everything as it is. The AWS Account ID and the AWS Region will be changed accordingly to the Account and Region you will use the Import API.
- If you have create a new Data Source manually, then you have to retreive that datasource ARN and change it here accordingly!
- catalog, schema and name are parameters related to Athena, so these will be different if you have another Data Source type. In any case, these have to be changed too if your data don’t reflect the same structure as the source.
This been said, I suggest you to use APIs as much as possible, it will be easier in managing the IDs.
Let me know if this helps! ![]()
Andrea
@andrepgn When I exported the bungle using the aws cli, the json contains the Account & region and it is not getting over riden even after passing the –override-paramaters. are you using AWS CLI or some SDk’s to export the assets. see below the exported datasource assets :{“resourceType”:“datasource”,“dataSourceId”:“eb0ce097-629a-4d58-8220-e4fc5eb5c76c”,“name”:“Infrastructure”,“type”:“SQLSERVER”,“dataSourceParameters”:{“sqlServerParameters”:{“host”:“”,“port”:8443,“database”:“”}},“vpcConnectionProperties”:{“vpcConnectionArn”:“arn:aws:quicksight:us-east-1:54****:vpcConnection/ecd430e1-3b6c-406a-a0a3-f7b69ab6efa8”},“sslProperties”:{“disableSsl”:false}}