Deploy dataset creation from Github

I would like to store my custom SQL queries in Github and have Quicksight use these to create datasets from a PostgreSQL source. Is this something others are interested in also or that has been planned by Quicksight product team?

This is something my CTO and dev team have been requesting for over a year because we want to centralise all our code in Github to:

  • Maintain versioning that doesn’t rely on Quicksight for our main dashboards
  • Have a backup
  • Improve transparency across our company (non-authors sometimes need to view the query code and it’s tedious to share)

Current workaround
When I update a dashboard query in quicksight, I manually transfer the change over to a Github repo that contains all the queries. Nothing really ensures the query is up to date or that there are no mistakes since there is no connexion between the two. I could also easily forget to do the update.

thanks !

1 Like

Hi @rachel - Thanks for posting this question. This requirement needs a dedicate solutioning discussion as you need to set up CI/CD pipeline. Very high level, you can store in scripts in Github → do an integration with S3 to store it and execute a lambda function which essentially update the data set via boto3. By saying that, a detailed POC is require.

Tagging @Max @AnwarAli for their expert advise.

Regards - Sanjeeb


Hi Rachel,

Yes as Sanjeeb says you can use the QuickSight APIs to create both data sources and datasets via the AWS APIs. See the Python SDK docs as an example - create_data_source and create_data_set.

You can also create data sources and datasets using CloudFormation - see details of the QuickSight CloudFormation support here. This blog post shows an example using Redshift as the data source.

Let us know if that helps.

Many thanks


Thanks, @robkc and @Sanjeeb2022 for offering your solutions!

@rachel Please help the community out by marking the reply with the Solution (click check box under the reply).

1 Like