Team, Could you please share some insights on how much time does it take to load a billion rows dataset, and/or how to optimise the process?

We are going to work with close to 1 billion rows. Incremental updates of the dataset, with a timestamped column in, can address in-between updates.

Considering that we need to have at any point in time data from the last 12 months, would QuickSight delete older rows (based on the timestamped column) that exceed its 1 billion rows quota? …OR do we have to always do full dataset updates to keep only 12 months of data (that we know are close but always less than 1billion) ?


See here

1 Like

Hi @Techatro! For additional details on how incremental refresh works please refer to below video.
In addition to incremental refresh, you may need to do a full refresh weekly once or 3 days once (depending on how much data you are adding as part of incremental refresh) to not exceed last 12 months of data.


1 Like