I have few datasets in my Quick Sight account (size of 10 MB - 10 GB) which are used for WBR reports. Since these datasets are incremental, their data size increase every week. This has caused the data to exceed SPICE capacity and fail to refresh multiple times (which was fixed with additional SPICE capacity purchase).
Could you please provide suggestions on how I can manage the SPICE capacity of the account with the growing size of the datasets. Additional details for your reference -
As and when the datasets get refreshed the underlying SPICE capacity is used for storage of additional data and hence when the available SPICE capacity is not sufficient you run into refresh failures. To avoid this you can turn on auto purchase of SPICE capacity and the system would then allocate more capacity as per your configured settings.
Additionally, I recommend that you do not load all the table data into SPICE, instead you load only data that is used for visualization as part of your analysis and dashboard - in terms of columns
And second would be applying filters while loading data into SPICE for instance, last one year of data… and having a secondary pre aggregated dataset for the historic data which reduces the volume of data. And when the user interacts with the data, you can always bring the filtered drilled down data to showcase on a dashboard using direct query mode.
Please let us know if there are any other concerns or questions and we will be able to assist you to further.
Hi @Srikanth_Baheti, needed your advice on refresh schedules for datasets as well. I currently have these SPICE datasets set on a full refresh weekly schedule. When I refresh a dataset, is the old snapshot of the dataset deleted or is the new data just appended to the dataset?
If old snapshots take up SPICE capacity, can they be deleted as well?