Dataset set size in Spice is more than the size of dataset file on drive

Hi Team,

I have a Csv which has around 20 records in it, the size of the file is 69 kb, When i upload this file to Spice the size shown in 230 kb.
Similarly have a Csv file of size 3.1 MB when uploaded to spice it shows the size as 8.4 MB it is near about 3 time the size of my actual file size on drive. Could some one share as to how and why the size increases and what needs to be done to overcome this size difference

the size in SPICE will be different with csv as the file format is different.
Here shows how SPICE calculate data size amazon-quicksight-user-guide/managing-spice-capacity.md at main · awsdocs/amazon-quicksight-user-guide · GitHub

2 Likes

Thank you @royyung for the sharing the link and letting us know how the spice calculation is different.

1 Like

About the size allocated in spice, i’ve a question.

How does the following formula work when fields aren’t always valorized? In spice will be allocated the calculated GB also if fields aren’t always valorized or if are partially valorized (eg: for a specific record, a Text Field 255 is valorized with ‘hello’ which length is 5 )?

Total logical row size in bytes =
(Number of Numeric Fields * 8 bytes per field)

  • (Number of Date Fields * 8 bytes per field)
  • (Number of Text Fields * (8 bytes + UTF-8 encoded character length per field) )

Total bytes of data = Number of rows * Total logical row size in bytes

GB of SPICE Capacity Needed = Total bytes of data / 1,073,741,824