NUMBER_BITWIDTH_TOO_LARGE for column

Hello,

When ingesting data into SPICE, some rows are skipped and when I download the error file, I see error: NUMBER_BITWIDTH_TOO_LARGE. The column for which this error happens has the value: -4644009008843610. The datatype of this column is Decimal in the spice dataset.

How can I solve this?

Please advise,

Thanks.

Hi @harpar1808 ,

The NUMBER_BITWIDTH_TOO_LARGE error occurs because your Decimal data type column can’t accommodate the 16-digit value. Upon some research, I think the Decimal format in SPICE uses Decimal(18,4), which reserves 4 digits for decimal places, leaving only 14 digits for the integer portion.

A potential solution could be to change the data type of your field. Converting the column to String data type in your dataset settings could be a way around the digit limit.

If you need to perform numeric operations on this column, try Decimal-float instead, which handles larger numeric ranges more flexibly. Or if you went the string route, you could try to create a calculated field in the analyses layer that converts the field to an integer or decimal data type.

hello @JacobR ,

Per below page, it says decimal-fixed supports 18 digits:

Also, if I change the type to decimal-float, the ingestion will be okay but just the values will be approximations ?

Please let me know.

Thanks.

Hello @JacobR ,

I have 200+ fields in spice dataset. Do I need to manually change their datatypes from decimal-fixed to decimal-float or can I do it in bulk somehow?

Please advise.

Thanks.

Hi @harpar1808 ,

Real quick, what data prep experience are you currently using for this dataset? From personal experience, the new data prep experience sometimes can behave not as expected as the Quick Team is still working on it. If you are currently using the new data prep experience, then I would try to switch to the old data prep experience to see if that can be a potential fix. Moreover, if that doesn’t work, then I think your other options are to see if switching to decimal-float can give you some extra wiggle room in terms of the number of digits allows or submit a support ticket. I say a support ticket might be an option since you have documentation stating that the decimal-fixed should technically support up to 18 digits and its giving you that error when you have 16 digits. To submit a support ticket, please refer to this resource (Case management - AWS Support).

And to answer your latest question, there is currently no bulk way to change datatypes within Quick.

I am using old data prep. As for new data prep, I am not even sure why was that pushed out as its half-baked (e.g. Incremental refreshes are not supported). If a software is released, it should at least be feature-wise backward compatible. Thanks I will send a suppor ticket for the same.