QUICKSIGHT(SPICE RUN TIMEOUT) ---Importing data from snowflake

Hi Team,

I’m encountering a timeout error while importing data from Snowflake into Amazon QuickSight (SPICE). Here’s the context:

  • I’m working with a table in Snowflake that has around 30-40 columns and approximately 5 million records.
  • I’m using a “custom SQL query” in QuickSight: SELECT * FROM db.schema.tablename.
  • Despite my dataset being within SPICE limitations, the import process times out.

Has anyone else experienced this issue? Any suggestions on what might be going wrong and how I can overcome this would be greatly appreciated.

1 Like

Hello @jatin, welcome to the QuickSight community!

There are a few things that could be causing this issue in QuickSight. First of all, due to the data breaches Snowflake experienced recently, they started requiring MFA for users. It could be an issue related to that not being set-up correctly with the QuickSight connection, causing the ingestion to fail.

Also, when you are viewing the dataset in the edit dataset menu, and hit apply after setting up your custom SQL, is any of the data from Snowflake loading in on that menu or is it timing out?

Lastly, if your snowflake account is provisioned too low with credits and compute limits, then it may just be taking too long to query the data. This could cause QuickSight to fail the ingestion if it exceeds the internal limitation. You may need to consider increasing your credits within Snowflake if that is the case.

These are my initial thoughts as to why you are experiencing this issue. Please let me know if this helps!

Hey @DylanM ,

No Actually I don’t think it is a low credits issue because we have plenty of credits available in snowflake.

I tried to import 100 records into SPICE and it worked but when I am doing this for large data like more than 1k records , it is failing and giving the error mentioned in the SS.

What your views on that ?

Hey @DylanM ,

One more thing: the process works fine using “DIRECT QUERY,” but not with “SPICE,” and I need the data in “SPICE.”

@jatin ,

The blog has details when you want to store the data in SPICE ( Establish private connectivity between Amazon QuickSight and Snowflake using AWS PrivateLink | AWS Big Data Blog ) , QuickSight would also then require access to the S3 bucket which gets configured ( Manage QuickSight > Security & Permissions > QuickSight access to AWS services > Amazon S3 )

Kind regards,
Koushik

Hey @Koushik_Muthanna and @DylanM ,

I have checked , we have all required connection between snowflake and quicksight. Still the problem persist . Is it due to the less cluster size or Extra small(XS)-single cluster in snowflake ?
or what do you think ,what are possible reasons for this “SQL_EXCEPTION” Error ?

@jatin ,

I cannot say definitely, check your snowflake logs as to why queries are failing.

@Koushik_Muthanna ,

I have checked the logs , Queries are Succeeding in the snowflake logs and showing the result but in Quicksight(SPICE) , the data is not loading .

Here are the process steps and verification steps that I taken till now for more clarity :

  • Table in Snowflake with 40 columns and 5 million records.
  • I am creating a new dataset in Quicksight (SPICE) using a custom SQL query: SELECT * FROM db_name.schema_name.table_name.
  • The custom SQL query executes successfully in Snowflake, as confirmed by the logs.
  • Quicksight SPICE fails to load the data and displays an error (earlier attached error screenshot).
  • When I run the custom SQL query with a limit of 100 rows: SELECT * FROM db_name.schema_name.table_name LIMIT 100, it executes perfectly, and the data loads successfully in SPICE.
  • But tried loading the data of a specific month or limiting to 1,000 records, it failed.
  • Verified internal staging in Snowflake is enabled.
  • Ensured Quicksight has connection with S3 storage and necessary permissions.
  • Confirmed that the SPICE quota for storage and data limits is not exceeded.

Anything that I am missing here ? or I have to double verify ?

@jatin ,

You could test by changing the cluster size . If that does not resolve, suggest you to open up a support case to debug this further. Here are the steps to open a support case so that we can help you further: Creating support cases and case management - AWS Support . If your company has an internal IT team, you might not have direct access to AWS Support and will need to raise an internal ticket to your IT team. They’ll open an AWS Support case on your behalf.

1 Like

are you using Snowflake or Athena as connector in QuickSight?
if using Athena, did you grant the S3 spill bucket access to QuickSight?

Hello @jatin, I do believe @Koushik_Muthanna’s suggestion to submit a support ticket will be the best course of action to receive assistance resolving this issue. Since we have not heard back from you with any remaining questions, I will archive this topic.

If you still need assistance with this issue, please post a new topic and provide a link to this question to provide relevant information. That will ensure you are at the top of the priority list for a response from one of our QuickSight experts. Thank you!