Stored Procedures - Feature Request

What are the changes of allowing the ability to execute stored procedures in custom SQL? Often times regular custom sql can not be optimized for a MS SQL Server and results in really slow refreshes

Thanks!

1 Like

Hi @jbuford04 - Thanks for the question. Can you please give some details on your use case, is there any specific reason you want to run stored procedure via QuickSight. One of the option, you can create an etl and populate a target table and used for your reporting or use of SPICE can help as well in terms of improving the performance.

We should not use QuickSight as ETL or ELT but use for analytics , let’s also take feedback from other community experts.

Hi @David_Wong @Naveed @thomask @Max @SD_QS @Ramon_Lopez - Kindly provide your expert advise on this.

Regards - Sanjeeb

1 Like

Not trying to solve an ETL problem with this but really want to do 2 things with it:

  1. Other analytics tools (PowerBI, Tableau, etc…) allow for the execution of a sproc to bring data into their equivalent of a dataset. My request is just to bring Quicksight into parity with those tools

  2. We’re currently executing sprocs to fill tables to then select * from table in many of our datasets today, the problem is the sproc generally is executed asynchronously from the dataset refresh so if a sproc fails, then the QS dataset will also subsequently fail, and there are 2 places (at least) to check for failures to see if it’s an issue with the sproc or with the refresh

  3. SPROCs can be optimized much better than queries and subqueries if we’re pulling from multiple tables or databases within a connection (indexing, etc…) and SQL Server is much better designed to process complex SQL than the way quicksight sends a query back to our server. I would much prefer to just send “EXEC sproc(@param)” than a very complicated and often sub-optimal query with sub-queries. Also, better source controls using SQL servers for our custsom SQL this way

In short, I think SPROCs are just a much more efficient way of bringing a dataset into Quicksight than the current custom sql

1 Like

Thank you @jbuford04 for the details. Tomorrow we have a Monthly Q&A Session, if possible please join and this is something we can discuss with AWS QuickSight team.

To register the Q&A session, please find the link -

On the event page, go to Th May 25: Monthly Live Q&A Series for Amazon QuickSight, with Expert Solution Architects | Fourth Thursday of the month | 12 Noon PST

Regards - Sanjeeb

1 Like

I can’t attend tomorrow’s, but I’ll register for an upcoming session

1 Like

No problem @jbuford04 . I will request this feature on behalf of you :slight_smile:

Regards - Sanjeeb

1 Like

+1 :slight_smile: I think this would complement the recent feature of Dataset Parameters really well.

@jbuford04 dunno if you tried this or if it works in your case, but if you are usingMSSQL you could try using openquery with an 'EXEC YourStoredProcedure' as parameter. Failing that, you may also try putting all that in a view on the server side like this and then querying the view on QuickSight (but you wont be able to pass parameter this way).

2 Likes

Hadn’t tried openquery() yet, but I’ll do some investigation – we’ve tried some work with Views to pull in a dataset, just finding some wildly variable executions for a view compared to just selecting from a table. Good input though!

hi @jbuford04

In lieu of the ability to run a stored procedure in custom sql- what about running a script that triggers the SP to load table, once done, trigger QS to import via a select *.

Would this work?

thanks!

2 Likes

That’s what we’re doing today, and it works most of the time; would just prefer to be able to exec natively like other tools

1 Like