I am seeing this consistent error on these visuals where the visual keeps timing out because the calculation took too long to load on the front end and also in the backend analysis. Unfortunately, due to the nature of the data I need to have parameterized calculations applied on the fields in the visuals so it must calculate in real time. Does any one have any experience to possibly work around this issue?
Hi @grazhang -
I’m sure the community can help give you some ideas to tune your visuals. To help you we’ll need to understand a bit more about your configuration.
1. What is your datasource type, DirectQuery or SPICE, if DirectQuery what’s the data source ?
2. What are those visual types (pie chart, etc), calculated fields, filters, and conditional statements do you have in those visuals?
3. How large is your data, how many rows ?
Answering these questions would be a great start.
Hi @robdhondt, thanks for your reply.
- I am using SPICE as the data source
- These visuals are pivot tables that are calculating the top solutions/countries/partners etc. based on a sum of revenue. The first column in the table is controlled by what the user selects in the parameter drop down (solutions vs. countries vs. partners vs. customers) and the second column calculates the revenue based on a date parameter as well as a date type parameter (if it is a launch date or create date, qualified date, etc.). In addition, the revenue calculation is furthermore deduped to avoid double counting, which involves a countOver calculation: {opp_annualized_amt}/countOver({opp_id},[{opp_id}],PRE_AGG). My guess is that all these calculated fields that are reliant on parameters are slowing down the visuals and making it time out. Unfortunately, it is necessary to have these parameters in order for the user to be able to customize their view accordingly.
- My data is quite large at 130803855 rows
Hope this helps - thanks!
I am currently exploring solutions myself for almost the exact situation. Any guidance would be appreciated
Hi @Gardner,
I’m not sure this LAC-W calculation is correct and believe it could be causing excessive query costs:
{opp_annualized_amt} / countOver({opp_id}, [{opp_id}], PRE_AGG)
This is equivalent to the SQL expression:
{opp_annualized_amt} / COUNT(opp_id) OVER (PARTITION BY opp_id)
Since the PARTITION BY opp_id
results in a constant value of 1
for each opp_id
, the formula essentially divides by 1
. While functional, this window calculation (LAC-W) introduces unnecessary computational overhead as it processes each row within its partition—something that could easily be avoided.
Instead, you might consider switching to a LAC-A function for better efficiency. For example:
SUM({opp_annualized_amt}) / COUNT({opp_id}, [your_group_key])
SQL equivalent:
SELECT
SUM(opp_annualized_amt) / COUNT(opp_id)
FROM
your_table
GROUP BY
your_group_key
This approach uses a group-level aggregation (LAC-A), which is computationally more efficient as it avoids row-by-row window processing.
That said, the group by/partition by logic doesn’t seem to align with this calculation either. It’s not clear what you’re trying to achieve with this formula. Can you clarify the intent behind it? That would help determine the best calculation to use.
For reference, here’s the documentation on Level-Aware Calculations.
Hi @grazhang,
It’s been awhile since we last heard from you on this thread, were you able to find a work around or do you have any additional questions?
If we do not hear back within the next 3 business days, I’ll close out this topic.
Thank you!
Hi @grazhang,
Since we have not heard back, I’ll go ahead and close out this topic. However, if you have any additional questions, feel free to create a new post in the community and link this discussion for relevant information if needed.
Thank you!