So, we have 2 datasets, one is category where we have all the category level data and other is transaction, which we are joining with category to get all the transaction values. For calculating a metric where we need transactions for every single date, business model and store, below is the calculation:
sumOver(
ifelse({category} = " " AND
{business_date}>= ${StartDate}
AND
{business_date}<= ${ToDate},
tnx, 0
),
[ {business_date}, {business_model_id}, {store name}], PRE_FILTER
)
This calculation is working but we have 100 GB of data. With that, when we are doing all these pre-filtering, this particular metric is taking more time to load than other. For one filter change, it is taking about 30-45 seconds because of these pre aggregations.
So, is there a way to optimize this calculation? Attached are sample datasets of 2 tables.
Transaction Data:
business_date | store_code | business_model_id | invoices | gmal_invoice |
---|---|---|---|---|
05-06-2023 | 275 | 4 | 154 | 0 |
06-06-2023 | 50 | 4 | 317 | 0 |
01-06-2023 | 369 | 4 | 194 | 0 |
01-06-2023 | 92 | 4 | 128 | 0 |
01-06-2023 | 344 | 4 | 351 | 0 |
01-06-2023 | 345 | 4 | 96 | 0 |
01-06-2023 | 166 | 4 | 202 | 0 |
01-06-2023 | 411 | 4 | 73 | 0 |
01-06-2023 | 222 | 4 | 268 | 0 |
01-06-2023 | 368 | 4 | 173 | 0 |