I’d like to calculate some kind of processing time for each item. The process has some status which indicate that the item is “in processing” like status 1 or 2. All other status are end status. Given the case that the item is in processing, the processing time equals the interval between start date of the process and “now”. In contrast, if the item reaches an end status the processing time equals the interval between the start date and the end date of the process. I created a calculated field:
Hello @dbb1 - I believe you can calculate it simply for each row of your dataset based on the current_status_id and you can use that derived calculated field into another calculation where you can use min and max based on your need. Hope this helps!
It will be really helpful if you can share a sample of your dataset and expected value so that I can try to replicate it at my end.
the dataset has the format of an event log. Each activity with an item (process step) gets logged as a single record with a timestamp and further information. Please have a look at the dummy data which have a similar format to my original data.
Thank you @dbb1 for your response. Based on your explanation, I believe you are looking for an output which is at a case_id level. So for your given example, we will end up with 6 records having case_id and timespan. Secondly based on the sample that you provided, all of the cases will fall under the first category i.e. where we need to measure between the minimum timestamp of a given ID and current timestamp as all the cases contain either of the 2 activities that you mentioned i.e. “pay compensation”, “reject request”. Please confirm if this is the right understanding. This will help me to replicate it at my end. Thank you!
Yes, you are right. The processing time is a case level calculation and we will have 6 processing times, one for each case.
For example let’s look at case_id 3. The process is finished, so the processing time would be the timespan between first timestamp 2010-12-30 14:32:00+01:00 and the last 2011-01-15 10:45:00+01:00, which would be roughly 16 days.
Let’s consider case_id 2 is not finished (ignore the last “pay compensation” record of case 2). In this situation the processing time would be the timespan between 2010-12-30 11:32:00+01, the first timestamp of case 2, and today, since we never finished the process…
Thank you @dbb1 for the details. I think this can be done by creating 2 calculated fields. First Create a Activity Indicator Field to denote the value 0 for rows having activities “pay compensation”, “reject request” and remaining activities as 1.
Once that is done, please leverage that field to conditionally calculate the timespan. I have tried a smaller simpler dataset and was able to achieve it. Please find the snapshots below. Hope this helps!
Calculated Field Details:
// Instead of the below, you can use yours like this : if(in(activity, [“pay compensation”, “reject request” ]), 0, 1)
ifelse(activity = ‘X’, 0, 1)
One question that came to my mind refers to the PRE_AGG parameter. I tried to see how the calculation changes when I remove PRE_AGG, so it defaults to POST_AGG_FILTER but in this case the formula returns an error:
Mismatched aggregation. Custom aggregations can't contain both aggregated and nonaggregated fields, in any combination.
Could you shed some light on why this is the case?
PRE_AGG and POST_AGG filters work differently. As the name suggests, Quicksight will allow you to use un-aggregated measure or dimension. However, for POST_AGG it is mandatory to provide some aggregation function as the essence of the calculation is that everything will happen after the aggregation. Hence the syntax will become invalid as soon as you change the parameter to POST_AGG_FILTER, as we are not using any aggregation function over the “Date” column. Please see the below reference to go through the details of Level Aware Calculations. Hope this helps!
In case my suggestion helped you to resolve your query, would request you to mark the post as “Solution”. This will help the community to find guidance and answers to similar question. Thank you!