Rolling Average calc for use in line chart

I would like to create a rolling average calculation to use in a line chart. The functions runningAverage and windowAverage do not fit my use case.

I want to create a calculation that averages {field} for rows with {OM Date} <= {date} and {OM Date} >= {date}-30days.

In my dataset, not all days appear in {OM Date}, which is why runningAverageandwindowAverage` do not fit my use case. The closest I have gotten to a solution is creating a date parameter and plugging it into a calculation and x-axis, but this is only one date/day and not a series of dates to plot on the x-axis.

1 Like

Can you add a filter to your line chart so you’re only calculating averages for the date field you are using?

To do that, make a date parameter: ${OMDate}

Then make a field to filter out data you don’t want.

filter_field = ifelse(dateDiff(${OMDate},{date},‘DD’)<=0 AND dateDiff(${OMDate},{date},‘DD’)>=-30,‘Range’,‘Out Of’)

Then make your running average field based on your aggregated field (I am using the count of sessions for my example)

running_avg = runningAvg(count({session_id}),[{date} DESC])

Then make your visual and filter the filter field to be always equal to ‘Range’

Let me know if that works


I think this is not exactly what I’m looking for,

For each day on the x-axis, I want to average a field for rows on that day and those within 30 days before that day.
On August 31, average rows with dates between August 31 and August 1.
On August 30, average rows with dates between August 30 and July 31.
etc …

Not all days appear in the dataset, so I think I need to access a date series/range. I thought the closest I got to a solution was to create a date parameter, but I could only create a single day and not a range.

Not all days appear in the dataset…
You may have to force the dataset to have every day in it by forcing the missing days to have a value of zero (0). Check out this topic where we used a left outer join with the calendar as the primary table and the facts (data) as the secondary table: How to show Zeros in the pivot table?

1 Like

Thats interesting, thanks for sharing. It got me closer but I am not able to execute a rolling calc, Here is where I am.

I started to work on this by creating a query and joining it to my dataset like this -

with calendar as (SELECT generate_series(
    MIN(DATE_TRUNC('day', "OM Date")::DATE),
    MAX(DATE_TRUNC('day', "OM Date")::DATE),
    '1d')::date as day

    FROM "nrtp_report"."nrtp_property_report"


metros as (
    select distinct "MSA"
    FROM "nrtp_report"."nrtp_property_report"

SELECT * FROM calendar 

I did a cross join because I want to look at a rolling average by MSA too. I could not pass a multiselect parameter for MSA to use it to a calculation to view stats by metro.

This calculation works to get a count for each day.

ifelse(count({OM Date}) >= 1, count({OM Date}), 0)

I am getting errors about combining non-aggregated and aggregated calculations when I try and create a rolling calculation. Any ideas?

Hi @ccowen, Thanks for your patience on this!
Based on the discussion we had earlier today below is the summary of the solution that worked.

  • Ingested missing dates from a date table to ensure we have no gaps in the date while using window function
  • computed window count of non null values to get actual count of records in the defined window
  • computed window sum of the measure in the defined window
  • Finally calculated Avg using computed values from prior steps.

I will mark this as the solution however if you run into any issues please feel free to reopen this.


Thank you for your help Karthik.

For record for this issue, these are the formulas I used-

ifelse(count({OM Date}) >= 1, count({OM Date}), 0)

rolling 30 day count:
windowSum({count_not_null_om_date}, [day DESC], 0, 30, [{MSA[calendar_om_dates]}])

rolling 30 day average:
windowSum(sum({Price Action 15day or 30day Reduction}), [day DESC], 0, 30, [{MSA[calendar_om_dates]}])/windowSum({count_not_null_om_date}, [day DESC], 0, 30, [{MSA[calendar_om_dates]}])