I often need to take transactional/event data, roll it up to some less granular timeframe (say, a daily level), and compute an average of the results for some set of dimensions and filters.
For example, take a table of orders across retail stores. I might want to know the average sales for Wednesdays over the past 8 weeks. To do that, I need to sum sales by day, filter to Wednesdays, and average.
Since my source data is at the order level, I can’t just slap an
average measure on it (that would compute the average sales of each order, not the average of each day). In the past, I’ve used a derived table to sum to the daily level, then created an average measure in the derived table’s view. Then I’ve join the derived table back to the original view using the date dimension.
The problem with this approach is that I can’t anticipate other ways that a user might want to slice this derived table average measure. What if you wanted to see average daily sales for a specific product on Wednesdays? In order for that to work, the product name field needs to be a dimension in the derived table and needs to exist in the join from the derived table to the original view.
Is there a best practice for creating measures that can be dynamically aggregated to a specified granularity? Curious to know if I can do something clever with Liquid or parameters to accomplish this.