@ryan.dunlavy, I noticed the prefetch API is deprecated now in the API docs. What is the suggested replacement instead of using prefetching if you want to cache a dashboard with a few commonly chosen filters.
In order to cache a dashboard with certain dashboard filters, we recommend using datagroups in conjunction with schedules. The
persist_with parameter can be used at the model or explore level along with a datagroup to set the caching policy for that explore or model.
Because a scheduled query follows the same caching policies as a query by a user, if a schedule is set up with the filter values you would like to cache that triggers with the same datagroup as the model or explore, then that query would be cached.
Hope this helps!
@ryan.dunlavy Just to clarify what you are saying, to pre-cache a dashboard you are suggesting to schedule a dashboard export with a datagroup. When the datagroup triggers, the dashboard will run which will cache all of the looks inside the dashboard. To pre-cache multiple different filters on that dashboard, just schedule multiple exports each with a different combination of filters.
@grantnicholas Yes, that is a great summation!
If the schedule runs on the same datagroup that the explores in the different tiles use, then the dashboard will always pull from cache when viewed in the front end since the previous cache would be busted and a new query would be run and cached simultaneously.
Thanks! That makes sense.
I saw the above comment that pre-fetch api is being deprecated and that scheduling with data groups is the suggested solution.
Is it possible to trigger a schedule with the api, so that we have the same side-effects as prefetching did, but not to actually generate a PDF/report or send it anywhere?
Btw the api demo super nifty to play around with, glad yall made it and hope it remains maintained:
So we ended up solving for the no-emailing, no sftp constraint by setting up a webhook to just listen and not do anything with the information it gets.