I have found a daily ingestion method to BigQuery using data accessible within the iLooker Explores which has worked for us over the last few months - with no scripting necessary.
I daily schedule to a GCS bucket csv data from the System Activity Explores, saved as looks and scheduled in Looker directly, i.e. History, Dashboard, Query, Look, Users.
When they arrive in the GCS bucket they overwrite the previous day’s bucket file with the same name. In BigQuery I set up a daily import job (BigQuery Transfers) to append the History file in GCS to a partitioned History table in BigQuery. For the other tables, I set them up as simple external tables as they are dimensional.
Any additional transformation or joining prior to loading into Looker is done via BigQuery and pipeline tools like DBT - some scripting there, naturally - but depending on the content of the scheduled Looker files, and if you are happen with them ‘out-of-the-box’, you could alternatively do the BigQuery table joins in a LookML model of course.
The ability to do this is already within the standard features of BQ and GCP. There may be other ways to do it (i.e. with App Script or Cloud Functions) but this method for me has been the most seamless. It has not failed even once and I have Looker usage data beyond 90 days.
I hope this helps - your comment “we have built further functionality than what iLooker offers” caught my attention - what have you been cooking up there?