Run Looks from Google Spreadsheets using Looker API

google
api
sheets

(Brecht Vermeire) #1

If you’re occasionally using Google Sheets for some on to go calculations based on data in Looker, you’ve probably run into this issue. For the data to be available, the Look needs to be public. While this is still a great way to bring data from Looker into Google Sheets (read more here), you can also use the Looker API for this. The cool part is: you can do it for any Look you have access to, without having to rewrite anything.

Google Apps script has the built-in URL Fetch Service that allows us to make HTTP requests straight from our spreadsheet.

In this example, we’ll define a custom function in Sheets that allows us to bring in the results from any Look.

You can find the latest version of the script on Github!

Step by step

  1. Open up your spreadsheet and navigate to Tools > Script editor. Copy paste in the script below in the editor, you can remove the dummy function myFunction().

  2. At the top, change the base URL to your company domain for Looker, and add in your API credentials (client ID & secret).

    Note: it’s important to limit the user tied to these API credentials appropriately, since the credentials will be accessible by whomever may access the sheet.

  3. Hit save and give the project a name.

  4. Navigate back to your spreadsheet and start typing LOOKER_RUN_LOOK().

=LOOKER_RUN_LOOK(look_id, format, query_limit)

  • look_id (number) - The ID of the Look, e.g. 345
  • format (number, optional) - The requested format. 1 returns data, 2 the underlying SQL. defaults to 1 (data).
  • query_limit (number, optional) - The query limit. Defaults to 5000 if left empty.

You should now have the custom function available in your spreadsheet:

You’ll see the second parameter taking either 1 or 2 as input. In this function, 1 is used for the results and 2 for the underlying SQL query.

Now in this spreadsheet, you’ll be able to query the results of every Look you have access to in Looker!

The full script

// Replace this with your base domain e.g. https://mycompany.looker.com:19999/api/3.0
var BASE_URL = 'https://mycompany.looker.com:19999/api/3.0';
// Replace this with your API credentials
var CLIENT_ID = 'XXX';
var CLIENT_SECRET = 'XXX';

/**
 * Returns the results or the sql of a Look
 *
 * @param {number} id The unique ID of the Look
 * @param {number} opt_format 1 for csv, 2 for raw sql - defaults to csv (optional)
  * @param {number} opt_limit the query limit - defaults to 5000 if empty (optional)
 * @return The Look results data
 * @customfunction
 */
function LOOKER_RUN_LOOK(id, opt_format, opt_limit) {
    try {
        var options = {
            "method": "get",
            "headers": {
                "Authorization": "token " + login()
            }
        };

        // set formatting to either csv or the raw sql query since sheets is limited
        var formatting;
        // convert param
        switch (opt_format) {
            case 1:
                formatting = "csv";
                break;
            case 2:
                formatting = "sql";
                break;
            default:
                formatting = "csv";
        }

        // set a custom limit
        var limit;
        if(opt_limit) {
          limit = opt_limit;
          // else use the 5k default
        } else if (opt_limit == -1) {
            limit = -1;
        } else {
          limit = 5000;
        }

        // get request for the look
        var response = UrlFetchApp.fetch(BASE_URL + "/looks/" + id + "/run/" + formatting + "?limit=" + limit, options);

        // if it's csv, fill it in the cells, if it's the query, use one cell only, if not specified, throw error
        if (opt_format == 1) {
            return Utilities.parseCsv(response.getContentText());
        } else if (opt_format == 2)
          {
            return response.getContentText();
          }
        else {
            return Utilities.parseCsv(response.getContentText());
        }
    } catch (err) {
        return "Uh oh! Something went wrong. Check your API credentials and if you're passing the correct parameters and that your Look exists!";
    }
}

function login() {
  try{
    var post = {
        'method': 'post'
    };
    var response = UrlFetchApp.fetch(BASE_URL + "/login?client_id=" + CLIENT_ID + "&client_secret=" + CLIENT_SECRET, post);
    return JSON.parse(response.getContentText()).access_token;
  } catch(err) {
    return "Could not login to Looker. " + err
  }
}

Other potential uses

Since we’re using the Looker API, there’s a lot more possible use cases. The Github repository here contains a few more functions that are all following the same pattern. You can add custom functions yourself too, a basic HTTP GET request could be written as follows:

function LOOKER_OTHER_ENDPOINT(input) {

  var options = {
        'method': 'get',
        'headers': {
            'Authorization': 'token ' + token
        }
    };

    var response = UrlFetchApp.fetch("https://yourcompany.looker.com/this/that" + input, options);

    return Utilities.parseCsv(response.getContentText())
  }

Reference material


Is IMPORTXML frustrating you? Here is a better method to refresh public looks in Google Sheets
Sharing Public Links (URLs) with limits over 5000
Iqy Import to Excel row limit
Unlimited results webquery to excel?
Anyone sending data from Looker to external visualization tools like Geckoboard?
Creating new users through Google Sheets using Looker API
#2

Does this feature allows to append the data every time when look runs. Means let say my look always have 1 week of data. But Google sheet will maintain multiple weeks of data and everytime one new week of data added by Looker using API.


(Brecht Vermeire) #3

It does not @kshah7, since it’s getting the results of the saved Look in it’s current version. You could however just copy paste ‘values only’ (CTRL/CMD + Shift + V). You could then use the same formula again at the bottom of the results to get the results from the Look again.


(Danny Thorpe) #4

URL link to Looker docs in step 2 is broken. Looks like it went doubledouble.


(Daniel Nelson) #5

An alternative way of hiding api credentials is to use google sheet “Properties”. If you’re in the script editor, go file -> Project Properties. This will allow you to define user specific or global variables that can be referenced from the script, and are only available to view/edit if you are the script owner.

https://developers.google.com/apps-script/guides/properties


(Kiran Tp) #6

Hey Brecht, your code worked like a charm. But recently I was trying to import some data using the LOOKER_RUN_LOOK (LOOK ID) custom function. Google sheet throws “Internal error executing the custom function”.
Is this error related to exceeding the quota of API calls on a given day because I was importing a lot of look ids? It would be great if you could give some insight into this error.


(Brecht Vermeire) #7

Looker itself doesn’t enforce quota on API calls since it’s not publicly accessible without keys generated by an admin. The error Internal error executing the custom function is definitely related to Google Apps Script, but it doesn’t provide any context. My first guess was a huge result set that UrlFetchApp cannot process, but in that case it decides to be vocal about it and it just gives result set too large. Quota might be the reason, but in that case it should be clear on that in the error message too.

Is the failure persistent or is it flaky? Do you use this functionality a lot or just a few times? Is it one particular Look?


(Kiran Tp) #8

The failure is consistent with specific looks where the query runs over 1 Tb whereas other looks which runs over a couple of Gb’s to run works pretty neat. Yes, we would be using the functionality a lot, say 30 times a week.


(Izzy) #9

That link that @brecht posted seems to indicate that the hard cap for a custom function runtime is 30s, I wonder if a 1Tb result set is just taking longer than that and it’s failing out… I found some stackoverflow cases (totally different scripts, but same error) where upon reducing the query size or optimizing their script to run faster, the error went away.

If you run the same look that usually errors but add a limit using the opt_limit param, does it work?