Looker Data Actions to update record in Snowflake Table

Use Looker Data Actions to Update a record in a Snowflake Table using PipeGears!

In this tutorial, I will walk through setting up a Data Action in LookML to select a State in a Look/Explore, submit a Data Action with a new State Code, process the Data Action JSON with a web service using PipeGears, update the State Code in the Snowflake database table, and refresh the Look/Explore.

Interacting with Data Actions in a Look or Explore
In the following Explore, you can see that Connecticut has an incorrect State Code.

I select the ellipsis (…) next to Connecticut and select the Data Action: Update State Code.
DataAction1

I enter the New State Code: CT and click Submit.

The Data Actions JSON Message
Looker then assembles information about the Value selected, State ID, and New State Code into a JSON (below) that it posts to the web service URL in PipeGears.
JSON (simplified):

{
  "type": "cell",
  "data": {
    "auth_code": "abc123456",
	"user_id": 1234,
    "state_name": "Connecticut",
	"state_id": 7
  },
  "form_params": {
    "new_state_code": "CT"
  }
}

LookML: Behind the Scenes
The LookML to create this Data Action uses Liquid syntax to capture information about the selected value, state_id from the record, new_state_code entered by the user, user_id from user attributes, and auth_code for authentication. The LookML code is shown below:

dimension: state_name {
    type: string
    sql: ${TABLE}."STATE_NAME" ;;
    action: {
      label: "Update State Code"
      url: "https://[yourlink].pipegears.io/states"  # Replace [yourlink]
      param: {
        name: "auth_code"
        value: "abc123456"
      }
      user_attribute_param: {
        user_attribute: id
        name: "user_id"
      }
      param: {
        name: "state_name"
        value: "{{ value }}"
      }
      param: {
        name: "state_id"
        value: "{{ state_lkp.state_id._value }}"
      }
      form_param: {
        label: "New State Code"
        name: "new_state_code"
        type: string
        required: no
      }
    }
  }

A Webservice Workflow in PipeGears
Over in PipeGears, I created the following workflow application with seven (7) Gears or steps.

List of Gears:

  1. Snowflake (SQL DataSource)
  2. HTTP Post (Web Service)
  3. Looker IPs (Filter)
  4. auth_code (Filter)
  5. Update State (SQL Query)
  6. Success Reply (Web Reply)
  7. Error Reply (Web Reply)

1. Snowflake (SQL DataSource)


JDBC URL: jdbc:snowflake://youraccount.snowflakecomputing.com/?user=theuser&warehouse=thewh&db=thedb&schema=public

This is a disconnected step and defines the properties for connecting to the Snowflake database. The step is referenced by the SQL Update step.

2. HTTP Post (Web Service)


This step creates a web service web hook endpoint that listens for JSON data to be POSTed. The JSON data is received and parsed, it’s parameters available for use in later steps.

Path: /states
Provide an endpoint path for your API Call.

Service URL: PipeGears provides a unique web service URL for your Account and a web service endpoint with the /Path extension. This complete endpoint URL is entered in the LookML Data Action link parameter, where it says [yourlink], in the section above.

3. Filter IPs (Filter)


This step ensures that only the Looker IP Whitelisted IPs are allowed to connect to this web service. The source IP (from Looker) is included as a parameter in the message. I used a MATCH_REGEX filter to limit the allowed IPs to the Looker US IPs.

Select the input: ${request.remoteIp}
All parameters are available by typing $.

String: MATCH_REGEX, ^(54.208.10.167|54.209.116.191|52.1.5.228|52.1.157.156|54.83.113.5)$
This is an OR REGEX that allow one of the five Looker US IP addresses.

4. auth_code (Filter)


For additional security, I am passing a parameter called auth_code from the Looker Data Action. My PipeGears Workflow ensures that this parameter with the correct value is included in the posted JSON message.

Select the input: ${request.body.json.data.auth_code}

String: EQUALS, abc123456

5. Update State (SQL Query)


The SQL Query updates the State record in the Snowflake database (defined in Step 1).

SQL DataSource: Snowflake (from Step 1)

Query: A simple update query with parameters injected from the Data Action JSON.

UPDATE LOOKER_SCRATCH.STATE_LKP
SET STATE_CODE = '${request.body.json.form_params.new_state_code}'
WHERE STATE_ID = ${request.body.json.data.state_id};

6. Success Reply (Web Reply)


The Success Reply step responds to Looker to tell it that the update was successful and to refresh the query and visualization.

Status: 200 (This means successful in HTTP parlance)

Header: Content-Type: application/json

Body: The body of the JSON message is shown below.

{
  "looker": {
    "success": true,
    "refresh_query": true
  }
}

7. Error Reply (Web Reply)


The Error Reply step responds to Looker to tell it that the request failed and something went wrong (either incorrect IP, incorrect auth_code, or invalid SQL update).

Status: 400 (This means Bad Request in HTTP parlance)

Header: Content-Type: application/json

Body: The body of the JSON message is shown below.

{
  "looker": {
    "success": false,
    "refresh_query": false
  }
}

PipeGears Runtime Console


Once your workflow is built and all of the steps are connected, it is time to test your Data Actions and the workflow to update Snowflake. In the upper right corner of the workflow, click the power button to start the workflow.

Once the workflow is started, click the >_ icon in the upper right of each of the steps. This will open a separate browser tab with the console for each step. The console will be relatively empty initially.

Then go back into Looker (on another tab) to your Explore/Look. Select a record and run through the Data Action: Update State Code, enter a value and Submit. Hopefully, the Data Action succeeds and the query/visualization is refreshed with the updated data. If not, you will need to troubleshoot the workflow and the consoles can help you out.

Next return to your consoles (one browser tab for each step) and review the code received and check for any errors. You can see the JSON message with all of the headers and information sent by Looker. The console can also be helpful in finding the names of parameters that you may want to use in your other workflow steps.

Summary
Putting it all together, we now have a LookML Data Action and a PipeGears Workflow the will allow a user to update the Snowflake database from within a Looker Explore or Look. As you can see, it can be a pretty easy process to set up and allow users to interact and update their data from within their Looker BI and Analytics data Explore.
I hope you find this useful. Please comment and let me know your questions and how you are using Data Actions.

References:
Data Actions (Discourse Article)
Using Zapier With Data Actions (Discourse Article)
Looker: Updating User Attributes from a Dashboard with Data Actions, Looker API, Integromat
Data Actions: Take Action On What You Learn (Video)

3 Likes

This is awesome! We hope to push our data to Snowflake later this year and a write-back strategy is one of the reasons why. Having the “how” so beautifully laid out is a major help, thanks so much, Jeff!

2 Likes

Please vote for this Feature Request:
Looker Actions: Database Write-back
This would be so much better and easier if a Data Action could write-back to the database using the existing database connection.

Jeff, conceptually I very much agree with you, however I have concerns about the potential for table locks, etc. I could see the impact of a table lock on a regularly used data warehouse table being quite substantial.