# [Retired] Calculating the difference between dates

#1

The content of this article has been moved to Looker’s Help Center. We have articles on calculating the difference between dates in LookML or in table calculations.

12 Likes

(Gil Margolin) #2

I would like to be able to group the timediff into timeframe:{minutes,day,months} is this possible ?

0 Likes

#4

This isn’t currently possible, but we are thinking about ways to do custom grouping of dimensions (similar to how timeframes are grouped) in the field picker. So this will fall under that in the future!

0 Likes

(Lawrence) #5

Is it possible to use DATEDIFF or a date operator in a table calculation?

1 Like

#6

Date operations aren’t currently possible in table calculations, but I’ll pass this on to our product team! Table calcs are still in beta, so we’re looking for great additions.

0 Likes

#7

As of Looker 3.30, table calculations now have date operations! Read more about these here.

0 Likes

(AJ) #8

Is there any way to do something like this but for between rows? i.e. time between the 1st arrival date and 2nd arrival date in the scenario above

0 Likes

(lloyd tabb) #9

Yes for sure. This article talks about generating sequence numbers for transactions:

Instead of a sequence number, you can compute previous order date as an attribute and use the above technique. Note, this example is MySQL.

``````  - dimension: previous_order
type: time
timeframes: [date, week, month, year]
sql: |
(
SELECT o.created_at
FROM orders o
WHERE o.id <= \${TABLE}.id
AND o.user_id = \${TABLE}.user_id
ORDER BY o.created_at DESC
LIMIT 1
)
``````

If you are in Postgres, or a dialect that supports window functions, you can build a derived table like the one described in this post:

``````- view: order_user_sequence
derived_table:
sql: |
SELECT
id AS order_id
, ROW_NUMBER() OVER(PARTITION BY user_id ORDER BY created_at) as user_order_sequence_number
, LAG(created_at) OVER PARTITION BY user_id ORDER BY created_at) as previous_order
FROM orders

fields:
- dimension: order_id
primary_key: true
sql: \${TABLE}.order_id

- dimension: user_order_sequence_number
type: int
sql: \${TABLE}.user_order_sequence_number

- dimension_group: previous_order
type: time
timeframes: [date, week, month, year]
sql: \${TABLE}.previous_order
``````

Agan, you can use the above techniques to do the date arithmetic.

0 Likes

(Ryan Tuck) #10

The `date_part` function suggested for postgres doesn’t work on fields that are already dates.

for instance,

``````select date_part('day', date('2015-02-02') - date('2015-02-01'))
``````

returns an error.

instead, to get a date diff between two dates in postgres:

``````select date('2015-02-02') - date('2015-02-01')
``````
1 Like

(Morgan Imel) #11

Hey Gil!
In 3.44 we released a feature called group_label that allows to you group fields in the field picker of an explore, similar to dimension groups. More info about group_label can be found here.

0 Likes

(Steve Litras) #12

I’m having problems trying to do a date calculation using an AWS Athena table. Is there specific syntax for that?

0 Likes

(rufus) #13

Hi Steve,

AWS Athena uses the Presto functions which are documented here.

So `date_diff(unit, timestamp1, timestamp2)` should work for you.

If you’re running into issues implementing those feel free to email support@looker.com with the details and we’ll be happy to look into it further.

0 Likes

(Steve Litras) #14

Thanks! That solved it for me…

0 Likes

(Jimmy Alexander González Jumbo) #15

UPDATE

DATE_PART(‘day’, orders.arrival_time - orders.ship_time) * 24 +
DATE_PART(‘hour’, orders.arrival_time - orders.ship_time ) * 60 +
DATE_PART(‘minute’, orders.arrival_time - orders.ship_time ) * 60 +
DATE_PART(‘minute’, orders.arrival_time - orders.ship_time )

change to

DATE_PART(‘day’, orders.arrival_time - orders.ship_time) * 24 +
DATE_PART(‘hour’, orders.arrival_time - orders.ship_time ) * 60 +
DATE_PART(‘minute’, orders.arrival_time - orders.ship_time ) * 60 +
DATE_PART(‘second’, orders.arrival_time - orders.ship_time )

0 Likes

(Morgan Imel) #16

Thanks for catching that @jagonzalezx! I’ll fix it right now.

0 Likes

(Camille Vernon | Data Operations Analyst) #17

Thanks for posting this, @Ryan_Tuck ! This is the exact solution for which I’ve just spent an hour looking.

0 Likes