Snowflake SQL 查询 JSON 数据 unix 时间转换

Snowflake SQL query for JSON data unix time conversion

目标:1) 能够将标记为“PAYMENT.json_data:date_payment”的 unix 时间戳列转换为人类可读格式,例如。 01/02/20 2) 并按过去 2 年过滤。不幸的是,我尝试操纵的任何代码都会导致错误,例如“SQL 编译错误:位置 0 处的语法错误行 7 意外 'PAYMENT'”。数据太多,导出后无法分析,也不会导出超过100MB。


SELECT
    PAYMENT.json_data:total,
    PAYMENT.json_data:date_payment,
    CUSTOMER.json_data:name
FROM PAYMENT
RIGHT JOIN CUSTOMER on customer.json_data:jnid=payment.json_data:customer



Limit 2
[![Output Sample][1]][1]

//All Payments in the system
//BROKEN    PAYMENT.json_data:DATE_FORMAT(DATE_ADD(FROM_date_payment(0), interval -315619200 second),'%Y-%m-%d');

with data as (
    select * 
    from values 
        (5671, 1399003200),
        (4500,1540580400) 
        v(total,date_payment)
)
select total, 
     date_payment, 
     to_timestamp(date_payment::number, 0) as datetime_type 
from data
--where datetime_type > '2018-01-01'
;

给出:

TOTAL   DATE_PAYMENT    DATETIME_TYPE
5671    1399003200  2014-05-02 04:00:00.000
4500    1540580400  2018-10-26 19:00:00.000

过滤时:

TOTAL   DATE_PAYMENT    DATETIME_TYPE
4500    1540580400  2018-10-26 19:00:00.000

所以你应该使用 to_timestamp(, 0) 作为基于秒的纪元值,3 作为毫秒值。

然后另一个陷阱,当从 json 中获取数据时,确保将其转换为数字,否则参数化版本可能会丢失..

然后使用过滤器。

[编辑] 添加明确的示例列..和真正的假 JSON 数据

WITH payment AS (  
  SELECT parse_json(a) as json_data
  FROM VALUES 
    ('{"customer":1234, "total":5671, "date_payment": 1399003200 }' ),
    ('{"customer":"1234", "total":4500, "date_payment": "1540580400"}' ) 
    v(a)
), customer as (  
SELECT parse_json(b) as json_data
  FROM VALUES
    ('{"jnid":"1234", "name": "Company AAA"}'),
    ('{"jnid":1234, "name": "Company AAA"}')
    c(b)
)
SELECT
    p.json_data:total::number AS total
    ,p.json_data:date_payment::number AS date_payment
    ,c.json_data:name AS customer_name
    --,to_timestamp(p.json_data:date_payment::number, 0) as datetime_type 
FROM PAYMENT AS p
JOIN CUSTOMER AS c
  ON c.json_data:jnid = p.json_data:customer  
WHERE to_timestamp(p.json_data:date_payment::number, 0) >= '2018-07-21'
ORDER BY 3,2;