Spark SQL 如何使用Filter

Spark SQL how to use Filter

我写了以下内容 SQL :

select  count(value) as total, name , window  from `event` 
   where count(value) > 1 group by window(event_time,'2 minutes'),name

Spark 出现以下错误:

nAggregate/Window/Generate expressions are not valid in where clause of the query.\nExpression in where clause: [(count(event.`value`) > CAST(1 AS BIGINT))]\nInvalid expressions: [count(event.`value`)]

正确的语法是什么?

你需要用HAVING代替(documentation),而且应该放在GROUP BY:

之后
select  count(value) as total, name , window  from `event` 
group by window(event_time,'2 minutes'),name
having total > 1