如何使用Spark SQL查询来过滤中文列名?

How to use Spark SQL query to filter Chinese column name?

我是运行下面的spark sql,它将获取所有数据:

scala> spark.sql("select * from t1").show()
+------+----+-------+
|  名稱|年齡|address|
+------+----+-------+
|jeremy|  33| Taipei|
|  Mary|  18| Taipei|
|  John|  28|    XXX|
|  大明|  29|    YYY|
|  小黃|  19|    ZZZ|
+------+----+-------+

但是当我添加一个过滤器时,将其命名为名称,Spark SQL 无法识别它。

scala> spark.sql("select * from t1 where 名稱=='jeremy'").show()
org.apache.spark.sql.catalyst.parser.ParseException:
mismatched input '名' expecting <EOF>(line 1, pos 23)

== SQL ==
select * from t1 where 名稱=='jeremy'
-----------------------^^^

  at org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:241)
  at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:117)
  at org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:48)
  at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:69)
  at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:643)
  ... 49 elided


scala> spark.sql("select * from t1 where '名稱'=='jeremy'").show()
+----+----+-------+
|名稱|年齡|address|
+----+----+-------+
+----+----+-------+

有人知道怎么做吗?

谢谢

您需要使用反引号 (`)。

spark.sql("select * from t1 where `名稱`=='jeremy'").show()