如何在sparkSQL中转义单引号

How to escape single quote in sparkSQL

我是 pySpark 和 SQL 的新手。我正在处理以下查询;

sqlContext.sql("Select Crime_type, substring(Location,11,100) as Location_where_crime_happened, count(*) as Count\
                            From street_SQLTB\
                            where LSOA_name = 'City of London 001F' and \
                            group by Location_where_crime_happened, Crime_type\
                            having Location_where_crime_happened = 'Alderman'S Walk'")

我正在努力处理单引号。我需要在 Alderman'S Walk 上应用过滤器。这可能很简单,但我无法弄清楚。 非常感谢您的帮助。

试试这个

import pyspark
from pyspark.sql import SparkSession

spark = SparkSession.builder.appName('SparkByExamples.com').getOrCreate()
simpleData = [("James","Sales","NY",90000,34,10000), \
    ("Michael","Sales","NY",86000,56,20000), \
    ("Robert","Sales","CA",81000,30,23000), \
    ("Maria","Alderman'S Walk","CA",90000,24,23000) \
  ]
columns= ["employee_name","department","state","salary","age","bonus"]
df1 = spark.createDataFrame(data = simpleData, schema = columns)
df1.createOrReplaceTempView('temp') 

df = sqlContext.sql("""select * from temp where department = "Alderman'S Walk" """)
display(df)

df = sqlContext.sql("select * from temp where department = 'Alderman\'S Walk' ")
display(df)

过滤后的输出: