pyspark:StopWordsRemover 参数区域设置无效值

pyspark: StopWordsRemover parameter locale given invalid value

我已经使用 pyspark 将几个文本文件加载到数据框中,将它们拆分为单词,现在想使用 StopWordsRemover 过滤掉停用词。

但是,当我想实例化 StopWordsRemover class 时,它失败并出现错误:

---------------------------------------------------------------------------
Py4JJavaError                             Traceback (most recent call last)
/usr/local/Cellar/apache-spark/2.4.0/libexec/python/pyspark/sql/utils.py in deco(*a, **kw)
     62         try:
---> 63             return f(*a, **kw)
     64         except py4j.protocol.Py4JJavaError as e:

/usr/local/Cellar/apache-spark/2.4.0/libexec/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py in get_return_value(answer, gateway_client, target_id, name)
    327                     "An error occurred while calling {0}{1}{2}.\n".
--> 328                     format(target_id, ".", name), value)
    329             else:

Py4JJavaError: An error occurred while calling None.org.apache.spark.ml.feature.StopWordsRemover.
: java.lang.IllegalArgumentException: StopWordsRemover_daf8924a73f7 parameter locale given invalid value pl_US.
    at org.apache.spark.ml.param.Param.validate(params.scala:77)
    at org.apache.spark.ml.param.ParamPair.<init>(params.scala:656)
    at org.apache.spark.ml.param.Param.$minus$greater(params.scala:87)
    at org.apache.spark.ml.feature.StopWordsRemover.<init>(StopWordsRemover.scala:109)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
    at py4j.Gateway.invoke(Gateway.java:238)
    at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
    at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
    at py4j.GatewayConnection.run(GatewayConnection.java:238)
    at java.lang.Thread.run(Thread.java:748)


During handling of the above exception, another exception occurred:

IllegalArgumentException                  Traceback (most recent call last)
<ipython-input-17-3dbcf7d12cb6> in <module>
----> 1 remover = StopWordsRemover(inputCol="words", outputCol="filtered")

/usr/local/Cellar/apache-spark/2.4.0/libexec/python/pyspark/__init__.py in wrapper(self, *args, **kwargs)
    108             raise TypeError("Method %s forces keyword arguments." % func.__name__)
    109         self._input_kwargs = kwargs
--> 110         return func(self, **kwargs)
    111     return wrapper
    112 

/usr/local/Cellar/apache-spark/2.4.0/libexec/python/pyspark/ml/feature.py in __init__(self, inputCol, outputCol, stopWords, caseSensitive, locale)
   2595         super(StopWordsRemover, self).__init__()
   2596         self._java_obj = self._new_java_obj("org.apache.spark.ml.feature.StopWordsRemover",
-> 2597                                             self.uid)
   2598         self._setDefault(stopWords=StopWordsRemover.loadDefaultStopWords("english"),
   2599                          caseSensitive=False, locale=self._java_obj.getLocale())

/usr/local/Cellar/apache-spark/2.4.0/libexec/python/pyspark/ml/wrapper.py in _new_java_obj(java_class, *args)
     65             java_obj = getattr(java_obj, name)
     66         java_args = [_py2java(sc, arg) for arg in args]
---> 67         return java_obj(*java_args)
     68 
     69     @staticmethod

/usr/local/Cellar/apache-spark/2.4.0/libexec/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py in __call__(self, *args)
   1523         answer = self._gateway_client.send_command(command)
   1524         return_value = get_return_value(
-> 1525             answer, self._gateway_client, None, self._fqn)
   1526 
   1527         for temp_arg in temp_args:

/usr/local/Cellar/apache-spark/2.4.0/libexec/python/pyspark/sql/utils.py in deco(*a, **kw)
     77                 raise QueryExecutionException(s.split(': ', 1)[1], stackTrace)
     78             if s.startswith('java.lang.IllegalArgumentException: '):
---> 79                 raise IllegalArgumentException(s.split(': ', 1)[1], stackTrace)
     80             raise
     81     return deco

IllegalArgumentException: 'StopWordsRemover_daf8924a73f7 parameter locale given invalid value pl_US.'

我试过将 locale 参数设置为 "en_US" 或像这里一样传递停用词列表 - pyspark : how to configure StopWordsRemover with french language on spark 1.6.3

我是 运行 Spark v2.4.0.

对我来说,将 JVM 参数设置到正确的位置和语言解决了问题:

-Duser.country=US -Duser.language=en

在使用 StopWordsRemover 之前 添加以下代码可以解决我的问题。

locale = sc._jvm.java.util.Locale
locale.setDefault(locale.forLanguageTag("en-US"))

对了,我的pyspark是2.4.0版本