json4s - 使用替代方法解析重载方法值
json4s - overloaded method value parse with alternatives
我有一个使用 json4s 的 Spark 项目。当 运行 正常提交时它工作正常,但我在尝试从 spark shell 解析 JSON 时遇到错误。最简单的例子来自json4s readme(项目中就是这样使用的)抛出异常:
spark2-shell [options] --jars my-assembled.jar
scala> import org.json4s._
scala> import org.json4s.native.JsonMethods._
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """)
<console>:30: error: overloaded method value parse with alternatives:
(in: org.json4s.JsonInput,useBigDecimalForDouble: Boolean,useBigIntForLong: Boolean)org.json4s.JValue <and>
(in: org.json4s.JsonInput,useBigDecimalForDouble: Boolean)org.json4s.JValue
cannot be applied to (String)
St运行gely,为默认提供显式参数这有效:
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """, false, true)
res2: org.json4s.JValue = JObject(List((numbers,JArray(List(JInt(1), JInt(2), JInt(3), JInt(4))))))
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """, true, true)
res3: org.json4s.JValue = JObject(List((numbers,JArray(List(JInt(1), JInt(2), JInt(3), JInt(4))))))
这不是:
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """, false, false)
java.lang.NoSuchMethodError: org.json4s.package$.JLong()Lorg/json4s/JsonAST$JLong$;
at org.json4s.native.JsonParser$$anonfun.apply(JsonParser.scala:194)
at org.json4s.native.JsonParser$$anonfun.apply(JsonParser.scala:145)
at org.json4s.native.JsonParser$.parse(JsonParser.scala:133)
at org.json4s.native.JsonParser$.parse(JsonParser.scala:71)
at org.json4s.native.JsonMethods$class.parse(JsonMethods.scala:10)
at org.json4s.native.JsonMethods$.parse(JsonMethods.scala:63)
... 53 elided
我也在没有 Spark 的情况下使用 Ammonite REPL 检查了它:
@ import $ivy.`org.json4s:json4s-native_2.12:3.6.10`
@ import org.json4s._
@ import org.json4s.native.JsonMethods._
@ parse(""" { "numbers" : [1, 2, 3, 4] } """)
res3: JValue = JObject(List(("numbers", JArray(List(JInt(1), JInt(2), JInt(3), JInt(4))))))
这可能是 Scala 版本的问题(在 Scala 2.11.2 上使用 Spark 2.3,在 2.12.8 上使用 Ammonite 示例 运行)?我检查了3.3.0和3.6.10之间的几个json4s版本。
这是因为二进制不兼容。
https://github.com/json4s/json4s/issues/316
Spark 2.3.0 依赖 json4s-jackson_2.11-3.2.11
但您可以尝试使用 json4s-native
.
的不兼容版本
所以从 --jars
中删除 json4s
,导入 org.json4s.jackson.JsonMethods._
而不是 org.json4s.native.JsonMethods._
并删除 parse
的第三个参数(在 json4s 3.2.11 中有没有参数 useBigIntForLong
).
然后
~/spark-2.3.0-bin-hadoop2.7/bin$ ./spark-shell --jars json4s-native_2.11-3.6.10.jar,json4s-ast_2.11-3.6.10.jar,json4s-core_2.11-3.6.10.jar,json4s-scalap_2.11-3.6.10.jar,paranamer-2.8.jar
2020-11-30 05:44:37 WARN Utils:66 - Your hostname, dmitin-HP-Pavilion-Laptop resolves to a loopback address: 127.0.1.1; using 192.168.0.103 instead (on interface wlo1)
2020-11-30 05:44:37 WARN Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2020-11-30 05:44:37 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://192.168.0.103:4040
Spark context available as 'sc' (master = local[*], app id = local-1606707882568).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.3.0
/_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit GraalVM EE 19.3.0, Java 1.8.0_231)
Type in expressions to have them evaluated.
Type :help for more information.
scala> import org.json4s._
import org.json4s._
scala> import org.json4s.native.JsonMethods._
import org.json4s.native.JsonMethods._
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """)
<console>:30: error: overloaded method value parse with alternatives:
(in: org.json4s.JsonInput,useBigDecimalForDouble: Boolean,useBigIntForLong: Boolean)org.json4s.JValue <and>
(in: org.json4s.JsonInput,useBigDecimalForDouble: Boolean)org.json4s.JValue
cannot be applied to (String)
parse(""" { "numbers" : [1, 2, 3, 4] } """)
^
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """, false, true)
res1: org.json4s.JValue = JObject(List((numbers,JArray(List(JInt(1), JInt(2), JInt(3), JInt(4))))))
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """, true, true)
res2: org.json4s.JValue = JObject(List((numbers,JArray(List(JInt(1), JInt(2), JInt(3), JInt(4))))))
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """, false, false)
java.lang.NoSuchMethodError: org.json4s.package$.JLong()Lorg/json4s/JsonAST$JLong$;
at org.json4s.native.JsonParser$$anonfun.apply(JsonParser.scala:194)
at org.json4s.native.JsonParser$$anonfun.apply(JsonParser.scala:145)
at org.json4s.native.JsonParser$.parse(JsonParser.scala:133)
at org.json4s.native.JsonParser$.parse(JsonParser.scala:71)
at org.json4s.native.JsonMethods$class.parse(JsonMethods.scala:10)
at org.json4s.native.JsonMethods$.parse(JsonMethods.scala:63)
... 53 elided
将更改为
~/spark-2.3.0-bin-hadoop2.7/bin$ ./spark-shell
2020-11-30 06:27:59 WARN Utils:66 - Your hostname, dmitin-HP-Pavilion-Laptop resolves to a loopback address: 127.0.1.1; using 192.168.0.103 instead (on interface wlo1)
2020-11-30 06:27:59 WARN Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2020-11-30 06:27:59 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://192.168.0.103:4040
Spark context available as 'sc' (master = local[*], app id = local-1606710484369).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.3.0
/_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit GraalVM EE 19.3.0, Java 1.8.0_231)
Type in expressions to have them evaluated.
Type :help for more information.
scala> import org.json4s._
import org.json4s._
scala> import org.json4s.jackson.JsonMethods._
import org.json4s.jackson.JsonMethods._
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """)
res0: org.json4s.JValue = JObject(List((numbers,JArray(List(JInt(1), JInt(2), JInt(3), JInt(4))))))
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """, true)
res1: org.json4s.JValue = JObject(List((numbers,JArray(List(JInt(1), JInt(2), JInt(3), JInt(4))))))
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """, false)
res2: org.json4s.JValue = JObject(List((numbers,JArray(List(JInt(1), JInt(2), JInt(3), JInt(4))))))
我有一个使用 json4s 的 Spark 项目。当 运行 正常提交时它工作正常,但我在尝试从 spark shell 解析 JSON 时遇到错误。最简单的例子来自json4s readme(项目中就是这样使用的)抛出异常:
spark2-shell [options] --jars my-assembled.jar
scala> import org.json4s._
scala> import org.json4s.native.JsonMethods._
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """)
<console>:30: error: overloaded method value parse with alternatives:
(in: org.json4s.JsonInput,useBigDecimalForDouble: Boolean,useBigIntForLong: Boolean)org.json4s.JValue <and>
(in: org.json4s.JsonInput,useBigDecimalForDouble: Boolean)org.json4s.JValue
cannot be applied to (String)
St运行gely,为默认提供显式参数这有效:
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """, false, true)
res2: org.json4s.JValue = JObject(List((numbers,JArray(List(JInt(1), JInt(2), JInt(3), JInt(4))))))
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """, true, true)
res3: org.json4s.JValue = JObject(List((numbers,JArray(List(JInt(1), JInt(2), JInt(3), JInt(4))))))
这不是:
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """, false, false)
java.lang.NoSuchMethodError: org.json4s.package$.JLong()Lorg/json4s/JsonAST$JLong$;
at org.json4s.native.JsonParser$$anonfun.apply(JsonParser.scala:194)
at org.json4s.native.JsonParser$$anonfun.apply(JsonParser.scala:145)
at org.json4s.native.JsonParser$.parse(JsonParser.scala:133)
at org.json4s.native.JsonParser$.parse(JsonParser.scala:71)
at org.json4s.native.JsonMethods$class.parse(JsonMethods.scala:10)
at org.json4s.native.JsonMethods$.parse(JsonMethods.scala:63)
... 53 elided
我也在没有 Spark 的情况下使用 Ammonite REPL 检查了它:
@ import $ivy.`org.json4s:json4s-native_2.12:3.6.10`
@ import org.json4s._
@ import org.json4s.native.JsonMethods._
@ parse(""" { "numbers" : [1, 2, 3, 4] } """)
res3: JValue = JObject(List(("numbers", JArray(List(JInt(1), JInt(2), JInt(3), JInt(4))))))
这可能是 Scala 版本的问题(在 Scala 2.11.2 上使用 Spark 2.3,在 2.12.8 上使用 Ammonite 示例 运行)?我检查了3.3.0和3.6.10之间的几个json4s版本。
这是因为二进制不兼容。
https://github.com/json4s/json4s/issues/316
Spark 2.3.0 依赖 json4s-jackson_2.11-3.2.11
但您可以尝试使用 json4s-native
.
所以从 --jars
中删除 json4s
,导入 org.json4s.jackson.JsonMethods._
而不是 org.json4s.native.JsonMethods._
并删除 parse
的第三个参数(在 json4s 3.2.11 中有没有参数 useBigIntForLong
).
然后
~/spark-2.3.0-bin-hadoop2.7/bin$ ./spark-shell --jars json4s-native_2.11-3.6.10.jar,json4s-ast_2.11-3.6.10.jar,json4s-core_2.11-3.6.10.jar,json4s-scalap_2.11-3.6.10.jar,paranamer-2.8.jar
2020-11-30 05:44:37 WARN Utils:66 - Your hostname, dmitin-HP-Pavilion-Laptop resolves to a loopback address: 127.0.1.1; using 192.168.0.103 instead (on interface wlo1)
2020-11-30 05:44:37 WARN Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2020-11-30 05:44:37 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://192.168.0.103:4040
Spark context available as 'sc' (master = local[*], app id = local-1606707882568).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.3.0
/_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit GraalVM EE 19.3.0, Java 1.8.0_231)
Type in expressions to have them evaluated.
Type :help for more information.
scala> import org.json4s._
import org.json4s._
scala> import org.json4s.native.JsonMethods._
import org.json4s.native.JsonMethods._
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """)
<console>:30: error: overloaded method value parse with alternatives:
(in: org.json4s.JsonInput,useBigDecimalForDouble: Boolean,useBigIntForLong: Boolean)org.json4s.JValue <and>
(in: org.json4s.JsonInput,useBigDecimalForDouble: Boolean)org.json4s.JValue
cannot be applied to (String)
parse(""" { "numbers" : [1, 2, 3, 4] } """)
^
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """, false, true)
res1: org.json4s.JValue = JObject(List((numbers,JArray(List(JInt(1), JInt(2), JInt(3), JInt(4))))))
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """, true, true)
res2: org.json4s.JValue = JObject(List((numbers,JArray(List(JInt(1), JInt(2), JInt(3), JInt(4))))))
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """, false, false)
java.lang.NoSuchMethodError: org.json4s.package$.JLong()Lorg/json4s/JsonAST$JLong$;
at org.json4s.native.JsonParser$$anonfun.apply(JsonParser.scala:194)
at org.json4s.native.JsonParser$$anonfun.apply(JsonParser.scala:145)
at org.json4s.native.JsonParser$.parse(JsonParser.scala:133)
at org.json4s.native.JsonParser$.parse(JsonParser.scala:71)
at org.json4s.native.JsonMethods$class.parse(JsonMethods.scala:10)
at org.json4s.native.JsonMethods$.parse(JsonMethods.scala:63)
... 53 elided
将更改为
~/spark-2.3.0-bin-hadoop2.7/bin$ ./spark-shell
2020-11-30 06:27:59 WARN Utils:66 - Your hostname, dmitin-HP-Pavilion-Laptop resolves to a loopback address: 127.0.1.1; using 192.168.0.103 instead (on interface wlo1)
2020-11-30 06:27:59 WARN Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2020-11-30 06:27:59 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://192.168.0.103:4040
Spark context available as 'sc' (master = local[*], app id = local-1606710484369).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.3.0
/_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit GraalVM EE 19.3.0, Java 1.8.0_231)
Type in expressions to have them evaluated.
Type :help for more information.
scala> import org.json4s._
import org.json4s._
scala> import org.json4s.jackson.JsonMethods._
import org.json4s.jackson.JsonMethods._
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """)
res0: org.json4s.JValue = JObject(List((numbers,JArray(List(JInt(1), JInt(2), JInt(3), JInt(4))))))
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """, true)
res1: org.json4s.JValue = JObject(List((numbers,JArray(List(JInt(1), JInt(2), JInt(3), JInt(4))))))
scala> parse(""" { "numbers" : [1, 2, 3, 4] } """, false)
res2: org.json4s.JValue = JObject(List((numbers,JArray(List(JInt(1), JInt(2), JInt(3), JInt(4))))))