Google Cloud Dataflow Javascript 解析时出现 UDF 错误 JSON
Google Cloud Dataflow Javascript UDF error when parsing JSON
我已经使用 Pub/Sub 到 BigQuery template 来流式传输发送到 Pub/Sub 主题的 JSON 数据。通过数据流,我想将数据展平以匹配 BigQuery 架构并流式传输它们。
这是数据流进程的 Javascript UDF:
function transform(inJson) {
var obj = JSON.parse(inJson);
// variable declarations
// ...
data['domain'] = obj['data']['domain']; // line 18
...
return JSON.stringify(data);
}
我也试过:
data.domain = obj.data.domain;
我刚刚复制了 here 中的示例并将其扩展为展平 JSON 数据。
错误信息如下:
TypeError: Cannot read property "domain" from undefined in <eval> at line number 18
和堆栈跟踪:
javax.script.ScriptException: TypeError: Cannot read property "domain" from undefined in <eval> at line number 18
at jdk.nashorn.api.scripting.NashornScriptEngine.throwAsScriptException(NashornScriptEngine.java:470)
at jdk.nashorn.api.scripting.NashornScriptEngine.invokeImpl(NashornScriptEngine.java:392)
at jdk.nashorn.api.scripting.NashornScriptEngine.invokeFunction(NashornScriptEngine.java:190)
at com.google.cloud.teleport.templates.common.JavascriptTextTransformer$JavascriptRuntime.invoke(JavascriptTextTransformer.java:156)
at com.google.cloud.teleport.templates.common.JavascriptTextTransformer$FailsafeJavascriptUdf.processElement(JavascriptTextTransformer.java:315)
at com.google.cloud.teleport.templates.common.JavascriptTextTransformer$FailsafeJavascriptUdf$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:275)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:240)
at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:325)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.output(SimpleParDoFn.java:272)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:309)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access0(SimpleDoFnRunner.java:77)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:621)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:609)
at com.google.cloud.teleport.templates.PubSubToBigQuery$PubsubMessageToFailsafeElementFn.processElement(PubSubToBigQuery.java:412)
at com.google.cloud.teleport.templates.PubSubToBigQuery$PubsubMessageToFailsafeElementFn$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:275)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:240)
at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:325)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.output(SimpleParDoFn.java:272)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:309)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access0(SimpleDoFnRunner.java:77)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:621)
at org.apache.beam.sdk.transforms.DoFnOutputReceivers$WindowedContextOutputReceiver.output(DoFnOutputReceivers.java:71)
at org.apache.beam.sdk.transforms.MapElements.processElement(MapElements.java:122)
at org.apache.beam.sdk.transforms.MapElements$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:275)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:240)
at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:325)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:76)
at org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1233)
at org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.access00(StreamingDataflowWorker.java:144)
at org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.run(StreamingDataflowWorker.java:972)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: <eval>:18 TypeError: Cannot read property "domain" from undefined
at jdk.nashorn.internal.runtime.ECMAErrors.error(ECMAErrors.java:57)
at jdk.nashorn.internal.runtime.ECMAErrors.typeError(ECMAErrors.java:213)
at jdk.nashorn.internal.runtime.ECMAErrors.typeError(ECMAErrors.java:185)
at jdk.nashorn.internal.runtime.ECMAErrors.typeError(ECMAErrors.java:172)
at jdk.nashorn.internal.runtime.Undefined.get(Undefined.java:157)
at jdk.nashorn.internal.scripts.Script$Recompilation67A$\^eval\_.transform(<eval>:18)
at jdk.nashorn.internal.runtime.ScriptFunctionData.invoke(ScriptFunctionData.java:639)
at jdk.nashorn.internal.runtime.ScriptFunction.invoke(ScriptFunction.java:494)
at jdk.nashorn.internal.runtime.ScriptRuntime.apply(ScriptRuntime.java:393)
at jdk.nashorn.api.scripting.ScriptObjectMirror.callMember(ScriptObjectMirror.java:199)
at jdk.nashorn.api.scripting.NashornScriptEngine.invokeImpl(NashornScriptEngine.java:386)
... 42 more
当我通过传递一些示例数据在本地尝试 Javascript 时,它按预期工作,没有任何错误。
更新
原来 Pub/Sub 发送包裹在 " 中的数据,所以我不得不从字符串的开头和结尾删除它们。而且 JSON 中的每个 " 都用 \ 转义了因此我也必须删除它们才能继续而不会出现任何错误。
您的 JavaScript 代码似乎有错误。
data['domain'] = obj['data']['domain']; // line 18
抛出以下错误
TypeError: Cannot read property "domain" from undefined in <eval> at line number 18
如果变量没有在某处重新初始化,您可以尝试在发布到 Pub/Sub 时将 JSON 放在 单行 中
即
{"name":"Sam","age":21}
结果是 Pub/Sub 发送包裹在 "
中的数据,所以我不得不从字符串的开头和结尾删除它们。此外,JSON 中的每个 "
都使用 \
进行了转义,因此我也必须删除它们才能继续而不会出现任何错误。
我已经使用 Pub/Sub 到 BigQuery template 来流式传输发送到 Pub/Sub 主题的 JSON 数据。通过数据流,我想将数据展平以匹配 BigQuery 架构并流式传输它们。
这是数据流进程的 Javascript UDF:
function transform(inJson) {
var obj = JSON.parse(inJson);
// variable declarations
// ...
data['domain'] = obj['data']['domain']; // line 18
...
return JSON.stringify(data);
}
我也试过:
data.domain = obj.data.domain;
我刚刚复制了 here 中的示例并将其扩展为展平 JSON 数据。
错误信息如下:
TypeError: Cannot read property "domain" from undefined in <eval> at line number 18
和堆栈跟踪:
javax.script.ScriptException: TypeError: Cannot read property "domain" from undefined in <eval> at line number 18
at jdk.nashorn.api.scripting.NashornScriptEngine.throwAsScriptException(NashornScriptEngine.java:470)
at jdk.nashorn.api.scripting.NashornScriptEngine.invokeImpl(NashornScriptEngine.java:392)
at jdk.nashorn.api.scripting.NashornScriptEngine.invokeFunction(NashornScriptEngine.java:190)
at com.google.cloud.teleport.templates.common.JavascriptTextTransformer$JavascriptRuntime.invoke(JavascriptTextTransformer.java:156)
at com.google.cloud.teleport.templates.common.JavascriptTextTransformer$FailsafeJavascriptUdf.processElement(JavascriptTextTransformer.java:315)
at com.google.cloud.teleport.templates.common.JavascriptTextTransformer$FailsafeJavascriptUdf$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:275)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:240)
at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:325)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.output(SimpleParDoFn.java:272)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:309)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access0(SimpleDoFnRunner.java:77)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:621)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:609)
at com.google.cloud.teleport.templates.PubSubToBigQuery$PubsubMessageToFailsafeElementFn.processElement(PubSubToBigQuery.java:412)
at com.google.cloud.teleport.templates.PubSubToBigQuery$PubsubMessageToFailsafeElementFn$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:275)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:240)
at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:325)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.output(SimpleParDoFn.java:272)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:309)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access0(SimpleDoFnRunner.java:77)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:621)
at org.apache.beam.sdk.transforms.DoFnOutputReceivers$WindowedContextOutputReceiver.output(DoFnOutputReceivers.java:71)
at org.apache.beam.sdk.transforms.MapElements.processElement(MapElements.java:122)
at org.apache.beam.sdk.transforms.MapElements$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:275)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:240)
at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:325)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:76)
at org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1233)
at org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.access00(StreamingDataflowWorker.java:144)
at org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.run(StreamingDataflowWorker.java:972)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: <eval>:18 TypeError: Cannot read property "domain" from undefined
at jdk.nashorn.internal.runtime.ECMAErrors.error(ECMAErrors.java:57)
at jdk.nashorn.internal.runtime.ECMAErrors.typeError(ECMAErrors.java:213)
at jdk.nashorn.internal.runtime.ECMAErrors.typeError(ECMAErrors.java:185)
at jdk.nashorn.internal.runtime.ECMAErrors.typeError(ECMAErrors.java:172)
at jdk.nashorn.internal.runtime.Undefined.get(Undefined.java:157)
at jdk.nashorn.internal.scripts.Script$Recompilation67A$\^eval\_.transform(<eval>:18)
at jdk.nashorn.internal.runtime.ScriptFunctionData.invoke(ScriptFunctionData.java:639)
at jdk.nashorn.internal.runtime.ScriptFunction.invoke(ScriptFunction.java:494)
at jdk.nashorn.internal.runtime.ScriptRuntime.apply(ScriptRuntime.java:393)
at jdk.nashorn.api.scripting.ScriptObjectMirror.callMember(ScriptObjectMirror.java:199)
at jdk.nashorn.api.scripting.NashornScriptEngine.invokeImpl(NashornScriptEngine.java:386)
... 42 more
当我通过传递一些示例数据在本地尝试 Javascript 时,它按预期工作,没有任何错误。
更新
原来 Pub/Sub 发送包裹在 " 中的数据,所以我不得不从字符串的开头和结尾删除它们。而且 JSON 中的每个 " 都用 \ 转义了因此我也必须删除它们才能继续而不会出现任何错误。
您的 JavaScript 代码似乎有错误。
data['domain'] = obj['data']['domain']; // line 18
抛出以下错误
TypeError: Cannot read property "domain" from undefined in <eval> at line number 18
如果变量没有在某处重新初始化,您可以尝试在发布到 Pub/Sub 时将 JSON 放在 单行 中 即
{"name":"Sam","age":21}
结果是 Pub/Sub 发送包裹在 "
中的数据,所以我不得不从字符串的开头和结尾删除它们。此外,JSON 中的每个 "
都使用 \
进行了转义,因此我也必须删除它们才能继续而不会出现任何错误。