在 apache flink 的 protobuf 事件中发布反序列化事件
issue deserializing events in protobuf events in apache flink
我正在我的 flink 应用程序中读取来自 kinesis 的事件。事件采用 protobuf 格式。如果我在 flink 应用程序中使用 'com.google.protobuf:protobuf-java:3.7.1'
,我没有问题。但是,如果我将其更改为 'com.google.protobuf:protobuf-java:3.10.0'
,我会得到上述堆栈跟踪异常
java.lang.IncompatibleClassChangeError: class com.google.protobuf.Descriptors$OneofDescriptor has interface com.google.protobuf.Descriptors$GenericDescriptor as super class
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
at java.net.URLClassLoader.access0(URLClassLoader.java:74)
at java.net.URLClassLoader.run(URLClassLoader.java:369)
at java.net.URLClassLoader.run(URLClassLoader.java:363)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetPublicMethods(Class.java:2902)
at java.lang.Class.privateGetPublicMethods(Class.java:2917)
at java.lang.Class.getMethods(Class.java:1615)
at org.apache.flink.api.java.typeutils.TypeExtractor.isValidPojoField(TypeExtractor.java:1786)
at org.apache.flink.api.java.typeutils.TypeExtractor.analyzePojo(TypeExtractor.java:1856)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1746)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1643)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:921)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:781)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfo(TypeExtractor.java:735)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfo(TypeExtractor.java:731)
at org.apache.flink.api.common.typeinfo.TypeInformation.of(TypeInformation.java:211)
at org.apache.flink.api.java.typeutils.ListTypeInfo.<init>(ListTypeInfo.java:45)
at com.bagi.streaming.serialization.ProtoSchema.getProducedType(ProtoSchema.java:40)
at org.apache.flink.streaming.connectors.kinesis.serialization.KinesisDeserializationSchemaWrapper.getProducedType(KinesisDeserializationSchemaWrapper.java:57)
at org.apache.flink.streaming.connectors.kinesis.FlinkKinesisConsumer.getProducedType(FlinkKinesisConsumer.java:363)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.addSource(StreamExecutionEnvironment.java:1456)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.addSource(StreamExecutionEnvironment.java:1414)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.addSource(StreamExecutionEnvironment.java:1396)
at com.bagi.streaming.StreamProcessor.getKinesisTrackingStream(StreamProcessor.java:101)
at com.bagi.streaming.StreamProcessor.getKinesisTrackingStream(StreamProcessor.java:110)
at com.bagi.streaming.StreamProcessor.consumeKinesis(StreamProcessor.java:117)
at com.bagi.streaming.StreamProcessor.main(StreamProcessor.java:80)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:423)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:813)
at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:287)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:213)
at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1050)
at org.apache.flink.client.cli.CliFrontend.lambda$main(CliFrontend.java:1126)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1844)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1126)
我正在使用 flink@1.8.0 和 'com.twitter:chill-protobuf:0.9.3'
。我正在 mac 上本地构建 flink 应用程序 jar。我已经尝试在 3.10.0 和 3.7.1 上使用 protoc for protobuf-java 在 3.10.0 以防万一。
这是我的解串器
public class ProtoSchema implements DeserializationSchema<List<Event>> {
@Override
public List<Event> deserialize(byte[] message) throws IOException {
List<Event> events = new LinkedList<>();
InputStream inputStream = new ByteArrayInputStream(message);
while (true) {
Event event = Event.parseDelimitedFrom(inputStream);
if (event != null) {
events.add(event);
} else {
break;
}
}
return events;
}
@Override
public boolean isEndOfStream(List<Event> nextElement) {
return false;
}
@Override
public TypeInformation<List<Event>> getProducedType() {
return new ListTypeInfo<>(Event.class);
}
}
我正在通过
插入
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
Properties consumerConfig = new Properties();
consumerConfig.put(AWSConfigConstants.AWS_CREDENTIALS_PROVIDER, "AUTO");
consumerConfig.put(AWSConfigConstants.AWS_REGION, region);
consumerConfig.put(ConsumerConfigConstants.SHARD_GETRECORDS_INTERVAL_MILLIS, "300");
consumerConfig.put(ConsumerConfigConstants.SHARD_GETRECORDS_RETRIES, "10");
consumerConfig.put(ConsumerConfigConstants.SHARD_GETRECORDS_MAX, "5000");
consumerConfig.put(ConsumerConfigConstants.STREAM_INITIAL_POSITION, "LATEST");
env.addSource(new FlinkKinesisConsumer<>(name, new ProtoSchema(), consumerConfig)).name("KinesisSource");
env.getConfig().registerTypeWithKryoSerializer(Event.class, ProtobufSerializer.class);
Event.class 使用 protoc@3.10.0 和 protobuf-[ 从 protobuf 模式编译而来=35=]@3.10.0
正如您在 protobuf-java:3.9.0
的评论中所说,存在对低版本 (3.8-) 的二进制不兼容更改。
到classclass Descriptors.OneofDescriptor
添加超classDescriptors.GenericDescriptor,
哪个
来自客户端超级接口的静态字段 class 可能隐藏从新超级接口继承的字段(具有相同的名称)-class 并导致 IncompatibleClassChangeError 异常. More
因此,如果您在 class 路径上有 protobuf-java:3.9.0+
并且还有一些较低版本 (3.8-) 调用此 class,您将收到此错误。 (在我的例子中,它来自 hadoop,它有 2.5 protobuf-java 版本,我的 fat jar 有 3.10)
解决方案:
- 您需要隐藏其中一个不兼容的依赖项
protobuf-java
more how to shade depedency with gradle
- 或者使用3.8及更低版本作为暂时的短视解决方案。
我正在我的 flink 应用程序中读取来自 kinesis 的事件。事件采用 protobuf 格式。如果我在 flink 应用程序中使用 'com.google.protobuf:protobuf-java:3.7.1'
,我没有问题。但是,如果我将其更改为 'com.google.protobuf:protobuf-java:3.10.0'
,我会得到上述堆栈跟踪异常
java.lang.IncompatibleClassChangeError: class com.google.protobuf.Descriptors$OneofDescriptor has interface com.google.protobuf.Descriptors$GenericDescriptor as super class
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
at java.net.URLClassLoader.access0(URLClassLoader.java:74)
at java.net.URLClassLoader.run(URLClassLoader.java:369)
at java.net.URLClassLoader.run(URLClassLoader.java:363)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetPublicMethods(Class.java:2902)
at java.lang.Class.privateGetPublicMethods(Class.java:2917)
at java.lang.Class.getMethods(Class.java:1615)
at org.apache.flink.api.java.typeutils.TypeExtractor.isValidPojoField(TypeExtractor.java:1786)
at org.apache.flink.api.java.typeutils.TypeExtractor.analyzePojo(TypeExtractor.java:1856)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1746)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1643)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:921)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:781)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfo(TypeExtractor.java:735)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfo(TypeExtractor.java:731)
at org.apache.flink.api.common.typeinfo.TypeInformation.of(TypeInformation.java:211)
at org.apache.flink.api.java.typeutils.ListTypeInfo.<init>(ListTypeInfo.java:45)
at com.bagi.streaming.serialization.ProtoSchema.getProducedType(ProtoSchema.java:40)
at org.apache.flink.streaming.connectors.kinesis.serialization.KinesisDeserializationSchemaWrapper.getProducedType(KinesisDeserializationSchemaWrapper.java:57)
at org.apache.flink.streaming.connectors.kinesis.FlinkKinesisConsumer.getProducedType(FlinkKinesisConsumer.java:363)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.addSource(StreamExecutionEnvironment.java:1456)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.addSource(StreamExecutionEnvironment.java:1414)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.addSource(StreamExecutionEnvironment.java:1396)
at com.bagi.streaming.StreamProcessor.getKinesisTrackingStream(StreamProcessor.java:101)
at com.bagi.streaming.StreamProcessor.getKinesisTrackingStream(StreamProcessor.java:110)
at com.bagi.streaming.StreamProcessor.consumeKinesis(StreamProcessor.java:117)
at com.bagi.streaming.StreamProcessor.main(StreamProcessor.java:80)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:423)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:813)
at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:287)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:213)
at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1050)
at org.apache.flink.client.cli.CliFrontend.lambda$main(CliFrontend.java:1126)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1844)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1126)
我正在使用 flink@1.8.0 和 'com.twitter:chill-protobuf:0.9.3'
。我正在 mac 上本地构建 flink 应用程序 jar。我已经尝试在 3.10.0 和 3.7.1 上使用 protoc for protobuf-java 在 3.10.0 以防万一。
这是我的解串器
public class ProtoSchema implements DeserializationSchema<List<Event>> {
@Override
public List<Event> deserialize(byte[] message) throws IOException {
List<Event> events = new LinkedList<>();
InputStream inputStream = new ByteArrayInputStream(message);
while (true) {
Event event = Event.parseDelimitedFrom(inputStream);
if (event != null) {
events.add(event);
} else {
break;
}
}
return events;
}
@Override
public boolean isEndOfStream(List<Event> nextElement) {
return false;
}
@Override
public TypeInformation<List<Event>> getProducedType() {
return new ListTypeInfo<>(Event.class);
}
}
我正在通过
插入StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
Properties consumerConfig = new Properties();
consumerConfig.put(AWSConfigConstants.AWS_CREDENTIALS_PROVIDER, "AUTO");
consumerConfig.put(AWSConfigConstants.AWS_REGION, region);
consumerConfig.put(ConsumerConfigConstants.SHARD_GETRECORDS_INTERVAL_MILLIS, "300");
consumerConfig.put(ConsumerConfigConstants.SHARD_GETRECORDS_RETRIES, "10");
consumerConfig.put(ConsumerConfigConstants.SHARD_GETRECORDS_MAX, "5000");
consumerConfig.put(ConsumerConfigConstants.STREAM_INITIAL_POSITION, "LATEST");
env.addSource(new FlinkKinesisConsumer<>(name, new ProtoSchema(), consumerConfig)).name("KinesisSource");
env.getConfig().registerTypeWithKryoSerializer(Event.class, ProtobufSerializer.class);
Event.class 使用 protoc@3.10.0 和 protobuf-[ 从 protobuf 模式编译而来=35=]@3.10.0
正如您在 protobuf-java:3.9.0
的评论中所说,存在对低版本 (3.8-) 的二进制不兼容更改。
到classclass Descriptors.OneofDescriptor
添加超classDescriptors.GenericDescriptor,
哪个
来自客户端超级接口的静态字段 class 可能隐藏从新超级接口继承的字段(具有相同的名称)-class 并导致 IncompatibleClassChangeError 异常. More
因此,如果您在 class 路径上有 protobuf-java:3.9.0+
并且还有一些较低版本 (3.8-) 调用此 class,您将收到此错误。 (在我的例子中,它来自 hadoop,它有 2.5 protobuf-java 版本,我的 fat jar 有 3.10)
解决方案:
- 您需要隐藏其中一个不兼容的依赖项
protobuf-java
more how to shade depedency with gradle - 或者使用3.8及更低版本作为暂时的短视解决方案。