DSEGraphFrame 无法启动图形
DSEGraphFrame impossible to launch the graph
我在生产中有一个 dse 图。
我已经通过 opscenter 启用了一个具有搜索和分析功能的节点。
我可以使用以下命令成功启动 gremlin 控制台和 运行 分析查询:
:remote config alias g graphName.a
当我尝试使用 scala spark 控制台启动 DSEGraph 框架查询时出现问题。
每当我在 scala spark 控制台中键入初始命令时,我都会遇到同样的错误。
val g = spark.dseGraph("graphName")
com.datastax.driver.core.exceptions.ServerError: An unexpected error occurred server side on /x.x.x.x:9042: Failed to execute method DseGraphRpc.getSchemaBlob
at com.datastax.driver.core.exceptions.ServerError.copy(ServerError.java:54)
at com.datastax.driver.core.exceptions.ServerError.copy(ServerError.java:16)
at com.datastax.driver.core.DriverThrowables.propagateCause(DriverThrowables.java:28)
at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:236)
at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:59)
at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:42)
at com.datastax.driver.dse.DefaultDseSession.execute(DefaultDseSession.java:232)
at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:40)
at com.sun.proxy.$Proxy6.execute(Unknown Source)
at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:40)
at com.sun.proxy.$Proxy7.execute(Unknown Source)
at com.datastax.bdp.util.rpc.RpcUtil.call(RpcUtil.java:42)
at com.datastax.bdp.graph.spark.DseGraphRpc.callGetSchema(DseGraphRpc.java:47)
at com.datastax.bdp.graph.spark.graphframe.DseGraphFrame$$anonfun$getSchemaFromServer.apply(DseGraphFrame.scala:504)
at com.datastax.bdp.graph.spark.graphframe.DseGraphFrame$$anonfun$getSchemaFromServer.apply(DseGraphFrame.scala:504)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo.apply(CassandraConnector.scala:112)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo.apply(CassandraConnector.scala:111)
at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:145)
at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:111)
at com.datastax.bdp.graph.spark.graphframe.DseGraphFrame$.getSchemaFromServer(DseGraphFrame.scala:504)
at com.datastax.bdp.graph.spark.graphframe.DseGraphFrameBuilder$.apply(DseGraphFrameBuilder.scala:241)
at com.datastax.bdp.graph.spark.graphframe.SparkSessionFunctions.dseGraph(SparkSessionFunctions.scala:20)
... 57 elided
Caused by: com.datastax.driver.core.exceptions.ServerError: An unexpected error occurred server side on /x.x.x.x:9042: Failed to execute method DseGraphRpc.getSchemaBlob
at com.datastax.driver.core.Responses$Error.asException(Responses.java:114)
at com.datastax.driver.core.RequestHandler$SpeculativeExecution.onSet(RequestHandler.java:498)
at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:1074)
at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:991)
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
at io.netty.handler.ssl.SslHandler.unwrap(SslHandler.java:1069)
at io.netty.handler.ssl.SslHandler.decode(SslHandler.java:902)
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:411)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:248)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:934)
at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:405)
at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:310)
at io.netty.util.concurrent.SingleThreadEventExecutor.run(SingleThreadEventExecutor.java:140)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
at java.lang.Thread.run(Thread.java:748)
在 cassandra 日志文件中,我得到了这个堆栈跟踪:
INFO [Native-Transport-Requests-14] 2018-04-11 17:54:47,248 RpcMethod.java:177 - Failed to execute method DseGraphRpc.getSchemaBlob
java.lang.reflect.InvocationTargetException: null
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_131]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_131]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_131]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_131]
at com.datastax.bdp.util.rpc.RpcMethod.execute(RpcMethod.java:159) ~[dse-core-5.1.1.jar:5.1.1]
at com.datastax.bdp.cassandra.cql3.RpcCallStatement.execute(RpcCallStatement.java:92) [dse-core-5.1.1.jar:5.1.1]
at org.apache.cassandra.cql3.QueryProcessor.processStatement(QueryProcessor.java:218) [cassandra-all-3.10.0.1695.jar:3.10.0.1695]
at com.datastax.bdp.cassandra.cql3.DseQueryHandler$StatementExecution.execute(DseQueryHandler.java:457) [dse-core-5.1.1.jar:5.1.1]
at com.datastax.bdp.cassandra.cql3.DseQueryHandler$Operation.executeWithTiming(DseQueryHandler.java:369) [dse-core-5.1.1.jar:5.1.1]
at com.datastax.bdp.cassandra.cql3.DseQueryHandler$Operation.executeAndMaybeWriteToAuditLog(DseQueryHandler.java:420) [dse-core-5.1.1.jar:5.1.1]
at com.datastax.bdp.cassandra.cql3.DseQueryHandler.process(DseQueryHandler.java:157) [dse-core-5.1.1.jar:5.1.1]
at com.datastax.bdp.cassandra.cql3.DseQueryHandler.process(DseQueryHandler.java:109) [dse-core-5.1.1.jar:5.1.1]
at org.apache.cassandra.transport.messages.QueryMessage.execute(QueryMessage.java:112) [cassandra-all-3.10.0.1695.jar:3.10.0.1695]
at org.apache.cassandra.transport.Message$Dispatcher.channelRead0(Message.java:546) [cassandra-all-3.10.0.1695.jar:3.10.0.1695]
at org.apache.cassandra.transport.Message$Dispatcher.channelRead0(Message.java:440) [cassandra-all-3.10.0.1695.jar:3.10.0.1695]
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) [netty-all-4.0.42.Final.jar:4.0.42.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) [netty-all-4.0.42.Final.jar:4.0.42.Final]
at io.netty.channel.AbstractChannelHandlerContext.access0(AbstractChannelHandlerContext.java:36) [netty-all-4.0.42.Final.jar:4.0.42.Final]
at io.netty.channel.AbstractChannelHandlerContext.run(AbstractChannelHandlerContext.java:358) [netty-all-4.0.42.Final.jar:4.0.42.Final]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_131]
at org.apache.cassandra.concurrent.AbstractLocalAwareExecutorService$FutureTask.run(AbstractLocalAwareExecutorService.java:162) [cassandra-all-3.10.0.1695.jar:3.10.0.1695]
at org.apache.cassandra.concurrent.SEPWorker.run(SEPWorker.java:109) [cassandra-all-3.10.0.1695.jar:3.10.0.1695]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_131]
Caused by: java.lang.WhosebugError: null
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:281) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
...
我没有更多信息可以提供。图名正确,我的配置似乎是正确的。
我的架构很大(不幸的是我不能公开提供),所以这可能解释了错误,但日志不是很清楚。
我可以怎样正确处理 运行 DSEGraphFrame?
这是 DseGraphFrame 中的问题。如果 属性 是它自身的元 属性 或某个循环,Whosebug 就会发生。它将在下一版本中修复。您可以在架构中删除该递归作为解决方法
要重现的最小示例:
system.graph('rec').create()
:remote config alias g rec.g
schema.propertyKey("name").Text().create();
schema.propertyKey("name").properties("name").add();
我在生产中有一个 dse 图。
我已经通过 opscenter 启用了一个具有搜索和分析功能的节点。
我可以使用以下命令成功启动 gremlin 控制台和 运行 分析查询:
:remote config alias g graphName.a
当我尝试使用 scala spark 控制台启动 DSEGraph 框架查询时出现问题。
每当我在 scala spark 控制台中键入初始命令时,我都会遇到同样的错误。
val g = spark.dseGraph("graphName")
com.datastax.driver.core.exceptions.ServerError: An unexpected error occurred server side on /x.x.x.x:9042: Failed to execute method DseGraphRpc.getSchemaBlob
at com.datastax.driver.core.exceptions.ServerError.copy(ServerError.java:54)
at com.datastax.driver.core.exceptions.ServerError.copy(ServerError.java:16)
at com.datastax.driver.core.DriverThrowables.propagateCause(DriverThrowables.java:28)
at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:236)
at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:59)
at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:42)
at com.datastax.driver.dse.DefaultDseSession.execute(DefaultDseSession.java:232)
at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:40)
at com.sun.proxy.$Proxy6.execute(Unknown Source)
at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:40)
at com.sun.proxy.$Proxy7.execute(Unknown Source)
at com.datastax.bdp.util.rpc.RpcUtil.call(RpcUtil.java:42)
at com.datastax.bdp.graph.spark.DseGraphRpc.callGetSchema(DseGraphRpc.java:47)
at com.datastax.bdp.graph.spark.graphframe.DseGraphFrame$$anonfun$getSchemaFromServer.apply(DseGraphFrame.scala:504)
at com.datastax.bdp.graph.spark.graphframe.DseGraphFrame$$anonfun$getSchemaFromServer.apply(DseGraphFrame.scala:504)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo.apply(CassandraConnector.scala:112)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo.apply(CassandraConnector.scala:111)
at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:145)
at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:111)
at com.datastax.bdp.graph.spark.graphframe.DseGraphFrame$.getSchemaFromServer(DseGraphFrame.scala:504)
at com.datastax.bdp.graph.spark.graphframe.DseGraphFrameBuilder$.apply(DseGraphFrameBuilder.scala:241)
at com.datastax.bdp.graph.spark.graphframe.SparkSessionFunctions.dseGraph(SparkSessionFunctions.scala:20)
... 57 elided
Caused by: com.datastax.driver.core.exceptions.ServerError: An unexpected error occurred server side on /x.x.x.x:9042: Failed to execute method DseGraphRpc.getSchemaBlob
at com.datastax.driver.core.Responses$Error.asException(Responses.java:114)
at com.datastax.driver.core.RequestHandler$SpeculativeExecution.onSet(RequestHandler.java:498)
at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:1074)
at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:991)
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
at io.netty.handler.ssl.SslHandler.unwrap(SslHandler.java:1069)
at io.netty.handler.ssl.SslHandler.decode(SslHandler.java:902)
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:411)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:248)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:934)
at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:405)
at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:310)
at io.netty.util.concurrent.SingleThreadEventExecutor.run(SingleThreadEventExecutor.java:140)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
at java.lang.Thread.run(Thread.java:748)
在 cassandra 日志文件中,我得到了这个堆栈跟踪:
INFO [Native-Transport-Requests-14] 2018-04-11 17:54:47,248 RpcMethod.java:177 - Failed to execute method DseGraphRpc.getSchemaBlob
java.lang.reflect.InvocationTargetException: null
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_131]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_131]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_131]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_131]
at com.datastax.bdp.util.rpc.RpcMethod.execute(RpcMethod.java:159) ~[dse-core-5.1.1.jar:5.1.1]
at com.datastax.bdp.cassandra.cql3.RpcCallStatement.execute(RpcCallStatement.java:92) [dse-core-5.1.1.jar:5.1.1]
at org.apache.cassandra.cql3.QueryProcessor.processStatement(QueryProcessor.java:218) [cassandra-all-3.10.0.1695.jar:3.10.0.1695]
at com.datastax.bdp.cassandra.cql3.DseQueryHandler$StatementExecution.execute(DseQueryHandler.java:457) [dse-core-5.1.1.jar:5.1.1]
at com.datastax.bdp.cassandra.cql3.DseQueryHandler$Operation.executeWithTiming(DseQueryHandler.java:369) [dse-core-5.1.1.jar:5.1.1]
at com.datastax.bdp.cassandra.cql3.DseQueryHandler$Operation.executeAndMaybeWriteToAuditLog(DseQueryHandler.java:420) [dse-core-5.1.1.jar:5.1.1]
at com.datastax.bdp.cassandra.cql3.DseQueryHandler.process(DseQueryHandler.java:157) [dse-core-5.1.1.jar:5.1.1]
at com.datastax.bdp.cassandra.cql3.DseQueryHandler.process(DseQueryHandler.java:109) [dse-core-5.1.1.jar:5.1.1]
at org.apache.cassandra.transport.messages.QueryMessage.execute(QueryMessage.java:112) [cassandra-all-3.10.0.1695.jar:3.10.0.1695]
at org.apache.cassandra.transport.Message$Dispatcher.channelRead0(Message.java:546) [cassandra-all-3.10.0.1695.jar:3.10.0.1695]
at org.apache.cassandra.transport.Message$Dispatcher.channelRead0(Message.java:440) [cassandra-all-3.10.0.1695.jar:3.10.0.1695]
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) [netty-all-4.0.42.Final.jar:4.0.42.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) [netty-all-4.0.42.Final.jar:4.0.42.Final]
at io.netty.channel.AbstractChannelHandlerContext.access0(AbstractChannelHandlerContext.java:36) [netty-all-4.0.42.Final.jar:4.0.42.Final]
at io.netty.channel.AbstractChannelHandlerContext.run(AbstractChannelHandlerContext.java:358) [netty-all-4.0.42.Final.jar:4.0.42.Final]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_131]
at org.apache.cassandra.concurrent.AbstractLocalAwareExecutorService$FutureTask.run(AbstractLocalAwareExecutorService.java:162) [cassandra-all-3.10.0.1695.jar:3.10.0.1695]
at org.apache.cassandra.concurrent.SEPWorker.run(SEPWorker.java:109) [cassandra-all-3.10.0.1695.jar:3.10.0.1695]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_131]
Caused by: java.lang.WhosebugError: null
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:281) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new5(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
at java.util.stream.ReferencePipeline.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
...
我没有更多信息可以提供。图名正确,我的配置似乎是正确的。
我的架构很大(不幸的是我不能公开提供),所以这可能解释了错误,但日志不是很清楚。
我可以怎样正确处理 运行 DSEGraphFrame?
这是 DseGraphFrame 中的问题。如果 属性 是它自身的元 属性 或某个循环,Whosebug 就会发生。它将在下一版本中修复。您可以在架构中删除该递归作为解决方法 要重现的最小示例:
system.graph('rec').create()
:remote config alias g rec.g
schema.propertyKey("name").Text().create();
schema.propertyKey("name").properties("name").add();