在 SpagoBI 中为 FIWARE Cosmos 创建 Hive 数据源

Create a Hive data source in SpagoBI for FIWARE Cosmos

我们正在尝试将报告部署到我们的 SpagoBI 服务器,以通过 Hive 显示来自 FIWARE global Cosmos 的 HDFS 的数据。
该报告在 SpagoBI Studio 本地成功生成,在 BIRT 报告查看器中显示 HDFS 数据(一旦添加了 Hive 驱动程序的 JAR)。这意味着与全球 FIWARE Lab Cosmos 实例的连接已正确配置。

问题是我们无法在 SpagoBI 服务器中部署相同的配置。 我们已经像在 SpagoBI Studio 中一样配置了数据源,将 Hive 驱动程序 JAR 添加到 /opt/spagobi/All-in-One-SpagoBI-5.1-1feb2d97af/lib 但是当我们尝试通过 Web 界面测试数据源时,我们得到以下异常:

it.eng.spagobi.tools.datasource.service.rest.TestConnection.testDataSource: Error testing datasources
java.sql.SQLException: Could not open connection to jdbc:hive2://cosmos.lab.fiware.org:10000: java.net.SocketException: Connection reset
    at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206)
    at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:178)
    at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
    at java.sql.DriverManager.getConnection(DriverManager.java:571)
    at java.sql.DriverManager.getConnection(DriverManager.java:215)
    at it.eng.spagobi.tools.datasource.service.rest.TestConnection.testDataSource(TestConnection.java:92)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.jboss.resteasy.core.MethodInjectorImpl.invoke(MethodInjectorImpl.java:167)
    at org.jboss.resteasy.core.ResourceMethod.invokeOnTarget(ResourceMethod.java:257)
    at org.jboss.resteasy.core.ResourceMethod.invoke(ResourceMethod.java:222)
    at org.jboss.resteasy.core.ResourceMethod.invoke(ResourceMethod.java:211)
    at org.jboss.resteasy.core.SynchronousDispatcher.getResponse(SynchronousDispatcher.java:542)
    at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:524)
    at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:126)
    at org.jboss.resteasy.plugins.server.servlet.ServletContainerDispatcher.service(ServletContainerDispatcher.java:208)
    at org.jboss.resteasy.plugins.server.servlet.HttpServletDispatcher.service(HttpServletDispatcher.java:55)
    at org.jboss.resteasy.plugins.server.servlet.HttpServletDispatcher.service(HttpServletDispatcher.java:50)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:728)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
    at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:222)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:123)
    at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:100)
    at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:953)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
    at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1041)
    at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:603)
    at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:312)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset
    at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129)
    at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
    at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:178)
    at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:288)
    at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
    at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:203)
    ... 39 more
Caused by: java.net.SocketException: Connection reset
    at java.net.SocketInputStream.read(SocketInputStream.java:196)
    at java.net.SocketInputStream.read(SocketInputStream.java:122)
    at java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
    at java.io.BufferedInputStream.read1(BufferedInputStream.java:275)
    at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
    at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127)
    ... 44 more

数据源的配置如下所示:

并且使用的驱动JARs等同于通过以下Maven依赖获得:

 <dependencies>
        <dependency>
        <groupId>org.apache.hive</groupId>
        <artifactId>hive-jdbc</artifactId>
        <version>0.13.0</version>
        </dependency>
        <dependency>
        <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
        <version>0.20.2-cdh3u6</version>
    </dependency>
    </dependencies>

有人知道连接失败的原因吗?

相关问题:

堆栈跟踪显示 "java.net.SocketException: Connection reset",因此连接到 Cosmos 时似乎出现了问题。您能否验证托管 SpagoBI 服务器的服务器是否能够连接到 FIWARE 实验室上的 Cosmos?

希望对您有所帮助

此致

看来是Cosmos中的HiveServer2运行的问题,时不时闪退。 现在又可以用了。