从 polybase 连接到 hadoop 时连接被拒绝
Connection refused while connecting from polybase to hadoop
在 Ubuntu 16.04 中尝试从 sql 服务器 2017 创建外部 table 到 Hadoop 时抛出以下错误
Msg 105019, Level 16, State 1, Line 1
EXTERNAL TABLE access failed due to internal error: 'Java exception raised on call to HdfsBridge_IsDirExist. Java exception message:
Call From DESKTOP-VE8KNAG/xxx.xxx.x.xxx to xxx.xxx.x.x:54310 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused: Error [Call From DESKTOP-VE8KNAG/1xxx.xxx.x.xxx to xxx.xxx.x.x:54310 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused] occurred while accessing external file.'
- 外部源位置取自核心-site.xml
- Tmp 文件夹已创建并为用户添加权限并将其添加到 core-site.xml
- Hadoop 中的所有节点都是 运行
10625 数据节点
10869 二级名称节点
17113 资源管理器
17434 节点管理器
10490 名称节点
21566 日元
sql查询
create EXTERNAL DATA SOURCE [HDP2]
WITH (TYPE = HADOOP,
LOCATION = N'hdfs://xxx.xxx.x.x:54310',
CREDENTIAL = [HDPUser])
GO
create EXTERNAL TABLE [dbo].CLASS_DIM_EXP (
[CLASS_ID] [varchar](8) NOT NULL,
[CLASS_DESC] [varchar](100) NULL,
[INSERT_DATE] [datetime2](7) NOT NULL,
[LAST_UPDATE_DATETIME] [datetime2](7) NOT NULL)
WITH (LOCATION='/user/pdw_user',
DATA_SOURCE = HDP2,
FILE_FORMAT = TSV,
REJECT_TYPE = VALUE,
REJECT_VALUE = 0);
核心-site.xml
<property>
<name>hadoop.tmp.dir</name>
<value>/app/hadoop/tmp</value>
<description>A base for other temporary directories.</description>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:54310</value>
<description>The name of the default file system. A URI whose
scheme and authority determine the FileSystem implementation. The
uri's scheme determines the config property (fs.SCHEME.impl) naming
the FileSystem implementation class. The uri's authority is used to
determine the host, port, etc. for a filesystem.</description>
</property>
是否需要更改某些内容?
这里的问题是cores-site.xml文件包含hdfs://localhost:54310 需要把这个替换成对应的IP地址hdfs://xxx.xxx.x.x:54310.
在 Ubuntu 16.04 中尝试从 sql 服务器 2017 创建外部 table 到 Hadoop 时抛出以下错误
Msg 105019, Level 16, State 1, Line 1 EXTERNAL TABLE access failed due to internal error: 'Java exception raised on call to HdfsBridge_IsDirExist. Java exception message: Call From DESKTOP-VE8KNAG/xxx.xxx.x.xxx to xxx.xxx.x.x:54310 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused: Error [Call From DESKTOP-VE8KNAG/1xxx.xxx.x.xxx to xxx.xxx.x.x:54310 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused] occurred while accessing external file.'
- 外部源位置取自核心-site.xml
- Tmp 文件夹已创建并为用户添加权限并将其添加到 core-site.xml
- Hadoop 中的所有节点都是 运行 10625 数据节点 10869 二级名称节点 17113 资源管理器 17434 节点管理器 10490 名称节点 21566 日元
sql查询
create EXTERNAL DATA SOURCE [HDP2]
WITH (TYPE = HADOOP,
LOCATION = N'hdfs://xxx.xxx.x.x:54310',
CREDENTIAL = [HDPUser])
GO
create EXTERNAL TABLE [dbo].CLASS_DIM_EXP (
[CLASS_ID] [varchar](8) NOT NULL,
[CLASS_DESC] [varchar](100) NULL,
[INSERT_DATE] [datetime2](7) NOT NULL,
[LAST_UPDATE_DATETIME] [datetime2](7) NOT NULL)
WITH (LOCATION='/user/pdw_user',
DATA_SOURCE = HDP2,
FILE_FORMAT = TSV,
REJECT_TYPE = VALUE,
REJECT_VALUE = 0);
核心-site.xml
<property>
<name>hadoop.tmp.dir</name>
<value>/app/hadoop/tmp</value>
<description>A base for other temporary directories.</description>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:54310</value>
<description>The name of the default file system. A URI whose
scheme and authority determine the FileSystem implementation. The
uri's scheme determines the config property (fs.SCHEME.impl) naming
the FileSystem implementation class. The uri's authority is used to
determine the host, port, etc. for a filesystem.</description>
</property>
是否需要更改某些内容?
这里的问题是cores-site.xml文件包含hdfs://localhost:54310 需要把这个替换成对应的IP地址hdfs://xxx.xxx.x.x:54310.