从 Hive 写入 s3 失败
Write to s3 from Hive fails
我正在尝试设置 Hadoop 集群以将 Hive 表写入 s3。
- 设置 s3 连接器
- 在 s3 上创建外部表工作正常
- Updated the keys in core-site.xml
- Updated the encryption to AES256
- HDFS 上的本地工作正常。
我从 s3 收到以下错误:(这是为了提高可读性而分解的单行错误)
FAILED: SemanticException org.apache.hadoop.hive.ql.metadata.HiveException:
Unable to determine if s3a://<MyBucket>/hive/warehouse/<My>.db/<MyTable> is encrypted:
java.io.InterruptedIOException: doesBucketExist on <MyBucket>:
com.amazonaws.AmazonClientException:
No AWS Credentials provided by
BasicAWSCredentialsProvider
EnvironmentVariableCredentialsProvider
SharedInstanceProfileCredentialsProvider :
com.amazonaws.SdkClientException:
Unable to load credentials from service endpoint
看起来锻炼不错,好好休息才是解决之道:
This link talks about the fs.s3a.aws.credentials.provider
If unspecified, then the default list of credential provider classes,
queried in sequence, is:
1. org.apache.hadoop.fs.s3a.BasicAWSCredentialsProvider: supports
static configuration of AWS access key ID and secret access key.
See also fs.s3a.access.key and fs.s3a.secret.key.
问题是我在 hadoop conf /etc/hadoop/conf
中指定了键,而不是 hive conf /etc/hive/conf
.移动 fs.s3a.access.key
和 fs.s3a.secret.key
解决了问题。
我正在尝试设置 Hadoop 集群以将 Hive 表写入 s3。
- 设置 s3 连接器
- 在 s3 上创建外部表工作正常
- Updated the keys in core-site.xml
- Updated the encryption to AES256
- HDFS 上的本地工作正常。
我从 s3 收到以下错误:(这是为了提高可读性而分解的单行错误)
FAILED: SemanticException org.apache.hadoop.hive.ql.metadata.HiveException:
Unable to determine if s3a://<MyBucket>/hive/warehouse/<My>.db/<MyTable> is encrypted:
java.io.InterruptedIOException: doesBucketExist on <MyBucket>:
com.amazonaws.AmazonClientException:
No AWS Credentials provided by
BasicAWSCredentialsProvider
EnvironmentVariableCredentialsProvider
SharedInstanceProfileCredentialsProvider :
com.amazonaws.SdkClientException:
Unable to load credentials from service endpoint
看起来锻炼不错,好好休息才是解决之道:
This link talks about the fs.s3a.aws.credentials.provider
If unspecified, then the default list of credential provider classes,
queried in sequence, is:
1. org.apache.hadoop.fs.s3a.BasicAWSCredentialsProvider: supports
static configuration of AWS access key ID and secret access key.
See also fs.s3a.access.key and fs.s3a.secret.key.
问题是我在 hadoop conf /etc/hadoop/conf
中指定了键,而不是 hive conf /etc/hive/conf
.移动 fs.s3a.access.key
和 fs.s3a.secret.key
解决了问题。