使用 Hadoop 文件系统和 BouncyCastle 没有数据写入 S3

No data being written to S3 using Hadoop FileSystem and BouncyCastle

我正在使用以下代码将加密数据写入 Amazon S3:

byte[] bytes = compressFile(instr, CompressionAlgorithmTags.ZIP);

PGPEncryptedDataGenerator encGen = new PGPEncryptedDataGenerator(new JcePGPDataEncryptorBuilder(PGPEncryptedData.CAST5).setWithIntegrityPacket(withIntegrityCheck).setSecureRandom(new SecureRandom()).setProvider("BC"));

encGen.addMethod(new JcePublicKeyKeyEncryptionMethodGenerator(pubKey).setProvider("BC"));

OutputStream cOut = encGen.open(out, bytes.length);

cOut.write(bytes);
cOut.close();

如果我将 "out" 设置为:

final OutputStream fsOutStr = new FileOutputStream(new File("/home/hadoop/encrypted.gpg"));

写文件就好了。

然而,当我尝试将它写入 S3 时,它没有给我任何错误,似乎可以工作,但是当我检查它时,S3 上没有数据:

final FileSystem fileSys  = FileSystem.get(new URI(GenericUtils.getAsEncodedStringIfEmbeddedSpaces(s3OutputDir)), new Configuration());
final OutputStream fsOutStr = fileSys.create(new Path(s3OutputDir)); // outputPath on S3

知道为什么它将数据完美地写入本地磁盘但不将文件写入 S3 吗?

关闭 fsOutStr 解决了问题。