Java 从 HDFS 传输到 S3

Java transfer from HDFS to S3

我想将文件从 HDFS 传输到 Java 中的 S3。有些文件可能很大,所以我不想在将文件上传到 S3 之前将其下载到本地。在 Java 中有什么方法可以做到这一点吗?

这是我现在拥有的(一段将本地文件上传到 S3 的代码)。我不能真正使用它,因为使用 File 对象意味着我将它放在我的硬盘上。

File f = new File("/home/myuser/test");

TransferManager transferManager  = new TransferManager(credentials);
MultipleFileUpload upload = transferManager.uploadDirectory("mybucket","test_folder",f,true);

谢谢

我弄清楚了上传部分。

AWSCredentials credentials = new BasicAWSCredentials(
            "whatever",
            "whatever");

    File f = new File("/home/myuser/test");

    TransferManager transferManager  = new TransferManager(credentials);

    //+upload from HDFS to S3
    Configuration conf = new Configuration();
    // set the hadoop config files
    conf.addResource(new Path("/etc/hadoop/conf/core-site.xml"));
    conf.addResource(new Path("/etc/hadoop/conf/hdfs-site.xml"));

    Path path = new Path("hdfs://my_ip_address/user/ubuntu/test/test.txt");
    FileSystem fs = path.getFileSystem(conf);
    FSDataInputStream inputStream = fs.open(path);
    ObjectMetadata objectMetadata =  new ObjectMetadata();
    Upload upload = transferManager.upload("xpatterns-deployment-ubuntu", "test_cu_jmen3", inputStream, objectMetadata);
    //-upload from HDFS to S3

    try {
        upload.waitForCompletion();
    } catch (InterruptedException e) {
        e.printStackTrace();
    }

关于如何为下载做类似的事情有什么想法吗?我还没有在 TransferManager 中找到任何可以像上面代码中那样使用流的 download() 方法。