Apache Beam 通配符递归搜索文件

Apache beam wildcard recursive search for files

我正在使用 Spotify 的 Scio 库在 Scala 中编写 Apache Beam 管道。我想在可以是 hdfs、alluxio 或 GCS 的文件系统上以递归方式搜索目录下的文件。像*.jar 应该找到提供的目录和子目录下的所有文件。

Apache beam sdk 提供 org.apache.beam.sdk.io.FileIO class 为此目的,我可以使用 pipeline.apply(FileIO.match().filepattern(filesPattern)) 在一个目录级别上查找文件。

我怎样才能递归地搜索与提供的模式匹配的所有文件?

目前,我正在尝试另一种方法,我正在创建所提供模式的 resourceId 并获取所提供模式的当前目录,然后我尝试使用 [=14= 解析当前目录中的所有子目录] 方法。但它为此抛出异常。

    val currentDir = FileSystems.matchNewResource(filesPattern, false).getCurrentDirectory
    val childDir = currentDir.resolve("{@literal *}", StandardResolveOptions.RESOLVE_DIRECTORY)

对于 currentDir.resolve 我遇到以下异常:

------------------------------------------------------------
 The program finished with the following exception:

org.apache.flink.client.program.ProgramInvocationException: The main method caused an error.
        at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:546)
        at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)
        at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:427)
        at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:813)
        at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:287)
        at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:213)
        at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1050)
        at org.apache.flink.client.cli.CliFrontend.lambda$main(CliFrontend.java:1126)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
        at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
        at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1126)
Caused by: java.lang.IllegalArgumentException: Illegal character in path at index 0: {@literal *}/
        at java.net.URI.create(URI.java:852)
        at java.net.URI.resolve(URI.java:1036)
        at org.apache.beam.sdk.io.hdfs.HadoopResourceId.resolve(HadoopResourceId.java:46)
        at com.sparkcognition.foundation.ingest.jobs.copyjob.FileOperations$.findFiles(BinaryFilesSink.scala:110)
        at com.sparkcognition.foundation.ingest.jobs.copyjob.BinaryFilesSink$.main(BinaryFilesSink.scala:39)
        at com.sparkcognition.foundation.ingest.jobs.copyjob.BinaryFilesSink.main(BinaryFilesSink.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)
        ... 12 more
Caused by: java.net.URISyntaxException: Illegal character in path at index 0: {@literal *}/
        at java.net.URI$Parser.fail(URI.java:2848)
        at java.net.URI$Parser.checkChars(URI.java:3021)
        at java.net.URI$Parser.parseHierarchical(URI.java:3105)
        at java.net.URI$Parser.parse(URI.java:3063)
        at java.net.URI.<init>(URI.java:588)
        at java.net.URI.create(URI.java:850)
        ... 22 more

请建议使用 apache beam 递归搜索文件的正确方法是什么?

参考资料: https://beam.apache.org/releases/javadoc/2.11.0/index.html?org/apache/beam/sdk/io/fs/ResourceId.html

您似乎复制了一些 faulty javadoc 的代码。发布的一些旧版本的示例代码在星号周围有错误。

查找currentDir中的所有文件:

val childDir = currentDir.resolve("**", StandardResolveOptions.RESOLVE_FILES)