将内联参数传递给保存在 hdfs 中的 shell 脚本

pass inline arguments to shell script saved in in hdfs

我在 hdfs 上有一个 shell 脚本,它接受 8-9 个参数。 一般情况下,我可以这样进行:

sh sample.sh -mode FULL -status DELETE -id 1456 -region AP -path </path/to/filepath>

我试过了hadoop fs -cat /dev/test/sample.sh | exec bash -mode FULL -status DELETE -id 1456 -region AP -path /dev/resultsFolder 即使我传递了这些参数,它们也不会被读取并且脚本会在没有参数的情况下执行。 抛出错误 No such file or directory 处理此问题的最佳方法是什么?

bash -mode FULL-modeFULL 参数传递给 bash,而不是您的脚本。来自 bash 手册页,描述选项:

-- A -- signals the end of options and disables further option processing. Any arguments after the -- are treated as filenames and arguments. An argument of - is equivalent to --.

和:

-s If the -s option is present, or if no arguments remain after option processing, then commands are read from the standard input. This option allows the positional parameters to be set when invoking an interactive shell.

因此我会调用它作为

.... | bash - --mode FULL ....

根据Gordon Davisson给出的评论,这里需要使用-s

.... | bash -s -- -mode FULL ....
bash -c "$(hadoop fs -cat /dev/test/sample.sh)" bash -mode FULL -status DELETE -id 1456 -region AP -path /dev/resultsFolder

也可以。

-c If the -c option is present, then commands are read from the first non-option argument command_string. If there are arguments after the command_string, the first argument is assigned to [=13=] and any remaining arguments are assigned to the positional parameters. The assignment to [=13=] sets the name of the shell, which is used in warning and error messages.

可能更具可读性(尽管 bash -s 相当优雅)

code=$(hadoop fs -cat /dev/test/sample.sh)
bash -c "$code" bash -mode FULL -status DELETE -id 1456 -region AP -path /dev/resultsFolder