如何手动实现背压

How implement back pressure manually

我有子进程,因为我正在将流传输到父进程。

在child.js

  let stream = readdirp(pathname);
  stream.pipe.pipe(process.stdout);   

在parent.js

let file = child => {
  let estream = es.map((data, next) => {
    _this.s3MultiUpload(JSON.parse(data), data, next);
    //i uploding this files to s3.
  });
  child.on("end", (code, signal) => {
    console.log("stream ended"); // `here is my problem`
    child.kill();
  });
  child.on("exit", (code, signal) => {
    console.log(code);
    console.log(signal);
    child.kill();
  });
  return estream;
};
child = fork(filePath, { silent: true });
child.stdout.pipe(this.file(child));

我的问题是在我将所有文件上传到 s3 流之前就结束了。我研究了背压,但我不明白如何在这里实施?

我想我需要添加回调或其他东西来处理标准输出管道。我不知道

你能帮帮我吗

该方法不必要地复杂。因为,IO 操作不受 CPU 限制,我们最好将 Promises 与 JavaScript 的 async/await* 语法一起使用来执行并行文件上传。构建我们自己的同步机制很复杂,并且出现了许多重叠的语言和库级概念1.

基于 readdirp documentation,但注意到我对特定上传内容不熟悉 API,我会按照这些建议提出一些建议

const readdirp = require('readdirp');
const util = require('util');
const fs = require('fs');

const readfile = util.promisify(fs.readfile);

(async function () {
  // Use streams to achieve small RAM & CPU footprint.
  // 1) Streams example with for-await. Node.js 10+ only.
  const paths = [];
  for await (const {path} of readdirp('pending-uploads')) {
    paths.push(path);
  }

  const uploadPromises = paths
    .map(readFile)
    .map(JSON.parse).
    .map(data => s3MultiUpload(data));

  await Promise.all(uploadPromises);
}());

1。背压是将 Reactive Extensions 库移植到在 Java。只是为了争论(理智?)考虑 what Erik Meijer says regarding backpressure.