node.js:确定流传输到最终目的地之前的长度
node.js: determine length of stream before it is piped to final destination
这个问题背后的背景是我正在获取一个图像缓冲区,用 pngquant 压缩它,然后将压缩后的图像通过管道传输到响应。类似于:
// https://www.npmjs.com/package/pngquant
const PngQuant = require('pngquant');
// start with base64-encoded png image data:
var base64data = '.......';
// then create buffer from this, as per:
//
//
var imgBuffer = Buffer.from(base64data, 'base64');
// set up pngquant...
const optionsArr = [ ..... ];
const myPngQuanter = new PngQuant(optionsArr);
// convert buffer into stream, as per:
//
var bufferStream = new stream.PassThrough();
bufferStream.end(imgBuffer);
// pipe the image buffer (stream) through pngquant (to compress it) and then to res...
bufferStream.pipe(myPngQuanter).pipe(res);
我想确定 pngquant 操作实现的压缩比。我可以很容易地找到起始尺寸:
const sizeBefore = imgBuffer.length;
我还需要压缩流的大小。此外,此信息必须在 之前 可用,因为我需要将 header 添加到 res
基于压缩统计数据。
为了获得 sizeAfter
,我尝试了 length-stream module, where you can insert a listener into the pipe (between myPngQuanter
and res
) to determine the length as it passes through. Whilst this does seem to work to determine the length of the compressed stream, it doesn't happen in time to add any headers to res
. I've also tried stream-length,但根本无法正常工作。
感谢任何帮助。
从本质上讲,流并没有真正的长度信息(流可以是无限的,例如打开 /dev/random
),所以我能看到的最简单的选择是使用另一个临时缓冲区。不幸的是,pngquant
没有对缓冲区进行操作的选项,但是除了完全使用不同的包之外,您对此无能为力。
第二次编辑,因为流缓冲区可能不起作用:
有个包叫stream-to-array
, which allows easy implementation of a stream-to-buffer conversion. As per the README,代码修改为:
const toArray = require('stream-to-array');
const util = require('util');
toArray(bufferStream.pipe(myPngQuanter))
.then(function (parts) {
const buffers = parts
.map(part => util.isBuffer(part) ? part : Buffer.from(part));
const compressedBuffer = Buffer.concat(buffers);
console.log(compressedBuffer.length); // here is the size of the compressed data
res.write(compressedBuffer);
});
或者使用 await
,如果您碰巧处于 async
环境中:
const toArray = require('stream-to-array');
const util = require('util');
const parts = await toArray(bufferStream.pipe(myPngQuanter));
const buffers = parts.map(part => util.isBuffer(part) ? part : Buffer.from(part));
const compressedBuffer = Buffer.concat(buffers);
console.log(compressedBuffer.length); // here is the size of the compressed data
res.write(compressedBuffer);
这个问题背后的背景是我正在获取一个图像缓冲区,用 pngquant 压缩它,然后将压缩后的图像通过管道传输到响应。类似于:
// https://www.npmjs.com/package/pngquant
const PngQuant = require('pngquant');
// start with base64-encoded png image data:
var base64data = '.......';
// then create buffer from this, as per:
//
//
var imgBuffer = Buffer.from(base64data, 'base64');
// set up pngquant...
const optionsArr = [ ..... ];
const myPngQuanter = new PngQuant(optionsArr);
// convert buffer into stream, as per:
//
var bufferStream = new stream.PassThrough();
bufferStream.end(imgBuffer);
// pipe the image buffer (stream) through pngquant (to compress it) and then to res...
bufferStream.pipe(myPngQuanter).pipe(res);
我想确定 pngquant 操作实现的压缩比。我可以很容易地找到起始尺寸:
const sizeBefore = imgBuffer.length;
我还需要压缩流的大小。此外,此信息必须在 之前 可用,因为我需要将 header 添加到 res
基于压缩统计数据。
为了获得 sizeAfter
,我尝试了 length-stream module, where you can insert a listener into the pipe (between myPngQuanter
and res
) to determine the length as it passes through. Whilst this does seem to work to determine the length of the compressed stream, it doesn't happen in time to add any headers to res
. I've also tried stream-length,但根本无法正常工作。
感谢任何帮助。
从本质上讲,流并没有真正的长度信息(流可以是无限的,例如打开 /dev/random
),所以我能看到的最简单的选择是使用另一个临时缓冲区。不幸的是,pngquant
没有对缓冲区进行操作的选项,但是除了完全使用不同的包之外,您对此无能为力。
第二次编辑,因为流缓冲区可能不起作用:
有个包叫stream-to-array
, which allows easy implementation of a stream-to-buffer conversion. As per the README,代码修改为:
const toArray = require('stream-to-array');
const util = require('util');
toArray(bufferStream.pipe(myPngQuanter))
.then(function (parts) {
const buffers = parts
.map(part => util.isBuffer(part) ? part : Buffer.from(part));
const compressedBuffer = Buffer.concat(buffers);
console.log(compressedBuffer.length); // here is the size of the compressed data
res.write(compressedBuffer);
});
或者使用 await
,如果您碰巧处于 async
环境中:
const toArray = require('stream-to-array');
const util = require('util');
const parts = await toArray(bufferStream.pipe(myPngQuanter));
const buffers = parts.map(part => util.isBuffer(part) ? part : Buffer.from(part));
const compressedBuffer = Buffer.concat(buffers);
console.log(compressedBuffer.length); // here is the size of the compressed data
res.write(compressedBuffer);