Sunday 20 March 2016

Need help understanding Node streams and crazy memory usage

Hello /r/node! I have a very simple example here which opens a file read stream and a file write stream, and simply pipes one to the other:var fs = require('fs'); console.log("BEFORE: ", process.memoryUsage()); var inp = fs.createReadStream( '/var/tmp/30mb.log' ); var outp = fs.createWriteStream( '/var/tmp/joe.txt' ); outp.on('finish', function() { console.log("AFTER: ", process.memoryUsage()); } ); inp.pipe( outp ); The file being read is 30 MB, but I wouldn't expect any large memory increase, because I'm using streams and pipes. It should read/write in 16K chunks or something, right? But in every version of Node I've tested, this effectively loads the entire file into memory:Node v0.12:BEFORE: { rss: 18169856, heapTotal: 9751808, heapUsed: 3934048 } AFTER: { rss: 52125696, heapTotal: 11803648, heapUsed: 4326008 } Node v4.2:BEFORE: { rss: 19476480, heapTotal: 7408736, heapUsed: 3623808 } AFTER: { rss: 39321600, heapTotal: 9472608, heapUsed: 4832224 } Node v5.8:BEFORE: { rss: 15433728, heapTotal: 7523616, heapUsed: 4076352 } AFTER: { rss: 46604288, heapTotal: 10619424, heapUsed: 6069584 } Can anyone explain why this is happening, and/or how I can copy a file without having Node's memory explode in size?

Submitted March 21, 2016 at 02:12AM by cgijoe_jhuckaby

No comments:

Post a Comment