49
Streams berggeist007 / pixelio.de

Streams in Node.js

Embed Size (px)

Citation preview

Page 1: Streams in Node.js

Streams

berggeist007 / pixelio.de

Page 2: Streams in Node.js

WHO AM I?

• Sebastian Springer

• Munich, Germany

• works @mayflowerphp

• https://github.com/sspringer82

• @basti_springer

• Consultant, Trainer, Autor

Page 3: Streams in Node.js

We should have some ways of connecting programs like garden hose - screw in another segment when it becomes necessary to massage data in another way.

Douglas McIlroy

CC-BY-SA 4.0

Page 4: Streams in Node.js

What is a stream?

Paul-Georg Meister / pixelio.de

Page 5: Streams in Node.js

$ ls -l /usr/local/lib/node_modules | grep 'js' | less

Page 6: Streams in Node.js

Source Step Step SinkInput OutputSte

p

insert

remove

Page 7: Streams in Node.js

Streams are EventEmitters

EventEmitter

Callbacks

Event

on(‘event’, callback)emit(‘event’ [, arg1][, arg2])

Page 8: Streams in Node.js

Where should you use streams?

selbst / pixelio.de

Page 9: Streams in Node.js

Pipe any given input via multiple steps to an output. Steps in between can be exchanged on demand.

Page 10: Streams in Node.js

Streams in Node.js

http fs

child_process tcp zlib

crypto

Page 11: Streams in Node.js

Example

Source: MySQL (relational DB)

Step 1: Adapt format

Step 2: Download profile images

Sink: MongoDB (document orientated DB)

Page 12: Streams in Node.js

Different types of streams

Karl-Heinz Laube / pixelio.de

Page 13: Streams in Node.js

Stream types

• Readable: Read information (Source)

• Writable: Write information (Sink)

• Duplex: readable and writable

• Transform: (Base: Duplex) Output is calculated based on input

Page 14: Streams in Node.js

Readable Streams

Andreas Hermsdorf / pixelio.de

Page 15: Streams in Node.js

Readable Streams in Node.js

http.Client.Response fs.createReadStream

process.stdin child_process.stdout

Page 16: Streams in Node.js

ReadStreamvar fs = require('fs'); var options = { encoding: 'utf8', highWaterMark: 2 }; var stream = fs.createReadStream('input.txt', options); var chunk = 1; stream.on('readable', function () { console.log(chunk++, stream.read());});

Page 17: Streams in Node.js

Erros in ReadStreamsvar rs = require('fs') .createReadStream('nonExistant.txt'); rs.on('error', function (e) { console.log('ERROR', e);});

ERROR { [Error: ENOENT: no such file or directory, open 'nonExistant.txt'] errno: -2,

code: 'ENOENT', syscall: 'open',

path: 'nonExistant.txt' }

Page 18: Streams in Node.js

ReadStream Modes

• Flowing Mode: Information flows automatically and as fast as possible.

• Paused Mode (Default): Information has to be fetched via read() manually.

Page 19: Streams in Node.js

Flowing Mode

stream.on('data', function (data) {});

stream.resume();

stream.pipe(writeStream);

Page 20: Streams in Node.js

Paused Mode

stream.pause();

stream.removeAllListeners('data');stream.unpipe();

Page 21: Streams in Node.js

Events• readable: Next chunk is available.

• data: Data is automatically read.

• end: There is no more data.

• close: Stream was closed.

• error: There was an error.

Page 22: Streams in Node.js

Object Mode

Usually String and Buffer objects are supported. In Object Mode you can stream any JS-Object. Encoding and chunk size are ignored.

Page 23: Streams in Node.js

"use strict"; var Readable = require('stream').Readable; class TemperatureReader extends Readable { constructor(opt) { super(opt); this.items = 0; this.maxItems = 10; } _read() { if (this.items++ < this.maxItems) { this.push({ date: new Date(2015, 9, this.items + 1), temp: Math.floor(Math.random() * 1000 - 273) + '°C' }); } else { this.push(null); } }} var tr = new TemperatureReader({objectMode: true});var tempObj; tr.on('readable', function() { while (null !== (tempObj = tr.read())) { console.log(JSON.stringify(tempObj)); }});

Page 24: Streams in Node.js

Writable Streams

I-vista / pixelio.de

Page 25: Streams in Node.js

Writable Streams in Node.js

http.Client.Request fs.createWriteStream

process.stdout child_process.stdin

Page 26: Streams in Node.js

WriteStream

var ws = require('fs') .createWriteStream('output.txt'); for (var i = 0; i < 10; i++) { ws.write(`chunk ${i}\n`); } ws.end('DONE');

Page 27: Streams in Node.js

Events

• drain: If the return value of write() is false, the drain event indicates the stream accepts more data.

• pipe/unpipe: Emitted as soon as a Readable Stream pipes into this stream.

Page 28: Streams in Node.js

Buffering

• cork()/uncork(): Buffers write operations to the memory or flushes memory content.

Page 29: Streams in Node.js

Bufferingvar ws = require('fs') .createWriteStream('output.txt'); ws.write('START');ws.cork();for (var i = 0; i < 10; i++) { ws.write(`chunk ${i}\n`); } setTimeout(function () { ws.uncork(); ws.end('DONE'); }, 2000);

Page 30: Streams in Node.js

Piping

Rolf Handke / pixelio.de

Page 31: Streams in Node.js

Piping

Source SinkInput Output

Page 32: Streams in Node.js

Pipingvar fs = require('fs'); var read = fs.createReadStream('input.txt');var write = fs.createWriteStream('pipe.txt'); write.on('pipe', function () { console.log('piped!'); });read.pipe(write);

Page 33: Streams in Node.js

WriteStream"use strict"; var Writable = require('stream').Writable; class WriteStream extends Writable { _write(chunk, enc, done) { console.log('WRITE: ', chunk.toString()); done(); }} var ws = new WriteStream();for (var i = 0; i < 10; i++) { ws.write('Hello ' + i); } ws.end();

Page 34: Streams in Node.js

WriteStream

_writev(chunks, callback)

Alternative to _write, without the encoding parameter.

Page 35: Streams in Node.js

Duplex Streams

Rainer Sturm / pixelio.de

Page 36: Streams in Node.js

Duplex Streams

Duplex Streams implement Readable as well as Writable Interface. Duplex Streams are the bas

class for Transform Streams.

Page 37: Streams in Node.js

Duplex Streams

tcp sockets zlib streams

crypto streams

Page 38: Streams in Node.js

Duplex Streams

var Duplex = require('stream').Duplex; class DuplexStream extends Duplex { _read() { ... } _write() { ... }}

Page 39: Streams in Node.js

Transform Streams

Dieter Schütz / pixelio.de

Page 40: Streams in Node.js

Transform Streams

Transform Streams transform Input by given rules into a defined Output.

Build on Duplex Streams but with an much easier API.

Page 41: Streams in Node.js

Transform Streams"use strict"; var fs = require('fs'); var read = fs.createReadStream('input.txt'); var write = fs.createWriteStream('transform.txt'); var Transform = require('stream').Transform; class ToUpperCase extends Transform { _transform(chunk, encoding, callback) { this.push(chunk.toString().toUpperCase()); callback(); }} var toUpperCase = new ToUpperCase();read.pipe(toUpperCase) .pipe(write);

Page 42: Streams in Node.js

Transform Streams

_flush(callback)

Is called as soon as all data is consumed. Is called before end-Event is triggered.

Page 43: Streams in Node.js
Page 44: Streams in Node.js

Gulp

The streaming build system.

Page 45: Streams in Node.js

Gulp

$ npm install --global gulp

$ npm install --save-dev gulp

$ vi gulpfile.js

$ gulp

Page 46: Streams in Node.js

Gulpvar gulp = require('gulp'); var babel = require('gulp-babel');var concat = require('gulp-concat');var uglify = require('gulp-uglify');var rename = require('gulp-rename');gulp.task('scripts', function() { return gulp.src('js/*.js') .pipe(concat('all.js')) .pipe(gulp.dest('dist')) .pipe(babel()) .pipe(rename('all.min.js')) .pipe(uglify()) .pipe(gulp.dest('dist'));});gulp.task('default', ['scripts']);

Page 47: Streams in Node.js

Gulp

$ gulp [16:09:10] Using gulpfile /srv/basti/gulpfile.js [16:09:10] Starting 'scripts'... [16:09:10] Finished 'scripts' after 178 ms [16:09:10] Starting 'default'... [16:09:10] Finished 'default' after 13 μs

Page 48: Streams in Node.js

Questions?

Rainer Sturm / pixelio.de

Page 49: Streams in Node.js

CONTACT

Sebastian Springer [email protected]

Mayflower GmbH Mannhardtstr. 6 80538 München Deutschland

@basti_springer

https://github.com/sspringer82