In the preceding chapter, we made sure that our users will enjoy our CLI. We also published our first release. Now we want to take a look at a more advanced topic: streams. Streams are a very powerful feature in Node.js for processing large amounts of data. With traditional buffering, we quickly run into memory problems, because all the data just doesn’t fit into the memory of the computer. Streams enable us to process data in small slices. Node.js streams work like Unix streams on the terminal, where you pipe data from a producer into a consumer by using the pipe ...
© Robert Kowalski 2017
Robert Kowalski, The CLI Book, https://doi.org/10.1007/978-1-4842-3177-7_4
4. Migrating Large Amounts of Data by Using Streams
Robert Kowalski1
(1)Hamburg, Germany
Get The CLI Book: Writing Successful Command Line Interfaces with Node.js now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.