Jon Linnell

How to pipe data into a Node.js script

28 October 2022

Most folks who work with a terminal will be familar with this kind of incantation:

1cat file.txt | pbcopy

In this example, we get the contents of file.txt and pipe it (using the | operator) to the clipboard (on macOS); we transfer the output of one command into the input of another.

This is a super-useful tool, and it's not unusual for *nix nerds to pipe data from command to command to get what they need.

For some jobs, it's tempting to create a Node script to do the high-level processing of data in a pipeline. So how do we access that data getting piped our way?

Reading stdin in Node.js

In Node.js, we can use the process.stdin Readable stream.

The older, more commonly found way to handle stdin data in Node is to use a while loop compiling data from stdin until it's empty.

1let data = "";
3process.stdin.on("readable", () => {
4  let chunk;
5  while (null !== (chunk = {
6    data += chunk;
7  }
10process.stdin.on("end", () => {
11  // process all the data and write it back to stdout
12  process.stdout.write(data);

This is fine, but a little clunky. Thankfully, process.stdin now implements an async iterator that makes this much easier to do.

1let data = "";
3async function main() {
4  for await (const chunk of process.stdin) data += chunk;
6  // process all the data and write it back to stdout
7  process.stdout.write(data);

Much more concise.

You'll notice that once we're done with the processing, we write the output to process.stdout; this allows the data to be used in the next command, eg. pbcopy or grep.

Avoid console.log(), as it appends a newline.

Reading data line-by-line

The chunk in the above example is usually a linear segment of bytes up to a certain arbitrary limit defined by the runtime. This isn't useful if we have a known line-by-line input coming into our app, say as an output from grep, or a file.

In this case, we can use the readline API available in Node.js, and hook up stdin as an input stream.

1const readline = require("node:readline");
3const rl = readline.createInterface({
4  input: process.stdin,
7rl.on("line", (line) => {
8  // process a line at a time
9  process.stdout.write(`Line: ${line.slice(0, 64)}...\n`);

The readline interface also implements an async iterator, so just as we did with the Readable stream above, we can refactor this to use a for..of loop:

1const readline = require("node:readline");
3async function main() {
4  const rl = readline.createInterface({
5    input: process.stdin,
6  });
8  for await (const line of rl) {
9    // process a line at a time
10    process.stdout.write(`line: ${line}\n`);
11  }

Happy piping!

👈 Back to articles

This article was written by Jon Linnell, a software engineer based in London, England.