Web Streams

Joris Verbogt
Joris Verbogt
May 13 2022
Posted in Engineering & Technology

Processing your data on the fly

Web Streams

Ever since streams were introduced in NodeJS, much work has been done to bring this powerful feature to the Web for everyone. With Web Streams, you finally can!

What are Streams?

Streams can easily be understood by thinking about what happens when a large piece of content, or, in fact, any data resource, is split into smaller pieces that are then processed one by one. This idea is in itself not new, as the internet itself consists of 'streams' of binary data packets, but streams in this context can be seen as a high-level abstraction layer of this principle. The concept allows you to start consuming data as it comes, and progressively display or process it for further use. No need for all the data to be first loaded into your data handling routine.

Streams come in three flavors: readable, writable, and transform.

Readable streams are where the chunks of data originate, like a file or HTTP request. These chunks can then be written (or 'piped') to a writable stream. In between, they can be modified (or 'transformed') through a transform stream.

Web Streams (see MDN documentation) are an implementation for the Web that is supported in most modern browsers. They are heavily inspired by the NodeJS streams API (some would say it is an improvement) and since NodeJS 18, they are available as an alternative in NodeJS too.

Example

In this blog post, we will focus on using streams to process a large data file, updating a progress indicator as it is being processed. First of all, the fetch() API implements a readable stream:

fetch('large-file')
  .then(response => {
    if (!response.ok) {
      throw Error(response.statusText)
    } else if (!response.body) {
      throw Error('ReadableStream not yet supported in this browser.')
    } else {
      // response.body is a ReadableStream
    }
  })
  .catch(error => {
    console.error(error)
  })

Now, we can start processing the stream:


fetch('large-file')
  .then(response => {
    if (!response.ok) {
      throw Error(response.statusText)
    } else if (!response.body) {
      throw Error('ReadableStream not yet supported in this browser.')
    } else {
      // we create a new response with a readable stream
      return new Response(
        new ReadableStream({
          start(controller) {
            const reader = response.body.getReader()
            function read() {
              reader.read().then(({done, value}) => {
                if (done) {
                  // this was all data
                  controller.close()
                } else {
                  // put the data chunk into our response stream
                  total += value.byteLength
                  console.log('loaded chunk of ' + value.byteLength + ' bytes, total ' + total + ' bytes so far')
                  controller.enqueue(value)
                  read()
                }
              }).catch(error => {
                console.error(error)
                controller.error(error)
              })
            }
            read()
          }
        })
      )
    }
  })
  .then(response => response.arrayBuffer())
  .then(response => {
    console.log('done loading ' + response.byteLength + ' bytes')
  })
  .catch(error => {
    console.error(error)
  })

This will print on the console:

...
loaded chunk of 1769472 bytes, total 6094848 bytes so far
loaded chunk of 196608 bytes, total 6291456 bytes so far
loaded chunk of 983040 bytes, total 7274496 bytes so far
loaded chunk of 786432 bytes, total 8060928 bytes so far
loaded chunk of 327680 bytes, total 8388608 bytes so far
loaded chunk of 786432 bytes, total 9175040 bytes so far
loaded chunk of 983040 bytes, total 10158080 bytes so far
loaded chunk of 327680 bytes, total 10485760 bytes so far
done loading 10485760 bytes

Transform Streams

Instead of just loading the data and passing it back to the browser (like in the example above), we could process the chunks and transform them into another stream:

fetch('large-file')
  .then(response => response.body)
  .then(response => response.pipeThrough(new TransformStream({
    transform(chunk, controller) {       
      // generate a stream of chunk sizes 
      controller.enqueue(chunk.byteLength)     
    }
  })))
  .then(response => {
    let total = 0
    return new Response(
      new ReadableStream({
        start(controller) {
          const reader = response.getReader()
          function read() {
            reader.read().then(({done, value}) => {
              if (done) {
                // this was all data
                controller.close()
              } else {
                total += value
                console.log('loaded chunk of ' + value + ' bytes, total ' + total + ' bytes so far')
                controller.enqueue(value)
                read()
              }
            }).catch(error => {
              console.error(error)
              controller.error(error)
            })
          }
          read()
        }
      })
    )
  })

Again, this will log a similar output to the browser console.

More use cases

This example is of course very minimal, and streams can be used for very powerful processing jobs in the browser:

  • create a stream of data and offload processing to a Service Worker (service workers support Web Streams)
  • process image data of large photo libraries as they are loaded
  • encrypt and decrypt communication channels (e.g., through Websocket connections) on the fly from within the browser

More examples can be found at the MDN example GitHub repo

Conclusion

With Web Streams, the world of NodeJS streams and the Browser once again become a bit more unified. Streams can be used in a number of use-cases on the web, especially if large data sets or images are involved.

As always, we hope you liked this article and if you have anything to add, maybe you are suited for a Developer position in Notificare. We are currently looking for a Core API Developer, check out the job description. If modern Javascript is your thing, don't hesitate to apply!

Keep up-to-date with the latest news