A Node to Workers Story

A Node to Workers Story

Node.js allows developers to build web services with JavaScript. However, you’re on your own when it comes to registering a domain, setting up DNS, managing the server processes, and setting up builds.

There’s no reason to manage all these layers on separate platforms. For a site on Cloudflare, these layers can be on a single platform. Serverless technology simplifies developers’ lives and reframes our current definition of backend.

In this article I will breeze through a simple example of how converting a former Node server into a Worker untangled a part of my teams’ code base. The conversion to Workers for this example can be found at this PR on Github.


Cloudflare Marketplace hosts a variety of apps, most of which are produced by third party developers, but some are produced by Cloudflare employees.

The Spotify app is one of those apps that was written by the Cloudflare apps team. This app requires an OAuth flow with Spotify to retrieve the user’s token and gather the playlist, artists, other Spotify profile specific information. While Cloudflare manages the OAuth authentication portion, the app owner – in this case Cloudflare Apps – manages the small integration service that uses the token to call Spotify and formats an appropriate response. Mysteriously, this Spotify OAuth integration broke.

Teams at Cloudflare are keen to remain agile, adaptive, and constantly learning. The current Cloudflare Apps team no longer comprises the original team that developed the Spotify OAuth integration. As such, this current team had no idea why the app broke. Although we had various alerting and logging systems, the Spotify OAuth server was lost in the cloud.

A Node to Workers Story

Our first step to tackling the issue was tracking down, where exactly did the OAuth flow live. After shuffling through several of the potential platforms – GCloud, AWS, Digital Ocean.. – we discovered the service was on Heroku. The more platforms introduced, the more complexity in deploys and access management.

I decided to reduce the number of layers in our service by simply creating a serverless Cloudflare Worker with no maintenance, no new logins, and no unique backend configuration.

Here’s how I did it.

Goodbye Node

The old service used the Node.js and Express.

app.post('/blah', function(request, response) {

This states that for every POST to an endpoint /blah, execute the callback function with a request and response object as arguments.

Cloudflare Workers are built on top of the Service Workers spec. Instead of mutating the response and calling methods on the response object like in Express, we need to respond to ‘fetch’ events. The code below adds an event listener for fetch events (incoming requests to the worker), receiving a FetchEvent as the first parameter. The FetchEvent has a special method called respondWith that accepts an instance of Response or a Promise which resolves to a Response.

addEventListener("fetch", event => {
         event.respondWith(new Response(‘Hello world!'));

To avoid reimplementation of the routing logic in my worker, I made my own app .

const app = {
   get: (endpoint, fn) => {
     url = new URL(request.url);
     if (url.pathname === endpoint && request.method === "GET")
       return fn(request);
     return null;
   post: (endpoint, fn) => {
     url = new URL(request.url);
     if (url.pathname === endpoint && request.method === "POST")
       return fn(request);
     return null;

Now with app set, I call app.get(..) similar to how I did in Node in my handler. I just need to make sure the handler returns at the correct app.

async function handleRequest(request) {
  lastResponse = app.post("/", async function (request) {..}
  if (lastResponse) {
      return lastResponse;
 lastResponse =  app.get("/", async function (request) {
 if (lastResponse) {
      return lastResponse;

lastResponse ensures that we keep listening for all the endpoint methods.

The other thing that needs to change is the return of the response. Before that return used response.json(), so the final response would be of JSON type.

         proceed: false,
         errors: [{type: '400', message: error.toString()}]

In workers, I need to return a type Response to the respondWith function. I replaced every instance of response.json or response.sendStatus with a new Response object.

return new Response(
         proceed: false,
         errors: [{ type: "400", message: res.error }]
         }, { headers: { ‘Content-Type': ‘application/json' } })

Now for the most beautiful part of the transition: delete useless config.

Our Express server was set to export app as a module insert credentials so that Heroku or whatever non-serverless server could pick up, run, and build.

Though I can import libraries for workers via webpack, for this application, it’s overkill. Also, I have access to fetch and other native service worker functions.

const {getJson} = require('simple-fetch')
module.exports = function setRoutes (app) {

Getting rid of modules and deployment config, I removed the files:Procfile, credentials.json, package.json, development.js, heroku.js, and create-app.js.Routes.js simply becomes worker.js.

This was a demo of how workers made my life as a programmer easier. Future developers working with my code can read this code without ever looking at any configuration. Even a purely vanilla bean Javascript developer can come in since there is no managing builds and pulling hair out.

A Node to Workers Story

With serverless I can now spend time on doing what I love – development.

A Node to Workers Story

Source:: CloudFlare