Adding a Redis cache to an Express app

Redis is, according to its own website, “an open source (BSD licensed), in-memory data structure store, used as a database, cache and message broker”.

In other words, Redis can be used as a blazing fast cache, and it’s dead simple to use.

Let’s use as an example a simple Express application. It doesn’t really matter what it does, just know that when the user visits the / route, the greet function is called and it takes between 1 and 3 seconds to run.

const express = require('express')
const app = express()
const port = 3000
const wait = ms => new Promise(resolve => setTimeout(resolve, ms))function greet(req, res) {
const between1and3seconds =
1000 + Math.floor(Math.random() * Math.floor(2000))
wait(between1and3seconds).then(() =>
res.send(`Hello, I just waited ${between1and3seconds} ms`),
)
}
app.get('/', greet)
app.listen(port, () => console.log(`Example app listening on port ${port}!`))

Every call will return new data, but each call will take between 1 and 3 seconds.

Using Redis, it’s easy to establish a cache. The first call is stored in the Redis in-memory database, and each subsequent call is returned from cache. This of course means that each subsequent call returns the same data.

The first step is to install Redis, the easiest way for macOS users is to use brew:

$ brew install redis

The Redis website has a download section with alternatives for other operating systems and even provides Docker images.

Once Redis is installed, simply start it. Personally I don’t like services running in the background so I created a small script:

!#/bin/zsh
redis-server /usr/local/Homebrew/etc/redis.conf

If you use Docker, you can use a simple docker-compose.yml file:

version: '3'
services:
redis:
container_name: redis-cache
image: redis
ports:
- '6379:6379'

And then start Redis using docker-compose up -d

Regardless of the method and environment, Redis should be a two step process: install, and start. There is no other configuration required.

The next step is to connect our application to Redis. In order to do this, we’re going to add a new dependency:

$ npm install express-redis-cache

Then, we’re going to configure our cache. By default the cache never expires, but while it may suit some situations, it’s more likely that we’ll want cached data to expire after a while, so let’s change our cache a bit so it expires after 10 seconds.

const ExpressRedisCache = require('express-redis-cache')
const cache = ExpressRedisCache({
expire: 10, // optional: expire every 10 seconds
})

And while we could use the cache on our existing route '/' , we’re going to create a new route instead, so we can compare the two:

app.get('/cached', cache.route(), greet)

The first call to /cached is still going to be slow, but every single call in the next 10 seconds is going to be lightning fast:

Obviously, since no new data is fetched, the greeting message remains the same, until a new non-cached call is made 10 seconds after the first one.

This is a good technique to use when:

  • The data returned by a service does not change very frequently
  • A particular route is slowed down by multiple services
  • An API called from the Express application is rate limited: call it once and retrieve from cache subsequent times

This is not a good technique when:

  • A service uses cookies/headers as parameters, since by default the URL is cache. However it is possible to cache per user session and to cache conditionally. This comes in handy to cache GraphQL queries.
  • The data changes faster than the cache expires.

Also, be mindful of REST operations, while it makes sense to cache a GET request, it might not make sense to cache a POST request.

Further reading:

Full code

Previously Paris and Geneva, currently New York. Can also be found at https://dev.to/arnaud

Previously Paris and Geneva, currently New York. Can also be found at https://dev.to/arnaud