In one of my bigger projects, we are using Redis with Node.js, as a buffer for large sets of monitoring data.

When we deployed that from a test-case, with just one data-sending instance to about 150 instances, Redis run out of Memory quite fast. Dependent on versions, operating system and some other things, that leads to either a crash of the Redis node or a significant performance decrease.

Redis with Node.js, when zipping is appropriate

Important to know is that Redis limit for storing data is it’s servers memory limit. Basically that’s what’s making Redis so fast. That’s what the Redis FAQ tells us:

Redis is an in-memory but persistent on disk database, so it represents a different trade off where very high write and read speed is achieved with the limitation of data sets that can’t be larger than memory. (Source Redis FAQ)

However, for large data sets, like ours, that leaves you with a difficult decision: Buy more, expansive memory for your server or use CPU-power on your client to reduce the data-size by compressing. Which way is the right one for your specific setup, is something you have to sort out. First of all, a good starting point is this study about a similar problem:

On the other hand, it seems that when we have an application that uses Redis mostly for reading stuff from it, we might think about compressing the input data. However as I mentioned before – try not to overdo it and implement such a mechanism only when you really need it.

Now, did you make your decision? Read on if you want to know how to implement this with Node.js.

Compress Data for Redis with Node.js

Implementation is build on the following Node.js modules:

  • redis
  • zlib

Here is how you write zipped data to redis:

var zlib = require('zlib'),
  redis = require('redis'),
  client = redis.createClient();

client.on("error", function (err) {
  console.log("Error " + err);
});

var redisKey = 'add_your_key_here';
var redisValue = 'Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim   ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.';

zlib.deflate(redisValue, function (err, zippedValue) {

  if (err) {
    console.log('Error deflating!');
    return;
  }

  // you have to encode the binary zip-data to base64 to be able to read it
  // later
  client.set(redisKey, zippedValue.toString('base64'), function (err) {

    if (err) {
      console.log('Error saving to redis!');
      return;
    }

    console.log('Zipped value saved!');

  });
});

And that’s how you read it:

var zlib = require('zlib'),
  redis = require('redis'),
  client = redis.createClient();

client.on("error", function (err) {
  console.log("Error " + err);
});

var redisKey = 'add_your_key_here';

client.get(redisKey, function (err, zippedValue) {

  if (err) {
    console.log('Error reading from redis!');
    return;
  }

  // you have to decode the base64 binary zip-data to base64
  zlib.inflate(new Buffer(zippedValue, 'base64'), function (err, redisValue) {

    if (err) {
      console.log('Error inflating!');
      return;
    }

    console.log(redisValue);

  });
});

Be aware of the base64 encoding. That’s required so you are able to write and read binary zipped data in JavaScript. More information for example here.