In one of my bigger projects, we are using Redis with Node.js, as a buffer for large sets of monitoring data.

When we deployed that from a test-case, with just one data-sending instance to about 150 instances, Redis run out of Memory quite fast. Dependent on versions, operating system and some other things, that leads to either a crash of the Redis node or a significant performance decrease.

Redis with Node.js, when zipping is appropriate

Important to know is that Redis limit for storing data is it’s servers memory limit. Basically that’s what’s making Redis so fast. That’s what the Redis FAQ tells us:

Redis is an in-memory but persistent on disk database, so it represents a different trade off where very high write and read speed is achieved with the limitation of data sets that can’t be larger than memory. (Source Redis FAQ)

However, for large data sets, like ours, that leaves you with a difficult decision: Buy more, expansive memory for your server or use CPU-power on your client to reduce the data-size by compressing. Which way is the right one for your specific setup, is something you have to sort out. First of all, a good starting point is this study about a similar problem:

On the other hand, it seems that when we have an application that uses Redis mostly for reading stuff from it, we might think about compressing the input data. However as I mentioned before – try not to overdo it and implement such a mechanism only when you really need it.

Now, did you make your decision? Read on if you want to know how to implement this with Node.js.

Compress Data for Redis with Node.js

Implementation is build on the following Node.js modules:

  • redis
  • zlib

Here is how you write zipped data to redis:

var zlib = require('zlib'),
  redis = require('redis'),
  client = redis.createClient();

client.on("error", function (err) {
  console.log("Error " + err);
});

var redisKey = 'add_your_key_here';
var redisValue = 'Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim   ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.';

zlib.deflate(redisValue, function (err, zippedValue) {

  if (err) {
    console.log('Error deflating!');
    return;
  }

  // you have to encode the binary zip-data to base64 to be able to read it
  // later
  client.set(redisKey, zippedValue.toString('base64'), function (err) {

    if (err) {
      console.log('Error saving to redis!');
      return;
    }

    console.log('Zipped value saved!');

  });
});

And that’s how you read it:

var zlib = require('zlib'),
  redis = require('redis'),
  client = redis.createClient();

client.on("error", function (err) {
  console.log("Error " + err);
});

var redisKey = 'add_your_key_here';

client.get(redisKey, function (err, zippedValue) {

  if (err) {
    console.log('Error reading from redis!');
    return;
  }

  // you have to decode the base64 binary zip-data to base64
  zlib.inflate(new Buffer(zippedValue, 'base64'), function (err, redisValue) {

    if (err) {
      console.log('Error inflating!');
      return;
    }

    console.log(redisValue);

  });
});

Be aware of the base64 encoding. That’s required so you are able to write and read binary zipped data in JavaScript. More information for example here.

I am a great fan of Elastic Search, the ElasticPress plugin and the ElasticPress WooCommerce module. The other day, when I was working on a clients WooCommerce shop, it was the first time I wasn’t satisfied with the quality of the search results, though.

After some digging with the Debug bar and its ElasticPress extension, I could narrow down the problem to a combination of many resembling titles in our database and the Elastic Search fuzziness parameter. Of course the fantastic folks at 10up provide a filter for that in their ElasticPress plugin.

So here is how to disable fuzziness in the search:

function themeslug_deactivate_ep_fuzziness( $fuzz ) {
    return 0;
}
add_filter( 'ep_fuzziness_arg', 'themeslug_deactivate_ep_fuzziness' );

Of course you can even raise the fuzziness with the same method. If you want to adjust fuzziness dependent on other search parameters, the filter provides two more arguments, $search_fields and $args, which might help.

 

It’s the WordPress plugin for webshops, without a question. For a shop that is not running WooCommerce with thousands of products, it has a very nice performance, too. However there are limitations and drawbacks running WooCommerce with many products. That’s mostly because it is build upon the WordPress database scheme, which forces WooCommerce to save much data in the wp_postmeta table.

For a shop that is not running WooCommerce with thousands of products, it has a very nice performance, too.

In my current installation, WooCommerce creates 26 meta-fields for each product, so the product related meta-table rows would grow beyond a million for more than 38 461 products. Adding other theme and plugin related fields, you might cross that mark much earlier. Dependent on your server architecture, such a big meta-table, can make some database queries really slow.

… product related meta-table rows would grow beyond a million for more than 38 461 products.

Running such a big shop should make you think about your server architecture. The odds are high that you can’t avoid an upgrade in the long run. However, sometimes a quick workaround is the only way to make things running again. Following two quick and dirty fixes to make your admin panel faster, when running WooCommerce with thousands of products.

WooCommerce Status Dashboard

With WooCommerce activated, you see a admin dashboard widget, that sums up you WooCommerce status, with sales per month, most sold product, order status and stock status. As it turns out, are the stock status queries very slow, dependent on the meta-table size. Here is how to turn of them in your functions.php file:

function themeslug_deactivate_stock_reports($from) {
 global $wpdb;
 return "FROM {$wpdb->posts} as posts WHERE 1=0";
}
add_filter( 'woocommerce_report_low_in_stock_query_from', 'themeslug_deactivate_stock_reports' );
add_filter( 'woocommerce_report_out_of_stock_query_from', 'themeslug_deactivate_stock_reports' ); 

Note, that turning of these reports will result in 0-values in your dashboard!

woocommerce-status-dashboard-query

 

Slow Edit WooCommerce Product for WordPress >= 4.4.0

Another effect of a big meta-table is the product edit page in wp-admin (it affects even the edit page of posts and pages). Since WordPress 4.4.0 there is made an extra query to the meta-table, which might couse long loading times. And of course there is a hook too disable it. Place the following code in your functions.php file.


function themeslug_postmeta_form_keys() {
 return false;
}
add_filter('postmeta_form_keys', 'themeslug_postmeta_form_keys');

Be aware that even this fix not only makes the edit page faster, it turns even off functionality in the meta-field box!

Other measures

There is much more you can do, to make WooCommerce with thousands of products faster, like optimizing your server architecture or using tools like Elastic Search or Redis Cache. If you need to use the above workarounds, you always should consider some of these measures in the long run!