WooCommerce is the most used webshop system in the world and its  awesome! However, using WordPress as its application framework has some drawbacks. In particular, you might get a very slow WooCommerce with many products. When I point that out in client meetings,  I inevitable get the same questions: How many products exactly result in a slow WooCommerce frontend/backend etc – and what can you do about it?

Besides all knowledge I have about server architecture, clean theming and WordPress coding, that question isn’t so easy to answer. That’s why I did set up my own little test app.

Recommendation: Many slow WooCommerce sites just don’t have the right type of hosting. A good WordPress/WooCommerce hosting service with great support and performance optimizing technical addons like Redis and ElasticSearch is Kinsta. Read more about their WooCommerce hosting here

Slow WooCommerce with Many Products – The Test Environment

I’ve done all of the following tests on my local computer. It’s a Dell XPS 13 the following hardware:

  • Intel Core i7-7500U
  • 8GB LPDDR3 1866MHz
  • 256GB PCIe Solid State Drive

Operating System and services:

  • Ubuntu 14.04
  • Apache 2.4.7
  • MySql 5.5.54
  • PHP 7.0.16

The WordPress installation I’m testing with is a clean install of:

  • WordPress 4.7.2
  • WooCommerce 2.6.14
  • Twenty Seventeen Theme

No other plugins or code adjustments.

Furthermore, I wrote a small npm application, which incrementally adds products to WooCommerce by using the API. After each insertion of data, the application pauses five seconds, to give garbage collection and MySql a break and then performs some tests, like reading products from the WooCommerce API or requesting the “Shop”-page from the front-end, while measuring both request-time and the average of the MySql cpu and memory usage.

The inserted products are simple WooCommerce products, with auto-generated lorem-ipsum titles, description and a random price.

Checkout the test application on GitHub, feedback and contributions very welcome:

https://github.com/danielauener/slow-woocommerce-tests

After cloning, go to the app folder and run

npm install

Then adjust the configuration parameters in index.js to your environment and run the application with a simple

npm start

After completing the test cycle, the test data is written to a benchmark csv file in the benchmark directory of the app directory.

What’s tested

There are three questions, I wanted to answer.

  1. Is WooCommerce really getting slower with many products and how much?
  2. Is it WooCommerce slowing down the performance or is it actually WordPress, which is slow?
  3. If we can confirm a slow WooCommerce with many products, why is that happening?

Besides these general questions, I was interested in two use-cases. The obvious one was request times for the WooCommerce front-end, because that’s impacting conversions and sales. The second one was the insertion speed of new products with a big database. That one is important, because most shops with around 10000 products and more, most likely have some kind of import process, which might add load to the entire server. Since we talking about imports, which can go on for hours, that’s quite relevant.

Slow WooCommerce performance with 50 000 Products?

In my first test I wanted to check a really huge product database, so I could get a full picture of what’s happening. After some tries and time calculations, I decided to run with 50 000 products, starting my test tool with the following environment variables (MySql and WP configuration in the index.js file).


MYSQL_PID=4333 ITERATIONS=500 POST_INSERT_CHUNK=100 npm start

CPU and Meta Table

After a good nights sleep, the test was finally through. What I could say from how my computer behaved during the test (noisy and warm), there had to be some CPU-usage, which you can see in the first image. The other two lines in the chart are the amount of added post in each step (confirming the test went as planned) and the size of the wp_postmeta-table.

Slow WooCommerce with 50 000 products: CPU-Usage

As you see, is the size of the meta-table (yellow), rising much faster than the size of the products-table (green), which we will discuss later. The CPU-usage seems to increase logarithmic, slowly approaching the 100%.

Just to clarify: the CPU-usage is measured only when the requests are done, not during the insert product calls.

Requests

The next chart I’ve created, shows, how the CPU-usage affects the front-end request times. After every product insertion cycle, test requests where made to the shop-page, which is listing the first ten products and a single product page.

Slow WooCommerce with 50 000 products: Request Speed

The overall development of the chart shows, that all the time, the single product page loading times are quite stable. The shop pages loading times increase directly from the beginning though. That’s what I would expect, because the shop-page queries the entire products table and needs to load meta data for at least ten products, while the single product page only needs to do that once.

Furthermore, we can see two quite obvious changes in the chart. Around 15 000 products (step 150) the CPU aproaches 50%, and we get a bigger variation in the loading times. Nonetheless, the single product times seem to be quite stable, but the shop-page times clearly increase.  The same happens again, when the CPU-usage passes 70%, around 35 000 products.

Insertion Speed

The speed of adding new products obviously increases with the amount of products, which is expected. I didn’t expect such an increase, though. I would have thought, that you have to operate WooCommerce with many more products, to get from around 4 seconds per 100 products to much more than a minute.

Slow WooCommerce with 50 000 products: Insertion Speed

For the last dip of increase time around 45 000 products I have no explanation. Might be that some resources were freed on my computer, not really sure, though.

Slow WooCommerce performance with 10 000 Products?

To double-check the previous results, I ran the same tests with only 10 000 products. I both wanted to see, if the test with lesser products would lead to the same conclusions and decrease test-time for further runs. Before this new test, I restored the initial database, from before the other test, than ran the following command:


MYSQL_PID=4333 ITERATIONS=100 POST_INSERT_CHUNK=100 npm start

CPU

The CPU-usage is rising, even with 10 000 products. It goes up to about 30%, which matches the 10 000 product value of the previous test. Meta-table and product count behave as expected.

Slow WooCommerce with 10 000 products: CPU-Usage

Requests

Even the requests show the same picture. The shop-page request time seems to rise accordingly to the CPU-usage, which was the same in the 50 000 test, below the 15 000 products mark. The single product page loads steadily between 100 and 150ms.

Slow WooCommerce with 10 000 products: Request Speed

Insertion Speed

The same goes for insertion speed, which shows a similar picture as above.

Slow WooCommerce with 10 000 products: Insertion Speed

What Causes a Slow WooCommerce with Many Products

Now, one of my questions was, what causes slow WooCommerce queries and if the problem actually is related to WooCommerce? From experience with prior projects, I knew the answer already and had a quite good idea what the reason was. However, it’s always nice to confirm your assumptions with data.

Big Meta Table = Long Loading Times

My experience was, that WooCommerce performance problems with many posts are mainly related to the size of the meta table. As we have seen above, is the table growing quite fast when you add more products. After adding 50 000 products, it contained around 650 000 rows. To test if the meta-table assumption was accurate, I added a function to my application, which removes all WooCommerce meta-fields after each product insertion cycle. The following command runs the 10 000 test with that addition.


MYSQL_PID=4333 ITERATIONS=500 POST_INSERT_CHUNK=100 npm start

CPU

The CPU-chart already shows, that the idea seems to be correct. Without the big meta-table, the CPU-usage is slightly increasing, but almost stable, compared to before.

Slow WooCommerce: CPU-usage without meta

Requests

The requests-chart gives us the same impression. The shop-page request time follows the usage values and is stable between 80 and 100 milliseconds.

Slow WooCommerce: Request times without meta

Insertion Speed

Only the insertion speed is painting another picture. Here we have no performance advantage at all, removing the meta-fields. I’ve added the grey line of instertion times from the previous test and it actually is almost identical. That was  surprise, I have to admit, I thought there would be at least some difference.

Slow WooCommerce: Inserts without meta fields

Conclusions

So, that was fun. But is there something we can do about it? Shouldn’t we use WooCommerce for more than 15 000 products?

Yes, you should use WooCommerce, even for many products. But with the knowledge we gathered here, we are able to make some decisions early on, when building a shop, which should help us along the road.

Some tips to avoid slow WooCommerce:

  1. Check before you start building, how many products the shop will be selling, now and in the future.
  2. Avoid more meta-fields than absolutely necessary, that includes meta-fields on other post-types as well.
  3. As much as we love Advanced Custom Fields: for a big shop, you might want to skip on it. For every field you add to a product with ACF, you get 2 meta-fields in the database.
  4. Think about an eventual import process very early and test it several times, while building the shop. You need to know how much time it will take.
  5. Inform everyone about eventually long import sessions and the extra load on the servers they are causing.
  6. Use a good caching-plugin, which takes load form your server and frees resources for other stuff.

Even if you do everything right, at some point all precautions wont help. Then you have to think about widening you infrastructure. There are some tools which can help, like Elastic Search or Redis, and there are surely a hundred ways to code your way around WooCommerce limitations (e. g. two of my workarounds for the admin panel). And, last but not least, there is always the possibility of a bigger server or a more advanced architecture.

In one of my bigger projects, we are using Redis with Node.js, as a buffer for large sets of monitoring data.

When we deployed that from a test-case, with just one data-sending instance to about 150 instances, Redis run out of Memory quite fast. Dependent on versions, operating system and some other things, that leads to either a crash of the Redis node or a significant performance decrease.

Redis with Node.js, when zipping is appropriate

Important to know is that Redis limit for storing data is it’s servers memory limit. Basically that’s what’s making Redis so fast. That’s what the Redis FAQ tells us:

Redis is an in-memory but persistent on disk database, so it represents a different trade off where very high write and read speed is achieved with the limitation of data sets that can’t be larger than memory. (Source Redis FAQ)

However, for large data sets, like ours, that leaves you with a difficult decision: Buy more, expansive memory for your server or use CPU-power on your client to reduce the data-size by compressing. Which way is the right one for your specific setup, is something you have to sort out. First of all, a good starting point is this study about a similar problem:

On the other hand, it seems that when we have an application that uses Redis mostly for reading stuff from it, we might think about compressing the input data. However as I mentioned before – try not to overdo it and implement such a mechanism only when you really need it.

Now, did you make your decision? Read on if you want to know how to implement this with Node.js.

Compress Data for Redis with Node.js

Implementation is build on the following Node.js modules:

  • redis
  • zlib

Here is how you write zipped data to redis:

var zlib = require('zlib'),
  redis = require('redis'),
  client = redis.createClient();

client.on("error", function (err) {
  console.log("Error " + err);
});

var redisKey = 'add_your_key_here';
var redisValue = 'Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim   ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.';

zlib.deflate(redisValue, function (err, zippedValue) {

  if (err) {
    console.log('Error deflating!');
    return;
  }

  // you have to encode the binary zip-data to base64 to be able to read it
  // later
  client.set(redisKey, zippedValue.toString('base64'), function (err) {

    if (err) {
      console.log('Error saving to redis!');
      return;
    }

    console.log('Zipped value saved!');

  });
});

And that’s how you read it:

var zlib = require('zlib'),
  redis = require('redis'),
  client = redis.createClient();

client.on("error", function (err) {
  console.log("Error " + err);
});

var redisKey = 'add_your_key_here';

client.get(redisKey, function (err, zippedValue) {

  if (err) {
    console.log('Error reading from redis!');
    return;
  }

  // you have to decode the base64 binary zip-data to base64
  zlib.inflate(new Buffer(zippedValue, 'base64'), function (err, redisValue) {

    if (err) {
      console.log('Error inflating!');
      return;
    }

    console.log(redisValue);

  });
});

Be aware of the base64 encoding. That’s required so you are able to write and read binary zipped data in JavaScript. More information for example here.

I am a great fan of Elastic Search, the ElasticPress plugin and the ElasticPress WooCommerce module. The other day, when I was working on a clients WooCommerce shop, it was the first time I wasn’t satisfied with the quality of the search results, though.

After some digging with the Debug bar and its ElasticPress extension, I could narrow down the problem to a combination of many resembling titles in our database and the Elastic Search fuzziness parameter. Of course the fantastic folks at 10up provide a filter for that in their ElasticPress plugin.

 

So here is how to disable fuzziness in the search:

function themeslug_deactivate_ep_fuzziness( $fuzz ) {
    return 0;
}
add_filter( 'ep_fuzziness_arg', 'themeslug_deactivate_ep_fuzziness' );

Of course you can even raise the fuzziness with the same method. If you want to adjust fuzziness dependent on other search parameters, the filter provides two more arguments, $search_fields and $args, which might help.

Recommendation: If you want to use WordPress with ElasticSearch, you can get both with Kinsta. Read more on how to speed up WordPress with their ElasticSearch solution on the Kinsta blogg.

 

It’s the WordPress plugin for webshops, without a question. For a shop that is not running WooCommerce with thousands of products, it has a very nice performance, too. However there are limitations and drawbacks running WooCommerce with many products. That’s mostly because it is build upon the WordPress database scheme, which forces WooCommerce to save much data in the wp_postmeta table.

For a shop that is not running WooCommerce with thousands of products, it has a very nice performance, too.

In my current installation, WooCommerce creates 26 meta-fields for each product, so the product related meta-table rows would grow beyond a million for more than 38 461 products. Adding other theme and plugin related fields, you might cross that mark much earlier. Dependent on your server architecture, such a big meta-table, can make some database queries really slow.

… product related meta-table rows would grow beyond a million for more than 38 461 products.

Running such a big shop should make you think about your server architecture. The odds are high that you can’t avoid an upgrade in the long run. However, sometimes a quick workaround is the only way to make things running again. Following two quick and dirty fixes to make your admin panel faster, when running WooCommerce with thousands of products.

WooCommerce Status Dashboard

With WooCommerce activated, you see a admin dashboard widget, that sums up you WooCommerce status, with sales per month, most sold product, order status and stock status. As it turns out, are the stock status queries very slow, dependent on the meta-table size. Here is how to turn of them in your functions.php file:

function themeslug_deactivate_stock_reports($from) {
 global $wpdb;
 return "FROM {$wpdb->posts} as posts WHERE 1=0";
}
add_filter( 'woocommerce_report_low_in_stock_query_from', 'themeslug_deactivate_stock_reports' );
add_filter( 'woocommerce_report_out_of_stock_query_from', 'themeslug_deactivate_stock_reports' ); 

Note, that turning of these reports will result in 0-values in your dashboard!

woocommerce-status-dashboard-query

 

Slow Edit WooCommerce Product for WordPress >= 4.4.0

Another effect of a big meta-table is the product edit page in wp-admin (it affects even the edit page of posts and pages). Since WordPress 4.4.0 there is made an extra query to the meta-table, which might couse long loading times. And of course there is a hook too disable it. Place the following code in your functions.php file.


function themeslug_postmeta_form_keys() {
 return false;
}
add_filter('postmeta_form_keys', 'themeslug_postmeta_form_keys');

Be aware that even this fix not only makes the edit page faster, it turns even off functionality in the meta-field box!

Other measures

There is much more you can do, to make WooCommerce with thousands of products faster, like optimizing your server architecture or using tools like Elastic Search or Redis Cache. If you need to use the above workarounds, you always should consider some of these measures in the long run!