When I’m running WordPress inside a docker container, I usually only mirror the wp-content directory from the container to the host. You shouldn’t change anything in the other WordPress directories anyway and you avoid many volume syncs. The problem with this approach is that you can’t run the wp-command from the host, you have to run WP-CLI inside Docker, that means logging in to the container running the commands and logging out again. Assuming your container is called wp-container the following commands would do that:

  
docker exec -ti wp-container bash
wp plugin list
exit
  

It’s just a little inconvenience, but why not save some command when we can?

I’m using gulp as task-manager for most of my projects, and added a task which forwards all commands to WP-CLI inside Docker. It even checks if the container is running and if WP-CLI is installed inside the container. If not it will be automatically installed.

var gulp = require('gulp');
var spawn = require('child_process').spawn;

var checkContainers = function(names, done) {
  var exec = require('child_process').exec;
  exec('docker ps --format {{.Names}}', function(error, stdout, stderr) {
    done(names.map(function(name) {
      return stdout.split("\n").indexOf(name) >= 0;
    }).reduce(function(running, next) {
      return running && next;
    }, true));
  });
};

var checkWPCli = function(container, done) {
  var exec = require('child_process').exec;
  exec('docker exec ' + container + ' bash -c \'wp\'', function(error, stdout, stderr) {}).on('exit', function(code) {
    done(127 !== code);
  });
};

var installWPCli = function(container, done) {
  var exec = require('child_process').exec;
  exec('docker exec ' + container + ' bash -c \'apt-get update && apt-get install -y less && curl -O https://raw.githubusercontent.com/wp-cli/builds/gh-pages/phar/wp-cli.phar && chmod +x wp-cli.phar && mv wp-cli.phar /usr/local/bin/wp && wp --allow-root cli\'', function(error, stdout, stderr) {}).on('exit', function(code) {
    done(0 === code);
  });
};

var runWPCli = function(container, done) {
  var command = process.argv.slice(process.argv.indexOf('-c') + 1);

  if (!command.length) {
    console.log('ERROR: Provide a valid wp-cli command!');
    return done();
  }

  var exec = require('child_process').exec;
  exec('docker exec ' + container + ' bash -c \'wp --allow-root ' + command.join(' ') + '\'', function(error, stdout, stderr) {
    console.log(stdout);
  }).on('exit', function(code) {
    done();
  });
};

gulp.task( 'wp', function(done) {

  checkContainers(['wp-container'], function(containerRunning) {

    if (!containerRunning) {
      console.log('ERROR: wp-container container is not running. Try "docker-compose up -d"')
      return done();
    }

    checkWPCli('wp-container', function(wpCliRunning) {
      if (!wpCliRunning) {

        console.log('WARNING: wp cli not installed, trying auto install ...');

        installWPCli('wp-container', function(wpCliRunning) {

          console.log('SUCCESS: wp cli installed!');
          runWPCli('wp-container', done);

        });
      } else {

        runWPCli('wp-container', done);

      }
    });
  });

});

So, with that gulp task in place, you will be able to run the following commands and every other WP-CLI command as well:

gulp wp -c "plugin list"
gulp wp -c "plugin install advanced-custom-fields"

What do you think?

Is that to much hassle for running some wp-commands or would you use the gulp forward in your daily work? Let me know!

In the last couple of months I’ve worked a great deal with WordPress as backend system, providing data to a JavaScript frontend via the Rest API. As I couldn’t find a complete solution for my favorite frontend-framework, my “getting started” process always repeated the following steps

  • set up WordPress development environment (docker)
  • integrate Javascript frontend in the code
  • setup environment for the frontend (docker)
  • make the two environments talk to each other (not always as easy as you think)
  • setup a simple JavaScript framework to query the WordPress API from the frontend

As that procedure eats up a whole lot of project hours, the time was ready for an easier solution. So, together with my colleague Oliwer at Northosts and much appreciated input and testing from my friend Samuel, I started working on a WordPress/Nuxt.js hybrid development environment. Our goal was to provide a one-command-installation, which makes you ready to go in 5 minutes.

Now we are finally ready for our first release, so say hello to WUXT!

WUXT Components

WUXT combines a couple of components you almost certainly already know from other projects, adds some convenient new functionality and solves a handfull of configuration problems. Following a short overview of all components

Docker containers: To provide a consistent development experience on all platforms, we use Docker (what else?). Three containers are needed, one for the WordPress database (mysql.wuxt), one for WordPress (wp.wuxt) and one for nuxt.js (front.wuxt). Everything is set up with docker-compose. We were extra careful to base everything on the official images (mysql, wordpress and node) and skip custom docker-files, which should make it a lot easier to upgrade the environment. The relevant files from the container are mirrored to the host with volumes, which makes the database persistent (_db folder), lets you access everything important in WordPress (wp-content directory) and nuxt.js (nuxt folder).

WordPress Rest API extensions: The WordPress API is awesome, but to use WordPress full strength in our nuxt.js frontend, we needed some additional functionality. Activating our WUXT-theme adds endpoints for menus, the front-page settings, getting pages and posts by slug and includes meta-data from well known plugins. We are constantly adding new endpoints to support even more use-cases.

WordPress connector in nuxt.js: It’s not difficult to setup some requests to the WordPress API from nuxt.js, but we wanted a standardized way. That’s why we included the wpapi node module into our nuxt application (you have always access via the $wp object). Even here we did some adjustments, to make sure the connection to the WordPress Rest API can be established both from the nuxt-container to the wp-container and from the client application to the wp-container. Of course, we also added some shortcuts to support our new endpoints for menus, front-page, custom post types etc.

Development tools: Docker is so great, but sometimes it makes developer lives a little bit more difficult. Logging in to the container to run a wp-cli command can be a hassle, file permissions when working with volumes might be another. To make that a little bit more easy we added gulp tasks to give you access to wp-cli or installing new node-modules inside the container, without accessing it. We even provide a post-type generator for gulp, which lets you create automatically loaded, API-ready post-types in seconds.

Nothing of the above is rocket-science, but I think the combination and packaging into one, complete environment should make a difference for everyone developing with WordPress and nuxt.js. Please try it out and give us feedback, issues or some of your spare-time to make WUXT better.

Getting started

First clone the repository to a directory you want, then change to that directory and simply start your containers (you need to have a running Docker installation of course):

docker-compose up -d

That starts the following containers:

  • MySql (mysql.wuxt) 
  • WordPress (wp.wuxt)
  • nuxt.js (front.wuxt)

Your containers are available at

  • Frontend: http://localhost:3000
  • Backend: http://localhost:3080http://localhost:3080/wp-admin
  • Database: docker exec -ti mysql.wuxt bash

Setup WordPress

After starting the containers the first time you have to do a common WordPress installation at http://localhost:3080/install.php. Then log in to wp-admin and select the wuxt theme to activate all the API extensions. Additionally you might want to activate the ACF plugin to make your meta work easier. Last but not least you have to set the permalink structure to “Post Name” in the WordPress settings.

To check if everything is running, visit http://localhost:3080 and verify that the WUXT info screen is showing.

Then check that the Rest API at http://localhost:3080/wp-json is returning a JSON-object and you are good to go.

Setup nuxt.js

Nuxt should have been started automatically inside the docker container. The command we use for running the nuxt.js server is yarn dev. Check if the frontend is running by opening http://localhost:3000. You should be greeted by the Wuxt intro-screen.

Finally check if BrowserSync is running, by doing a minor change to the front-page. The change should directly be visible on the front-page as well.

Feedback

Please let us know what you think about WUXT, what you did with it and what you couldn’t do. Please clone, fork, open issues, comment or contribute, it’s much appreciated.

WooCommerce is the most used webshop system in the world and its  awesome! However, using WordPress as its application framework has some drawbacks. In particular, you might get a very slow WooCommerce with many products. When I point that out in client meetings,  I inevitable get the same questions: How many products exactly result in a slow WooCommerce frontend/backend etc – and what can you do about it?

Besides all knowledge I have about server architecture, clean theming and WordPress coding, that question isn’t so easy to answer. That’s why I did set up my own little test app.

Recommendation: Many slow WooCommerce sites just don’t have the right type of hosting. A good WordPress/WooCommerce hosting service with great support and performance optimizing technical addons like Redis and ElasticSearch is Kinsta. Read more about their WooCommerce hosting here

Slow WooCommerce with Many Products – The Test Environment

I’ve done all of the following tests on my local computer. It’s a Dell XPS 13 the following hardware:

  • Intel Core i7-7500U
  • 8GB LPDDR3 1866MHz
  • 256GB PCIe Solid State Drive

Operating System and services:

  • Ubuntu 14.04
  • Apache 2.4.7
  • MySql 5.5.54
  • PHP 7.0.16

The WordPress installation I’m testing with is a clean install of:

  • WordPress 4.7.2
  • WooCommerce 2.6.14
  • Twenty Seventeen Theme

No other plugins or code adjustments.

Furthermore, I wrote a small npm application, which incrementally adds products to WooCommerce by using the API. After each insertion of data, the application pauses five seconds, to give garbage collection and MySql a break and then performs some tests, like reading products from the WooCommerce API or requesting the “Shop”-page from the front-end, while measuring both request-time and the average of the MySql cpu and memory usage.

The inserted products are simple WooCommerce products, with auto-generated lorem-ipsum titles, description and a random price.

Checkout the test application on GitHub, feedback and contributions very welcome:

https://github.com/danielauener/slow-woocommerce-tests

After cloning, go to the app folder and run

npm install

Then adjust the configuration parameters in index.js to your environment and run the application with a simple

npm start

After completing the test cycle, the test data is written to a benchmark csv file in the benchmark directory of the app directory.

What’s tested

There are three questions, I wanted to answer.

  1. Is WooCommerce really getting slower with many products and how much?
  2. Is it WooCommerce slowing down the performance or is it actually WordPress, which is slow?
  3. If we can confirm a slow WooCommerce with many products, why is that happening?

Besides these general questions, I was interested in two use-cases. The obvious one was request times for the WooCommerce front-end, because that’s impacting conversions and sales. The second one was the insertion speed of new products with a big database. That one is important, because most shops with around 10000 products and more, most likely have some kind of import process, which might add load to the entire server. Since we talking about imports, which can go on for hours, that’s quite relevant.

Slow WooCommerce performance with 50 000 Products?

In my first test I wanted to check a really huge product database, so I could get a full picture of what’s happening. After some tries and time calculations, I decided to run with 50 000 products, starting my test tool with the following environment variables (MySql and WP configuration in the index.js file).


MYSQL_PID=4333 ITERATIONS=500 POST_INSERT_CHUNK=100 npm start

CPU and Meta Table

After a good nights sleep, the test was finally through. What I could say from how my computer behaved during the test (noisy and warm), there had to be some CPU-usage, which you can see in the first image. The other two lines in the chart are the amount of added post in each step (confirming the test went as planned) and the size of the wp_postmeta-table.

Slow WooCommerce with 50 000 products: CPU-Usage

As you see, is the size of the meta-table (yellow), rising much faster than the size of the products-table (green), which we will discuss later. The CPU-usage seems to increase logarithmic, slowly approaching the 100%.

Just to clarify: the CPU-usage is measured only when the requests are done, not during the insert product calls.

Requests

The next chart I’ve created, shows, how the CPU-usage affects the front-end request times. After every product insertion cycle, test requests where made to the shop-page, which is listing the first ten products and a single product page.

Slow WooCommerce with 50 000 products: Request Speed

The overall development of the chart shows, that all the time, the single product page loading times are quite stable. The shop pages loading times increase directly from the beginning though. That’s what I would expect, because the shop-page queries the entire products table and needs to load meta data for at least ten products, while the single product page only needs to do that once.

Furthermore, we can see two quite obvious changes in the chart. Around 15 000 products (step 150) the CPU aproaches 50%, and we get a bigger variation in the loading times. Nonetheless, the single product times seem to be quite stable, but the shop-page times clearly increase.  The same happens again, when the CPU-usage passes 70%, around 35 000 products.

Insertion Speed

The speed of adding new products obviously increases with the amount of products, which is expected. I didn’t expect such an increase, though. I would have thought, that you have to operate WooCommerce with many more products, to get from around 4 seconds per 100 products to much more than a minute.

Slow WooCommerce with 50 000 products: Insertion Speed

For the last dip of increase time around 45 000 products I have no explanation. Might be that some resources were freed on my computer, not really sure, though.

Slow WooCommerce performance with 10 000 Products?

To double-check the previous results, I ran the same tests with only 10 000 products. I both wanted to see, if the test with lesser products would lead to the same conclusions and decrease test-time for further runs. Before this new test, I restored the initial database, from before the other test, than ran the following command:


MYSQL_PID=4333 ITERATIONS=100 POST_INSERT_CHUNK=100 npm start

CPU

The CPU-usage is rising, even with 10 000 products. It goes up to about 30%, which matches the 10 000 product value of the previous test. Meta-table and product count behave as expected.

Slow WooCommerce with 10 000 products: CPU-Usage

Requests

Even the requests show the same picture. The shop-page request time seems to rise accordingly to the CPU-usage, which was the same in the 50 000 test, below the 15 000 products mark. The single product page loads steadily between 100 and 150ms.

Slow WooCommerce with 10 000 products: Request Speed

Insertion Speed

The same goes for insertion speed, which shows a similar picture as above.

Slow WooCommerce with 10 000 products: Insertion Speed

What Causes a Slow WooCommerce with Many Products

Now, one of my questions was, what causes slow WooCommerce queries and if the problem actually is related to WooCommerce? From experience with prior projects, I knew the answer already and had a quite good idea what the reason was. However, it’s always nice to confirm your assumptions with data.

Big Meta Table = Long Loading Times

My experience was, that WooCommerce performance problems with many posts are mainly related to the size of the meta table. As we have seen above, is the table growing quite fast when you add more products. After adding 50 000 products, it contained around 650 000 rows. To test if the meta-table assumption was accurate, I added a function to my application, which removes all WooCommerce meta-fields after each product insertion cycle. The following command runs the 10 000 test with that addition.


MYSQL_PID=4333 ITERATIONS=500 POST_INSERT_CHUNK=100 npm start

CPU

The CPU-chart already shows, that the idea seems to be correct. Without the big meta-table, the CPU-usage is slightly increasing, but almost stable, compared to before.

Slow WooCommerce: CPU-usage without meta

Requests

The requests-chart gives us the same impression. The shop-page request time follows the usage values and is stable between 80 and 100 milliseconds.

Slow WooCommerce: Request times without meta

Insertion Speed

Only the insertion speed is painting another picture. Here we have no performance advantage at all, removing the meta-fields. I’ve added the grey line of instertion times from the previous test and it actually is almost identical. That was  surprise, I have to admit, I thought there would be at least some difference.

Slow WooCommerce: Inserts without meta fields

Conclusions

So, that was fun. But is there something we can do about it? Shouldn’t we use WooCommerce for more than 15 000 products?

Yes, you should use WooCommerce, even for many products. But with the knowledge we gathered here, we are able to make some decisions early on, when building a shop, which should help us along the road.

Some tips to avoid slow WooCommerce:

  1. Check before you start building, how many products the shop will be selling, now and in the future.
  2. Avoid more meta-fields than absolutely necessary, that includes meta-fields on other post-types as well.
  3. As much as we love Advanced Custom Fields: for a big shop, you might want to skip on it. For every field you add to a product with ACF, you get 2 meta-fields in the database.
  4. Think about an eventual import process very early and test it several times, while building the shop. You need to know how much time it will take.
  5. Inform everyone about eventually long import sessions and the extra load on the servers they are causing.
  6. Use a good caching-plugin, which takes load form your server and frees resources for other stuff.

Even if you do everything right, at some point all precautions wont help. Then you have to think about widening you infrastructure. There are some tools which can help, like Elastic Search or Redis, and there are surely a hundred ways to code your way around WooCommerce limitations (e. g. two of my workarounds for the admin panel). And, last but not least, there is always the possibility of a bigger server or a more advanced architecture.

In one of my bigger projects, we are using Redis with Node.js, as a buffer for large sets of monitoring data.

When we deployed that from a test-case, with just one data-sending instance to about 150 instances, Redis run out of Memory quite fast. Dependent on versions, operating system and some other things, that leads to either a crash of the Redis node or a significant performance decrease.

Redis with Node.js, when zipping is appropriate

Important to know is that Redis limit for storing data is it’s servers memory limit. Basically that’s what’s making Redis so fast. That’s what the Redis FAQ tells us:

Redis is an in-memory but persistent on disk database, so it represents a different trade off where very high write and read speed is achieved with the limitation of data sets that can’t be larger than memory. (Source Redis FAQ)

However, for large data sets, like ours, that leaves you with a difficult decision: Buy more, expansive memory for your server or use CPU-power on your client to reduce the data-size by compressing. Which way is the right one for your specific setup, is something you have to sort out. First of all, a good starting point is this study about a similar problem:

On the other hand, it seems that when we have an application that uses Redis mostly for reading stuff from it, we might think about compressing the input data. However as I mentioned before – try not to overdo it and implement such a mechanism only when you really need it.

Now, did you make your decision? Read on if you want to know how to implement this with Node.js.

Compress Data for Redis with Node.js

Implementation is build on the following Node.js modules:

  • redis
  • zlib

Here is how you write zipped data to redis:

var zlib = require('zlib'),
  redis = require('redis'),
  client = redis.createClient();

client.on("error", function (err) {
  console.log("Error " + err);
});

var redisKey = 'add_your_key_here';
var redisValue = 'Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim   ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.';

zlib.deflate(redisValue, function (err, zippedValue) {

  if (err) {
    console.log('Error deflating!');
    return;
  }

  // you have to encode the binary zip-data to base64 to be able to read it
  // later
  client.set(redisKey, zippedValue.toString('base64'), function (err) {

    if (err) {
      console.log('Error saving to redis!');
      return;
    }

    console.log('Zipped value saved!');

  });
});

And that’s how you read it:

var zlib = require('zlib'),
  redis = require('redis'),
  client = redis.createClient();

client.on("error", function (err) {
  console.log("Error " + err);
});

var redisKey = 'add_your_key_here';

client.get(redisKey, function (err, zippedValue) {

  if (err) {
    console.log('Error reading from redis!');
    return;
  }

  // you have to decode the base64 binary zip-data to base64
  zlib.inflate(new Buffer(zippedValue, 'base64'), function (err, redisValue) {

    if (err) {
      console.log('Error inflating!');
      return;
    }

    console.log(redisValue);

  });
});

Be aware of the base64 encoding. That’s required so you are able to write and read binary zipped data in JavaScript. More information for example here.