Current WordPress Rest API extension list

If you want to have all the endpoints as plugin, there is one in the plugin directory: WUXT Headless WordPress API Extensions

I love the WordPress Rest API and switching more and more from theme development to a headless WP approach, with an nice front-end framework. Right now I’m favoring Nuxt.js, which is build on Vue.js (check out wuxt, my very own dockerized nuxt/wp development environment).

For using WPs full strength with the Rest API I’ve collected/build a useful snippet library with WordPress Rest API extensions. I’ll try to maintain the following list as development goes on. All of the following extensions can be embedded in the functions.php file. If you wondering about the wuxt_ prefix, I’ve got the code from my Wuxt project and the prefix is as good as anyone.

Front-page extension

Everything starts with a nice front-page, but there no obvious way to get the WordPress front-page via the Rest API. To read the settings, you have to be authorized, which makes things unnecessary complicated. So here a custom endpoint for getting the front-page.

GET: /wp-json/wp/v2/front-page

<?php

/**
 * Adds a front-page endpoint for generell front-page settings in the
 * Front-end
 */
add_action( 'rest_api_init', 'wuxt_front_page_route' );


function wuxt_front_page_route() {
    register_rest_route( 'wp', '/v2/front-page', array(
        'methods'  => 'GET',
        'callback' => 'wuxt_get_front_page'
    ) );
}


function wuxt_get_front_page( $object ) {

    $request  = new WP_REST_Request( 'GET', '/wp/v2/posts' );

    $frontpage_id = get_option( 'page_on_front' );
    if ( $frontpage_id ) {
      $request  = new WP_REST_Request( 'GET', '/wp/v2/pages/' . $frontpage_id );
    }

    $response = rest_do_request( $request );
    if ($response->is_error()) {
        return new WP_Error( 'wuxt_request_error', __( 'Request Error' ), array( 'status' => 500 ) );
    }

    $embed = $object->get_param( '_embed' ) !== NULL;
    $data = rest_get_server()->response_to_data( $response, $embed );

    return $data;

}

?>

Right now, there is no way I know of for getting menus from the WordPress Rest API. I’m not sure if it’s the most effective way, but here my WordPress Rest API custom endpoint for menus. It registers even a standard ‘main’ menu as default if no location is requested.

Note: The most work for this snippet is done by the menu-class of Michael Cox, you have to include it to get the endpoint to work.

GET: /wp-json/wp/v2/menu?location=<location>

<?php

/**
 * Adds a menu endpoint
 */
require_once('/path/to/Menu.php');

add_action('init', 'wuxt_register_menu');
add_action('rest_api_init', 'wuxt_route_menu');

function wuxt_register_menu()
{
    register_nav_menu('main', __('Main meny'));
}

function wuxt_route_menu()
{
    register_rest_route('wp', '/v2/menu', array(
        'methods' => 'GET',
        'callback' => 'wuxt_get_menu',
    ));
}

function wuxt_get_menu($params)
{
    $params = $params->get_params();
    $theme_locations = get_nav_menu_locations();

    if (!isset($params['location'])) {
        $params['location'] = 'main';
    }

    if ( ! isset( $theme_locations[$params['location']] ) ) {
        return new WP_Error( 'wuxt_menu_error', __( 'Menu location does not exist' ), array( 'status' => 404 ) );
    }

    $menu_obj = get_term( $theme_locations[$params['location']], 'nav_menu' );
    $menu_name = $menu_obj->name;
    $menu = new Menu( $menu_name );
    return $menu->getTree();
}

?>

Filtering Categories and taxonomies

When filtering taxonomies with an Rest API request, you are stuck with OR-queries, because the category endpoint doesn’t give you the full complexity of a tax_query. That means you can get posts which are either in category A or B. The following adjustment doesn’t give you the full complexity either, but it lets you switch all tax_queries to an AND-relation, so that you can select posts which are both in category A and B.

GET: /wp-json/wp/v2/posts/?categories=1,2&and=true

<?php

  /**
   * Ads AND relation on rest category filter queries
   */
  add_action( 'pre_get_posts', 'wuxt_override_relation' );

  function wuxt_override_relation( $query ) {

    // bail early when not a rest request
  	if ( ! defined( 'REST_REQUEST' ) || ! REST_REQUEST ) {
  		return;
  	}

    // check if we want to force an "and" relation
    if ( ! isset( $_GET['and'] ) || !$_GET['and'] || 'false' === $_GET['and'] || !is_array( $tax_query = $query->get( 'tax_query' ) ) ) {
  		return;
  	}

    foreach ( $tax_query as $index => $tax ) {
      $tax_query[$index]['operator'] = 'AND';
    }

  	$query->set( 'tax_query', $tax_query );

  }

?>

Loading ACF Meta-fields

Integrating meta-fields from the Advanced-Custom-Fields plugin into the Rest API responses can be done with this plugin. If you need a simpler solution (only integrating meta fields into post-objects, not writing them), or simply a bit more control, the following snippet can get you started. It sets all ACF-fields to show_in_rest, which lets them appear in the post-objects meta-section:

<?php

    /**
     * Register meta fields from ACF
     */
    add_action( 'init', 'wuxt_register_acf_meta' );

    function wuxt_register_acf_meta() {

        if( function_exists( 'acf_get_field_groups' ) ){
            $result = array();
            $acf_field_groups = acf_get_field_groups();
            foreach( $acf_field_groups as $acf_field_group) {
                foreach($acf_field_group['location'] as $group_locations) {
                    foreach($group_locations as $rule) {
                        foreach(acf_get_fields( $acf_field_group ) as $field) {
                            register_meta( 'post', $field['name'], array( 'show_in_rest' => true ) );
                        }
                    }

                }

            }
        }

    }

?>

SEO-Data

The register_meta trick is handy, even for other plugins. If you want to integrate data from our favorite SEO add-on, Yoast WordPress SEO, into the post objects, you can do it like that:

<?php

    /**
     * Register meta fields for WordPress SEO
     */
    add_action( 'init', 'wuxt_register_yoast_meta' );

    function wuxt_register_yoast_meta() {
      if(in_array('wordpress-seo/wp-seo.php', apply_filters('active_plugins', get_option('active_plugins')))){

          $allowed_yoast_keywords = array(
              '_yoast_wpseo_title',
              '_yoast_wpseo_bctitle',
              '_yoast_wpseo_metadesc',
              '_yoast_wpseo_focuskw',
              '_yoast_wpseo_meta-robots-noindex',
              '_yoast_wpseo_meta-robots-nofollow',
              '_yoast_wpseo_meta-robots-adv',
              '_yoast_wpseo_canonical',
              '_yoast_wpseo_redirect',
              '_yoast_wpseo_opengraph-description',
          );

          foreach( $allowed_yoast_keywords as $field) {
              register_meta( 'post', $field, array( 'show_in_rest' => true ) );
          }

      }
    }

?>

Building Urls

If you are building a front-end app on top of WordPress, you have to think about how to structure your urls. WordPress has two default post-types (posts & pages) and in the urls is not distinguished which type you are requesting, so http://wp-site.expl/something might lead to a page or a post, dependent on the type of the object with the slug something.

That means, that if you want to mirror that behaviour in your app, you have to do two requests for each url, one searching pages, one searching posts. To make that one request, use the following.

GET: /wp-json/wp/v2/slug/<post-or-page-slug>

<?php

/**
 * Adds a slug endpoint for getting the page or post for a given slug
 */
add_action('rest_api_init', 'wuxt_slug_route');


function wuxt_slug_route()
{
    register_rest_route('wp', '/v2/slug/(?P<slug>\S+)', array(
        'methods'  => 'GET',
        'callback' => 'wuxt_get_slug'
    ));
}


function wuxt_get_slug($object)
{

    $slug = $object->get_param('slug');

    $request = new WP_REST_Request('GET', '/wp/v2/posts');
    $request->set_param('slug', $slug);

    $response = rest_do_request($request);

    if (!$response->data) {

        $request = new WP_REST_Request('GET', '/wp/v2/pages');
        $request->set_param('slug', $slug);

        $response = rest_do_request($request);
    }

    if (!$response->data) {
        return new WP_Error('wuxt_no_such_slug', __('Slug does not exist'), array('status' => 404));
    }

    $embed = $object->get_param('_embed') !== NULL;
    $data = rest_get_server()->response_to_data($response, $embed);

    return $data[0];
}

?>

More to come …

Hope you found something of the code above useful. Send me your extensions in the comments and I will happily integrate them!

Since 5.0 the Gutenberg block-editor is here and it seems like curiosity about what you can do with it, slowly surpasses the panic about what Gutenberg could do to all your beloved projects. To get the block-editor into one of your existing projects, there are some hurdles to take:

  • getting to know Gutenbergs editing experience
  • building at least one block-ready template
  • deactivating classic editor (not really a hurdle)
  • and … convincing your client

Say you tackled all these problems and your Gutenberg migration project is good to go: how do you start?

Step by Step: Gutenberg for specific posts or pages

In my (very humble) experience of one finished and one ongoing block-editor migration project, if you can’t start from scratch, you should do it step by step, post by post and page by page. First group your posts and pages by special features and make a plan about how you can build them in Gutenberg. Which blocks do you need, can you apply an own color scheme, what about forms, custom fields and short-codes etc.

With that plan in place, you can now get to one pilot post and start to shape it in the Gutenberg-way. At the same time you don’t want to interfere with your clients possible changes to other posts and you want to give them the chance to get to know the block-editor in a limited playground. So now you have to do some coding.

  1. Deactivate Gutenberg everywhere. I know, you just activated it. But remember, you basically want to preserve the classic experience on the entire site, so that has to be the default. This time, we don’t use the plugin, though. We use this line of code in the functions.php file:
       
      
    // Disable Gutenberg for all posts
    add_filter('use_block_editor_for_post', '__return_false', 5);
      
    
  2. Next step is to activate the Gutenberg block-editor for specific posts and pages. I think the best way is to use a special meta field, which I called use_gutenberg. If that field is true, you just activate the Gutenberg block-editor with the appropriate filter (use_block_editor_for_post).
     
      
    function theme_enable_gutenberg_post_meta($can_edit, $post) {
    
    	if (empty($post->ID)) return $can_edit;
    
    	if (get_post_meta($post->ID, 'use_gutenberg', true)) return true;
    
    	return $can_edit;
    
    }
    add_filter('use_block_editor_for_post', 'theme_enable_gutenberg_post_meta', 10, 2);
      
    

If you add the meta-field with ACF, like I did, it might look like that:

Now you can transfer the group of pages and finally the entire site to Gutenberg in a very controlled and fail-safe way. Your clients will thank you!

Do you have thought about Gutenberg migrations or other experience of transferring the block-editor to existing sites, please let me know in the comments.

 

In one of my bigger projects, we are using Redis with Node.js, as a buffer for large sets of monitoring data.

When we deployed that from a test-case, with just one data-sending instance to about 150 instances, Redis run out of Memory quite fast. Dependent on versions, operating system and some other things, that leads to either a crash of the Redis node or a significant performance decrease.

Redis with Node.js, when zipping is appropriate

Important to know is that Redis limit for storing data is it’s servers memory limit. Basically that’s what’s making Redis so fast. That’s what the Redis FAQ tells us:

Redis is an in-memory but persistent on disk database, so it represents a different trade off where very high write and read speed is achieved with the limitation of data sets that can’t be larger than memory. (Source Redis FAQ)

However, for large data sets, like ours, that leaves you with a difficult decision: Buy more, expansive memory for your server or use CPU-power on your client to reduce the data-size by compressing. Which way is the right one for your specific setup, is something you have to sort out. First of all, a good starting point is this study about a similar problem:

On the other hand, it seems that when we have an application that uses Redis mostly for reading stuff from it, we might think about compressing the input data. However as I mentioned before – try not to overdo it and implement such a mechanism only when you really need it.

Now, did you make your decision? Read on if you want to know how to implement this with Node.js.

Compress Data for Redis with Node.js

Implementation is build on the following Node.js modules:

  • redis
  • zlib

Here is how you write zipped data to redis:

var zlib = require('zlib'),
  redis = require('redis'),
  client = redis.createClient();

client.on("error", function (err) {
  console.log("Error " + err);
});

var redisKey = 'add_your_key_here';
var redisValue = 'Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim   ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.';

zlib.deflate(redisValue, function (err, zippedValue) {

  if (err) {
    console.log('Error deflating!');
    return;
  }

  // you have to encode the binary zip-data to base64 to be able to read it
  // later
  client.set(redisKey, zippedValue.toString('base64'), function (err) {

    if (err) {
      console.log('Error saving to redis!');
      return;
    }

    console.log('Zipped value saved!');

  });
});

And that’s how you read it:

var zlib = require('zlib'),
  redis = require('redis'),
  client = redis.createClient();

client.on("error", function (err) {
  console.log("Error " + err);
});

var redisKey = 'add_your_key_here';

client.get(redisKey, function (err, zippedValue) {

  if (err) {
    console.log('Error reading from redis!');
    return;
  }

  // you have to decode the base64 binary zip-data to base64
  zlib.inflate(new Buffer(zippedValue, 'base64'), function (err, redisValue) {

    if (err) {
      console.log('Error inflating!');
      return;
    }

    console.log(redisValue);

  });
});

Be aware of the base64 encoding. That’s required so you are able to write and read binary zipped data in JavaScript. More information for example here.

I am a great fan of Elastic Search, the ElasticPress plugin and the ElasticPress WooCommerce module. The other day, when I was working on a clients WooCommerce shop, it was the first time I wasn’t satisfied with the quality of the search results, though.

After some digging with the Debug bar and its ElasticPress extension, I could narrow down the problem to a combination of many resembling titles in our database and the Elastic Search fuzziness parameter. Of course the fantastic folks at 10up provide a filter for that in their ElasticPress plugin.

 

So here is how to disable fuzziness in the search:

function themeslug_deactivate_ep_fuzziness( $fuzz ) {
    return 0;
}
add_filter( 'ep_fuzziness_arg', 'themeslug_deactivate_ep_fuzziness' );

Of course you can even raise the fuzziness with the same method. If you want to adjust fuzziness dependent on other search parameters, the filter provides two more arguments, $search_fields and $args, which might help.

Recommendation: If you want to use WordPress with ElasticSearch, you can get both with Kinsta. Read more on how to speed up WordPress with their ElasticSearch solution on the Kinsta blogg.

 

It’s the WordPress plugin for webshops, without a question. For a shop that is not running WooCommerce with thousands of products, it has a very nice performance, too. However there are limitations and drawbacks running WooCommerce with many products. That’s mostly because it is build upon the WordPress database scheme, which forces WooCommerce to save much data in the wp_postmeta table.

For a shop that is not running WooCommerce with thousands of products, it has a very nice performance, too.

In my current installation, WooCommerce creates 26 meta-fields for each product, so the product related meta-table rows would grow beyond a million for more than 38 461 products. Adding other theme and plugin related fields, you might cross that mark much earlier. Dependent on your server architecture, such a big meta-table, can make some database queries really slow.

… product related meta-table rows would grow beyond a million for more than 38 461 products.

Running such a big shop should make you think about your server architecture. The odds are high that you can’t avoid an upgrade in the long run. However, sometimes a quick workaround is the only way to make things running again. Following two quick and dirty fixes to make your admin panel faster, when running WooCommerce with thousands of products.

WooCommerce Status Dashboard

With WooCommerce activated, you see a admin dashboard widget, that sums up you WooCommerce status, with sales per month, most sold product, order status and stock status. As it turns out, are the stock status queries very slow, dependent on the meta-table size. Here is how to turn of them in your functions.php file:

function themeslug_deactivate_stock_reports($from) {
 global $wpdb;
 return "FROM {$wpdb-&amp;amp;amp;gt;posts} as posts WHERE 1=0";
}
add_filter( 'woocommerce_report_low_in_stock_query_from', 'themeslug_deactivate_stock_reports' );
add_filter( 'woocommerce_report_out_of_stock_query_from', 'themeslug_deactivate_stock_reports' ); 

Note, that turning of these reports will result in 0-values in your dashboard!

woocommerce-status-dashboard-query

 

Slow Edit WooCommerce Product for WordPress >= 4.4.0

Another effect of a big meta-table is the product edit page in wp-admin (it affects even the edit page of posts and pages). Since WordPress 4.4.0 there is made an extra query to the meta-table, which might couse long loading times. And of course there is a hook too disable it. Place the following code in your functions.php file.


function themeslug_postmeta_form_keys() {
 return false;
}
add_filter('postmeta_form_keys', 'themeslug_postmeta_form_keys');

Be aware that even this fix not only makes the edit page faster, it turns even off functionality in the meta-field box!

Other measures

There is much more you can do, to make WooCommerce with thousands of products faster, like optimizing your server architecture or using tools like Elastic Search or Redis Cache. If you need to use the above workarounds, you always should consider some of these measures in the long run!