Up until recently, the internal Rails services that make up our Mobile platform utilised action caching for a lot of requests. When data is rendered it gets compressed and cached in memcached, ready to be served by the Rails app next time that action is called.
Now we’re going one better and caching the raw page in memcached and allowing this to be served direct from nginx. This is providing a speed improvment from ~10ms down to ~1.5ms per request as we bypass the Rails stack completely and cut down on connections to our upstream application servers.
To do this, we need to make some modifications to Rails’ page caching facility as it only caches onto disk. Enter memcaches_page, a gem we’ve written to share this logic across all our services. Drop this into your
Gemfile and use
caches_page as normal. It’ll use your existing
cache_store settings and store the page using the
fullpath as the key. Now all we need to do is update our nginx configuration to serve this:
The benefit of this caching is clear. Running requests unnecessarily through the Rails stack blocks more important requests from being fulfilled. So not only does this change speed up the individual fetching of the endpoints we are caching, it frees up the app servers to process the uncacheable actions such as logging in, searching and sending messages.
The next steps on our continual quest to keep our mobile platform nice and speedy include:
- Switching to JRuby in order to use native threads, allowing us to serve more concurrent requests with a far lower memory overhead.
- Enabling HTTP Streaming. This allows mobile devices to get a head start on downloading CSS and JS assets whilst we finish preparing the request.
- Parallelising requests from the mobile application rather than fetching member information and message text one-by-one.
There’s some fantastic engineering challenges around our mobile platform. We’re using some great technologies like JRuby, Redis, RabbitMQ, nginx and memcached to eke out as much performance as possible. If you get excited about a 5ms speedup in a request, love to make graphs slope downwards with a single deployment, and generally care about the user’s experience, you’d fit right in here. Check out our jobs page for our latest openings.