here at rapidrabbit we deliver many 1,000 requests per second. doing this while only using a handful of servers and ruby on rails we employ very clever caching using redis and nginx.
once the cache is written, it is directly accessed by nginx via a module, which makes it around 500-2,000 times faster than any rails controller.
today i am going to show you how to do it.
as prerequisite i asume you have a running redis server and a working nginx with the HttpRedis module installed. if not you should ask your admin how to.
aside from that you also need the, big suprise, ‘redis’ gem in your rails application.
so go ahead and add
gem 'redis'
to your Gemfile.
create a
config/initializers/redis.rb
with following content
if ENV['RAILS_ENV'] == 'production' $redis_cache = Redis.new(:host => 'your.redis.master.org', :port => 6379, :db => 1) else $redis_cache = Redis.new(:host => 'localhost', :port => 6379, :db => 1) end
we just define a variable holding our connection to the redis database. i prefer to switch the db from ‘0’ to ‘1’ in case you are already using redis for something else and don’t want to clutter it up with the caching keys.
now we can start the caching using a simple after_filter. first add this to your app/controllers/application_controller.rb
def save_cache_to_redis $redis_cache.setex( request.request_uri, @cache_lifetime, response.body ) end
as you can see this is fairly simple and doesn’t really need that much explaining. you save the whole html output to a redis key named after the request url.
so in your controller, that should be cached, you write:
after_filter :save_cache_to_redis, :except => [:create, :destroy, :update] ... def index @cache_lifetime = 86400 #after 1 day the request will hit the controller again ... end
and voila now your complete request answer will get saved into the redis store. but how to actually deliver from the cache?
as i already mentioned we will do this directly from nginx without any involvement from rails. why?
because this is god damn fast. how fast can it be? a 2 ghz dualcore server can easily deliver 300-400 mb/s of http answers (50kb answer size)…that fast.
so edit the nginx configuration for the domain that needs caching (or ask your admin nicely) to something like this:
upstream redis { server localhost:6379; keepalive 1024 single; } upstream yourunicornupstream { server unix:/home/appuser/app/shared/unicorn.sock; } server { listen 80; server_name your.website.com; root /home/appuser/app/current/public; error_log /dev/null crit; #real man don't log location / { set $redis_db "1"; set $redis_key $uri; default_type text/html; redis_pass redis; error_page 404 405 502 504 = @fallback; } location @fallback { proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header Host $http_host; proxy_redirect off; if (!-f $request_filename) { proxy_pass http://yourunicornupstream; break; } } }
and bam! your server now asks redis if it has a key for the url in question, and if not it falls back to your unicorn.
but what about that cache lifetime?
since it is said that the two big problems in ruby on rails app are ‘naming things’ and ‘cache invalidation’ this is no easy question.
but i’ll give it a shot anyway.
they are two basic scenarios. you know exactly when you want to refresh the cache or you just want it to be ‘fresh enough’.
if the second case is true for you, just pick a caching time reasonably high to make your site faster, but not have changed content around for to long.
in the first case, here is how to invalidate your cache:
let’s say you cache the index view of some model. that means, if the model gets created/deleted/edited you need to invalidate the cache.
so in your app/models/thingy.rb
class Thingy < ActiveRecord::Base after_create :invalidate_cache after_destroy :invalidate_cache ... def invalidate_cache $redis_cache.keys("/thingies/*").each do |key| $redis_cache.del(key) end end end
of course this is only a modest example on how far you can go with your invalidation. but i leave the rest up to you.
UPDATE!
i have written a second article regarding far better invalidation, you should check it out ;)
i hope you got something out of reading this, if so drop me a comment.
have fun.
Could this technique work with user authentication using Devise?
yes but then you’d have to modify it in order for the rails controller to deliver the cached result which would be a lot slower
keep in mind that using the ‘keys’ command on a large redis database will slow you down considerably. we had a situation where running keys was timing out commands that came after it, and all hell broke loose. so beware.
one alternative is to keep a reference to your cached views’ urls in a redis set, so when you go to invalidate, instead of hitting all the keys in your database to find what you need to delete, you simply get the members of the set and delete those.
i know…we ran into the same issue ;) … i already writing on a second article covering this problem…this solution only works for “normal” website with a few thousand keys in the database…
so if you stay tuned i’ll deliver the more powerful stuff soon…
check this out:
http://over9000.org/rails/rails-caching-with-redis-invalidation-done-right
better?
whoot. nicely done.
[…] my last article about using redis as rails cache i used the redis ‘keys’ function to invalidate cache. […]
[…] you have set up your nginx as described here […]
This is pretty cool, but this is what a reverse proxy meant for.
Set proper HTTP caching headers and use a fast reverse proxy (such as Varnish).
You will get the benefit of Redis, as in everything kept in memory and access fast, but at the same time you still benefit from client side caching.