When generating a new Rails app, caching is already set up. This makes getting started as quick as wrapping some view fragments using the cache
helper or using Rails.cache.fetch
to store external API results. The cached data is stored in one of Rails' cache stores, which can keep the data in memory, Memcached, Redis, or even write it straight to disk.
But which of the different cache stores is best for which situation? In this overview, we'll take a look at each of the options.
đź‘‹ And if you like this article, there is a lot more we wrote about Ruby (on Rails) performance, check out our Ruby performance monitoring checklist.
File store
The :file_store
is used by default when no cache store is specifically configured. As the name implies, it writes cache entries to the file system. These end up in the tmp/cache
directory in the application's root unless you select a different location by explicitly setting the cache_store
configuration to :file_store
and passing a path to the directory you'd like to use.
The cache files are saved to disk and they're not purged automatically. This means you'll need to take care to remove them yourself to make sure they won't fill up your disk. For example, running Rails.cache.cleanup
periodically will keep your cache free of expired entries.
The file store is useful in development because each entry is stored in the cache directory, allowing you to drop an item from the cache by removing its file. This is useful for testing cache invalidation, for example.
In production, the file store is useful when each server process runs on the same file system. Although it's slower than memory-based cache stores, the file store can be great if the cache can potentially grow very big.
Memory store
Since version 5.0, Rails automatically sets up the :memory_store
in the development configuration when generating a new application. When using the memory store, cached data is kept in memory in the Ruby web server's process.
Because the data is kept in memory in the same process as your web server, the memory store is great to use in development as it's automatically cleared whenever you restart your development web server.
By default, the in-memory store will use 32 megabytes, but you can override that by passing the :size
option when configuring the cache store.
When the cache is full, the least recently used entries are automatically dropped from the cache to make room for new entries.
Using the memory store in production is possible (and the fastest solution you can find), but isn't recommended for systems running multiple server processes. Since processes can't access each other's caches, each process would need to maintain its own copy of the cache.
Memcache store
The :mem_cache_store
uses the Dalli gem and Memcached to store entries in a centralized, in-memory cache.
The data is kept in a separate process instead of the Ruby server process. Because of that, the cache isn’t dropped when your app restarts but stays in memory as long as the Memcached server continues running. When that’s restarted, you’ll start with a fresh cache.
The Memcache store will assume the cache server is running on localhost by default, but you can pass one or multiple addresses to use remote servers.
Memcached is configured to use a maximum cache size of 64 MB by default, but that can be configured using command line options or in the memcached.conf
file. Like the memory store, it will start removing the least recently used items when the cache reaches its maximum size.
The :mem_cache_store
is the go-to cache store for production environments. By using a central memcached server, the cache can be shared between multiple web servers and even over multiple hosts when using a remote memcached server.
Redis cache store
Rails 5.2 introduced the :redis_cache_store
store, which allows you to store cache entries in Redis, much like you would using the Memcache store.
To use Redis as a Rails cache store, use a dedicated Redis cache that’s set up as a LRU (Last Recently Used) cache instead of pointing the store at your existing Redis server, to make sure entries are dropped from the store when it reaches its maximum size.
The Redis store works with the Redis gem (including Redis::Distributed) and hiredis, as well as providing support for a number of configuration options, like setting one or multiple remote servers.
Redis will periodically write the dataset to disk, so most of your cached data will survive when the cache server restarts. In development, cached items can be removed from the console using Rails.cache.delete
.
In production, Redis rivals Memcached in providing a centralized cache store. While it’s usage in Rails apps isn’t as widespread as Memcached yet, the Redis store will most likely become a popular Rails cache store in the future.
Which cache store to use?
In general, Rails' file and memory stores are great for development but can be used in production for smaller applications when knowing and understanding their caveats. The production-grade Memcached and Redis stores are usually better choices for bigger production apps, especially when running multiple web servers on multiple hosts.
This concludes our overview of cache stores in Rails. Be sure to check out the Cache stores section in the Rails guides for more information and configuration options for each store.
How did you like this article and previous articles in the AppSignal Academy series? We have some more articles about caching in Rails lined, up, but please don’t hesitate to let us know what you’d like us to write about (caching-related or otherwise) next!