Introduction to Redis Cache
Every developer feels ecstatic when the number of visitors to their website surge. But this also means more data to be stored. And the same developer who was happy starts pondering about what to do to store this perennial pile of data. One way to solve this is to expand your database storage. But it doesn’t solve problems such as rising costs and multiplying requests to database leading to larger response times.
The other solution is caching which can help your cause greatly. Caching is nothing new to most of us as we would have handled or seen cached data many a time in our browsers. But for the uninitiated, caching is a process that helps to load pages faster and helps application owners to save the huge cost in expanding the database.
One of the popular caching servers is Redis. Redis offers great advantages such as high speed, support for multiple programming languages, scalability to name a few. All these features make it an ideal choice as a caching server.
In this post, we are going to understand more about caching, how it is beneficial, and why Redis is a popular choice for caching.
Table of Contents
- What is Caching?
- How caching improves the user journey?
- What is Redis?
- Why use Redis for caching?
- How to install Redis in Ubuntu
- Wrapping Up
What is Caching?
In simple words, caching is the process of storing data in a temporary storage component, which then can be retrieved easily to serve the data faster.
Caching works on the simple principle of reusing previously computed data efficiently that has been stored on top of the actual storage layer. The data in the cache is typically stored in RAM and may be used in relation to a software component.
In the illustration here, the first scenario is a system without a cache. In this, whenever a user places a request through the application, the server queries the database each time a request is received and responds with the data received from the database.
In the second scenario, the system has a dedicated cache. Whenever a user places a request through the application, the cache server is queried first and if the information is available, it responds back to the application. If the information is not there in the cache server, then the database is queried. Along with the response, the information is now stored in the cache server to be used for the next request.
How Caching improves the user journey?
Let us understand this with an example. Suppose you visit your favorite shopping portal and complete an order. Now to do this, let us assume the application had to fetch your profile details ten times for various steps (fetching shipping address, fetching previous cart items, stored payment details, etc.). Each time the server hits the database and responds to the application, it takes 2 seconds. So the total time you wait for the application to load is 2 * 10 = 20 seconds.
Now assuming this application has a cache server, then it takes the complete 2 seconds to fetch the information from the database for the first time. But after that, the information is stored in the cache. Retrieving details from the cache is blazingly fast and we will assume it takes half the time of the initial request (1 second). So your total wait during your shopping journey is 2 + 1*10= 12 seconds. This is almost half the time you had spent waiting on the first scenario, which means your experience with the second scenario will be much better due to lower wait time. But more importantly, it helps your application to significantly improve its SEO rank as page load time plays an important role in determining it.
Now that we have seen caching and how it is beneficial, in the next section we are going to dive into a popular cache server called Redis.
What is Redis?
As you know by now that a cache is a dedicated service that acts as a server by saving web pages and content locally. It places the requested information in temporary storage which is also the cache. A cache server also allows users to access the webpages even in offline mode. One of the databases that is effectively used as a cache server is Redis.
Redis is an open-source NoSQL database that works on the principle of a key-value store. A key-value store is a process where some data is stored inside a key. Only those who possess the key can access the data. Redis stands for Remote Dictionary Server and is popularly used as a database, cache, message broker, and queue. Redis supports various types of data structures such as hashes, strings, lists, bitmaps to name a few.
Redis typically works on the Master/Slave configuration and also supports sharding which makes it quite easy to distribute the dataset across multiple instances of Redis. Another important feature of Redis is that its working is predominantly atomic. This means you can safely set or increase a key, add or remove elements from a set or even increase a counter.
Why use Redis for caching?
Simple to use
Redis is quite easy to use as there are fewer lines of code required to manage data in your applications.
Support for multiple languages
Redis supports most programming languages including Java, Python, Node.js, R, Go to name a few.
Redis has a primary-replica architecture combined with asynchronous replication. This means the data can be replicated to multiple replica servers that automatically has a positive impact on the read performance. It also facilitates faster recovery in case of any outage in the primary server.
The point-in-time backups supported by Redis and also the fact that all the data lives in memory makes the data quite persistent. It uses flexible policies according to elapsed time and the number of updates done since the last save. It supports an append-only file persistence mode.
Redis guarantees blazing fast performance because it stores the entire dataset in primary memory. It also supports the pipelining of commands and enables the usage of multiple values in a single command, which leads to faster communication with client libraries.
One of the important requirements for any cache system is scalability and Redis facilitates that effortlessly. Redis can be scaled horizontally to manage any surge in the demand for RAM. This is achieved through the process of clustering where the cluster is typically made of multiple Redis servers. Each of these servers (also called shared) is responsible for the subset of the keyspace of the cache. This enables scaling just by adding more shards.
Local Cache Replicas
It is possible to deploy Redis on the same that server that runs the application’s processes. This means it can act as a local private cache. Once this is achieved, the replication feature of Redis can be utilized to maintain the local replicas of the central shared cache.
How to install Redis on Ubuntu?
Before starting with the installation of Redis on Ubuntu, ensure that you are logged in with sudo privileges.
The first step is to update the apt packages list by running the below command in your SSH terminal:
sudo apt update
Once this is done, install Redis with the below command:
sudo apt install redis-server
Once the installation is complete, Redis will start automatically. You may check the status of the service with the following command:
sudo systemctl status redis-server
If everything has been set up fine, you will not receive an error upon running this command. This indicates your Redis server is ready to be used.
Caching has always been one of the popular mechanisms to serve webpage content to visitors quickly and Redis has been the go-to database for most organization’s caching needs. Twitter, GitHub, Weibo, and Snapchat are some of the popular organizations that have incorporated Redis into their ecosystem. One of the reasons for its popularity and huge adoption is that it is open-source which means it can be used by anyone across the globe. In this post, apart from understanding in detail about how caching helps us and why use Redis for caching, we also saw how to set up Redis in your system. In one of our upcoming posts, we will help you to implement Redis caching in ASP.NET.