Redis Passes $300 Million in Annualized Recurring Revenue
Redis, the world’s fastest real-time data platform, announced that it has passed $300 million in annual recurring revenue (ARR). This milestone comes as the company is seeing increasing demand in infrastructure for AI and agentic systems.
“We’re entering a new phase in the evolution of AI infrastructure where context is becoming the locus of innovation," says Redis CEO Rowan Trollope.
Also Read: Larsen & Toubro Partners with Indian Army
“We’re starting to see the emergence of ‘systems of decision,’ real-time data infrastructure that sits at the front of the stack where agentic decisions are being made, providing the necessary context and operational data to drive AI applications. This is Redis’ traditional place in the stack, so developers have realized that we’re a natural fit for those types of workloads.”
Redis was named as the top choice (42 percent) for agent memory storage by developers in the 2025 Stack Overflow developer survey, and in December saw nearly one million downloads of Redis Vector Library (RedisVL), the company’s free library that provides the building blocks for its AI use cases. Downloads of RedisVL increased 3X from September to December, and have increased 10X from December 2024.
Dave continues to say that, "This financial milestone, powered by Redis' growth in AI use cases over the past couple years, is an encouraging sign that sustainability and scale can go hand in hand".
Redis’ AI solutions serve as a real-time context engine for AI systems – a single platform that searches, gathers, and serves AI data, providing the memory, caching, and coordination that agents need to perform personalized tasks fast and accurately, at scale. With tools and features like vector search and storage, Agent Memory Server, and managed semantic caching service LangCache, Redis provides the fastest search and performance at scale while accurately delivering context and reducing LLM latency.
Also Read: IIIT Dharwad to Deploy India's First Commercial Quantum Computer
Redis also reduces expensive and energy-intensive calls to LLMs by caching common responses, and enabling RAG to reduce the computational load on models.
"The defining challenge we face as AI matures will be finding a way to scale the compute resources it requires in a sustainable way", says Dave Easton, a partner at Generation Investment Management.
"Advancements that help reduce unnecessary computation and improve how data is managed and accessed will be essential as AI adoption accelerates, and technologies like Redis are a piece of this complex puzzle".
Also Read: Lenovo, NVIDIA Unveil AI Cloud Gigafactory Program
Dave continues to say that, "This financial milestone, powered by Redis' growth in AI use cases over the past couple years, is an encouraging sign that sustainability and scale can go hand in hand".
Trollope says that he expects Redis’ growth to continue to accelerate as agentic AI becomes more commonplace inside businesses.
“We have an incredibly diverse customer base,” he says. Trollope says that, “We’re a core technology for companies building some of the foundational pieces of AI like models and coding agents, while also being a bedrock piece of infrastructure for enterprises building complex AI systems”.
“This is the most exciting moment in technology of my lifetime, and it’s incredible to have Redis at the center of it”, adds Trollope.



