Redis announces major AI advancements and intention to acquire Decoable at Redis Released 2025

Redis outlines an expansive AI strategy featuring significant acquisitions and innovative services, propelling its growth as a key player in AI infrastructure.

  • Friday, 5th September 2025 Posted 7 months ago in by Aaron Sandhu

Redis, the world's fastest data platform, recently unveiled an impactful expansion to its AI strategy during the Redis Released 2025 event. The keynote address by CEO Rowan Trollope highlighted several key initiatives, including the acquisition of Decodable, the introduction of LangCache, and numerous advancements to bolster Redis' position as a critical infrastructure layer for AI applications.

"As AI enters its next phase, the challenge isn't proving what language models can do; it's giving them the context and memory to act with relevance and reliability," Trollope noted. He emphasised how Redis' strategic acquisition of Decodable will streamline data pipeline developments, enabling data conversion into actionable context swiftly and efficiently within Redis.

Decodable, established by Eric Sammer, offers a serverless platform that simplifies the ingestion, transformation, and delivery of real-time data. By joining forces with Redis, Decodable aims to enhance AI capabilities and seamlessly connect developers with real-time data sources.

Redis also premiered LangCache, a fully-managed semantic caching service that cuts latency and token usage by up to 70% in LLM-reliant applications. The caching solution optimises performance and reduces costs significantly, supporting Redis’ mission to bolster AI agent efficiency.

The key advantages of LangCache include:

  • Up to 70% reduction in LLM API costs in high-traffic scenarios
  • 15x faster response times for cache hits compared to live LLM inference
  • Enhanced user experiences with lower latency and consistent outputs

Redis continuously adapts to the swift advancements in AI. Recent integrations make it easier for developers to leverage existing AI frameworks and tools. New tools, such as AutoGen and Cognee, along with LangGraph enhancements, provide scalable memory solutions for agents and chatbots.

Developers can now:

  • Utilise AutoGen for a fast-data memory layer
  • Leverage Cognee to manage memory through summarisation and reasoning
  • Implement LangGraph enhancements for reliability

Additional Redis for AI Enhancements

Redis' evolution continues with key improvements in hybrid search and data compression technologies within AI applications. Introduced upgrades include:

  • Hybrid search improvements using Reciprocal Rank Fusion, integrating text and vector rankings
  • Support for int8 quantised embeddings, yielding 75% memory savings and 30% faster search speeds

These latest updates ensure that Redis remains a pivotal platform for developing high-quality, reliable AI agents and frameworks.

UK businesses focus on AI pricing strategies, but struggle with outdated billing systems.
Databricks is set to invest over $850 million in the UK to expand their AI and data ambitions,...
Motive launches an integrated AI analytics platform designed to transform decision-making and...
A survey reveals that ambitious small business owners in the UK view AI as pivotal for their growth...
UK businesses invest heavily in AI, but structural challenges hinder large-scale productivity...
BakerHostetler’s 2025 report examines cybersecurity threats, regulatory developments, and...
The latest report by Fivetran highlights pipeline fragility in data infrastructures hindering...
Deloitte Netherlands partners with Illumio to strengthen digital resilience in Europe through...