Despite the explosion in adoption of AI, many businesses still run on architectures in which their enterprise data is split into either operational systems (OLTP) or analytics systems (OLAP). This separation was dictated by legacy infrastructure which made it challenging to run both day-to-day applications and analytical workloads on the same platform. That divide is now a source of operational issues, resulting in friction, waste and delays across teams.
This split created a disconnect: developers concentrated on keeping applications running, while analysts were left working with data that was often outdated or incomplete. Modern cloud architecture has removed some of the technical barriers, but the divide persists, sustained by legacy software, vendor lock-in and long-standing working practices. It’s time to rethink this model and move towards a unified data stack that reflects the rise of AI agents and applications.
Tackling the legacy bottleneck
Once data lands in a transactional system, it becomes both tricky and expensive to move. Proprietary storage formats and tightly coupled architectures trap data inside operational systems and block integration with modern data and AI workflows. The result is that businesses end up working around infrastructure that no longer fits their needs.
Modern AI agents and applications require rapid and reliable access to live data. However, when operational data is stuck in legacy environments, it becomes much harder to enable automation, personalisation or just real-time decision-making. In addition to slowing development, this also limits responsiveness, scalability and the ability to extract timely insights from rapidly growing data volumes.
An increasing number of businesses are now, understandably, seeking alternatives that remove these constraints and offer a unified, responsive foundation for modern data-driven systems.
Bridging the divide between operations and analytics
The original OLTP/OLAP split was logical at a time when computing capabilities were limited. Running analytics alongside operational workloads simply wasn’t viable. But with cloud-native storage, such as open table formats, businesses no longer require separate pipelines to make operational data available for analytics. Yet, many organisations still rely on outdated architectures where operational data must be extracted, transformed and loaded before it can be analysed, introducing delays, duplication and overhead.
The negative impact is not to be overlooked. Analyst’s base decisions on outdated information. Developers spend time maintaining fragile pipelines instead of building new capabilities. Innovation slows and opportunity costs mount.
In response, more and more businesses are shifting to unified data architectures, where operational and analytical workloads share a single data foundation, utilising engines optimised for each specific task. This reduces complexity, improves efficiency and enables faster iteration — all of which are critical benefits in the AI era.
Readying the data stack for intelligent agents
AI agents are driving a step-change in application development; performing complex, multi-step tasks by reasoning over proprietary data and interacting with other components in real time. With the ability to coordinate decisions and actions throughout an entire data ecosystem, agents mark an evolution beyond basic automation to becoming fundamental parts of organisational operations.
For this shift to be both supported and successful, infrastructure must evolve. AI agents require low-latency access to live data, seamless integration across systems and modern development workflows. A new concept known as a lakebase remedies these issues. It delivers the reliability of an operational database and the openness of a data lake in one, centralised place, so teams can run transactions and analytics without the challenge of juggling systems. It enables fast access to data, scales easily through separated storage and compute, and fits modern development habits like instant branching and versioning. Built for today’s AI-driven workloads, a lakebase empowers both developers and AI agents to build, test, and ship applications quickly, without the constraints of old OLTP setups.
Shaping the next generation of data platforms
It’s increasingly evident that a unified data stack will underpin modern systems. As AI becomes integrated into all aspects of a business, infrastructure that removes silos and unites operational and analytical systems together will be essential for powering teams to innovate and grow without constraints.
Legacy OLTP systems have become out of sync with what modern, AI-driven businesses demand, because of their fixed and complex architecture. Unified, open platforms that can support transactional operations and real-time intelligence without compromise are crucial for AI-native applications.
This shift won’t occur overnight, but organisations that start to reduce fragmentation, adopt open standards and build for agent-driven systems, will be best positioned to succeed in the era of AI.