Why Closed Data Ecosystems Can't Keep Up with AI Agents

By ⚡ min read

The Rise of AI Agents and the Data Challenge

Artificial intelligence agents are transforming how organizations interact with their data. Unlike human analysts who run a few queries per day, these agents can execute ten to a hundred times more queries in the same period. This dramatic increase in query volume exposes a critical weakness in traditional data architectures: closed data ecosystems. When every query must travel through the same expensive compute path, costs spiral and performance suffers.

Why Closed Data Ecosystems Can't Keep Up with AI Agents
Source: thenewstack.io

Anjan Kundavaram, Chief Product Officer at Fivetran, recently discussed this challenge on The New Stack podcast. He warns that relying on a single, powerful compute engine for all queries is inefficient. As he puts it, “It’s kind of like using a Lamborghini to mow the lawn all the time.” The analogy highlights the waste of treating every request with the same high-powered resources, even when a lighter, cheaper option would suffice.

Why Closed Data Stacks Are a Costly Bottleneck

Closed data stacks—where data is locked into a single vendor's ecosystem—were designed for human-scale analytics. They work well when a few analysts run a handful of complex queries. But agents change the game. They don't need immediate answers; they can spend extra time searching for cost-effective routes. In a multi-engine stack, an agent can route a complex analytical question to an expensive engine and a simple lookup to a cheaper one. In a closed stack, every question goes through the same expensive door, driving up costs without delivering proportional value.

The economics shift that Kundavaram describes is counterintuitive. He notes, “An agent could go spend more time if the agent thinks you’re going to save 10x the cost.” This flexibility is lost in closed systems, where agents have no choice but to use the one available compute resource, no matter the task.

The Triple Whammy of Unconsolidated Data

Even if you have an open stack, problems arise when the data and context the AI needs are scattered across many systems. Kundavaram warns of a “triple whammy” that hits organizations with fragmented data landscapes. First, AI agents produce poor answers because they lack consolidated context. Second, agents run far more queries than humans, multiplying costs. Third, those queries are fed with weak context, wasting resources on low-quality outputs.

“It’s going to be like a triple whammy,” Kundavaram says. The result is a data environment where costs explode and AI outcomes disappoint.

Why Closed Data Ecosystems Can't Keep Up with AI Agents
Source: thenewstack.io

Rethinking Control: Innovate, Don't Restrict

When faced with rising query costs, many data leaders instinctively clamp down on access. Kundavaram argues this is exactly the wrong approach. He shares an example from a large company where analytics budgets skyrocketed. The internal response was to impose controls, but Kundavaram advised the opposite: “Let’s innovate.”

He believes that the productivity unlock from agentic analytics only materializes if organizations refuse the lockdown instinct. Instead, they should invest in open infrastructure and semantic discipline. By strategically allocating queries to appropriate compute engines and ensuring data is well-organized, companies can harness agents' power without breaking the bank.

The Open Data Infrastructure Approach

Fivetran is actively promoting what it calls Open Data Infrastructure. At Google Cloud Next, the company introduced an Open Data Infrastructure Data Access Benchmark. This benchmark is designed to expose hidden costs and make it harder for vendors to quietly tax customers’ AI workloads. The message is clear: closed data stacks won’t survive the agent era.

As reported by TNS earlier this year, most enterprise data systems were never built with agent swarms in mind. Fivetran argues that openness—allowing multiple compute engines, consolidating data, and enforcing semantic consistency—is essential for cost-effective AI. The company has been working on interoperability for data lakes on Google Cloud, demonstrating that the technology to support this shift already exists.

In conclusion, the agent era demands a fundamental rethinking of data architecture. Closed ecosystems create bottlenecks that throttle performance and inflate costs. By embracing open infrastructure and resisting the urge to restrict access, organizations can unlock the full potential of AI agents while keeping budgets under control.

Recommended

Discover More

Apple Insights: Your Top Questions Answered10 Ways Go Fix Can Modernize Your CodebaseGoogle Unveils AI 'Skills' for Flutter and Dart to Close Knowledge GapPixel 8 Series Steps Closer to AirDrop Compatibility with Quick Share ExtensionHow to Use apkeep 1.0.0 to Download Android Apps for Research