All articles
How the AI Boom is Forcing Enterprises to Reckon with Years of Neglected Data Foundations
David LaRue, Founder of IQ4Hire and former CIO and CDO, explains how AI exposes weak data and process foundations, making unglamorous cleanup the real driver of ROI.

Key Points
The enterprise push for AI exposes long-standing weaknesses in data quality, architecture, governance, and process alignment that now directly limit trust and ROI.
David LaRue, Founder of IQ4Hire, frames AI as a stress test that reveals whether organizations have done the foundational work needed to scale.
He outlines a disciplined path forward built on auditing base systems, cleaning and containing data, aligning tools with real workflows, and sustaining change over time.
There’s a push to get really rapid results from AI systems, and there’s an ROI associated with that. What’s painful is the prep work that has to be done on the data side before you can actually get good ROI from AI.

The enterprise rush to adopt AI is hitting a wall: the state of the existing foundation. The promise of generative ROI and agentic efficiency is exposing long-standing issues with architecture, data governance, and security. AI's new wave is raising the stakes on old problems, and the unglamorous work of process and data cleanup is fast becoming the price of admission for success.
David LaRue is a technology and digital strategy executive with deep experience leading large-scale transformation. As the Founder of the consultancy IQ4Hire, he brings an operator’s perspective shaped by years in CIO and CDO roles responsible for enterprise architecture, data, and growth. That background informs his view of AI not as a shortcut to value, but as a force that reveals whether an organization’s foundations are truly ready.
"There’s a push to get really rapid results from AI systems, and there’s an ROI associated with that. What’s painful is the prep work that has to be done on the data side before you can actually get good ROI from AI," LaRue says. That prep work is the foundation AI can’t skip. As teams like Marketing—which LaRue notes has the "number one spend on AI right now"—move quickly to adopt new tools, speed collides with siloed, inconsistent data. AI only intensifies the problem by combining flawed sources at scale, making transparency non-negotiable. If leaders can’t see how data moves from source to outcome, trust in the results breaks down.
Compliance minefield: "There’s a lot of groundwork that has to be done to coalesce and normalize data," he says. "Without it, a simple query can surface information someone should never see, exposing company secrets to learning models." That same risk extends into regulated environments, where AI use quickly becomes a compliance issue. "When a call center scores agents with AI, you have to ask how much of that data is being used to train the models and whether it’s compliant. These systems train on your data unless you buy specific licensing to contain it, and that’s where things get very complex."
On the same page: Ideally, this technical work leads to a single source of truth the business can rely on. It’s a familiar challenge for executives who have asked why the numbers from different departments don't line up. Without that alignment, even the most advanced AI can be rendered ineffective. "You have to ask, are the systems putting out the same data?" LaRue notes. "Does the data from web analytics line up with the ERP system and with finance? Does everybody agree on the definitions? Because nobody is going to believe the outcome unless you believe the data."
Process vs. practice: The gap between systems and outcomes often shows up in human process rather than technology. "Business leaders document what they think the process is, but when you watch the work being done, it’s never exactly the same. The first step is validating what’s actually happening."
LaRue frames transformation as a discipline, not a technology choice. Progress starts with an unvarnished assessment of existing data, systems, and architecture, rather than jumping to new tools. From there, the work has to be broken into contained, manageable efforts that reduce risk while steadily rebuilding the foundation AI depends on.
Back to basics: "Before applying new technologies, you have to go back and look at your base data and your base systems," LaRue insists, calling architectural review a critical first step. From there, progress depends on breaking technical debt into contained moves. "You have to decompartmentalize it and say, 'I can replace this with a new tool or move it from something I haven’t touched in years to something newer,'" he says, describing change as a series of deliberate upgrades rather than a single overhaul.
In perpetuity: But even with a solid plan, many organizations stumble at the point where value should compound. "You can get a company to buy a program, but can you implement it and keep it running?" asks LaRue. "If you don’t keep validating it with new people, retraining them, and reinforcing the processes and systems, you won’t get the value. Sustainment is what determines whether the work actually pays off over time."
While advocating for this foundational work, LaRue remains a pragmatist, acknowledging the political reality of needing quick wins. "I think you should pick off some of these new opportunities, the low-hanging fruit, absolutely," he concludes. "But these technical debt pieces are typically there because they're hard to address, and they're costly and time-consuming and involve change management."




