Picture this: Your company just closed a six-figure deal for an AI platform. The vendor demo was flawless, the board signed off, and IT spent three months getting everything configured. Six months later, a handful of power users love it, most of your team has gone back to their old workflows, and your CFO is asking why productivity numbers haven't moved.
This isn't a hypothetical. It's happening across enterprises right now. A new study has finally put a price tag on it.
A Whatfix-commissioned study found that enterprises stand to lose $10.9 million annually due to poor digital adoption — not because the AI tools are broken or the wrong choice, but because employees aren't actually using them effectively. $10.9 million, gone every year, from a problem that isn't on most technology roadmaps.
The Real Problem Isn't the AI
There's a persistent myth in enterprise technology: buy a good enough tool, and people will use it. The past decade of failed CRM rollouts, abandoned ERP systems, and now underperforming AI deployments has disproved this pretty conclusively.
WalkMe's research found that enterprises wasted over $104 million on underused technology in 2024. Of the 46,000+ enterprise applications surveyed, executives estimated employees used around 37 apps regularly. The actual number was closer to 625 — most of them cobbled together in unsanctioned workarounds because the official tools weren't cutting it.
Meanwhile, 79% of executives say they're confident about meeting their AI goals. Only 28% of their employees feel trained to use the tools. A 51-point gap between what leadership believes and what the workforce can actually do is an adoption problem, and no amount of confident executive surveys will close it.
70% of enterprise AI failures trace back to people and process issues — inadequate change management, missing training, resistance from employees, no clear goals — versus just 20% for infrastructure and 10% for the actual algorithms. The technology holds up fine. What collapses is the organization trying to absorb it.
What "Poor Digital Adoption" Looks Like Day-to-Day
The most expensive form is shelfware. The AI tool that passed the procurement checklist but lives permanently minimized on employees' desktops. 42% of companies abandoned most AI initiatives in 2025, up sharply from 17% the year before. Most weren't abandoned because the software failed. The rollout strategy created friction that employees simply routed around.
Shadow workarounds are sneakier. When official tools don't fit real workflows, employees don't stop working. They find unofficial routes: exporting data to spreadsheets, using personal ChatGPT accounts instead of the sanctioned corporate AI, reverting to email threads for processes that were supposed to move to a collaboration platform. WalkMe's analysis identified dozens of shadow AI tools — Perplexity, Claude, Glean — quietly gaining usage across enterprises while the official AI stack sat underutilized. Companies end up paying enterprise AI licensing fees while employees use consumer tools that skip every security and governance guardrail in the process.
Then there's the quieter damage: teams adopting unevenly. One division becomes fluent with an AI tool; another treats it as optional. Adoption becomes tribal, and the expected org-wide efficiency gains never materialize. On average, employees waste 36 working days every year on technology frustrations, including navigating tools they weren't properly trained on. The efficiency gains projected in your business case concentrate in a fraction of your workforce, and the numbers stay flat.
Why Mid-Market Companies Get Hit Hardest
Large enterprises have the budget for dedicated change management offices, internal training academies, and digital transformation teams focused exclusively on adoption. Mid-market companies — typically $50M to $1B in revenue — rarely have that luxury.
RSM's 2025 AI survey found that generative AI adoption among middle-market firms has surged to 91%, up from 77% the year before. Only one in four of those organizations reports that AI is fully integrated into core operations. The rest are somewhere between "running pilots" and "signed a contract, hoping for the best."
That gap is structural. Mid-market IT teams are stretched across infrastructure, security, compliance, and support. Change management is often one part-time person's second job. When AI tools land on employees' desks without embedded workflows, in-context training, or utilization tracking, the tools don't stick.
Mid-market companies end up carrying enterprise-sized AI bills while their adoption rates look more like a Series A startup's.
The Three Layers — and Where Most Rollouts Actually Fail
The organizations getting real returns from AI aren't treating deployment as a one-time IT project. They run it as a continuous organizational change initiative. There are three distinct layers to a successful rollout, and most companies only fund one of them.
The part everyone does: tool selection. Vendor evaluation, security reviews, proof-of-concept testing, contract negotiation. It gets all the attention and all the budget. The right tool is a prerequisite, not a strategy.
Where it actually falls apart: workflow integration. This doesn't mean the tool connects to your existing software stack via API. It means the tool fits into the actual, specific, day-to-day work of the people who are supposed to use it.
An AI writing assistant that requires employees to leave their primary work environment and open a separate tab gets used occasionally. The same capability embedded directly in the tool where writing already happens gets used constantly. The technology can be identical. Adoption rates won't be.
This layer requires answering a deceptively simple question: at what exact moment in an employee's day does this tool create value, and are we meeting them there? Forrester's research commissioned by Whatfix makes this concrete: enterprises using Digital Adoption Platforms — in-context training and usage analytics embedded directly in applications — achieve faster time-to-value and higher returns than those relying on standalone documentation or training programs. The tool has to meet the employee in their workflow, not the other way around.
The layer nobody budgets for: ongoing measurement. Most companies measure deployment success by seat count — how many licenses are active, how many employees completed the onboarding checklist. These metrics tell you almost nothing about whether value is being generated.
What matters is utilization depth: how frequently employees use the tool, which features they engage with, where they drop off, and whether their usage is generating the intended outcomes. The Whatfix 2026 ROI report found that digital transformation initiatives tied to specific measurable outcomes outperform those that aren't — by a margin worth building your entire measurement strategy around. Without this layer, you're flying blind. You might be one of the 79% of executives feeling confident about AI goals while your workforce quietly belongs to the 72% that doesn't know how to use the tools effectively.
The Goldman Sachs Contrast
Goldman Sachs isn't a mid-market company. But how they approached AI deployment looks so different from the typical enterprise playbook that it's worth examining.
Goldman deployed AI assistants to all 46,000+ employees globally — not through a phased multi-year migration, but following a focused pilot with 10,000 employees that proved the model before scaling. More than 50% of employees actively use the platform, with a stated goal of 100% adoption by end of 2026. For context: most companies that announce enterprise-wide AI deployments are still measuring success by whether the tool is installed.
CIO Marco Argenti's approach rested on building for specific roles rather than generic users. Rather than deploying a generic AI tool and leaving adoption to chance, Goldman built role-specific "copilots" tailored to the specific work of developers, analysts, and bankers, embedded in the workflows where that work actually happens. In a March 2025 Goldman podcast, Argenti described the trajectory toward what he called a "hybrid workforce" and acknowledged that getting there requires confronting AI illiteracy at every level of the organization, not just among frontline employees.
Argenti has also been direct about where the hard work lives: making AI agents understand and reflect company culture is one of the most difficult parts of any deployment. The technical integration is the easier problem. Getting an organization to genuinely change how it works is where most efforts stall — and it's where Goldman has spent the most effort.
Goldman also wired governance into the platform before scaling: security reviews, compliance checks, and audit trails built into the foundation rather than retrofitted later. Most organizations treat governance as the department that slows things down. Goldman built it in from the start, which meant no retrofit delays when scale came.
No mid-market company can replicate Goldman's resources. The strategic logic is fully portable: start with highest-measurability use cases, build for specific roles, target adoption rather than deployment, and measure continuously.
A 3-Question Adoption Audit You Can Run This Week
Before signing another AI contract, answer these three questions about the tools you already have. This takes 30 minutes in a leadership meeting and will surface more signal about your AI strategy than most vendor presentations.
Question 1: What percentage of licensed seats are actively used each week?
Not activated, not checked off the onboarding list — actually opened and used to do work. Pull utilization data from your vendors. Every enterprise SaaS tool tracks this, and if yours doesn't offer it, that's its own red flag. Weekly active usage below 60% signals an adoption problem, not a tool problem. Buying more AI tools on top of that will compound the issue, not fix it.
Question 2: Can your managers describe, in one sentence, how each AI tool fits into their team's specific daily workflow?
Ask three managers this question tomorrow, unprompted. Blank stares, vague references to "it helps with productivity," or contradictory answers from each person all point to the same thing: the workflow integration layer is missing. The tool was deployed. It wasn't adopted.
Question 3: When did we last review utilization data, and what did we change based on it?
If the answer is "we haven't" or "we checked during rollout," the measurement layer is missing. Adoption isn't a launch event. Identify one metric — weekly active users, feature engagement rate, support ticket volume — and commit to reviewing it monthly.
These are the questions most organizations skip because they're distracted by whatever AI announcement dropped this week.
The Meter Is Already Running
The $10.9 million annual loss from poor digital adoption isn't a future risk. For most enterprises deploying AI right now, it's already happening. Every week of low utilization, every workaround an employee builds to avoid the official tool, every team that treats AI as optional — these are active costs, not missed opportunities.
The technology layer of AI implementation is largely figured out. The workforce readiness layer isn't, and most organizations are still pretending otherwise while the meter runs.