The tool was delivered. It works technically. The training was done. And yet, three months later, half of the team has returned to Excel and the other half uses the tool only because the boss insists.
The project is considered a success in the invoice, but not in the operational one.
Around 70% of digital transformation initiatives do not meet their objectives. [1]
72% of failures are attributed to insufficient management support and employee resistance. [2]
Bain found that 88% of business transformations fall short of their original ambitions. [1]
These data do not describe technically failed projects: they describe technically correct projects that did not generate adoption.
The Four Real Causes of Non-Adoption
1) The tool solves the problem of the client, not the user
The chief operating officer sees the problem from his level: “I need visibility of X, I need Y to be registered, I need Z to be approved in a traceable way.”
The operator who is going to use the tool ten times a day sees it from his own: “this screen has too many fields, this flow has three more steps than I need, this is slower than the Excel I was already using”. [6]
Prosci's ADKAR framework identifies “Desire” —the user needs to want to use the tool, not just know how—as one of the necessary conditions for adoption. [3]
A tool that solves the client's problem but adds friction to the user that the opera does not generate that Desire.
Without observing real users during discovery, that mismatch is not detected until the tool is in production.
2) There is no owner of the process after launch
The tool is launched. The project is closed. The team that built it moves on to the next project.
Who gets the alert when the flow fails?
Who decides if that bug is urgent or can wait?
Who updates the tool when a business rule changes? [6]
Without a defined owner from the start, the tool is software in production without maintenance.
Failures are piling up. Los Workarounds they come back. In six months, no one remembers exactly how it works.
3) Training occurs once and there is no subsequent support
Launch training teaches how to use the tool. It doesn't teach why to use it, what happens if something goes wrong, or how to solve cases that weren't in the demo.
83% of employees who experience change fatigue say their organizations don't provide enough resources to help them adapt. [2]
Without a post-launch support process, the tool is left at the mercy of the memory of those who attended the initial training.
4) Actual adoption is not measured
“Users have it assigned” is not adoption. Adoption is how many times a day each user opens it and for what purpose.
Companies that continuously monitor adoption metrics report up to 30% more ROI on digital initiatives. [5]
Without those metrics defined from the start, there is no warning sign when the tool starts to be ignored. Abandonment is silent until someone asks and discovers that no one uses it.
What changes when adoption is designed from the start
Adoption is not a training module at the end of the project. It's a dimension of design from the start.
It involves involving real users in the discovery process, designing the tool for the workflow of the person who is going to use it (not just for the visibility of the person in charge of it), defining a process owner before the Go-live, establish usage metrics from day one and design post-launch support before delivering.
Organizations that consolidate their Shadow IT before implementing new tools, they achieve around 23% more adoption. [4]
60% of employees build software outside of IT supervision because the process of adopting the official tool has more friction than its informal alternative. [7]
Adoption is designed before building, not trying to recover after delivery.
A tool that no one uses is not a finished project.
It is a project that cost time and money and did not produce the operational change that justified it.
The difference between an internal tool that generates adoption and one that is abandoned in six months is usually not in the technology: it is whether someone asked the team that was going to use it what they really needed, and whether someone was left responsible for it to work well after the launch.
References
1. Melting Spot Blog. (2025). Digital Transformation Failure Rate 2025 — Why 70% of Projects Still Fail. https://blog.meltingspot.io/why-digital-transformation-projects-fail/ — In 2026, around 70% of digital transformation initiatives are still not meeting their objectives. Gartner estimates that only about 48% of projects meet or exceed their objectives. Bain (2024) found that 88% of transformations do not achieve their original ambitions. The global cost of these failed initiatives is estimated at $2.3 trillion annually.
2. Mooncamp. (2024). 65+ Change Management Statistics for Success in 2026. https://mooncamp.com/blog/change-management-statistics — Only 34% of major change initiatives achieve complete success. 72% of transformation failures are attributed to insufficient management support (33%) and employee resistance (39%). Transformations focused on technology rather than strategic objectives are twice as likely to fail.
3. Prosci. (2024). Change Management Best Practices — 12th Edition. https://www.prosci.com/blog/change-management-trends-2024-and-beyond — Prosci's ADKAR (Awareness, Desire, Knowledge, Ability, Reinforcement) framework identifies the five conditions necessary for an individual to successfully adopt change. The absence of any of the five generates resistance or abandonment. In internal tool projects, the most frequently omitted are Desire (the user doesn't see why they should use the new tool) and Reinforcement (there's no post-launch follow-up).
4. Deloitte. (2023). Automation with Intelligence: 2022 Global Automation Survey.Deloitte Insights. https://www.deloitte.com/us/en/insights/topics/talent/intelligent-automation-2022-survey-results.html — Organizations that consolidate their shadow IT before implementing automation achieve around 23% more adoption of new flows than those that automate on a fragmented ecosystem. Adoption is also higher when the automated process was designed with direct input from the users who are going to use it.
5. Number Analytics. (2025). A Complete Guide to Technology Adoption Rates in 2024. https://www.numberanalytics.com/blog/complete-guide-technology-adoption-rates-2024 — Companies that continuously monitor adoption metrics report up to 30% more ROI on digital initiatives (Gartner, 2023). Without adoption metrics defined from the start, there is no warning sign when the tool starts to be ignored.
6. Niels SenNorman Group. (2024). Discovery: Definition. https://www.nngroup.com/articles/discovery-phase/ — The design must be based on a real understanding of users, their needs and their work contexts. A tool that solves the boss's problem but not that of the user who operates it on a daily basis does not generate adoption no matter how good it is technically.
7. Retool. (2026). The Build vs. Buy Shift: How Vibe Coding and Shadow IT HaveShaped Enterprise Software. BusinessWire. https://www.businesswire.com/news/home/20260217548274 — 60% of respondents built software outside of IT supervision in the past year. The pattern is the same one that generates abandoned tools: the process of adopting the official tool is more expensive than the informal workaround you are already familiar with.
Heading
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Artículos destacados
Explora nuestros últimos artículos y tendencias.