Even though the Project Management Institute (PMI) began delivering their Certified Practitioner program some years ago, many experienced, veteran, project managers still disregard agile methods and stress the importance of Waterfall approaches. Their rationale is simple:
Premise: Many projects fail because they have poor planning, analysis and design. Hence, more upfront planning and better analysis decreases risk of failure.
The literature over the last decade clearly shows the reasons for project failure. Whether it’s Forbes, Standish, PMI or Gartner, they all report the very similar outcomes for delivery with traditional project management frameworks.
The Project Management Institute states that six factors that must be met for a project to be successful.
Customers are happy
Costs don't exceed the budget
Works as designed
People use it
The people who funded the project are happy with it
It meets the goals that drove the project
According to many sources over the last decade, the outcomes defined by PMI for success are rarely achieved. So, why do projects fail?
Why the experts say project fail?
IAG Consulting
- Companies with poor business analysis capability will have three times as many project failures as successes.
- 68% of companies are more likely to have a marginal project or outright failure than a success due to the way they approach business analysis. In fact, 50% of this groups projects were runaway which had any 2 of: taking over 180% of target time to deliver; consuming in excess of 160% of estimated budget; or delivering under 70% of the target required functionality.
- Companies pay a premium of as much as 60% on time and budget when they use poor requirements practices on their projects.
- Over 41% of the IT development budget for software, staff and external professional services will be consumed by poor requirements at the average company using average analysts versus the optimal organization.
- The vast majority of projects surveyed did not utilize sufficient business analysis skill to consistently bring projects in on time and budget. The level of competency required is higher than that employed within projects for 70% of the companies surveyed.
Forrester
- Poorly defined applications (miscommunication between business and IT) contribute to a 66% project failure rate, costing U.S. businesses at least $30 billion every year.
Meta Group
- 60% – 80% of project failures can be attributed directly to poor requirements gathering, analysis, and management.
Gartner
- 50% are rolled back out of production.
- 40% of problems are found by end users.
Carnegie Mellon
- 25% – 40% of all spending on projects is wasted as a result of re-work.
Dynamic Markets Limited
- Up to 80% of budgets are consumed fixing self-inflicted problems.
Is more upfront planning and analysis the solution?
The natural tendency in traditional, 20th century project management practice is to attempt to reduce risk through more planning, user research, requirements analysis and definition. Unfortunately, the these activities just create a false sense of security that the future solution is truly knowable through extensive analysis, definition and planning. It’s arrogant at best. You can’t predict a hand of poker. Why does anyone think that planning and requirements gathering will accurately predict people’s requirements 6, 12 or 24 months away?
In a complex environment, where more is “unknown than known”, the only true way to reduce risk is to apply Complexity Theory. Can we predict the outcome? Can we know what problems will occur during integration? Do stakeholders know what they want before they see it? Can we account today for what might happen tomorrow, or next month, or in 6-months time? Complexity theory helps us understand the future by doing the following:
- Conducting “fail fast” experiments with short timeframes or work cycles. Essentially, don’t do months of work and ask for feedback, or assess alignment to a goal, do a few weeks of work, produce an outcome, and then assess what to do next.
- Examining the results when the experiment has concluded and providing fast feedback. Do this in 2-week cycles (“Sprints”). Produce something that can be inspected. This means don’t just document or collect requirements. Produce a working solution. People can comment on a working solution easier, and produce feedback with greater certainty about what they want next than requirements in a Word document.
- Then, choosing what experiment to conduct next. Only with feedback on something tangible can you truly know what to do next – deliver more of the same, or change tact to deliver something different. Ultimately if you’ve chosen the wrong solution you have only lost 2-weeks instead of months.
This way of working is what Dave Snowden’s Cynefin framework refers to as “Probe, Sense, Respond”.
More upfront analysis, requirements documentation, and planning – typified by linear and Waterfall delivery frameworks – will only reduce risk and project failure in a “complicated domain” or “simple domain”. If won’t help in a complex domain.
Experiments are designed to build knowledge
Good planning, analysis and design are critical to project success, as is communication, and a shared vision of what is being delivered. The fallacy is assuming that all this effort must be entirely done up-front, in totality, by specialists and then handed down to developers in the form of documentation for them to interpret. Documented requirements can only capture explicit knowledge. It can’t capture knowledge gained about the subtleties of the context and its associated needs.
An experiment, on the other hand, creates a shared experience. The team collectively discovers whether an assumed solution creates the desired outcome.

Agile frameworks are more successful
The CHAOS Report by the Standish Group (2020) show that agile projects are more successful than Waterfall projects, have fewer challenges and fewer failures.