The 6 Constraints That Lead To Failed Software and IT Projects


We know soft­ware de­vel­op­ment can be fun­da­men­tally risky, but why? What are the causes of pro­ject fail­ure? And, how can these be mit­i­gated?

‘let us not look back in anger, nor for­ward in fear, but around in aware­ness.’ (James Thurber)


6 Triple con­straints af­fect if a soft­ware pro­ject will be suc­cess­ful, chal­lenged or a fail­ure. In or­der of like­li­hood, these are: Budget, Customer Satisfaction, Quality, Met user Requirements, Risk and time. The in­dus­try bench­mark is 31% Success, 50% Challenged & 19% Failure. In re­view­ing WorkingMouse’s last 15 pro­jects, 46.7% are Successful, 46.7% Challenged, and 6.6% have failed. The ar­ti­cle also de­tails strate­gies for mit­i­gat­ing each of the con­straints.

Project States & Outcomes

The first step to rais­ing our aware­ness is un­der­stand­ing the de­f­i­n­i­tion of a pro­jec­t’s state. Previously, I had a black and white view of this. However, since re­search­ing, I’ve found the Stanish Group CHAOS re­ports adding a ‘Challenged’ state. This means the pro­ject is over bud­get, time or de­liv­ered with fewer re­quire­ments than its need to ful­fil its pur­pose.

On top of defin­ing the states, the re­port also iden­ti­fied the in­dus­try’s per­cent­ages of out­comes across all soft­ware pro­jects. Exclusive of size and com­plex­ity whilst in­clud­ing all types from Custom soft­ware pro­jects to off-the-shelf pur­chases.

The three state’s and rates are:

  1. Success: 31%, Released and in use within Budget, time and with re­quired fea­tures.
  2. Challenged: 50%, Late, over bud­get, or less than re­quired fea­tures.
  3. Failure: 19%, Cancelled be­fore com­ple­tion or de­liv­ered but never used.

WorkingMouse Project Outcomes

I was­n’t sur­prised to see the ma­jor­ity sit­ting in the newly de­fined chal­lenged state given the above num­bers. It spurred me on to com­pare our health against the in­dus­try stan­dard.

To achieve a fair re­view, I as­sessed the last 15 pro­jects we’ve com­pleted. We ex­cluded scopes and fo­cused solely on prod­uct de­vel­op­ment soft­ware pro­jects.

Projects on Budget, Met Requirements, and were de­liv­ered to pro­duc­tion on time were con­sid­ered a suc­cess.

Based upon this re­search, our prod­uct Failure, Challenge, and Sucess rates are: Success = 46.7%, Challenged = 46.7%, Failure = 6.7%

I promise that I haven’t de­nied data from this to make us look good. I will hon­estly say that a com­pany such as WorkingMouse is only as good as our last pro­ject and skills. A look back fur­ther into our his­tory may pro­vide less favourable re­sults. In say­ing that, this res­onated with our value of sci­en­tific but not heart­less. As long as we learn and move on, we will con­stantly im­prove.

Mitigating The 6 Constraints

After re­view­ing the out­come per­cent­ages, we tagged the causes for Challenged of Failed pro­jects. I thought it would be help­ful to un­der­stand the like­li­hood of each con­straint and men­tion the strate­gies we de­ploy to mit­i­gate these in de­scend­ing or­der.

🏦 Budget 47%

No sur­prises here, Public en­emy #1 the dol­lar bucks! In this ex­am­ple, we’re a pro­fes­sional ser­vices soft­ware de­vel­op­ment com­pany.

We have three key strate­gies to mit­i­gate Budget.

Scientifically Estimate

We es­ti­mate in a way that en­ables us to take in the risks and com­plex­ity of each re­quire­ment whilst also al­low­ing for all pos­si­ble fac­tors. As the build pro­gresses, es­ti­ma­tions may change, and there are al­ways un­knowns. The fur­ther we try to pre­dict the fu­ture, the more in­ac­cu­rate we are.

Development Options

Commonly re­ferred to as Trade-off slid­ers. Every en­gage­ment we un­der­take is on a fixed time, vari­able scope ba­sis, which works hand in hand with the scrum process. It en­ables the prod­uct owner to set a bud­get based on the es­ti­ma­tions. As the time nears com­ple­tion, more scope may be added or var­ied or, more com­monly, the re­main­der is de-pri­ori­tised.


The key to work­ing suc­cess­fully and one of the prod­uct own­er’s hard calls to make. We en­sure that every re­quire­ment is set to a Must, Should or Could be­fore de­vel­op­ment. We fo­cus our de­vel­op­ment it­er­a­tions on the Must’s so that when time is run­ning low, the Should’s and Could’s end up on the edit­ing room floor.

😌 Customer Satisfaction 33%

This was a sur­prise to me as it came up four times across 15 pro­jects and also an im­por­tant one to de­fine as I be­lieve it can be vague. I want to de­scribe this as the cus­tomer be­ing sat­is­fied their ob­jec­tive has been achieved. However, I would like to build upon this and ar­gue that sat­is­fac­tion is not cov­ered un­less they are happy with the jour­ney to get there. So, sat­is­fac­tion should in­clude the process, out­come and en­gage­ment they re­ceived along the way.

In ter­m’s of mea­sure­ment, we use a CSAT Score on a four-weekly ba­sis cap­tured by in­ter­views from the ac­count man­ager. Our av­er­age re­sult since im­ple­ment­ing this method has been 60% on a goal of 75%. Given that we have ~57% of pro­jects chal­lenged or failed, this seems to line up. It also shows op­por­tu­ni­ties for im­prove­ment. There are two key strate­gies that we’re fo­cus­ing on here.


We fol­low a Lean Scrum Process, and it has well de­fined and doc­u­mented stages with Standard Operating Procedures. We call this our Way of Working, and it en­ables both par­ties to un­der­stand each stage’s in­puts, out­put, work­flow, and ex­pec­ta­tions.

People Continuity

This key in a high touch, low vol­ume en­vi­ron­ment. Customers be­come frus­trated when the team changes from Scoping to Development and into Support. Having to ex­plain and elab­o­rate is­sues, prob­lems and so­lu­tions is a se­vere drain on cus­tomer ef­fort. We now mit­i­gate this by keep­ing teams con­sis­tent and hav­ing squad Leads (Project man­agers) in­volved in all ac­tive stages of Scoping and Development and, where pos­si­ble, per­sist­ing the de­vel­op­ment team from Scope to Development and back in again.

Over the last two years, we’ve in­tro­duced the con­cept of flex­i­ble ve­loc­ity into our pro­jects. Customers can speed up or speed down their pro­jects with ex­tra team mem­ber’s; how­ever, we achieved this at the cost of a fixed squad struc­ture. We’re mov­ing back to the fixed squad process. On top of this, the ac­count man­ager does­n’t change. This form a good-sized tri­an­gle of com­mu­ni­ca­tion be­tween the Product Owner (customer), Squad Lead and Account Manager. The ac­count man­ager and cus­tomer fo­cus on strat­egy, whilst the Squad lead fo­cus on tac­tics. The key here is trust and plac­ing the needs of the prod­uct first.

🙌 Met user Requirements 27%

Another tricky one. Let’s keep this as sim­ple as pos­si­ble. ‘Met user Requirments’ = the scope of work achieved and re­leased within time, and the Budget was enough to solve the prob­lem the build was at­tempt­ing to re­solve.

Start with a prob­lem

When em­bark­ing on a new build or en­gage­ment, we al­ways start with a de­fined prob­lem. It en­sures we’re not tak­ing un­val­i­dated or­ders, saves time, and means that we can solve the prob­lem with­out a pre-de­fined so­lu­tion.

Focus on the end-user

Throughout the Scope and Development. We fo­cus on the end-user. There are many ways we do this: User Discovery Interviews, user pro­to­typ­ing, User Stories in de­vel­op­ment, Beta Releases, to name a few. To un­der­stand all the as­pects of this, I rec­om­mend read­ing our Guide to cre­at­ing a suc­cess­ful soft­ware prod­uct.

As be­fore, in Budget…Prioritisation.

👌 Quality 27%

Okay, there’s a whole can of worms here re­gard­ing Quality. So I am go­ing to keep it sim­ple again. Quality = The Quality of the de­vel­oped prod­uct de­vel­oper. There are nu­mer­ous as­pects to this, so we’ll fo­cus on the top.

User Acceptance Testing

At the end of every it­er­a­tion, the pro­ject is de­ployed to beta. The Product owner then has the op­por­tu­nity to val­i­date that the work matches the de­fined user ac­cep­tance cri­te­ria on every ticket.


It is the process of en­sur­ing every ticket cre­ated has an au­to­mated or doc­u­mented test to en­sure it’s cor­rectly work­ing. This way, you know if a re­lease has de­tracted from other re­quire­ments by caus­ing a test to fail. Once all tests pass, you know the soft­ware is work­ing as re­quired and safe to re­lease to pro­duc­tion.

Performance & Technology

At WorkingMouse, we com­monly use a #C, React test stack called #C bot from Codebots. We main­tain #C bot con­stantly to en­sure it is up to date re­gard­ing open source tech­nolo­gies, se­cu­rity, and per­for­mance.

Code Quality

We al­ways have a min­i­mum of 2 de­vel­op­ers on pro­jects, and all code has to pass a peer re­view process bere be­ing con­sid­ered com­plete as part of Quality Assurance.


We fol­low a Triple-A ap­proach to se­cu­rity and en­sure that all ap­pli­ca­tions pass the OWASP top 10.

⚠️ Risk 13%

Unfortunately, the risk is every­where and every­thing when work­ing in cus­tom soft­ware de­vel­op­ment. The best way to ad­dress risk is through a clear and trans­par­ent ap­proach.

Risk Register

We main­tain a risk reg­is­ter for the life­time of every pro­ject and doc­u­ment the chance of even­tu­al­ity and po­ten­tial im­pact. We dis­cuss the risk reg­is­ter in every meet­ing, and any­one can flag a risk as a pro­ject stake­holder.

⏳ Time 13%

In our busi­ness as a pro­fes­sional ser­vice agency, time is money! However, due to all of the above con­straints, it’s of­ten af­fected the most.


Sometimes, con­tribut­ing fac­tors will force a firm dead­line. We can in­crease the team ve­loc­ity to meet the de­mand in this case. However, the trade-off is that the in­creased speed is de­tracted with every team mem­ber added due to ex­tra com­mu­ni­ca­tion lines and com­plex­ity. We usu­ally say each de­vel­oper adds a fac­tor of 1.6 in­stead of 2 to the ve­loc­ity.


There are so many un­known un­knowns in soft­ware de­vel­op­ment. Time is only a so­cial con­struct, and dead­lines are usu­ally from cus­tomer promises. Our rec­om­men­da­tion to our cus­tomer is to promise it but not when. This way, you avoid un­re­al­is­tic ex­pec­ta­tions. Quality is a must, so, there­fore, it’s bet­ter to wait and get it right the first time as op­posed to rush­ing it through.


A more in­formed con­ver­sa­tion can take place when you’re aware of these con­straints and the strate­gies to mit­i­gate them. However, the key to deal­ing with these is un­der­stand­ing the lim­i­ta­tions and en­sur­ing the team re­la­tion­ship comes first. If the pro­ject spon­sor and team un­der­stand that de­lays hap­pen, it’s okay, As long as every­one is learn­ing from those mis­takes and mit­i­gat­ing the pro­jec­t’s con­straints pro­ceed­ing for­ward.

Discover Software


David Burkett

Growth en­thu­si­ast and res­i­dent pom

Get cu­rated con­tent on soft­ware de­vel­op­ment, straight to your in­box.

How Leading Edge Software Can Help You Scale Your Business

09 October 2018

How to Budget for an Agile Software Development Project

11 September 2019

What is Agile Software Development: How to Start with a Problem

16 October 2020

Your vi­sion,

our ex­per­tise

Book a con­sul­ta­tion