We know software development can be fundamentally risky, but why? What are the causes of project failure? And, how can these be mitigated?
'let us not look back in anger, nor forward in fear, but around in awareness.' (James Thurber)
6 Triple constraints affect if a software project will be successful, challenged or a failure. In order of likelihood, these are: Budget, Customer Satisfaction, Quality, Met user Requirements, Risk and time. The industry benchmark is 31% Success, 50% Challenged & 19% Failure. In reviewing WorkingMouse's last 15 projects, 46.7% are Successful, 46.7% Challenged, and 6.6% have failed. The article also details strategies for mitigating each of the constraints.
Project States & Outcomes
The first step to raising our awareness is understanding the definition of a project's state. Previously, I had a black and white view of this. However, since researching, I've found the Stanish Group CHAOS reports adding a 'Challenged' state. This means the project is over budget, time or delivered with fewer requirements than its need to fulfil its purpose.
On top of defining the states, the report also identified the industry's percentages of outcomes across all software projects. Exclusive of size and complexity whilst including all types from Custom software projects to off-the-shelf purchases.
The three state's and rates are:
- Success: 31%, Released and in use within Budget, time and with required features.
- Challenged: 50%, Late, over budget, or less than required features.
- Failure: 19%, Cancelled before completion or delivered but never used.
WorkingMouse Project Outcomes
I wasn't surprised to see the majority sitting in the newly defined challenged state given the above numbers. It spurred me on to compare our health against the industry standard.
To achieve a fair review, I assessed the last 15 projects we've completed. We excluded scopes and focused solely on product development software projects.
Projects on Budget, Met Requirements, and were delivered to production on time were considered a success.
Based upon this research, our product Failure, Challenge, and Sucess rates are: Success = 46.7%, Challenged = 46.7%, Failure = 6.7%
I promise that I haven't denied data from this to make us look good. I will honestly say that a company such as WorkingMouse is only as good as our last project and skills. A look back further into our history may provide less favourable results. In saying that, this resonated with our value of scientific but not heartless. As long as we learn and move on, we will constantly improve.
Mitigating The 6 Constraints
After reviewing the outcome percentages, we tagged the causes for Challenged of Failed projects. I thought it would be helpful to understand the likelihood of each constraint and mention the strategies we deploy to mitigate these in descending order.
🏦 Budget 47%
No surprises here, Public enemy #1 the dollar bucks! In this example, we're a professional services software development company.
We have three key strategies to mitigate Budget.
We estimate in a way that enables us to take in the risks and complexity of each milestone whilst also allowing for all possible factors. As the build progresses, estimations may change, and there are always unknowns. The further we try to predict the future, the more inaccurate we are.
Commonly referred to as Trade-off sliders. Every engagement we undertake is on a fixed time, variable scope basis, which works hand in hand with the scrum process. It enables the product owner to set a budget based on the estimations. As the time nears completion, more scope may be added or varied or, more commonly, the remainder is de-prioritised.
The key to working successfully and one of the product owner's hard calls to make. We ensure that every requirement is set to a Must, Should, Could or Won't before development. We focus our development iterations on the Must's so that when time is running low, the Should's and Could's end up on the editing room floor.
😌 Customer Satisfaction 33%
This was a surprise to me as it came up four times across 15 projects and also an important one to define as I believe it can be vague. I want to describe this as the customer being satisfied their objective has been achieved. However, I would like to build upon this and argue that satisfaction is not covered unless they are happy with the journey to get there. So, satisfaction should include the process, outcome and engagement they received along the way.
In term's of measurement, we use a CSAT Score on a four-weekly basis captured by interviews from the account manager. Our average result since implementing this method has been 60% on a goal of 75%. Given that we have ~57% of projects challenged or failed, this seems to line up. It also shows opportunities for improvement. There are two key strategies that we're focusing on here.
We follow a Lean Scrum Process, and it has well defined and documented stages with Standard Operating Procedures. We call this our Way of Working, and it enables both parties to understand each stage's inputs, output, workflow, and expectations.
This key in a high touch, low volume environment. Customers become frustrated when the team changes from Scoping to Development and into Support. Having to explain and elaborate issues, problems and solutions is a severe drain on customer effort. We now mitigate this by keeping teams consistent and having squad Leads (Project managers) involved in all active stages of Scoping and Development and, where possible, persisting the development team from Scope to Development and back in again.
Over the last two years, we've introduced the concept of flexible velocity into our projects. Customers can speed up or speed down their projects with extra team member's; however, we achieved this at the cost of a fixed squad structure. We're moving back to the fixed squad process. On top of this, the account manager doesn't change. This form a good-sized triangle of communication between the Product Owner (customer), Squad Lead and Account Manager. The account manager and customer focus on strategy, whilst the Squad lead focus on tactics. The key here is trust and placing the needs of the product first.
🙌 Met user Requirements 27%
Another tricky one. Let's keep this as simple as possible. 'Met user Requirments' = the scope of work achieved and released within time, and the Budget was enough to solve the problem the build was attempting to resolve.
[Start with a problem]
When embarking on a new build or engagement, we always start with a defined problem. It ensures we're not taking unvalidated orders, saves time, and means that we can solve the problem without a pre-defined solution.
Focus on the end-user
Throughout the Scope and Development. We focus on the end-user. There are many ways we do this: User Discovery Interviews, user prototyping, User Stories in development, Beta Releases, to name a few. To understand all the aspects of this, I recommend reading our Guide to creating a successful software product.
As before, in Budget...Prioritisation.
👌 Quality 27%
Okay, there's a whole can of worms here regarding Quality. So I am going to keep it simple again. Quality = The Quality of the developed product developer. There are numerous aspects to this, so we'll focus on the top.
User Acceptance Testing
At the end of every milestone, the project is deployed to beta. The Product owner then has the opportunity to validate that the work matches the defined user acceptance criteria on every ticket.
It is the process of ensuring every ticket created has an automated or documented test to ensure it's correctly working. This way, you know if a release has detracted from other requirements by causing a test to fail. Once all tests pass, you know the software is working as required and safe to release to production.
Performance & Technology
At WorkingMouse, we commonly use a #C, React tech stack called #C bot from Codebots We maintain #C bot constantly to ensure it is up to date regarding open source technologies, security, and performance.
We always have a minimum of 2 developers on projects, and all code has to pass a peer review process bere being considered complete as part of Quality Assurance.
⚠️ Risk 13%
Unfortunately, the risk is everywhere and everything when working in custom software development. The best way to address risk is through a clear and transparent approach.
We maintain a risk register for the lifetime of every project and document the chance of eventuality and potential impact. We discuss the risk register in every meeting, and anyone can flag a risk as a project stakeholder.
⏳ Time 13%
In our business as a professional service agency, time is money! However, due to all of the above constraints, it's often affected the most.
Sometimes, contributing factors will force a firm deadline. We can increase the team velocity to meet the demand in this case. However, the trade-off is that the increased speed is detracted with every team member added due to extra communication lines and complexity. We usually say each developer adds a factor of 1.6 instead of 2 to the velocity.
There are so many unknown unknowns in software development. Time is only a social construct, and deadlines are usually from customer promises. Our recommendation to our customer is to promise it but not when. This way, you avoid unrealistic expectations. Quality is a must, so, therefore, it's better to wait and get it right the first time as opposed to rushing it through.
A more informed conversation can take place when you're aware of these constraints and the strategies to mitigate them. However, the key to dealing with these is understanding the limitations and ensuring the team relationship comes first. If the project sponsor and team understand that delays happen, it's okay, As long as everyone is learning from those mistakes and mitigating the project's constraints proceeding forward.