Legacy software is surely the number one problem faced by the software industry today. As time passes, the software's entropy increases and it becomes harder and harder to work with. The accumulation of technical debt can eventually reach a point where momentum is lost and a rewrite is feasible. This can feel like the right option, but for the astutely aware the new software will also suffer from software entropy and the cycle of legacy continues.
Insanity: doing the same thing over and over again and expecting different results" - Albert Einstein
A software modernisation project is essentially about identifying the requirements and functionality of the legacy system - and then making the requirements available in a newer and better looking system. This is a simplified viewpoint as software modernisation projects usually take the opportunity to address other issues as well. This postulate will form the argument that follows.
The problem is that the requirements of a software system are written into the code by developers. A developer reads through the requirements, interprets their meaning, and then writes the code. This may seem normal but that's because the problem is subtle. If a requirement changes, then the developer must update the code. Alternatively, if the code changes then someone must update the requirements. The connection between requirements and code is a manual one. If the developer is removed from the project, i.e. leaves the business or is no longer available, then the problem becomes even further amplified.
The manual connection between the requirements and the code is weak. Ideally, if there was a stronger connection, the requirements can remain while the code could be modernised underneath.
Requirements are an abstract representation of the code. Abstraction is a fundamental principle of computer science. So, the question is: can we leverage the concept of abstraction to make a stronger connection between the requirements and code? The answer is yes, but let's first look at an example of abstraction in science to cement our understanding.
Physicists are currently searching for the Theory of Everything (ToE). The theory will be able to describe the complexity of the universe within a single framework. The challenge is to combine the two modern pillars of general relativity and quantum mechanics into a theory of everything. What has been discovered is that the world is made up of layers of increasingly smaller size. For example, an apple is made of atoms; atoms are made of protons, neutrons and electrons; protons and neutrons are made of quarks; and string theorists believe that quarks and electrons are made from strings. String theory is a possible path to a theory of everything.
Each layer of the physical world can be used to build a higher layer and eventually you end up with the world as we see it. In a sense, each higher layer is more abstract than the one below. Drawing on this as an analogy, computer scientists use a similar approach to representing software. Software is made from models such as the requirements. The models are built from elements in the meta-model such as entities, services, etc. The meta-model elements are built from the meta-meta-model which provides the basic building blocks of everything.
The names used by physicists are far better compared to what computer scientists came up with and these names could be some of the reason why modelling in computer science is so misunderstood. Regardless, if you think of the analogy we used you will be ok. The important point is this - by formalising the layers we now have the makings of a strong connection between the requirements and the code. This is where it begins to get really interesting.
Imagine that we have created a model of the software. The model is a good abstract representation of the software and includes the requirements, state and behaviour - all describing the intent of the software. At this point, it's possible to use code generators (or as we like to call them - robots) to write the code for you. This is possible because there is a strong connection between the layers, and the model sufficiently describes the software.
This hypothetical scenario is a reality today. These academic areas of research fall under several umbrellas that include model-driven engineering, domain specific languages and product line engineering. As always there is a massive gap between research and industry. At the moment the industry is starting to experiment with how to unlock the inherent power, but like all doors, you need to have the right key.
Finally, let's return to the original problem stated at the beginning of this article, legacy is plaguing the software industry. If the requirements of a software system are captured at a higher level of abstraction in models - and the models are used as first class artefacts in the software process by having robots generate code - the software becomes less susceptible to legacy. This is the beginning of a software developers' theory of everything, and like the physicists, there is a long journey ahead. However with plenty of emerging evidence, this is how to migrate to the cloud.