Legacy systems represents a massive, long-term business investment, making a migration project an inevitable consequence of aging software systems. Legacy system migration has many traps, but like all traps - if you know they are there - they can be avoided or even turned into an advantage. In this article I will talk about some of the do’s and don’ts from my experience working on these types of projects. At the end of the article I will give some practical advice and lay down a high-level strategy that has worked for me.
Conceptually software applications can be split into two general parts - state and behaviour. The state of an application is the data and what it means. The behaviour of an application is the functionality and how it behaves. This is ingrained into most software developers and we learn this very early through techniques like object-oriented programming. An object is a way to encapsulate state and specify behaviour.
By splitting the application into these two general parts we can begin to form a plan of how to migrate a legacy system to the cloud.
There is great variation between the ways you end up with needing to undergo data migration. You may be doing a modernisation software project so you only have to move from a legacy database to cloud computing. Or, you may be part of a merger and acquisition - and in this case - you often end up with two or three separate systems that need to be combined. The challenges are different but the most important thing is you will need to do a lot of analysis and mapping.
Do not hand code the mapping! You must find a tool - or set of tools - that will manage the mapping and do the data migration. I use an approach to software called model-driven engineering (MDE) and one of the common things we do is called model-to-model (M2M) transformations. If you treat the legacy database schema as one model and also treat the cloud database as another model, then you can use M2M transformations to get from one to the other. My favourite M2M tools comes from Eclipse
One last point I would like to make on the data migration is the quality of the data you will encounter. A word of warning, it's most likely going to be horrible. Over the years it would have been updated and manipulated by many developers - foreign keys will be missing, it won’t be 3rd normal form, columns will be incorrectly named, or my favourite, the super column where someone dumps an array of data. All these problems add up to a big problem and are the reasons why data migration is so difficult. This emphasises the need for a systematic mapping tool of sorts.
Data is the state of an application while functionality is the behaviour. An example would be something like a workflow. It is a common business problem to raise some sort of work request. Delegate that request to a worker and then track the request through to its completion. If you can think of any sort of functionality of your favourite software - this is the behaviour.
Now for the bad news. Believe it or not, data migration is the easier of the two. Functionality migration is far more time consuming. The reason for this is simple. Once you have identified what the functionality is - hopefully documenting it into some requirements or design documents - a software developer then has to write the code for it. It’s like starting a greenfield project.
Now for the good news. Recall I mentioned earlier about an approach to building software called model-driven engineering. Another common thing we do is called model-to-text (M2T) transformations. These are effectively like code generators - or like robots - and can do a lot of the heavy lifting when it comes to writing the functionality. Using these techniques can save a lot of time, lift the quality of the project, and reduce the risk of the project failing.
1. Do more analysis2. Do testing, testing and more testing3. Do use a scientific approach4. Do set both technology and organisation KPI’s5. Do use model-driven engineering6. Don’t do it all at once7. Don’t underestimate the size of the problem8. Don’t leave the data migration as an afterthought9. Don’t think it's going to be easy10. Don’t stress!
There is no silver bullet or magic wand that can be waived to solve this problem. You have to roll up your sleeves and work hard to achieve the desired result. The solution I use is a six-step process that we found works for us.
1. We work to ensure you will have a systematic path of data at the end. This is ensured by an automated step that reverse engineers from the legacy database to the plugin model.
2. The next step involves using a business analyst to create the requirements backlog that is used to describe the behaviour of the application.
3. After analysing the previous step you then determine which capabilities of the plug-in model you want to switch on to match the application.
4. Then create the new, mobile responsive user-interface
5. You then get to generate the new cloud application! This process is followed by assigning developers to augment the target to meet all of the requirements.
6.The sixth and final step is the migration step! This is where you migrate customers from the legacy application to the new cloud application. Thanks to step 1, this process gets to be fully automated (with the exceptions of overly complex scenarios of course).
Legacy migration can seem like a scary process to a lot of businesses, but it’s often a necessary one. Using a tried and tested method such as our 6-step approach can potentially not only alleviate some of the traps but even help you transform the legacy migration process into a benefit for your company.