The journey of modernising your architecture starts with a choice. Which is the right one for you?
So far in this series, we have outlined the importance of a conscious mindset in making changes; described legacy as outdated systems that cannot keep up with growing business demands BUT run your business; and stated the importance of knowing why you are modernising. Next comes the choice of which way to move forward. Knowing […]
So far in this series, we have outlined the importance of a conscious mindset in making changes; described legacy as outdated systems that cannot keep up with growing business demands BUT run your business; and stated the importance of knowing why you are modernising. Next comes the choice of which way to move forward.
Knowing why your business needs to modernise enables you to identify the changes that will be required. These changes fall into two categories, technology, such as moving to the cloud or organisational, such as culture.
To address these changes, there are two main ways forward. The first is starting from scratch with a blank slate, aka the Big Bang. The second is the incremental (phased) approach whereby change is introduced piece by piece. In this post, we will look at the choice through the lens of a Big Bang approach. The incremental approach will be compared in the next post.
Most of us have either been in this situation or toyed with the idea of rewriting an existing system, and probably came up with one of the following compelling reasons for going with the Big Bang approach.
- Lower cost and faster lead-through compared to an incremental evolution
- Developers are more motivated when working on greenfield projects
- Freedom to organise teams/work as needed
- No constraints from the design choices of the past
- Don’t need to burden yourself with understanding that legacy system or waste effort on changing the legacy system just to simply access required functionalities
If you have been part of a Big Bang approach, I’m sure you can relate to the following narrative.
Everything starts in earnest, with motivation and expectations sky high. The teams organise themselves quickly and start coding immediately. Code is written at an unbelievable rate and proof of concepts are produced left, right and center.
The months fly by and everything seems to be going well until, one day, someone asks how far along are we and when will we be ready. You talk about all the great stuff that has been developed so far, but cannot demo it in a live system. You start to realise that the light at the end of the tunnel is not getting any closer. In fact, it seems to be disappearing further into the distance.
It seems that the complexity was significantly underestimated, especially when it comes to downstream parts of the system, like the data lake, reporting, compliance, etc.
You could say that the first cracks have started to appear in the Big Bang approach: There is nothing to show in production for all that effort.
Production is the only thing that matters
Nothing erodes stakeholder confidence or developer motivation faster than having nothing to demonstrate in production. What’s worse, without production volumes you can’t know how the system will actually behave.
Of course you could just replicate production volumes. I have yet to see this work in practice, though, as the actual usage patterns in production with network delays from browsers etc. never match the patterns replicated by in-house performance test servers.
But there is a more critical underlying issue here. What production volumes could you even replicate? This is a new, rewritten system that isn’t in production. No volumes or usage patterns are available. You would have to base performance tests on assumptions, leaving validation until you can get real production usage data, i.e. go live.
At this point, Martin Fowler’s definition of Big Bang predicts what will happen:
“The only thing a Big Bang rewrite guarantees is a Big Bang!”
Apart from the technical risks, there are other significant downsides that cannot be overlooked:
- Business is basically feature-frozen as it waits for the new implementation.
- Data migration from legacy systems to a new system is a commonly overlooked, but critical part of the process. Scheduling both the go live as well as the data migration at the same time requires significant planning and effort.
- Not sustainable in the long term. Big bang is a project with an end. This is like making a new year’s resolution to get fit and, once you reach the goal, stopping and focusing on the rest of life’s challenges. What happens? You slip back into the same behavior as before – slacker territory or, in the case of the big bang, software that is slowly degrading into tomorrow’s legacy waiting for the next big bang.
Is an incremental approach better?
In our next blog post about modernising legacy we will delve into whether the incremental or phased approach is a better option.
If you don’t feel like waiting for the next piece to get published, contact me and we can talk about this sooner!
[contact-form-7 id=”9154″ title=”Modern arch. blog post series – subscribe”]