Chris Oldwood from The OldWood Thing
As the pendulum swings ever closer towards being leaner and focusing on simplicity I grow more concerned about how this is beginning to affect software architecture. By breaking our work down into ever smaller chunks and then focusing on delivering the next most valuable thing, how much of what is further down the pipeline is being factored into the design decisions we make today?
Part of the ideas around being leaner is an attempt to reduce waste caused by speculative requirements which has led many a project in the past into a state of â€œanalysis paralysisâ€ where they canâ€™t decide what to build because the goalposts keep moving. By focusing on delivering something simpler much sooner we begin to receive some return on our investment earlier and also shape the future based on practical feedback from today, rather than trying to guess what we need.
When weâ€™re building those simpler features that sit nicely upon our existing foundations we have much less need to worry about the cost of rework from getting it wrong as itâ€™s unlikely to be expensive. But as we move from independent features to those which are based around, say, a new â€œconceptâ€ or â€œpillarâ€ we should spend a little more time looking further down the backlog to see how any design choices we make might play out later.
Thinking to Excess
The term â€œoverthinkingâ€ implies that we are doing more thinking than is actually necessary; trying to fit everyoneâ€™s requirements in and getting bogged down in analysis is definitely an undesirable outcome of spending too much time thinking about a problem. As a consequence we are starting to think less and less up-front about the problems we solve to try and ensure that we only solve the problem we actually have and not the problems we think weâ€™ll have in the future. Solving those problems that we are only speculating about can lead to overengineering if they never manage to materialise or could have been solved more simply when the facts where eventually known.
But how much thinking is â€œoverthinkingâ€? If I have a feature to develop and only spend as much effort thinking as I need to solve that problem then, by definition, any more thinking than that is â€œoverthinking itâ€. But not thinking about the wider picture is exactly what leads to the kinds of architecture & design problems that begin to hamper us later in the productâ€™s lifetime, and later on might not be measured in years but even in days or weeks if we are looking to build a set of related features that all sit on top of a new concept or pillar.
Hence, it feels to me that some amount of overthinking is necessary to ensure that we donâ€™t prematurely pessimise our solution and paint ourselves into a corner. We should factor work further down the backlog into our thoughts to help us see the bigger picture and work out how we can shape our decisions today to ensure it biases our thinking towards our anticipated future rather than an arbitrary one.
Acting on our impulses prematurely can lead to overengineering if we implement whatâ€™s in our thoughts without having a fairly solid backlog to draw on, and overengineering is wasteful. In contrast a small amount of overthinking â€“ thought experiments â€“ are relatively cheap and can go towards helping to maintain the integrity of the systemâ€™s architecture.
One has to be careful quoting old adages like â€œa stich in time saves nineâ€ or â€œan ounce of prevention is worth a pound of cureâ€ because they can send the wrong message and lead us back to where we were before â€“ stuck in The Analysis Phase . That said I want us to avoid â€œthrowing the baby out with the bathwaterâ€ and forget exactly how much thinking is required to achieve sustained delivery in the longer term.
 The one phrase I always want to mean this is â€œthink globally, act locallyâ€ because it sounds like it promotes big picture thinking while only implementing what we need today, but thatâ€™s probably stretching it too far.