Ragile software development – a longitudinal reflection on post-agile methodology
Ok, so there is no such thing as a ‘ragile’ software development method. Nevertheless, for a number of converging reasons, I have recently been given cause to reflect on the history of rapid, lightweight and agile approaches to software development, and the current dominant ideology of methods. I use the label ‘ragile’ as an indicator of where we might have been, or where we might still go, in developing software in the post-agile era.
There’s a scene in Woody Allen’s movie ‘Sleeper‘, where two doctors from 200 years into the future discuss belief about diet. It goes like this:
“You mean there was no deep fat? No steak or cream pies or… hot fudge?”
“Those were thought to be unhealthy… precisely the opposite of what we now know to be true.”
Scientists tend to realize that this week’s theory is just that, and it may be replaced by a new theory at any time, based on our empirical observations. Diet promoters tend to take the opposite view. Everything in the past was wrong, but now we know the truth. I hope that software developers are more like scientists than fad dietitians, and will embrace change in their thinking.
Received wisdom has it, perhaps, that the problems of the waterfall approach to software development have been overcome by a legion of certified Scrum Masters leading their agile organisations to the continuous delivery of quality applications. If only we were so enlightened. Royce himself, in his famous ‘waterfall’ paper, stated “I believe in this concept, but the implementation described above is risky and invites failure.” He was talking about Figure 2 in his paper, the oft-copied waterfall diagram. It seems that few bothered to read the rest of the paper and the rather more challenging figures within it, consigning the software industry to decades of misunderstanding. Assuming that software development was in a chronic pre-agile crisis is also a misreading of history. Participants at the 1968 NATO conference, which was the apocryphal source of terms like ‘software engineering’ and ‘software crisis’ acknowledged that many large data processing systems were working perfectly well, thank you. DeGrace and Stahl told us that software was a ‘wicked problem’ 1990, but ‘wicked’ does not mean ‘insoluble’, though it does mean that we should not expect there to be one right answer.
The software industry has seen a series of new ideas about software development over the decades, many of which have been based on leveraging improvements in the hardware and software tools available to us. Kent Beck in the first XP book referred to fully utilising these new tools, turning up the dial on all the best practices at once, possibly to 11 (or was that just Spinal Tap?) Almost a decade earlier, James Martin had published his ‘Rapid Application Development‘ book. stressing the value of, among other things, prototyping, code generation, metrics, visualisation tools, process support tools and shared, reusable domain models. Later, in 1996, Steve McConnell’s ‘Rapid Development‘ emphasised many of the same ideas, stressing productivity tools, risk management and best practices. Both authors prefigured many practices of the lightweight (later agile) methods that were emerging in the late 1990s; Iterative, timeboxed development, customer engagement, small teams, adapting to changing requirements and quality assurance.
An underlying theme in rapid development is the concept of domain modelling and automated tools, including for code generation. Similar themes appear in Agile Modelling, Model Driven Development and, Domain Driven Design. Tools like the Eclipse Modeling Framework, and those based on the naked objects pattern, such as Apache Isis, put modelling at the heart of systems development.
The agile methods movement is at a point of change (hmmm, isn’t that the definition of a crisis?) Recent efforts have revisited the lessons of the Japanese car industry, with Lean and Kanban, in a search for ever ‘lighter’ processes, while at the same time a vision of agile as a traditional fixed methodology process has become established (endemic, even.) This has recently caused two signatories of the original agile manifesto to refer to ‘the failure of agile‘ (Andrew Hunt) and state that ‘agile is dead‘ (Dave Thomas). From another perspective, Vaughn Vernon lamented in his book on domain driven design in 2013 that in Scrum “a product backlog is thrust at developers as if it serves as a set of designs”
So, coming back to ‘ragile’, what is it? Well no more than an acknowledgement that there may be a collection of practices from both rapid and agile (and model/domain driven development) that remain relevant to the future of software development. Such an approach would emphasise tools, leverage prototypes, include shared domain models, embrace code generation, automate as much as possible, including estimation and project management, and deliver continuously. Such an approach might be considered radical, in the particular sense of going back to the roots of a phenomenon. Some of the ideas of Martin and O’Connell were much harder to do in the 1990s than they are now. Can there be any software developers who do not use code generators of one type or another? They generate object property methods, test stubs, service end points and a host of other components, they refactor code and create user interfaces and database schema. Rails and Grails developers work with domain models as a matter of course, allowing frameworks to build whole architectures automatically. It’s time we rethink how these strands might become a method for the 2020s that is able to cope with the distributed and parallel domain models of the cloud-based, in-database, Internet-of-things, plugin-driven, Web 3.0 applications of the future.