The New Stack and Methodology: Interplay

(This piece — written in collaboration with Bhushan Nigale — is the fifth in a series that explores the evolution of the software technology stack and software development methodologies in the last two decades. In this instalment Bhushan and I examine the interplay between the old and the new worlds, and also look at how the stack and methodologies play together in this evolution)

The first four articles in this series examined the transition from the traditional stack and Waterfall methodology (common two decades ago) to the New Stack we see today in cloud-native products and the Agile/LEAN methodologies common in software development today. Those articles looked at the drivers (that led to the changes), and the impact (of these changes). Some of the key challenges the new stack and methodologies brought in were also discussed.

Given all this, it’s fair to ask: where does this leave the traditional stack or the Waterfall methodology? Where are they used (or relevant) even today? Do they have a role to play in future? How do these co-exist with the new stack and methodologies? 

This article explores the interplay between old and new, and also how the stack and methodologies play together. 

The traditional stack today

The traditional stack, dominant two decades ago, is still widely in use today. It figures mostly in the enterprise software products built around the 1990s and deployed ‘on-premise’. Some of these products have been rewritten for the cloud, some others have followed the ‘lift and shift’ path to the cloud, but a majority — close to 60 percent [1]— remain where they were originally deployed: in the on-premise data centres maintained by the IT departments of enterprises.  

These legacy enterprise products — and thus the traditional stack they are based on — can be expected to stay operational for decades. The reasons for this are many.

Firstly, the large amount of investment (both in hardware and software) that has gone into these systems results in a lot of inertia. Having invested so much into these systems, the natural inclination is to keep them running for a long time.

Next there’s the tricky matter of switching costs — costs that include not just building or buying new software, but also migrations costs, end-user training costs, etc — that need to be justified: unless there’s a compelling business reason, such transformation projects do not get the budget. 

Then there’s the question of skill. Enterprise IT departments are experienced in maintaining and operating the traditional stack, but they lack skills the new stack demands. Unless there’s a demographic change — which can take decades — this factor will continue to play a role in decisions involving a move to a new architecture.

Ultimately, it is a matter of business priority. These enterprise products also are typically ‘systems-of-record’, which do not face the same kind of demands — to change fast or scale flexibly — as the ‘systems of engagement’ (or, in the B2C world, any consumer facing apps) do. And while they may be mission critical, these transactional systems are often not seen as strategic: so why touch them if most of the innovation is anyway happening elsewhere? As long as the data from these systems of record can be accessed quickly and used (for AI related capabilities, for instance), there’s little business need to rebuild these solutions on the new stack.  

So these legacy products built on the traditional stack will continue to be in use in the foreseeable future. One important consequence of this is the rise of Robotic Process Automation (RPA) tools in the software industry [2]. These tools make up for the deficiencies in legacy software (like missing APIs, or  fragmented toolsets) and add a layer that further removes the need to modernize legacy solution landscapes among enterprise customers. 

Continue reading “The New Stack and Methodology: Interplay”
Advertisement

The New Methodology: Impact

(This piece — written by Bhushan Nigale — is the fourth in a series that explores the evolution of the software technology stack and software development methodologies in the last two decades. In this instalment Bhushan examines the consequences of the widespread adoption of Agile and Lean.)

In article two of the series I presented the various forces that have led to the evolution of software development practices from the Waterfall model to Lean and Agile. We saw a variety of proximate causes that caused this evolution: the increasing role software plays in all spheres of our life, the massive changes in software architecture and the mainstreaming of Open Source software, the increasing consumerization of IT and the changing demographics of the software industry. 

This article examines the consequences of these changes. Mainly, it answers the question: did Agile and Lean hold on to their promise? When a species evolves to adapt to its new environment, manifest changes appear. Can we discern such changes in the industry, for instances in workplaces and the roles played by practitioners? If we live in a post-Waterfall world, what are the obvious signposts that the changes have ushered?

In what follows, I provide an overview of how other industries have begun to adopt Agile, to what extent the hierarchies still matter, the roles of teams over individuals, and the rising importance of roles such as the Product Manager.

Agile delivers

The agile movement that arose from the Agile Manifesto is now widespread to the extent that software development organizations consider it the de facto style for delivering innovation at scale. Software development and implementation projects are risky, failure-plagued endeavors: while statistics widely differ, reliable studies (such as the Standish Group’s Annual CHAOS report) report as high as two-thirds of technology projects ending in partial or total failure. 

With its emphasis on involving end-users as early as possible and then collaborating with them, smaller release cycles and a clear articulation of user requirements, Agile addresses the most crucial reasons for these failures: users become stakeholders, vested in the success of the project, rather than just using the project ‘thrown over the fence’ to them (e.g. by IT departments). The transparency in progress improves trust and the health of interdepartmental relationships – truth is the best disinfectant.

Hierarchies matter less 

A counterintuitive, but welcome, change has been the gradual flattening of organizational hierarchies. While Lean originated in manufacturing companies, traditionally hierarchical with a ‘command-and-control’ operational model, its fundamental principle of putting customer value first meant that employees need to be more empowered to ensure this principle lives in practice. Thus, a product owner several levels below the unit head, takes significant decisions and takes accountability in the success of the product: brand new announcements in products and cloud services are increasingly made by Product Managers and not development departmental heads.

Continue reading “The New Methodology: Impact”

The New Methodology: Origins

(This piece — written by Bhushan Nigale — is the second in a series that explores the evolution of the software technology stack and software development methodologies in the last two decades. It examines the journey from the Waterfall model to Agile and LEAN, outlining the main factors that drove this change.)

A benefit of spending over two decades in an industry is that one develops a perspective to separate hype from substance. This viewpoint is especially useful in an industry like software, where minor feature increments are hailed as innovation, and press releases, blogs and Tweets tout routine upgrades as revolutionary. After having lived through several such hype cycles that have a high probability of going bust, one learns to exercise caution, and appreciate genuine path-breaking innovations (the first article in this series — written by Manohar Sreekanth — lists the technology changes that have stayed).

Innovation in software development methodologies is even harder to achieve and sustain. A paradigm shift is rare – at least in the original sense of the term (Thomas Kuhn used it to define  a fundamental change in basic experimental practices in a scientific discipline). Inertia is difficult to overcome, especially if established methodologies seem to be getting the job done.

I’ve been privileged to witness and experience firsthand such a paradigm shift in software development, namely the shift from Waterfall to Agile methodology. The shift has been so complete that new entrants to the industry have little – if at all – any familiarity with the older methodologies. Agile is their new default mode now.

Examining and reviewing this shift is both useful and important, because the promises of any established order need to be constantly reexamined as flaws and digressions inevitably creep in. Over time, unless tended carefully, practices tend to return to older routines — regression towards the mean is an iron-clad statistical law. Understanding the older practices and the change drivers that led to their evolution help us better appreciate the advances and detect costly deviations. An appreciation of the historical developments helps practitioners not only to address flaws, but also iterate over the methodology to adapt to the changing operational environments.

A variety of forces have led to this evolution from Waterfall to Agile: the increasing role software plays in all spheres of our life, the massive changes in software architecture and the mainstreaming of Open Source software, the increasing consumerization of IT and the changing demographics of the software industry. We examine these factors in this article, and treat the consequences of these changes in a subsequent one. 

From Waterfall to Agile

The previous article in this series traced the fundamental change in the technology stack used to build software applications. A parallel evolution, in the methodology of developing software, has accompanied these mammoth technological shifts.

When I entered the industry in the late 1990s, Waterfall had none of the negative labels one finds associated with it today. Terms such as ‘Software Requirements Document’ and ‘Handover to Maintenance’ were ubiquitous and carried a certain respect – passing a Quality Assurance Gate was a big milestone that invited celebration. The software development process flowed from a high perch (hence ‘Waterfall’) of analysis and design to the plains of testing and release, where software was then finally delivered to customers.

But cracks had already started to appear. Disenchantment was rising, both with long delivery cycles and the obsession with adherence to the strict development processes. The internet – which broke the traditional stack as we saw in the previous article – was triggering foundational changes in which software was consumed, and these consumption-driven pressures were now being transmitted to how software was being built. Consumers wanted their software delivered to them faster and better, even as it began to occupy an increasingly central part in their lives. 

Continue reading “The New Methodology: Origins”