This is the first post in a series that will offer our management team's reflections on DCG's 20th anniversary and the state of the software industry.
This year, DCG celebrates its 20 year anniversary – all 20 of which I’ve spent with the company! As such, it is an opportune time to look back on the software industry and reflect on where we have been and where we are today.
Software: A Changing Landscape
Probably everyone would agree that the business of developing and deploying software is in constant flux. New practices, techniques and tools are continuously being introduced to the development environment with varying degrees of success and sustainability – we’ve seen it time and time again over the past 20 years.
These changes are often influenced by the evolving needs of the business. Cost has always been a business driver, but over the past several years companies have become increasingly focused on software quality and speed (time to market) – and how software performance data can be used to drive strategic decisions.
To address this newfound concern, more and more of our clients are turning to a tried and trusted solution: the use of quantitative and qualitative measurement data to govern software development practices and outcomes – one of DCG’s core offerings.
For example, twenty years ago we saw the rise of the outsourcing mega-deal. In hopes of lowering costs, companies looked to offshore sourcing alternatives with third party suppliers. These mega-deal, multi-year outsourcing arrangements required service level agreements and measures to be established in order to monitor and manage risks. What we learned from these early outsourcing initiatives was that they did not always result in the cost savings that had been anticipated.
Of course, outsourcing is still very much part of the norm in today’s software development environment, but organizations have developed the ability to make better decisions as to which portions of their development and maintenance activities they choose to outsource. Through the use of now accepted industry performance measures, IT departments are deploying quantitative benchmarks and qualitative assessments to help them better understand levels of productivity and quality performance. Using this information, they are better able to evaluate and select potential cost-saving sourcing strategies.
To that end, the use of software sizing techniques, such as Function Point Analysis, have become a key factor in the effective measurement and management of software, as IT departments and C-level management have increasingly required sound, quantitative data that can measure the value contribution of IT.
Of Course, Some Things Remain the Same
But while proven solutions are being used to address newer business concerns, some issues have continued to plague software over the years. The most costly of these issues occur at the frontend of a software development lifecycle, specifically, the lack of proper requirements development, ineffective cost estimating and ignoring the need to properly set customer expectations.
Issues associated with these shortcomings have long been recognized by most organizations, and yet proven solutions to these problems have been slow in gaining ground. There seems to be a continued reluctance to make the proper investment necessary to correct these issues. One major reason that these frontend issues are not aggressively addressed is that organizations aren’t typically aware of what these frontend issues are costing them on the backend! For example, poorly defined requirements and taking shortcuts to reduce costs and speed up delivery can have a devastating impact on the quality of the software deliverable, resulting in extended maintenance activities and costs.
We routinely have organizations come to us seeking help in addressing these issues. Our solution involves both some of our standard techniques (sizing), as well as newer ones that have developed over the past 20 years (Agile).
From a sizing perspective, we know that a key measure is to determine the size and complexity of the software problem domain. Using that information we are able to help our clients improve requirements clarity, more effectively estimate project costs and ultimately properly set customer expectations. Proving the old adage, “You can’t properly manage what you don’t measure.”
Similar to sizing, Agile practices have been around for the past twenty years. But, it seems like only recently that IT organizations have come to the realization that by using Agile practices and technologies, they can gain the advantage of working more closely with their business partner to establish the prioritized requirements and incrementally deliver value.
Even though these issues described seem to continually plague software, there are solutions – both old and new – that exist to mediate them, if an organization wishes to do so. In this respect, it’s time for software organizations to step out of the past and move into the future.
On the whole, software will never stop evolving. It’s fun – and sometimes necessary – to talk about and play with new tools and techniques that have come about. But on the whole, while the priorities of IT – and the business – may change, it’s always worthwhile to consider how trusted solutions can be of value.
Vice President, Software Performance Management