I’d like to call attention to the a recent paper by Donald J. Reifer that appeared in the Journal of Cyber Security and Information Systems, “Software Productivity Progress during the First Decade of the 21st Century.”
As the title indicates, the paper summarized the progress various industries have made in software productivity during the first decade of the 21st century using data from 1,000 completed projects, none of which are over 10 years old.
Thousands of dollars are spent each year to improve productivity, which is defined by IEEE Standard 1045-1992 as the ratio of units of output generated to units of input used to generate them, but is productivity actually improving or are we wasting dollars?
Reifer concludes that thanks to considerable improvements in the technology that software organizations are using, there have been considerable productivity improvements in the last decade. On the whole, this means that these investments are worth it, according to empirical evidence.
The upfront costs may be considerable and seemingly risky, but they are balanced out over time due to improved productivity. With data such as this in hand, IT departments have the evidence they need to provide reassurance to the business (and to themselves) that investments are valuable.
For the details of how Reifer came to these conclusions, you can read his paper here.