CIOs Need to Lead the Digital Transformation

Gartner recently released the results of its “2017 CEO Survey: CIOs Must Scale Up Digital Business”.  As I read through it, I saw many links to the messages I have been communicating about driving the business value of software and that will be discussed in my new book “The Business Value of Software” being published by CRC Press in August 2017. 

CIOs The Gartner research found that the top priorities for CEOs in 2017 are 1) growth, 2) technology-related business change and 3) product improvement and innovation.  These three priorities are interconnected and are driven by the digital transformations occurring at many organizations.  Therefore, it is essential that the CIO and their team be intimately involved in the strategic discussions related to these three areas. 

Part of these strategic discussions needs to be measuring the success of the initiatives.  This is a topic that I have discussed in-depth when talking about visualizing the value of software and Gartner emphasizes it in their report.  In order to drive value of a software development initiative, for example, it is essential to clearly understand the goals and objectives of the initiative, and collaborate with the business unit to discuss, define, measure, and prioritize projects based on their ability to deliver on the value expectations.  In the Gartner research, they found that 53 percent of the respondents could not provide a clear metric for success.  It is not only critical that the C-suite have distinct metrics for a software development initiative, such as revenue, sales and profit goals, but that they communicate these goals to the entire technology team that are making the day-to-day tactical decisions that will impact the strategic direction of the project and ultimately the business value. 

The report also highlights that 57 percent of organizations will be building up their in-house information technology and digital capabilities in 2017 versus 29 percent that will be outsourcing this function.  Either way, the IT/digital team needs to be considered a partner in developing solutions that drive business value and not just a tactical arm that develops and implements the solutions.

CIOs need to step up.  They should establish and lead the digital strategy for their organization, collaborating tightly with the appropriate business unit managers and then communicating the goals to the IT team in order to deliver on the expected business value.  By defining metrics based on business value, the success of a project can be measured throughout the development lifecycle, stakeholders can be held accountable and projects can be modified throughout the process to realign it with its goals and objectives. 

If you are interested in help with your value delivery metrics, feel free to contact me.

Michael D. Harris

CEO

 

Written by Michael D. Harris at 12:20

Algorithms: What are They Worth and What Might They Cost You?

Every so often, I read an article that gets me thinking in a different way about software value and software risk.  Danilo Doneda of Rio de Janeiro State University and Virgilio Almeida of Harvard University recently published an article entitled, “What is Algorithm Governance?[1]

Doneda and Almeida suggest that the time may have come to apply governance to algorithms because of the growing risks of intentional or unintentional, “… manipulation, biases, censorship, social discrimination, violations of privacy and property rights and more,” through the dynamic application of a relatively static algorithm to a relatively dynamic data set.  

By way of example, we have probably all experienced the unintended consequences of the application of a reasonably well understood algorithm to new data.  We all have a basic grasp of what the Google search algorithm will do for us but some of you might have experienced embarrassment like mine when I typed in a perfectly innocent search term without thinking through the possible alternative meanings of that set of words (No, I’m not going to share).  At the other end of the spectrum from the risk of relatively harmless misunderstandings, there is a risk that algorithms can be intentionally manipulative – the VW emission control algorithm that directed different behavior when it detected a test environment is a good example. 

For those of us who deal with outsourcing software development, it is impossible to test every delivered algorithm against every possible set of data and then validate the outcomes. Algorithm Governance, Software risk management consulting by DCG Software Value

If we consider software value, from a governance perspective, it should be desirable to understand how many algorithms we own and what they are worth.  Clearly, the Google search algorithm is worth more than my company.  But, are there any algorithms in your company’s software that represent trade secrets or even simple competitive differentiators?  Which are the most valuable? How could their value be improved?  Are they software assets that should be inventoried and managed?  Are they software assets that could be sold or licensed?  If data can gather and sell data then why not algorithms?

From a software metrics perspective, it should be easy to identify and count the algorithms in a piece of software.  Indeed, function point analysis might be a starting point using its rules for counting unique transactions, each of which presumably involves one or more algorithms, though it would be necessary to identify those algorithms that are used by many unique transactions (perhaps as a measure of the value of the algorithm?).  Another possible perspective on the value of the algorithm might be on the nature of the data it processes.  Again, function points might offer a starting point here but Doneda and Almeida offer a slightly different perspective.  They mention three characteristics of the data that feeds “Big Data” algorithms, “… the 3 V’s: volume (more data are available), variety (from a wider number of sources), and velocity (at an increasing pace, even in real time).  It seems to me that these characteristics could be used to form a parametric estimate of the risk and value associated with each algorithm. 

It is interesting to me that these potential software metrics appear to scale similarly for software value and software risk.  That is, algorithms that are used more often are more valuable yet carry with them more risk.  The same applies to algorithms that are potentially exposed to more data. 

[1] Doneda, Danilo & Almeida, Virgilio A.F. “What is Algorithm Governance.” IEEE Computer Edge. December 2016.

 

Mike Harris, CEO

Written by Michael D. Harris at 15:07

How Software Estimation Impacts Business Value

Software estimation in simple terms is the prediction of the cost, effort and/or duration of a software development project based on some foundation of knowledge.  Once an estimate is created, a budget is generated from the estimate and the flow of activity (the planning process) runs from the budget.  

Software estimation can significantly impact business value because it impacts business planning and budgeting. 

One challenge is that most organizations have a portfolio of software development work that is larger than they can accomplish and need a mechanism to prioritize the projects based on the value they deliver to the business.  This is where estimation can help – they predict the future value of the project to the business and estimate the cost of the project in resources and time.   Unfortunately, the estimates are often created by the people that are performing the actual day-to-day work not estimation experts.  Worse, new estimates from the people doing the work are typically based on their recall of previous estimates, not on previous project actuals – very few organizations take the time to report the actuals after a project is completed.  To most accurately estimate a software development project’s future business value,  it is best to generate the estimate based on the actuals from similar past projects and statistical modelling of the parameter that are different for the next project. 

Of course, an estimate is only an estimate no matter who develops it.  You can’t predict all the factors that may require modifications to the plan.  This is where the estimation cone of uncertainty comes in.  The cone starts wide because there is quite a bit of uncertainty at the beginning around the requirements of a project.  As decisions are made and the team discovers some of the unknown challenges that a project presents, then the cone of uncertainty starts to get smaller towards the final estimate. 

In regards to business value, the cone of uncertainty is significant because of the impact that the rigid adoption of early estimates can have on the budgeting and planning processes, especially if the software development effort is outsourced

 

I see software estimation as both a form of planning and input to the business planning process.  However, there is a a significant cross-section of the development community that believes #NoEstimates is the wave of the future.  This is a movement within the Agile community based on the premise that software development is a learning process that will always involve discovery and be influenced by rapid external change.  They believe that this dynamic environment of ongoing changes makes detailed, up-front plans a waste of time as software estimates can never be accurate.  Using #NoEstimates techniques requires breaking down stories into manageable, predictable chunks so that teams can predictably deliver value.  The ability to predictably deliver value provides organizations with the tool to forecast the delivery.  In my view, the #NoEstimates philosophy really isn’t not estimating – it is just estimating differently. 

Whether you use classic estimation methodologies that leverage plans and performance to the plans to generate feedback and guidance, or follow the #NoEstimates mindset that uses both functional software and throughput measures as feedback and guidance – the goal is usually the same.  They are both a form of planning and input to the business planning processes that are aimed at driving the business value of each software development initiative. 

Written by Michael D. Harris at 11:16

Microservices in Software Architecture

Microservices in Software ArchitectureSoftware value can take many forms but the ability to respond quickly and flexibly to new business challenges separates “just so” software architecture from high-value software architecture.  To this end, over the past 20 years, we have seen many steps down the path from monolithic applications to client-server to service-oriented architectures (SOA).  Now, organizations seeking to maximize the business value of their software architectures are adopting microservices architectures. 

Microservices, as the name suggests, should represent the smallest unit of functionality that aligns to a core business capability. 

hat’s not to say that each business process or transaction is a single microservice but rather that business processes and transactions are “composed” using microservices.  Sounds like SOA?  Well, yes, it did to me too, at first.  The major difference, I think, is that this time the industry has got out ahead of the curve, learned from the challenges that we all had/have with SOA and built the necessary infrastructure to standardize and support the microservices from the beginning.  For example:

  • Microservices API’s are standardized
  • Microservices are natively able to communicate with each other through industry-wide adoption of pre-existing standards like HTTP and JSON.
  • Microservices can be formally defined using standards like the “Restful API Modelling Language” (RAML) so that developers reusing the microservices can depend on the functionality contained within the microservice and resist the urge to rewrite their own version “just in case.”  Indeed, a collaboration hub like Mulesoft’s Anypoint Exchange encourages merit-based reuse of microservices by capturing the reviews and ratings of other developers who have used that microservice.
  • Microservices can be implemented in different programming languages.
  • Tools are available to manage the complexity of microservices e.g. Mulesoft Anypoint Platform.

This last bullet hints at some of the challenges of a microservice architecture.  Development needs to be highly automated with automated deployment to keep track of all the microservices that need to be composed into a particular application and continuous integration.  However, the adoption of a microservices approach also requires strong discipline from developers and the devops team.  Fortunately, the “small is beautiful” nature of most microservices means that the development teams can (and should) be small so team discipline and communication can be maximized. 

Implementating a microservices architecture is not something to try on your own for the first time. 

There a number of companies that have already developed strong experience in architecting and development microservices including our own Spitfire Group who have done a number of implementations including a back-office upgrade for a Real Estate firm.

I believe that organizations should seriously consider enhancing the business value of their software by implementing microservices architecture for their “leading edge” products or services.  By “Leading edge,” I mean those software-based products or services that are most subject to change as the business environment changes.  They are probably customer-facing applications which have to respond to competitive changes in weeks not months.  They are probably going to be applications whose software value rests on they’re being fit for purpose all the time.

Written by Michael D. Harris at 13:52
Tags :
Categories :

The Software Development Productivity Benchmarking Guide

DCG Software Value

If you missed the news, DCG Software Value, LEDAmc, and TI Métricas recently announced the release of “The Software Development Productivity Benchmarking Guide.” 

It contains actionable benchmarking guidance and information that will enable organizations to track their progress both internally and against industry standards to facilitate the creation of high quality software and improve resource and budget management.

The guide is available to all international software metrics organizations, including IFPUG, ISBSG, NESMA, and COSMIC, as well as to any independent company that is interested in implementing or improving its benchmarking practice.

Don't miss out on this resource; it's free and available for download here

Written by Default at 05:00

"It's frustrating that there are so many failed software projects when I know from personal experience that it's possible to do so much better - and we can help." 
- Mike Harris, DCG President

Subscribe to Our Newsletter
Join over 30,000 other subscribers. Subscribe to our newsletter today!