CIOs Need to Lead the Digital Transformation

Gartner recently released the results of its “2017 CEO Survey: CIOs Must Scale Up Digital Business”.  As I read through it, I saw many links to the messages I have been communicating about driving the business value of software and that will be discussed in my new book “The Business Value of Software” being published by CRC Press in August 2017. 

CIOs The Gartner research found that the top priorities for CEOs in 2017 are 1) growth, 2) technology-related business change and 3) product improvement and innovation.  These three priorities are interconnected and are driven by the digital transformations occurring at many organizations.  Therefore, it is essential that the CIO and their team be intimately involved in the strategic discussions related to these three areas. 

Part of these strategic discussions needs to be measuring the success of the initiatives.  This is a topic that I have discussed in-depth when talking about visualizing the value of software and Gartner emphasizes it in their report.  In order to drive value of a software development initiative, for example, it is essential to clearly understand the goals and objectives of the initiative, and collaborate with the business unit to discuss, define, measure, and prioritize projects based on their ability to deliver on the value expectations.  In the Gartner research, they found that 53 percent of the respondents could not provide a clear metric for success.  It is not only critical that the C-suite have distinct metrics for a software development initiative, such as revenue, sales and profit goals, but that they communicate these goals to the entire technology team that are making the day-to-day tactical decisions that will impact the strategic direction of the project and ultimately the business value. 

The report also highlights that 57 percent of organizations will be building up their in-house information technology and digital capabilities in 2017 versus 29 percent that will be outsourcing this function.  Either way, the IT/digital team needs to be considered a partner in developing solutions that drive business value and not just a tactical arm that develops and implements the solutions.

CIOs need to step up.  They should establish and lead the digital strategy for their organization, collaborating tightly with the appropriate business unit managers and then communicating the goals to the IT team in order to deliver on the expected business value.  By defining metrics based on business value, the success of a project can be measured throughout the development lifecycle, stakeholders can be held accountable and projects can be modified throughout the process to realign it with its goals and objectives. 

If you are interested in help with your value delivery metrics, feel free to contact me.

Michael D. Harris

CEO

 

Written by Michael D. Harris at 12:20

Software Value: Impact on Software Process Improvement | DCG

Business value has not always been the primary driver of software process improvement, but that is changing.  This is the main point of an excellent article by Richard Turner in the March/April edition of CrossTalk, “The impact of Agile and Lean on Process Improvement.”

Turner’s article is a concise and refreshingly frank walk through the history of software process improvement from the perspective of an expert who has been intimately involved.  With a hint of frustration that I certainly share, Turner captures perfectly the thinking that has led to a move away from process improvement initiatives like CMMi in commercial software development organizations:

“One of the drawbacks of earlier process improvement approaches was the concept and distribution of value. The overall value of the process improvement was often situational at best and nebulous at worst.  Where it was seen as a necessity for competitive credibility [as was the case for my development group at Sanchez Computer Associates back in 2001], the value was in passing the audit rather than in any value to the organization and the customer.  In other cases, the value was essentially associated with the success of one or two champions and disappeared if they failed, changed positions or left the company [as I did].  On those occasions where PI was primarily instituted for the actual improvement of the organization, the internal focus on practices was often valued as a way of cutting costs, standardizing work [We certainly needed to make our processes repeatable] or deploying better predictive management capabilities rather than improving the product or raising customer satisfaction.”

While I agree with 95% of Turner’s analysis here, in my experience both passing the audit and standardizing our processes raised customer satisfaction.  We went from having one customer ready to give us a reference to most of our customers being referenceable on the basis of solid evidence that we had fixed the reliability of our software development

Turner contrasts historic process improvement initiatives, mostly targeted at waterfall operations, where business value was not a prime driver to today’s initiatives where, “With the emergence of Agile and Lean, the concept of value became more aligned with outcomes.  The focus on value stream and value-based decision making and scheduling brought additional considerations to what were considered best practices.”

Turner recognizes that in today’s Agile and Lean software development teams, the teams themselves are responsible for their own processes.  Mostly, this is a strength because creative people are likely to optimize processes under their control out of simple self-interest (which benefits the organization).  Where this falls down in my experience is where, “These organizations rely on cross-fertilization of personnel across multiple projects to improve the organization as a whole.”  To put it bluntly, this rarely happens.  Teams can be self-organizing but groups of teams don’t typically self-organize.  Hence, there is still a place for organizational process improvement – with a lean, software value driven emphasis – in the most modern software development organization.  By way of evidence, scrum teams that are working together on the same program struggle to develop ways to coordinate and synchronize their efforts unless a framework such as SAFe is introduced through a process improvement initiative. 

That said, I will leave the last word to Turner, “Process improvement that does not improve the ability to adapt has little value.”

 

Michael D. Harris, CEO

Written by Michael D. Harris at 13:36

Using Software Value to Drive Organizational Transformation

I was delighted to read a thought leadership article from McKinsey recently, “How to start building your next-generation operating model,” that emphasizes some key themes that I have been pushing for years (the quotes below are from the article):

  • The importance of orienting the organization around value streams to maximize the flow of business value – “One credit-card company, for example, shifted its operating model in IT from alignment around systems to alignment with value streams within the business.
  • Perfection is the enemy of good enough – “Successful companies prioritize speed and execution over perfection.
  • Continuous improvement relies on metrics to identify which incremental, experimental improvements work and which don’t.  Benchmarking and trend analysis help to prioritize areas where process improvement can offer the most business value – “Performance management is becoming much more real time, with metrics and goals used daily and weekly to guide decision making.”
  • Senior leaders, “hold themselves accountable for delivering on value quickly, and establish transparency and rigor in their operations.
  • “Leading technology teams collaborate with business leaders to assess which systems need to move faster.”

Using Software Value to Drive Organizational Transformation

There is one “building block” for transformation in the article to which I am a recent convert and so kudos to the McKinsey team for including it in this context.   Their “Building Block #2” is “Flexible and modular architecture, infrastructure and software delivery.”  We are all familiar with the flexible infrastructure that cloud provides, but I have been learning a lot recently about the flexible, modular architecture and software delivery for application development and application integration that is provided by microservices frameworks such as the AnyPoint PlatformTM from Mulesoft.

While they promote organizing IT around business value streams, the McKinsey authors identify a risk to be mitigated in that value streams should start to build up software, tools and skills specific to each value stream.  This might be contrary to the tendency in many organizations to make life easier for IT by picking a standard set of software, tools and skills across the whole organization.  I agree that it would be a shame indeed if agile and lean principles that started life in IT software development are constrained by legacy IT attitudes as the agile and lean principles roll out into the broader organization.

There are a lot more positive ideas for organizational transformation in the article, so I recommend that you take a few minutes to read it.  My only small gripe is that while the authors emphasize organizing around value throughout, they do not mention prioritizing by business value.  Maybe at the high level that McKinsey operates in organizations that concept is taken for granted.  My experience is that as soon as you move away from the top level, if business value priorities are not explicit, then managers and teams will use various other criteria for prioritization and the overall results may be compromised. 

Written by Michael D. Harris at 14:16
Categories :

Algorithms: What are They Worth and What Might They Cost You?

Every so often, I read an article that gets me thinking in a different way about software value and software risk.  Danilo Doneda of Rio de Janeiro State University and Virgilio Almeida of Harvard University recently published an article entitled, “What is Algorithm Governance?[1]

Doneda and Almeida suggest that the time may have come to apply governance to algorithms because of the growing risks of intentional or unintentional, “… manipulation, biases, censorship, social discrimination, violations of privacy and property rights and more,” through the dynamic application of a relatively static algorithm to a relatively dynamic data set.  

By way of example, we have probably all experienced the unintended consequences of the application of a reasonably well understood algorithm to new data.  We all have a basic grasp of what the Google search algorithm will do for us but some of you might have experienced embarrassment like mine when I typed in a perfectly innocent search term without thinking through the possible alternative meanings of that set of words (No, I’m not going to share).  At the other end of the spectrum from the risk of relatively harmless misunderstandings, there is a risk that algorithms can be intentionally manipulative – the VW emission control algorithm that directed different behavior when it detected a test environment is a good example. 

For those of us who deal with outsourcing software development, it is impossible to test every delivered algorithm against every possible set of data and then validate the outcomes. Algorithm Governance, Software risk management consulting by DCG Software Value

If we consider software value, from a governance perspective, it should be desirable to understand how many algorithms we own and what they are worth.  Clearly, the Google search algorithm is worth more than my company.  But, are there any algorithms in your company’s software that represent trade secrets or even simple competitive differentiators?  Which are the most valuable? How could their value be improved?  Are they software assets that should be inventoried and managed?  Are they software assets that could be sold or licensed?  If data can gather and sell data then why not algorithms?

From a software metrics perspective, it should be easy to identify and count the algorithms in a piece of software.  Indeed, function point analysis might be a starting point using its rules for counting unique transactions, each of which presumably involves one or more algorithms, though it would be necessary to identify those algorithms that are used by many unique transactions (perhaps as a measure of the value of the algorithm?).  Another possible perspective on the value of the algorithm might be on the nature of the data it processes.  Again, function points might offer a starting point here but Doneda and Almeida offer a slightly different perspective.  They mention three characteristics of the data that feeds “Big Data” algorithms, “… the 3 V’s: volume (more data are available), variety (from a wider number of sources), and velocity (at an increasing pace, even in real time).  It seems to me that these characteristics could be used to form a parametric estimate of the risk and value associated with each algorithm. 

It is interesting to me that these potential software metrics appear to scale similarly for software value and software risk.  That is, algorithms that are used more often are more valuable yet carry with them more risk.  The same applies to algorithms that are potentially exposed to more data. 

[1] Doneda, Danilo & Almeida, Virgilio A.F. “What is Algorithm Governance.” IEEE Computer Edge. December 2016.

 

Mike Harris, CEO

Written by Michael D. Harris at 15:07

How Does Cybersecurity Drive the Business Value of Software?

cyber-security drives business value of softwareSoftware brings tremendous value to organizations, but in today’s day and age, it also carries significant risk.  Malicious cyberattacks continue to rise at a rapid pace.  According to the Identity Theft Resource Center and CyberScout, data breaches increased by 40 percent in 2016 – that’s after a record year in 2015.  With the ongoing upsurge in data breaches, software can be seen by many as a potential liability for an organization.  We are such a data-driven economy today that criminals have realized that they can cause serious damages to companies, governments and other entities by hacking into their information systems and stealing, corrupting or deleting valuable data.  These breaches are extremely costly to organizations – not only financially, but also to their reputations. 

 Just look at Target.  In 2013, hackers stole credit card numbers of 110 million customers costing the retail giant approximately $162 million, in addition to a decrease in sales and a black eye to their reputation (for a short period of time). 

 It’s no wonder that “94 percent of CISOs are concerned about breaches in their publicly facing assets in the next 12 months, particularly within their applications,” according to a January 2017 Bugcrowd study.  However, despite these concerns, another survey of over 500 IT decision makers found that 83 percent of the respondents actually release their code before testing it or resolving known weaknesses (Veracode, September 2016). 

Software is typically at the foundation of all cybersecurity attacks.  In fact, the Software Engineering Institute stated that 90 percent of reported security incidents result from exploits against defects in the design or code of software.  If a network router is hacked, most likely the hacker went through the router’s software, not hardware.  These breaches can pose such a significant threat to an organization’s value that software developers must make application security an integral part of the software development lifecycle. 

By finding and fixing vulnerabilities early in the software development lifecycle, there is less risk to the business and more potential for increased business value from the software.  For example, Adobe Flash player is a product used by many websites to enable interactivity and multimedia.  In 2015, it had more than 300 patches (TechBeacon’s Application Security Buyer’s Guide).  Developing these patches is a resource drain (both time and money).  On balance though the risk Adobe would run by not providing these patches could be significant and negatively impact the Adobe’s value as well as the value of the organizations using its product. 

So, if an application has, let’s say, 500 known weaknesses, the organization may not have the time or money to fix all of them before an imminent release.  They need to collaborate with the business unit and determine which vulnerabilities pose the highest risk to the business (negative business value) and which ones, if remediated, will help to deliver the most value to the business if they are fixed.  It is not unusual for developers to fix those vulnerabilities that are easiest to resolve; however, it is critical to take a step back and prioritize identified vulnerabilities based on business value.  

 

Mike Harris, CEO

Written by Michael D. Harris at 12:29
Categories :

"It's frustrating that there are so many failed software projects when I know from personal experience that it's possible to do so much better - and we can help." 
- Mike Harris, DCG President

Subscribe to Our Newsletter
Join over 30,000 other subscribers. Subscribe to our newsletter today!