ICEAA Professional Development & Training Workshop

Last week DCG's Tom Cagley, VP of Consulting, and David Lambert, Software Sizing and Estimation Specialist, attended the International Cost Estimating and Analysis Association's (ICEAA) annual Professional Development and Training Workshop in San Diego.

It was a great event, bringing together professionals from government, industry and academic cost communities to discuss one of our favorite topics here at DCG: Estimation.

The event program featured speakers, training and an exhibition hall, with the goal of furthering attendees' understanding and appreciation of using data-driven estimating and analysis techniques.

Tom and David both presented at the event - and both presentations are available for download:

  • Agile Estimation Using Functional Metrics: Learn how combining the discipline of functional metrics with the collaborative approaches found in Agile parametric estimation has proved to be an effective way to estimate in an Agile environment.
  • Put Some SNAP in Your Estimating Model: Learn how the sizing framework, the Software Non-Functional Assessment Process (SNAP), can be used to size non-functional software requirements, providing a more complete and accurate assessment of the size of a project.

Download the presentations here and contact Tom or David for more information or with any questions. More information about DCG's Estimation Solutions and Consulting Services are available here and here.

We enjoyed the conference (and were happy to see such an interest in estimation), and we hope to attend again in the future!

Written by Default at 05:00

What is #NoEstimates?

This report can be downloaded here.

Scope of this Report

Estimation is one of the lightening rod issues in software development and maintenance. Over the past few years the concept of #NoEstimates has emerged and has become a movement within the Agile community. Due to its newness, #NoEstimates has several camps revolving around a central concept of not generating task level estimates. The newness of the movement also means there are no (or very few) large example projects that can be used as referencesi. Finally there are no published quantitative studies of results comparing the results of work performed using #NoEstimates techniques to other methods. In order to have a conversation we need to be begin by establishing a shared context and language across the gamut of estimating ideas whether Agile or not. Without a shared language that includes #NoEstimates we will not be able to compare the concept to classical estimation concepts.

Context

#NoEstimates Context:

There are two main groups or camps of thought leaders in the #NoEstimate movement (the two camps probably reflect more of continuum of ideas rather than absolutes). The first camp argues that a team should break work down into small chunks and then immediately begin completing those small chunks (doing the highest value first). The chunks would build up quickly to a minimum viable product (MVP) that can generate feedback, so the team can hone its ability to deliver value. This camp leverages continuous feedback and re-planning to guide work, and luminaries like Woody Zuill often champion this camp. A second camp begins in a similar manner – by breaking the work into small pieces, prioritizing on value (and perhaps risk), delivering against a MVP to generate feedback – but they measure throughput. Throughput is a measure of how many units of work (e.g. stories or widgets) a team can deliver in a specific period of time. Continuously measuring the throughput of the team provides a tool to understand when work needs to start in order for it to be delivered within a period time. Average throughput is used to provide the team and other stakeholders with a forecast of the future. This is very similar to throughput measured used in Kanban. People like Vasco Duarte champion the second camp who practice #NoEstimates from a lean or Kanban perspective. We recently heard David Anderson, the Kanban visionary, discuss a similar #NoEstimates position using throughput as a forecasting tool. Both camps in the #NoEstimates movement eschew developing story- or task-level estimates. The major difference is on the use of throughput to provide forecasting which is central to bottom-up estimating and planning at the lowest level of the classic estimation continuum.

Classic Estimation Context:

Estimation as a topic is often a synthesis of three related, but different concepts. The three concepts are budgeting, estimation and planning. Because these three concepts are often conflated it is important to understand the relationship between the three. These are typical in a normal commercial organization, however these concepts might be called different things depending on your business model.

An estimate is a finite approximation of cost, effort and/or duration based on some basis of knowledge (this is known as a basis of estimation). The flow of activity conflated as estimation often runs from budget, to project estimation to planning. In most organizations, the act of generating a finite approximation typically begins as a form of portfolio management in order to generate a budget for a department or group.

The budgeting process helps make decisions about which pieces of work are to be done. Most organizations have a portfolio of work that is larger than they can accomplish, therefore they need a mechanism to prioritize. Most portfolio managers, whether proponents of an Agile or a classic approach, would defend using value as a key determinant of prioritization. Value requires having some type of forecast of cost and benefit of the project over some timeframe. Once a project enters a pipeline in a classic organization, an estimate is typically generated. The estimate is generally believed to be more accurate than the original budget due to the information gathered as the project is groomed to begin.

Plans breakdown stories into tasks often with personal assigned, an estimate of effort generated at the task level and sum the estimates into higher-level estimates. Any of these steps can (but should not) be called estimation. The three -level process described above, if misused, can cause several team and organizational issues. Proponents of the #NoEstimates movement often classify these issues as estimation pathologies. Jim Benson, author of Personal Kanban, established a taxonomy of estimation pathologies that includes:

1. Guarantism – a belief that an estimate is actually correct.

2. Swami-itis – a belief that an estimate is a basis for sound decision making.

3. Craftosis – an assumption that estimates can be done better.

4. Reality Blindness – an insistence that estimates are prima facie implementable.

5. Promosoriality – a belief that estimates are possible (planning facility)

Estimates by definition are imprecise and can only be accurate within a range of confidence however these facts are often “forgotten” in lieu of the single number contract. Acting as if any of these pathologies are true has generated the anger and frustration needed to fuel the #NoEstimates movement.

When done correctly, both #NoEstimates and classic estimation are tools to generate feedback and create guidance for the organization. In its purest form #NoEstimates uses functionality to generate feedback and to provide guidance about what is possible. The less absolutist “Kanban’er” form of #NoEstimates uses both functional software and throughput measures as feedback and guidance tools. Classic estimation tools use plans and performance to the plan to generate feedback and guidance. The goal is usually the same, it is just that the mechanisms are very different.

Budgeting, Estimation, Planning, #NoEstimates and the Agile Planning Onion

There are many levels of estimation including budgeting, high-level estimation and task planning (detailed estimation). We can link a more classic view of estimation to the “Agile Planning Onion” popularized by Mike Cohn. In the Agile Planning Onion strategic planning is on the outside of the onion and the planning that occurs in the daily sprint meetings at the core of the onion. Budgeting is a strategic form of estimation that most corporate and governmental entities perform. Other than in its most extreme form, budget is generally not a practice being eschewed by #NoEstimate proponents. Estimation exists in the middle layers of the Agile Planning Onion (product and release layers). In classic estimation, these estimates are often developed using top down techniques such as analogy or parametric estimation using function points, story points or tee-shirt sizing. #NoEstimate proponents leveraging Kanban techniques perform this level of estimates as forecasts using average flow rates and queuing theory (an application of Little’s Law). The resistance at this level has generated the perception that sized based estimation at this level (and later, planning at the task level) generate several pathological behaviors within organizations. The final layers of the planning onion, iteration and daily planning, are generally the areas of highest concern to the #NoEstimates movement. While tasks may be identified, effort is not assigned.

It should be noted that while effort estimates are not done at the planning layers or generally at the estimation layer most teams adopt rules to break work down into predictable units. Rules or guidelines are often established that affect story and task size. The used of rules to govern granularity is one of the reason flow measures can be used to forecast when work needs to begin in order to meet date or dependency requirements. Johanna Rothman stated in her article “The Case for #NoEstimates,” that “when you deliver small, valuable chunks of work every day or more often” that you can avoid estimation. The critical words being small and everyday which requires the team to understand how to groom stories to the desired granularity. Whether through the use of rules or feedback using these techniques to groom stories could easily be construed as a crude form of estimation.

Scenarios:

Standard Corporate Environments:

Organizational budgeting (strategy and portfolio): Continuous flow or other #NoEstimates techniques don’t answer the central questions most organizations need to answer which include:

1. How much money should I allocate for software development, enhancements and maintenance?

2. Which projects or products should we fund?

3. Which projects will return the greatest amount of value? While most budgets are scientific guesses there is a need to understand at least some approximation of the size and cost of the work on the overall backlog.

High Level Estimation (product and release):

Release Plans and product road maps could easily be built from forecasts based on teams that have a track record of delivering value on a regular basis. The idea of #NoEstimates can be applied at this level of planning and estimation IF the right conditions are met. Conditions include:

1. Stable Teams
2. Agile Mindset (both team and organizational levels)
3. Well-groomed stories

The classic questions of when, what and how much can be answered in this environment for work done by single teams or by scaled Agile programs.

It should be noted that the example used by Woody Zuill’s, which uses the most form of #NoEstimates (start, deliver, get feedback, and then do more) is a reflection of an environment where all of these factors are reflected.

Task Level Estimation (iteration and daily):

Task level planning is the base of #NoEstimates discussions. Stable teams that are able consistently to accept and deliver what is expected do not have any need to plan effort at a task level.

Commercial / Contractual Work:

Raja Bavani, Senior Director at Cognizant Technology Solutions stated in a recent conversation, that he thought that #NoEstimates was a non-starter in a contractual environment.

Conclusion

Estimation is a form of planning. Planning is a considered an important competency in most business environments. Planning activities abound whether planning the corporate picnic to planning the acquisition and implementation of a new customer relationship management system. Most planning activities center on answering a few very basic questions. When will “it” be done? How much will “it” cost? What is “it” that I will actually get? Rarely does the question of how much effort will it take get asked unless as proxy for how much will it cost. As the work progresses the questions shift to whether we are going to meet the date, budget or scope. Answering those questions can be accomplished by any number of techniques. Using #NoEstimates techniques still requires most organizations to budget. Using #NoEstimates techniques requires breaking down stories into manageable, predictable chunks so that teams can predictably deliver value. The ability to predictably deliver value provides organizations with the tool to forecast the delivery. #NoEstimates really isn’t not estimating . . . it is just estimating differently.

Sources

1. The C3 Project was used to hone and prove many of the Agile techniques (eXtreme Programing and WIKIs for example) and acted as a training ground for many luminaries of the early Agile movement.

2. http://herdingcats.typepad.com/my_weblog/2015/03/five-estimating-pathologies-and-their-corrective-actions.html 4/27/15 or http://moduscooperandi.com/blog/modus-list-3-our-five-estimate-pathologies/ 4/27/15

Written by Default at 05:00

Estimating – Have the Discussion

David HerronDoesn’t it seem like the phrase "estimating accuracy" is something of an oxymoron? If an estimate is an approximation, then how accurate should we expect an estimate to be? And what should your customer expect with regard to an estimate he/she is given?

A proper estimating procedure considers three primary input parameters: the size of the project, the complexity of the project and the delivery team's ability to produce the desired deliverable. In order to produce an "accurate" estimate, all three parameters need to be accurately represented. The entire estimating process is dependent upon having detailed information available upon which to base the estimate. Typically, during the lifecycle of a software project, additional information becomes available as the project progresses. Quite often, early estimates are noted to be +/- 100% accurate. This is usually understood and accepted within the organization. 

Based on this common occurrence, it is easy to make the argument that most estimating practices do not result in accurate predictions. So what is the value in investing time to estimate a project?  Perhaps we should change our perspective and look not at the end result, but instead at the process of estimating. We have already said that accuracy is dependent upon getting or having the right information. Well, the source of that information varies. It can come from documentation or from the local subject matter expert. And, some of the information will be incomplete or subject to interpretation. Further, some necessary information may be missing altogether, thereby increasing the risk of an inaccurate estimate. 

But, it's important to remember that the key player in the estimating process is your customer or end user. The proper perspective to take for a successful estimating practice is to make sure that you have properly set expectations. Yes, the estimate is not going to be 100% accurate. But, by engaging the customer several things occur. First, the customer has an increased level of awareness with regard to what goes into an estimate. Their expectations are now properly set in terms of potential risks and impacts. And, ideally they have some accountability in the accuracy of the estimate.

Having a formal discussion with your customer regarding the risk factors inherent to an estimate and the potential impact they may have is likely as valuable (if not more so) than the so-called accuracy of the estimate.


David Herron
VP, Software Performance Management

Written by David Herron at 05:00
Categories :

Why Are So Many of Our Projects Late, Over Budget or Deliver Less Than Was Promised?

Scope of this Report

 This report identifies evidence that projects are late, over budget or deliver less than promised? It then considers various potential causes for these failures including culture, process, and estimation and how getting these things right can contribute to success.

What evidence is there that projects are late, over budget or deliver less than promised?

Most organizations develop business cases to initiate change1. These business cases require narrative explanation of the change and the associated financial return on investment.

Dan Galorath, noted software estimation expert, cites government data, “A recent US governmentreport showed 81% of budget or $57 billion in IT projects in danger of failing. Detailed reports on the hearings can be found here. Of 413 IT projects identified by OMB and federal agencies NEARLY 80% OFTHEM WERE IDENTIFIED AS HAVING BEEN POORLY PLANNED. The scorecard for IT projects shows muchprogress but much work left to do.”

The PMI Pulse Report 2014 pointed up some stark statistics. “Only 56 percent of strategic initiatives meet their original goals and business intent. This poor performance results in organizations losing $109 million for every $1 billion invested in projects and programs. High-performing organizations successfully complete 89 percent of their projects, while low performers complete only 36 percent successfully.”

Culture, Process or Estimation Issue?

Are process, culture or estimation responsible for the failures? Any or all of them can have a significant impact on a project’s performance. The tendency is always to blame the supplier for the failure – ‘Company X failed to deliver the ABCD project on-time for the XYZ government’, is not an uncommon headline. In truth it’s normally a combination of all three.

We need to look at potential sources of failure from several directions

 

  • Culture: Is the organisation working as one towards a common, transparent, communicated goal?
  • The Governance Process: What decisions need to be made, who makes them and are they tracked to completion?
  • Backlog Prioritisation and Change Control: Was there a product backlog or its equivalent effectively managed and prioritised?
  • Is estimation effective?: Are estimates based on facts or opinions?
  • Is the development model effective?: Agile, Iterative or Waterfall methods are found, but are they effectively policed?

Weaknesses or failures in any of the above will put a project at risk. It is incumbent on both the supplier and Business teams to ensure that there are strong robust processes in place to de-risk the project.

Culture

The PMI Pulse report consists of responses to a voluntary questionnaire and is therefore self-selecting, but it is a valuable resource for discussion of what seems to be a constant refrain over the years.

The clear message of the report is that the most successful organisations in terms of project delivery have strong processes backed by effective measurement and project management offices.

Success comes on the back of success. Companies with effective traditional development methods can adapt quickly and effectively to agile methods. The key here is the word “effective” – Kotter suggests that without urgency, transformation cannot happen – and change is harder for some companies than others. The whole organisation adopts agile because of effective leadership, visible sponsors and a commitment to succeed. Such organisations are either consciously or culturally Lean. To them facts influence decisions; changes to process are tracked and monitored, and successful change remains while unsuccessful change is found early and discarded. During projects deviations from the norm are analysed and corrections are made. Projects seldom fail.

Contrast that with poor process organisations that change methods to follow the latest trends. For them a change in process is an excuse for chaos. Typically we see blame cultures, with poor communications and absent sponsors. Use of metrics is poor and often concentrates only on cash and time to market. If a project falls behind schedule the typical response is to throw people at it. Hordes of heroes are bound to help. Once again we have people repeating the same behaviour expecting a different result. That’s the definition of insanity often, incorrectly, attributed to Einstein. Whoever said it was right. In this instance insanity comes from not analysing reasons for failure but looking for quickv“obvious” answers, and doing that repeatedly.

Process – Governance

We look for an IT governance framework which has similar characteristics to the model proposed by Weill and Ross in 2004:

  • Identify what decisions you need to make;
  • Identify who makes those decisions – an individual;
  • Identify how those decisions will be made e.g. what data is needed, who else should contribute opinions.

Effective organisations have clear governance based on effective leadership and visible sponsorship. They avoid committee decision making and clearly communicate decisions. Crucially they have effective monitoring and measurement activities so that deviations from course are made knowingly or are corrected quickly with little drama.

Process - Development methods

All development methods demand process. Some, such as waterfall, can be process intensive. Agile by contrast is process light, but it’s not process absent. Rather, we can say that Agile is less prescriptive.

Many effective organisations use waterfall or iterative development with defined methods backed by strong metrics and effective reporting. The best performers can adapt to agile when it fits the situation and they continue to be successful.

Agile works best when thought of as a lean process and that means that once you commit to build some user functionality you should do it only once. That means taking a disciplined approach to defining the minimum marketable features, refining the product backlog and delivering sufficient documentation to enable maintenance. It can become a game in ineffective organisations.

Two conversations, which we have had recently, underscore the need for discipline in the use of agile methods. In both cases the productivity related to delivery of functionality was defined as low. When probed, the reason is that both organisations were content to develop and re-develop the same functionality a number of times – to get it right in the end. The business and the development teams seemed to accept that delivering a business change using agile methods allowed for infinite changes of mind. This adoption of agile is not cost effective and gives rise to concern about how effectively agile methods are being used. Instead of the oft-quoted “paralysis by analysis” we see in waterfall, we have “endless enhancement,” and in either case a lot of time, energy, creativity and money is wasted.

Again the lesson is that effective development methods work to their maximum potential when the right amount of control and monitoring of progress is applied. Measurement and reporting is often seen to be an overhead and expensive. We have found that effective measurement and reporting consumes about 1% of project budget (1.5% to 2% on small projects and as low as 0.5% on major programmes). Companies that want to use facts to manage recognise this as money well spent as it enables effective management. Those that look for easy cost-cuts generally take out “overhead” first preferring to chart their course through the icebergs in the dark with a small rudder.

Estimation

Estimating is difficult and the key thing to remember is that it is only an estimate. Often these become written in blood as the initial and final answer. Estimates should be a living thing throughout the project and should be revisited when either something significant changes in the project or when we know sufficiently more to refine the estimate.

In a waterfall development, estimates should at least be performed at Requirements, end of High level design, the start of Construction and reviewed at implementation. Just because Agile doesn’t have the traditional phases, early estimates are still important, and just as useful.

However, estimates can be revisited any time during the lifecycle such as when requirement shift or other variables come into play which will impact originally stated outcomes.

Failure to review and maintain project estimates means you can’t manage risk or use any contingency.

Early Estimates and the Challenges

Early in the project lifecycle, cost and schedule estimates are generated based on best information available. As is often the case, this information is lacking in detail and most likely is ambiguous. This presents several problems for the estimator. For example, ambiguous requirements make it difficult to determine a proper size.

However, by making and documenting stated assumptions, the estimate produced early in the lifecycle can be effectively managed and customer expectations can be properly set.

What Should Estimates Be?

Estimating is a risk assessment activity. The wise project manager can use a well-developed project estimate to properly set and manage end user expectations. Transparency is the watchword here. By sharing stated assumptions with the user and by helping them to understand the basis for the estimate, you are engaging them and making them share in the accountability for the estimate.

For example, if they are aware that the estimate is based on their requirements and the general feeling is that the requirements are somewhat incomplete then it can safely be assumed that another estimate will be required when more data is available and that the new estimate will probably be different from the original estimate.

“I want it delivered NOW!”

This dynamic shows itself, not so subtly, when management doesn’t really want an estimate at all; they want the software delivered when they want it delivered.

How many times have we seen a situation where the sales/marketing group, the business users or even our own senior management has requested a software solution that has a fixed delivery date already attached to it? And even though the user or senior manager may ask for an estimate they really aren’t interested in the response unless they are told what they want to hear or alternatively the supplier is told what the answer already is.

In this type of management environment, the IT organization doesn’t invest much time in their estimating practice because they don’t realize the power of good estimation as a vehicle to properly manage the project and/or their customer’s expectations. The net result is the best endeavours by the IT organisation in a blind attempt to deliver the requirement and usually a project starting with a high tariff of risk.

Expert Estimates?

One perceived problem that is seen in expert estimates is that of memory. Estimates based on memory are subject to the cognitive bias of the estimator therefore involvement of others provides a balance that helps to cancel the potentially negative impacts of bias.

Single expert estimates tend to be either too high or too low depending on the estimator responsible and the culture, for example, the Scotty from Star Trek syndrome (bias) creeps into play with expert estimates. The seasoned estimator estimates high knowing it’ll be corrected anyway by the PM and the ultimate result may be a sensible figure.

The other typical scenario is it is easy for the expert to estimate how long a piece of work would take him to do but when he has to estimate for less experienced colleagues then it’s much more difficult to achieve so we tend to get under-estimates.

Don’t accept just a number for the estimate, three point estimates and the estimate assumptions are a key way to review and validate estimates.

Use of three-point estimating techniques also allows a more reasoned view of the estimate. In reality usually when we estimate we always think of a range –how long does it take you to get to work each day it might be 30 minutes on average but 20 minutes with quiet roads and 50 minutes in rush-hour. So combining all three estimates (Optimistic of 20, Average of 30 and Pessimistic of 50) gives you a much better view of the risk and what contingency you may need to use.

Expert estimates are a key estimate to obtain but there is great value in obtaining another estimate to reconcile this estimate against.

In the Agile space, the benefit of normalizing various experts estimates is often formalized through “planning poker” which constrains the estimate values that experts are allowed to choose and then requires the experts to justify and ultimately reconcile their estimates with each other. Given how effective it is in Scrum planning, the same process could and should apply more widely to expert estimates in general.

How Else Should We Address the Estimating Issue?

We can view popular estimation techniques through two separate perspectives: data and algorithms. First, from the perspective of experiential data / historical data and algorithmic / collaborative techniques, we see that many of the experiential based techniques leverage collaborative techniques to combat perceived weaknesses.

Historical data is used both in model based and expert estimates. Estimating without memory of the past is not possible. The bigger issue is whether models derived from historical data are clearly superior to expert estimates. If you are trying to remove the need for expert estimators the answer is unfortunately ... no.

Finally what is true is that expert estimates require a level of expertise that is sometimes not readily available, which then requires leveraging tools to be used to validate estimates. If estimates are important and the required level of expertise is not available then the choice is far starker. Estimates generated from models leveraging historical data in calibrated tools are the only logical choice.

Volatility and the Impact of Change?

Another key characteristic of a failing or delayed project is the degree of volatility and change. Studies show the exponential rising cost of change as you get in the later stages of a project particularly with a Waterfall methodology.

Agile is designed to accommodate change but change can occasionally be an excuse to use it as in, ”I don’t need to know what I want, I can keep changing my mind and we’ll be fine if we use Agile” can be a client view. This can lead to the same “priority” story being redeveloped multiple times until the client has worked out what they want and the Agile project fails because it runs out of time or money. Is the methodology at fault? Of course not but perhaps a requirements or design “spike” could have been implemented with the client to help them clarify their ideas.

Governance of change is the key, if you know what the business is going to look like at the implementation of the project then the project will control change and is much more likely to succeed.

Deviation from the Norm?

Often, changes to applications with regular release cycles tend to be of a similar size with the same team doing the work. The expert estimates roll into complexity matrices and sensible size metrics and all should be well as long as the estimators continually update their historical records and test the results against external databases. We still see the same mistakes repeated in the hope that a miracle will happen.

The challenge is when we deviate from the norm. Compressed timescale, significant increase in size will invalidate the current estimating methodology in the project. People will assume we can deliver at the same rate and the project is set to fail.

Lawrence H. Putnam published an empirical software estimation model in 1978. In the formula noted below, Size is the product size (whatever size estimate is used by your organization is appropriate). Putnam uses ESLOC (Effective Source Lines of Code) throughout his books.

  • B is a scaling factor and is a function of the project size.
  • Productivity is the Process Productivity, the ability of a particular software organization to produce software of a given size at a particular defect rate.
  • Effort is the total effort applied to the project in person-years.
  • Time is the total schedule of the project in years.

In practical use, when making an estimate for a software task the software equation is solved for effort:

Parametric estimating toolsets understand the likely impact and use this or their own bespoke calculation engines to deal with this and can make a major contribution in increasing the change of success in a project by setting realistic expectations.

Tracking and Monitoring

The final area to consider is effective tracking and control of the project. Continuous review of the projects velocity (Agile) or rate of delivery (Size measure per time period) will indicate the projects real status and chance of success, for example, if the development team report the project is 80% complete 3 weeks in a row then the project is likely in trouble.

Again the PMI report indicates that organisations that deliver successful projects, irrespective of the methods used, tend to have functioning and effective Programme Management Offices (PMO) and theses PMOs gather and analyse data effectively to support the successful delivery of business initiatives.

Conclusion

There is no need for projects to be delivered late, over budget or with less scope. These statements come from organisations that don’t understand that the light in the tunnel is actually a train coming full speed at you.

Successful delivery of a project requires a culture of effective business processes, effective estimation and sound development processes.

Strong business processes linking, effective business vision, realistic expectations and close communication between the vendor and supplier are key elements in successful delivery.

Development methods are only as good as the organisation that uses them. Whatever end of the process spectrum, it’s the effective use of the end to end processes that delivers the goods without drama.

The fundamental transformation of the idea to money comes with the estimate. Good estimates are living things that change with the circumstances.

Effective estimation requires an organization to commit resources to the development and execution of a well-defined software estimating practice, backed by a PMO that delivers effective data analysis.

Sources

  1. Dan Galorath on Estimating Blog, http://www.galorath.com/wp/government-it-project-woes-and-estimating-total-ownership-costs.php
  1. PMI Pulse Report 2014:  http://www.pmi.org/Business-Solutions/Pulse.aspx
  1. “IT Governance: how top performers manage IT decision rights for superior results.” Peter Weill and Jeanne Ross 2004
  1. “DCG Works With Leading Customer Management Company to Implement Measurement and Governance Program for Data-based Decision Making” /insights/publications/measurement-program-for-data-based-decision-making/Is there a Business Case for Better Estimation? DCG Trusted Advisor Report, July 2013
  1. “What are the benefits if any of estimating my software projects through the use of a vendor developed estimating model?” DCG Trusted Advisor Report, July 2014
  1. Putnam Model  http://en.wikipedia.org/wiki/Putnam_model
Written by Default at 05:00

Estimating Software Maintenance

DavidI recently read an interesting and informative article that presents a unique and proven approach for estimating maintenance and support activities using a new type of "sizing" model. The authors, Anjali Mogre and Penelope Estrada Nava, share their experience based on the work they have done at Atos worldwide.

The article opens with a positioning statement identifying issues with estimating software maintenance and support activities. Simply put, there is no reliable standard for sizing and estimating the effort associated with maintenance and support activities. Examples of software maintenance are described as correcting faults, software migrations, design improvements, adapting to different technical environments, etc.

Noted in the article are existing standards and practices, such as ISO/IEC standard 15939, IFPUG Function Point Analysis, lines of code measures and benchmark data from the International Software Benchmarking Standards Group (ISBSG). While valuable in their own right, the author points out the shortcomings of each in being able to properly size and forecast the size and complexity of a maintenance effort.

The proposed solution is based on an internally developed size measure and the ITIL definition of a service request (problem ticket). The size measure is called a "Work Point," and it is defined as a function of the type and complexity of a ticket (as defined by ITIL).

However, the best parts of the article are the details that follow. An in-depth review of the practices used in this method are presented with detailed examples and explanations. The practical application of this approach allows one to imagine and consider how these techniques could be applied to one's own needs to improve one's ability to manage their maintenance work load.

At the conclusion of the article we learn that not only is this an effective technique for sizing and estimating maintenance activities, but it has also been used to baseline productivity, measure performance improvements and set productivity targets.

If you are struggling with your maintenance and support work stream (and who isn’t), then this is a must read.

Download the article here.


David Herron
VP Software Performance Management

Written by David Herron at 05:00
Categories :

"It's frustrating that there are so many failed software projects when I know from personal experience that it's possible to do so much better - and we can help." 
- Mike Harris, DCG Owner

Subscribe to Our Newsletter
Join over 30,000 other subscribers. Subscribe to our newsletter today!