Warning: Table './devblogsdb/cache_page' is marked as crashed and last (automatic?) repair failed query: SELECT data, created, headers, expire, serialized FROM cache_page WHERE cid = 'http://www.softdevblogs.com/?q=aggregator/sources/6' in /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/database.mysql.inc on line 135

Warning: Cannot modify header information - headers already sent by (output started at /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/database.mysql.inc:135) in /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/bootstrap.inc on line 729

Warning: Cannot modify header information - headers already sent by (output started at /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/database.mysql.inc:135) in /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/bootstrap.inc on line 730

Warning: Cannot modify header information - headers already sent by (output started at /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/database.mysql.inc:135) in /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/bootstrap.inc on line 731

Warning: Cannot modify header information - headers already sent by (output started at /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/database.mysql.inc:135) in /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/bootstrap.inc on line 732
Software Development Blogs: Programming, Software Testing, Agile, Project Management
Skip to content

Software Development Blogs: Programming, Software Testing, Agile Project Management

Methods & Tools

Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!

Herding Cats - Glen Alleman
warning: Cannot modify header information - headers already sent by (output started at /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/database.mysql.inc:135) in /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/common.inc on line 153.
Syndicate content
Deep Learning Resources about Performance-Based Project Management¬ģ And the Principles, Practices, and Processes to Increase Probability of Success
Updated: 8 hours 42 min ago

Connecting Project Benefits to Business Strategy for Success

Tue, 08/23/2016 - 05:01

The current PMI Pulse titled Delivering Value: Focus on Benefits during Project Execution provides some guidance on how to manage the benefits side of an IT project. But the article misses the mark on an important concept. This is a chart in the paper, suggesting the metrics of the benefits.

But where do these metrics come from?

Screen Shot 2016-08-16 at 1.00.29 PM

The question is where do the measures of the benefits listed in the above chart come from? 

The answer is they come from the Strategy of the IT function. Where is the strategy defined? The answer is in the Balanced Scorecard. This is how ALL connections are made in enterprise IT projects. Why are we doing something? How will we recognize that it's the right thing to do? What are the measures of the outcomes connected to each other and connected to the top level strategy to the strategic needs of the firm.

When you hear we can't forecast the benefits in the future from our work, you can count on the firm spending a pile of money for probably not much value. Follow the steps starting on page 47 in the presentation above and build the 4 perspectives and connect the initiatives.
  • Stakeholder - what does the business need in terms of beneficial outcomes?
  • Internal Processes - what governance processes will be used to produce these outcomes?
  • Learings and Growth - what people, information, and organizational elements will be needed to execute the process to produce the benefical outcomes?
  • Budget - what are you willing to spend to achieve¬†these beneficial outcomes

As always, each of these is a random variable operating in the presence of uncertanty, creating risk that they will not be achieved. As always, this means making estimates of both the beneficial outcomes and the cost to achieve them. 

Like all non-trivial projects, estimating is a critical success factor. Uncertainty is unavoidable. Making decisions in the presence of uncertanty is unavoidable. Having some hope that the decision will result in a beneficial outcomes requires making estimates of that outcome and choosing the most likely beneficial outcome.  

Anyone telling you otherwise is working in a de-minimis project. 

So Let's Apply These Principles to a Recent Post

A post Uncertainty of benefits versus costs, has some ideas that need addressing ...

  • Return of an investment is the benefits minus the costs.¬†
    • And both are random variables subject to reducible and irreducible uncertanties.
    • Start by building a model of these uncertainties.
    • Apply that model and update it with data from the project as it proceeds.
  • Most people focus way too much on costs and not enough on benefits.
    • Why? This is bad management. This is naive management. Stop doing stupid things on purpose.
    • Risk Management (from the underlying uncertainties)¬†is how adults manage projects - Tim Lister.
    • Behave like an adult, manage the risk.
  • If you are working on anything innovative, your benefit uncertainty is crazy high.
    • Says who?
    • If you don't have some statistically confident sense of what the pay off¬†is going to be, you'd better be ready to spend money to find out before you spend all the money.
    • This is naive project management and naive business management.
    • It's counter to the first bullet - ROI = (Value - Cost)/Cost.¬†
    • Having an acceptable level of¬†confidence in both Value and Cost is part of Adult Management of other people's money.
  • But we can‚Äôt estimate based on data, it has to be guesses!
    • No estimates, are not guesses unless done by a Child.¬†
    • Estimates ARE based on data. This is called Reference Class Forecasting. Also parametric models use past performance to project future performance.
    • If Your cost estimation might be off by +/- 50%, but your benefit estimation could be off by +/-95% (or more),¬†you're pretty much clueless¬†about what the customer wants. Or you're spending money on a R&D project to find out. This is one of those examples conjectured by inexperienced estimators. This is not how it works in any mature¬†firm.
    • Adults don't guess, they estimate.
    • Adults know how to estimate. Lots of books, papers, and tools.
  • So we should all stop doing estimates, right?
    • No -¬†an estimate is a forecast and a commitment.
    • The¬†commitment MUST have a confidence level.
    • We have 80% confidence of launching on or before the 3rd week in November 2014 for 4¬†astronauts¬†in our¬†vehicle¬†to the International Space station. This was a VERY innovative system. This is why a contract¬†for $3.5B was awarded. This approach is applicable to ALL projects
    • Any de minimis projects have not deadline or a Not to Exceed target cost.

All projects are probabilistic. All projects have uncertainty in cost and benefits. Estimating both cost and benefit, continuously updating those estimates, and taking action to correct unfavorable variances from plan, is how adults manage projects.

  Related articles Strategy is Not the Same as Operational Effectiveness Decision Making Without Estimates? Local Firm Has Critical Message for Project Managers Architecture -Center ERP Systems in the Manufacturing Domain The Purpose Of Guiding Principles in Project Management Herding Cats: Large Programs Require Special Processes Herding Cats: The Problems with Schedules Quote of the Day The Cost Estimating Problem Estimating Processes in Support of Economic Analysis
Categories: Project Management

Invoking "Laws" Without a Domain or Context

Thu, 08/18/2016 - 22:31

It seems to be common invoke Laws in place of actual facts when trying to support a point. Here's two recent ones I've encountered with some Agile and #NoEstimates advocates. Two of my favorite are:

  • Goodhart's Law
  • Hofstadter's law

These are not Laws in the same way as the Laws of Physics, Laws of Chemistry, Laws of Queuing theory - which is why it's so easy to misapply them, misuse them, use them to obviscate the situation and hide behind fancy terms which have no meaning to the problem at hand. Here's some real laws.

  • Newton's Law(s), there are three of them:
    • First law: When viewed in an inertial reference frame, an object either remains at rest or continues to move at a constant velocity, unless acted upon by a net force.
    • Second law: In an inertial reference frame, the vector sum of the forces F on an object is equal to the mass m of that object multiplied by the acceleration vector a of the object: F = ma.
    • Third law: When one body exerts a force on a second body, the second body simultaneously exerts a force equal in magnitude and opposite in direction on the first body.
  • Boyle's Law -¬†For a fixed mass of gas at constant temperature, the volume is inversely proportional to the pressure.¬†pv = Constant.
  • Charle's Laws - For a fixed mass of gas at constant pressure, the volume is directly proportional to the kelvin temperature.¬†V = Constant x T
  • 2nd Law of Thermodynamics¬†states that the total entropy of an isolated system always increases over time, or remains constant in ideal cases where the system is in a steady state or undergoing a reversible process. The increase in entropy accounts for the irreversibility of natural processes, and the asymmetry between future and past.
  • Little's Law - which is¬†l = őĽw, which asserts that the time average number of customers in a queueing system, l, is equal to the rate at which customers arrive and enter the system, őĽ, times¬†the average sojourn time of a customer, w. And just to be clear the statistics of the processes in Little's Law are IID - Independent, Identicially Distribution and Stationary. Rarely the case in software development, where Little's Law is misused often.

Misuse of Goodhart's Law

This post, like many of other posts, was stimulated by a conversation on social media. Sometimes the conversations trigger ideas that have laid dormant for awhile. Sometimes, I get a new idea from a word or a phrase. But most of the time, they come from a post that was either wrong, misinformed, or worse misrepresenting  no principles.

The OP claimed Goodhart's Law was the source of most of the problems with software development. See the law below. 

But the real issue with invoking Goodhart's Law has several dimensions, using Goodhart's Law named after the economist who originated it, Charles Goodhart. Its most popular formulation is: "When a measure becomes a target, it ceases to be a good measure." This law is part of a broader discussion of making policy decision on macro economic models. 

Given that the structure of an econometric model consists of optimal decision rules of economic agents, and that optimal decision rules vary systematically with changes in the structure of series relevant to the decision maker, it follows that any change in policy will systematically alter the structure of econometric models.

What this says is again when the measure becomes the target, that target impacts the measure, changing the target. 

So first a big question

Is this macroeconomic model a correct  operational model for software development processes - measuring changes the target?

Setting targets and measuring performance against that target is the basis of all closed loop control systems used to manage projects. In our domain this control system is the Risk Adjusted Earned Value Management System (EVMS). EVM is a project management technique for measuring project performance and progress in an objective manner. A baseline of the planned value is established, work is performed, physical percent complete is measured, and the earned value is calculated. This process provides actionable information about the performance of the project using Quantifiable Backup Data (QBD) for the expected outcomes of the work, for the expected cost, at the expected time all adjusted for the reducible and irreducible uncertanties of the work.

Without setting a target to measure against, we have:

  • No baseline control.
  • No measures of effectiveness.
  • No measures of performance.
  • No technical performance measures.
  • No Key Performance Parameters.

With no target and no measures of progress toward the target ... 

We have no project management, no program controls, we have no closed loop control system.

With these missing pieces, the project is doomed on day one. And then we're surprised it runs over cost and schedule, and doesn't deliver the needed capabilities in exchange for the cost and time invested.

When you hear Goodhart's Law is the cause of project failure, you're likely talking to someone with little understanding of managing projects with a budget and do date for the needed capabilities - you know an actual project. So what this means in economics and not in project management is ...

... when a feature of the economy is picked as an indicator of the economy, then it inexorably ceases to function as that indicator because people start to game it. - Mario Biagioli, Nature (volume 535, page 201, 2015)

Note the term Economy, not cost, schedule, and technical performance measures of projects. Measuring the goals and activities of monetary policy Goodhart might be applicable. For managing development of products with other people's money, probably not.

Gaming of the system is certainly possible on projects. But unlike the open economy, those gaming the project measures can be made to stop, with a simple command. Stop gaming or I'll find someone else to take your place.

Misuse of Hofstadter's Law

My next favorite misused law is this one, which is popular among the #Noestimates advocates who claim estimating can't be done. 

Hofstadter's Law: It always takes longer than you expect, even when you take into account Hofstadter's Law¬†‚ÄĒ‚ÄĮDouglas Hofstadter,¬†G√∂del, Escher, Bach: An Eternal Golden Braid

Hofstadter's Law is about the development and use of self-referencing systems. The statement is about how long it takes is itself a self-referencing statement. He's speaking about the development of a Chess playing program - and doing so from the perspective of 1978 style software development. The game playing programs use a look ahead tree with branches of the moves and countermoves. The art of the program is to avoid exploring every branch of the look ahead tree down to the terminal nodes. In chess - actual chess, people - not the computer - have the skill to know what branches to look down and what branches to not look down. 

In the early days (before 1978) people used to estimate that it would be ten years until the computer was a world champion, But after ten years (1988) it was still estimated that day was still ten years away. 

This notion is part of the recursive Hofstadter's Law which is what the whole book is about. The principle of Recursion and Unpredictability is described at the bottom of page 152. 

For a set to be recursively enumerable (the condition to traverse the look ahead tree for all position moves), means it can be generated from a set of starting points (axioms), by the repeated application of rules of inference. Thus, the set grows and grows, each new element being compounded somehow out of previous elements, in a sort of mathematical snowball. But this is the essence of recursion - something being defined in terms of simpler versions of itself, instead of explicitly. 

Recursive enumeration is a process in which new things emerge from old things by fixed rules. There seem to be many surprises in such processes ...

So if you work on the development of recursive enumeration based software systems, then yes - estimating when you'll have your program working is likely going to be hard. Or if you work on the development of software that has no stated Capabilities, no Product Roadmap, no Release Plan, no Product Owner or Customer that may have even the slightest notion of what Done Looks like in units of measure meaningful to the decision makers, then probably you can apply Hofstadter's Law. Yourdan calls this type of project A Death March Project - good luck with that.

If not, then DO NOT fall prey to the misuse of Hofstadter's Law by those likely to not have actually read Hofstadter's book, nor have the skills and experience to understand the processes needed to produce credible estimates.

So once again, time to call BS, when quotes are misused

Related articles Agile Software Development in the DOD Empirical Data Used to Estimate Future Performance Thinking, Talking, Doing on the Road to Improvement Herding Cats: The Misuse Hofstadter's Law Just Because You Say Words, It Doesn't Make Then True There is No Such Thing as Free Doing the Math Building a Credible Performance Measurement Baseline Your Project Needs a Budget and Other Things
Categories: Project Management

Invoking "Laws" Without a Domain or Context

Thu, 08/18/2016 - 22:31

It seems to be common invoke Laws in place of actual facts when trying to support a point. Here's two recent ones I've encountered with some Agile and #NoEstimates advocates. Two of my favorite are:

  • Goodhart's Law
  • Hofstadter's law

These are not Laws in the same way as the Laws of Physics, Laws of Chemistry, Laws of Queuing theory - which is why it's so easy to misapply them, misuse them, use them to obviscate the situation and hide behind fancy terms which have no meaning to the problem at hand. Here's some real laws.

  • Newton's Law(s), there are three of them:
    • First law: When viewed in an inertial reference frame, an object either remains at rest or continues to move at a constant velocity, unless acted upon by a net force.
    • Second law: In an inertial reference frame, the vector sum of the forces F on an object is equal to the mass m of that object multiplied by the acceleration vector a of the object: F = ma.
    • Third law: When one body exerts a force on a second body, the second body simultaneously exerts a force equal in magnitude and opposite in direction on the first body.
  • Boyle's Law -¬†For a fixed mass of gas at constant temperature, the volume is inversely proportional to the pressure.¬†pv = Constant.
  • Charle's Laws - For a fixed mass of gas at constant pressure, the volume is directly proportional to the kelvin temperature.¬†V = Constant x T
  • 2nd Law of Thermodynamics¬†states that the total entropy of an isolated system always increases over time, or remains constant in ideal cases where the system is in a steady state or undergoing a reversible process. The increase in entropy accounts for the irreversibility of natural processes, and the asymmetry between future and past.
  • Little's Law - which is¬†l = őĽw, which asserts that the time average number of customers in a queueing system, l, is equal to the rate at which customers arrive and enter the system, őĽ, times¬†the average sojourn time of a customer, w. And just to be clear the statistics of the processes in Little's Law are IID - Independent, Identicially Distribution and Stationary. Rarely the case in software development, where Little's Law is misused often.

Misuse of Goodhart's Law

This post, like many of other posts, was stimulated by a conversation on social media. Sometimes the conversations trigger ideas that have laid dormant for awhile. Sometimes, I get a new idea from a word or a phrase. But most of the time, they come from a post that was either wrong, misinformed, or worse misrepresenting  no principles.

The OP claimed Goodhart's Law was the source of most of the problems with software development. See the law below. 

But the real issue with invoking Goodhart's Law has several dimensions, using Goodhart's Law named after the economist who originated it, Charles Goodhart. Its most popular formulation is: "When a measure becomes a target, it ceases to be a good measure." This law is part of a broader discussion of making policy decision on macro economic models. 

Given that the structure of an econometric model consists of optimal decision rules of economic agents, and that optimal decision rules vary systematically with changes in the structure of series relevant to the decision maker, it follows that any change in policy will systematically alter the structure of econometric models.

What this says is again when the measure becomes the target, that target impacts the measure, changing the target. 

So first a big question

Is this macroeconomic model a correct  operational model for software development processes - measuring changes the target?

Setting targets and measuring performance against that target is the basis of all closed loop control systems used to manage projects. In our domain this control system is the Risk Adjusted Earned Value Management System (EVMS). EVM is a project management technique for measuring project performance and progress in an objective manner. A baseline of the planned value is established, work is performed, physical percent complete is measured, and the earned value is calculated. This process provides actionable information about the performance of the project using Quantifiable Backup Data (QBD) for the expected outcomes of the work, for the expected cost, at the expected time all adjusted for the reducible and irreducible uncertanties of the work.

Without setting a target to measure against, we have:

  • No baseline control.
  • No measures of effectiveness.
  • No measures of performance.
  • No technical performance measures.
  • No Key Performance Parameters.

With no target and no measures of progress toward the target ... 

We have no project management, no program controls, we have no closed loop control system.

With these missing pieces, the project is doomed on day one. And then we're surprised it runs over cost and schedule, and doesn't deliver the needed capabilities in exchange for the cost and time invested.

When you hear Goodhart's Law is the cause of project failure, you're likely talking to someone with little understanding of managing projects with a budget and do date for the needed capabilities - you know an actual project. So what this means in economics and not in project management is ...

... when a feature of the economy is picked as an indicator of the economy, then it inexorably ceases to function as that indicator because people start to game it. - Mario Biagioli, Nature (volume 535, page 201, 2015)

Note the term Economy, not cost, schedule, and technical performance measures of projects. Measuring the goals and activities of monetary policy Goodhart might be applicable. For managing development of products with other people's money, probably not.

Gaming of the system is certainly possible on projects. But unlike the open economy, those gaming the project measures can be made to stop, with a simple command. Stop gaming or I'll find someone else to take your place.

Misuse of Hofstadter's Law

My next favorite misused law is this one, which is popular among the #Noestimates advocates who claim estimating can't be done. 

Hofstadter's Law: It always takes longer than you expect, even when you take into account Hofstadter's Law¬†‚ÄĒ‚ÄĮDouglas Hofstadter,¬†G√∂del, Escher, Bach: An Eternal Golden Braid

Hofstadter's Law is about the development and use of self-referencing systems. The statement is about how long it takes is itself a self-referencing statement. He's speaking about the development of a Chess playing program - and doing so from the perspective of 1978 style software development. The game playing programs use a look ahead tree with branches of the moves and countermoves. The art of the program is to avoid exploring every branch of the look ahead tree down to the terminal nodes. In chess - actual chess, people - not the computer - have the skill to know what branches to look down and what branches to not look down. 

In the early days (before 1978) people used to estimate that it would be ten years until the computer was a world champion, But after ten years (1988) it was still estimated that day was still ten years away. 

This notion is part of the recursive Hofstadter's Law which is what the whole book is about. The principle of Recursion and Unpredictability is described at the bottom of page 152. 

For a set to be recursively enumerable (the condition to traverse the look ahead tree for all position moves), means it can be generated from a set of starting points (axioms), by the repeated application of rules of inference. Thus, the set grows and grows, each new element being compounded somehow out of previous elements, in a sort of mathematical snowball. But this is the essence of recursion - something being defined in terms of simpler versions of itself, instead of explicitly. 

Recursive enumeration is a process in which new things emerge from old things by fixed rules. There seem to be many surprises in such processes ...

So if you work on the development of recursive enumeration based software systems, then yes - estimating when you'll have your program working is likely going to be hard. Or if you work on the development of software that has no stated Capabilities, no Product Roadmap, no Release Plan, no Product Owner or Customer that may have even the slightest notion of what Done Looks like in units of measure meaningful to the decision makers, then probably you can apply Hofstadter's Law. Yourdan calls this type of project A Death March Project - good luck with that.

If not, then DO NOT fall prey to the misuse of Hofstadter's Law by those likely to not have actually read Hofstadter's book, nor have the skills and experience to understand the processes needed to produce credible estimates.

So once again, time to call BS, when quotes are misused

Related articles Agile Software Development in the DOD Empirical Data Used to Estimate Future Performance Thinking, Talking, Doing on the Road to Improvement Herding Cats: The Misuse Hofstadter's Law Just Because You Say Words, It Doesn't Make Then True There is No Such Thing as Free Doing the Math Building a Credible Performance Measurement Baseline Your Project Needs a Budget and Other Things
Categories: Project Management

A Growth Job

Thu, 08/18/2016 - 16:23
  • Is never permanent.
  • Makes you like yourself.
  • Is fun.
  • Is sometimes tedious, painful, frustrating, monotonous, and at the same time gives a sense of accomplishment.
  • Bases compensation on productivity.
  • Is complete: One thinks, plans, manages and is the final judge of one's work.
  • Addresses real need in the world are large - people want what you do because they need it.
  • Involves risk-taking.
  • Has a few sensible entrance requirements.
  • Ends automatically when a task is completed.
  • Encourages self-competitive excellence.
  • Causes anxiety because you don't necessarily know what you're doing.
  • Is one where you manage your time, money and people, and where you are accountable for specific results, which are evaluated by people you serve.
  • Never involves saying¬†Thank God It's Friday.
  • Is where the overall objectives of the organizations are supported by your work.
  • Is where good judgment is one, maybe the only, job qualification.¬†
  • Gives every jobholder the chance to influence, sustain or change organizational objectives.
  • Is when you can quit or be fired at any time.
  • Encourages reciprocity and and parity between the boss and the bossed.
  • Is when we work from a sense of mission and desire, not obligation and duty.

From If things Don't Improve Soon I May Ask You To Fire Me - Richard K. Irish

Related articles IT Risk Management Applying the Right Ideas to the Wrong Problem Build a Risk Adjusted Project Plan in 6 Steps
Categories: Project Management

The Problems with Schedules #Redux #Redux

Wed, 08/17/2016 - 17:35

Here's an article, recently referenced by a #NoEstimates twitter post. The headline is deceiving, the article DOES NOT suggest we don't need deadlines, but that deadlines without credible assessment of their credibility are the source of many problems on large programs...

Screen Shot 2016-08-10 at 10.12.36 AM

The Core Problem with Project Success

There are many core Root Causes of program problems. Here's 4 from research at PARCA

Bliss Chart

  • Unrealistic performance expectations missing Measures of Effectiveness and Measures of Performance.
  • Unrealistic Cost and Schedule estimates based on inadequate risk adjusted growth models.
  • Inadequate accessment of risk and unmitigated exposure to these risks with proper handling plans.
  • Unanticipated Technical issues with alternative plans and solutions to maintain effectiveness.

Before diving into the details of these, let me address another issue that has come up around project success and estimates. There is a common chart used to show poor performance of projects that compares Ideal project performance with the Actual project performance. Here's the notional replica of that chart.

Screen Shot 2016-08-17 at 10.11.56 AM

This chart shows several things

  • The¬†notion of¬†Ideal is just that - notional. All that line says is this was the baseline Estimate at Completion for the project work. It says nothing about the credibility of that estimate, the possibility that one or all of the Root Causes above are in play.
  • Then the chart shows that many projects cost more or take longer (costing more) in the sample population of projects.¬†
  • The term¬†Ideal¬†is a misnomer. There is no ideal in the estimating business. Just the estimate.
    • The estimate has two primary attributes - accuracy and precision.
  • The chart (even the notional charts) usually don't say what the accuracy or precision is of the value that make up the line.

So let's look at the estimating process and the actual project performance 

  • There is no such thing as the¬†ideal cost estimate. Estimates are probabilistic. They have a probability distribution function (PDF) around the Mode of the possible values from the estimate. This Mode is the Most Likely value of the estimate. If the PDF is symmetric (as shown above) the upper and lower limits are usually done in some 20/80 bounds. This is typical in or domain. Other domains may vary.
  • This says here's our estimate with an 80% confidence.¬†
  • So now if the actual cost or schedule, or so technical parameter falls inside the¬†acceptable range¬†(the confidence interval) it's considered GREEN. This range of variances addresses the uncertanty in both the estimate and the project performance.

But here's three problems. First, there is no cause stated for that variance. Second, the ideal line can never be ideal. The straight line is for the estimate of the cost (and schedule) and that estimate is probabilistic. So the line HAS to have a probability distribution around it. The confidence interval on the range of the estimate. The resulting actual cost or schedule may well be within acceptable range of the estimate. Third is are the estimates being updated, when work is performed or new work is discovered and are those updates the result of changing scope? You can't state we did make our estimate if the scope is changing. This is core Performance Measurement Baseline struff we use every week where we work.

As well since ideal line has no probabilistic attributes in the original paper(s), no shown above - Here's how we think about cost, schedule, and technical performance modeling in the presence of the probabilistic and statistical processes of all project work. †

So let's be clear. NO point estimates can be credible. The Ideal line is a point estimate. It's bogus on day and continues to mislead as more data is captured from projects claimed to not match the original estimate. Without the underlying uncertanties (aleatory and epistemic) in the estimating model the ideal estimates are worthless. So when the actual numbers come in and don't match the ideal estimate there is NO way to know why. 

Was the estimate wrong (and all point estimates are wrong) or was one or all of Mr. Bliss's root causes the cause of the actual variance

CostSo another issue with the Ideal Line is there is no confidence intervals around the line. What if the actual cost came inside the acceptable range of the ideal cost? Then would the project be considered on cost and on schedule? Add to that to coupling  between cost, schedule, and the technical performance as shown above. 

The use of the Ideal is Notional. That's fine if your project is Notional.

What's the reason a project or a collection of projects don't match the baselined estimate. That estimate MUST have an accuracy and precision number before being useful to anyone. 

  • Essentially that straight line is likely an unquantified¬†point estimate.¬†And ALL point estimates are WRONG, BOGUS, WORTHLESS. (Yes I am shouting on the internet).
  • Don't ever make decisions in the presence of uncertanty with point estimates.
  • Don't ever do analysis of cost and schedule variances without first understanding the accuracy and precision of the original estimate.
  • Don't ever make suggestions to make changes to the processes without first finding the root cause of why the actual performance has a variance with the planned performance.

 So what's the summary so far:

  • All project work is probabilistic, driven by the underlying uncertainty of many processes. These process are coupled - they have to be for any non-trivial projects. What's the coupling factors? The non-linear couplings? Don't know these, no way to suggest much of anything about the time phased cost and schedule.
  • Knowing the reducible and irreducible uncertainties of the project is the minimal critical success factor for project success.
  • Don't know these? You've doomed the project on day one.

So in the end, any estimate we make in the beginning of the project, MUST be updated at the project proceeds. With this past performance data we can make improved estimates of the future performance as shown below. By the way, when the #NoEstimates advocates suggest using past data (empirical data) and don't apply the statistical assessment of that data to produce a confidence interval for the future estimate (a forecast is an estimate of a future outcome) they have only done half the work needed to inform those paying what is the likelihood of the future cost, schedule, or technical performance.

6a00d8341ca4d953ef01bb07e7f187970d

So Now To The Corrective Actions of The Causes of Project Variance

If we take the 4 root causes in the first chart - courtesy of Mr. Gary Bliss, Director Performance Assessment and Root Cause Analysis (PARCA), let's see what the first approach is to fix these

Unrealistic performance expectations missing Measures of Effectiveness and Measures of Performance

  • Defining the Measures of Performance, the resulting Measures of Effectiveness, and the Technical Performance Measures of the resulting project outcomes is a critical success factor.
  • Along with the Key Performance Parameters, these measures define what DONE looks like in units of measure meaningful to the decision makers.
  • Without these measures, those decision makers and those building the products that implement the solution have no way to know what DONE looks like.

Unrealistic Cost and Schedule estimates based on inadequate risk adjusted growth models

  • Here's where estimating comes in. All project work is subject to uncertainty. Reducible (Epistemic) uncertainty and Irreducible (Aleatory) uncertainty.¬†
  • Here's how to Manage in the Presence of Uncertainty.
  • Both these cause risk to cost, schedule, and technical outcomes.
  • Determining the range of possible values for aleatory and epistemic uncertainties means making estimates from past performance data or parametric models.

Inadequate assessment of risk and unmitigated exposure to these risks without proper handling plans

  • This type of risk is held in the Risk Register.
  • This means making estimates of the probability of occurrence, probability of impact, probability of the cost to mitigate, the¬†probability of any residual risk, the probability of the impact of this residual risk.
  • Risk management means making estimates.¬†
  • Risk management is how adults manage projects. No risk management, no adult management. No estimating no adult management.

Unanticipated Technical issues with no alternative plans and solutions to maintain effectiveness

  • Things go wrong, it's called development.
  • When thing go wrong, where's Plan B? Maybe even Plan C.

When we hear we can't estimate, planning is hard or maybe not even needed, we can't forecast the future, let's ask some serious questions.

  • Do you know what DONE looks like in meaningful units of measure?
  • Do you have a plan to get to Done when the customer needs you to, for the cost the customer can afford?
  • Do you have the needed resources to reach Done for the planned cost and schedule?
  • Do you know something about the risk to reaching Done and do you have plans to mitigate those risks in some way?
  • Do you have some way to measure physical percent complete toward Done, again in units meaningful to the decision makers, so you can get feedback (variance) from your work to take corrective actions to keep the project going in the right direction?

The answers should be YES to these Five Immutable Principles of Project Success

If not, you're late, over budget, and have a low probability of success on Day One.

†NRO Cost Group Risk Process, Aerospace Corporation, 2003

Related articles Applying the Right Ideas to the Wrong Problem Herding Cats: #NoEstimates Book Review - Part 1 Some More Background on Probability, Needed for Estimating A Framework for Managing Other Peoples Money Are Estimates Really The Smell of Dysfunction? Five Estimating Pathologies and Their Corrective Actions Qualitative Risk Management and Quantitative Risk Management
Categories: Project Management

Range of Domains in Sofwtare Development

Tue, 08/16/2016 - 17:45

Once again I've encountered a conversation about estimating where there was a broad disconnect between the world I work in - Software Intensive System of Systems - and our approach to Agile software development, and someone claiming things that would be unheard of here.

Here's a briefing I built to sort out where on the spectrum you are, before proceeding further with what works in your domain may actually be forbidden in mine. 

So when someone starts stating what can or can't be done, what can or can't be known, what can or can't be a process - ask what domain do you work in?

Paradigm of agile project management from Glen Alleman Related articles Agile Software Development in the DOD How Think Like a Rocket Scientist - Irreducible Complexity Herding Cats: Value and the Needed Units of Measure to Make Decisions The Art of Systems Architecting Complex Project Management
Categories: Project Management

Value and the Needed Units of Measure to Make Decisions

Sun, 08/14/2016 - 20:18

For some reason the notion if value is a big mystery in the agile community. Many blogs, tweets, books are spent of speaking about Value as the priority in agile software development

We focus on Value over cost. We produce Value at the end of every Sprint Value is the most important aspect  of Scrum based development.

Without units of measure of Value beyond time and money, there can be not basis of comparison between one value based choice and another.

In the Systems Engineering world where we work, there are four critical units of measure for all we done.

  • Measures of Effectiveness -¬†these are¬†operational measures of success that are closely related to the achievements of the mission or operational objectives evaluated in the operational environment, under a specific set of conditions.
    • MOE's are stated in units of measure meaningful to the buyer
    • They focus on capabilities independent of any technical implementation
    • They are connected with mission success
    • MOP's belong to the End User
  • Measure of Performance -¬†characterize physical or functional attributes relating to the system operation, measured or estimated under specific conditions.
    • MOP's are attributes that assure the systems to capability to perform
    • They are an assessment of the system to assure it meets the design requirements to satisfy the Measures of Effectiveness¬†
    • MOP's belong to the project
  • Technical Performance Measures - are attributes that determine how well a system or system element is satisfying or expected to satisfy a technical requirement or goal
    • TPMs assess the design process
    • They define compliance¬†to performance requirements
    • They identify technical risk
    • They are limited to critical thresholds, and
    • They include projected performance
  • Key Performance Parameters - represent the capabilities and characteristics so significant that failure to meet them can be cause for reevaluation, reassessing, or termination of the program
    • KPP's have a threshold or objective value
    • They characterize the major drivers of performance
    • The buyer defines the KPP's during the operational concept development process - KPP's say what DONE looks like

So when you read about value and don't hear about the units of measure of Effectiveness and Performance and their TPM's and KPP's, it's going to be hard to have any meaningful discussion about the return on investment for the cost needed to produce that value.

Here's how these are related...

Screen Shot 2016-08-14 at 1.16.52 PM

Related articles Capabilities Based Planning First Then Requirements Systems Thinking, System Engineering, and Systems Management What Can Lean Learn From Systems Engineering?
Categories: Project Management

Large Programs Require Special Processes

Wed, 08/03/2016 - 15:51

Acquisition Category 1 (ACAT 1) programs are large - large is greater than $5B, yes Billion with a B. This Integrated Program Management Conference (IPMC) 2013 presentation addresses the issues of managing these programs in the presence of uncertanty. 

Like all projects or programs, uncertanty comes in two forms - Irreducible (Aleatory) and Reducible (Epistemic). Both these uncertainties create risk. Both are present no matter the size - from small to mega. When we hear estimates are hard, we're not good at estimating, estimates are possibly misused, or any other dysfunction around estimating, they're just symptoms of the problem. Trying to fix the symptom does little to actually fix the problem. Find the root cause, fix that, and the symptom can then have a probability of going away.

This may look like a unique domain, but the core principles of managing in the presence of uncertainty are immutable.

Forecasting cost and schedule performance from Glen Alleman
Categories: Project Management

Agile for Large Scale Government Programs

Wed, 08/03/2016 - 01:32

It would seem counter intuitive to apply Agile (Scrum) to large Software Intensive System of Systems. But it's not. Here's  how we do it with success.

Agile in the government from Glen Alleman Related articles The Microeconomics of a Project Driven Organization GAO Reports on ACA Site All Project Work is Probabilistic Work
Categories: Project Management

Why We Don't Need to Question Everything

Tue, 08/02/2016 - 16:15

Tilting at WindmillsIt's popular in some agile circles to question everything. This begs the question - is there any governance process in place? No? Then you're pretty much free to do whatever you want with the money provided to you to build software. If there is a Governance process in place, then that means there are decision rights in place as well. This decision rights almost always belong to the people providing the money for you to do your work. 

Questioning those governance processes and questioning the principles, processes, and procedure that implement the governance processes usually starts with the owners of the governance process. If there is a mechanism for assessing the efficacy of the governance, that's where the questioning starts. Go find that place, put in your suggestions for improvement, become engaged with the Decision Rights Owners and then provide your input.

Standing outside the governance process shouting challenge everything is tilting at windmills.

So when you hear that phrase, ask do you have the right to challenge the governance process?

Related articles Planning is the basis of decision making in the presence of uncertainty What is Governance? Why We Need Governance
Categories: Project Management

Systems, Systems Engineering, Systems Thinking

Fri, 07/29/2016 - 15:27

On our morning road bike ride, the conversation came around to Systems. Some of our group are like me - a techie - a few others are business people in finance and ops. The topic was what's a system and how does that notion impact or world. The retailer in the group had a notion of a system - grocery stores are systems that manage the entire supply chain from field to basket.

Here's my reading list that has served me well for those interested in Systems

  • Systems Engineering: Coping with Complexity, Richard Stevens, Peter Brook, Ken Jackson, Stuart Arnold
  • The Art of Systems Architecting, Mark Maier and Eberhardt Rechtin
  • Systems Thinking: Coping with 21st Century Problems, John Boardman and Brian Sauser
  • Systemantics: How Systems Work and Especially How They Fail, John Gall
  • The Art of Modeling Dynamic Systems: Forecasting for Chaos, Randomness and Determinism, Foster Morrison
  • Systems Thinking: Building Maps for Worlds of Systems, John Boardman and Brian Sauser
  • The Systems Bible: The Beginner's Guide to Systems Large and Small, John Gall
  • A Primer for Model-Based Systems Engineering, 2nd Edition, David Long and Zane Scott
  • Thinking in Systems: A Primer, Donella Meadows

These are all actionable outcomes books. 

Systems of information-feedback control are fundamental to all life and human endeavor, from the slow pace of biological evolution to the launching the latest space satellite ... Everything we do as individuals, as an industry, or as a society is done in the context of an information-feedback system. - Jay W. Forrester

Related articles Systems Thinking, System Engineering, and Systems Management Estimating Guidance Can Enterprise Agile Be Bottom Up? Essential Reading List for Managing Other People's Money Systems Thinking and Capabilities Based Planning Herding Cats: Systems, Systems Engineering, Systems Thinking What Can Lean Learn From Systems Engineering?
Categories: Project Management

Assessing Value Produced in Exchange for the Cost to Produce the Value

Fri, 07/29/2016 - 04:56

A common assertion in the Agile community is we focus on Value over Cost.

Both are equally needed. Both must be present to make informed decisions. Both are random variables. As random variables, both need estimates to make informed decisions.

To assess value produced by the project we first must have targets to steer toward. A target Value must be measured in units meaningful to the decision makers. Measures of Effectiveness and Performance that can monetized this Value.

Value cannot be determined without knowing the cost to produce that Value. This is fundamental to the Microeconomics of Decision making for all business processes.

The Value must be assessed using...

  • Measures of Effectiveness - is an Operational measure of success that is¬†closely related to the achievements of the mission or operational objectives evaluated in the operational environment, under a specific set of conditions.
  • Measures of Performance - is a Measure that characterize physical or functional attributes relating to the system operation, measured or estimated under specific conditions.
  • Key Performance Parameter - is a Measure that represents the capabilities and characteristics so significant that failure to meet them can be cause for reevaluation, reassessing, or termination of the program.
  • Technical Performance Measures - are Attributes that determine how well a system or system element satisfies or expected to satisfy a technical requirement or goal.

Without these measures attached to the Value there is no way to confirm that the cost to produce the Value will breakeven. The Return on Investment to deliver the needed Capability is of course.

ROI = (Value - Cost)/Cost

So the numerator and the denominator must have the same units of Measure. This can usually be dollars. Maybe hours. So when we hear ...

The focus on value is what makes the #NoEstimates idea valuable - ask in what units of measure is that Value? Are those units of measure meanigful to the decision makers? Are those decision makers accountable for the financial performance of the firm?

 

Related articles The Reason We Plan, Schedule, Measure, and Correct Estimating Processes in Support of Economic Analysis The Microeconomics of Decision Making in the Presence of Uncertainty
Categories: Project Management

Quote of the Day

Fri, 07/29/2016 - 04:54

If you can't explain what you are doing as a process, then you don't know what you are doing - Deming

Process is the  answer to the question How do we do things around here? All organizations should have a widely accepted Process for making decisions.  "A New Engineering Profession is Emerging: Decision Coach," IEEE Engineering Management Review, Vol. 44, No. 2, Second Quarter, June 2016

Related articles Plan Management Three Increasingly Mature Views of Estimate Making in IT Projects What's in a Domain?
Categories: Project Management

Quote of the Day

Fri, 07/29/2016 - 04:53

Without a Plan, you're making it up as you go. If you make it up as you go, it's the basis of getting it horribly wrong - Salman Rushdie

Related articles Architecture -Center ERP Systems in the Manufacturing Domain IT Risk Management Why Guessing is not Estimating and Estimating is not Guessing
Categories: Project Management

The Myth of "Discover by Doing"

Wed, 07/27/2016 - 13:58

There is a popular Agile and No Estimates phrase...

It is by doing the work we discover the work we must do. 

This of course ignores the notion of engineering  or designing a solution to the needed Capabilities of the customer, BEFORE starting coding. It is certainly the case that some aspects of the software solution can only be confirmed when the working software is available for use. But like all good platitudes in the agile community, there is no domain or context as to where this phrase is applicable. Where can coding start before there is some sort of framework for how the needed capabilities will be delivered? 

  • Shall we just start coding and see what comes out?
  • How about just buying a COTS product a start installing it to see if it is going the meet our needs?

This sounds not only näive, but sounds like we're wandering around looking for a solution without any definition of what the problem is. When that is the approach, when the solution appears it may not be recognized as the solution. Agile is certainly the basis of dealing with emerging requirements. But all good agile processes have some sense of what the customer is looking for. 

This understanding of what capabilities the customer needs starts with a Product Roadmap. The Product Roadmap is a plan that matches short-term and long-term business goals with specific technology solutions to help meet those goals.

A Plan is a Strategy for success. All strategies have a hypothesis. A Hypothesis need to be tested. This is what working software does. It tests the hypothesis of the Strategy described in the Product Roadmap.

So if you have to do the work to discover what work must be done, you've got an Open Loop control system. To close the loop this emergent work needs to have a target to steer toward. With this target the working software can be compared to the desired working software. The variance between the two used to take corrective actions to steer toward the desired goals.

And of course, since the steering target (goal) and the path to this goal are both random variables - estimates will be needed to close the loop of the control processes used to reach the desired outcomes that meet the Capabilities requested by the customer.

Related articles Mr. Franklin's Advice Root Cause of Project Failure Herding Cats: The Myth of "Discover by Doing" Estimating and Making Decisions in Presence of Uncertainty
Categories: Project Management

Quote of the Day

Thu, 07/21/2016 - 19:03

A skeptic will question claims, then embrace the evidence. A denier will question claims, then reject the evidence. - Neil deGrase Tyson

Think of this whenever there is a conjecture that has no testable evidence of the claim. And think ever more when those making the conjectured claim refuse to provide evidence. When that is the case, it is appropriate to ignore the conjecture all together 

Categories: Project Management

Quote of the Day

Tue, 07/19/2016 - 15:08

Where You Stand Depends On Where You Sit

This notion of the basis of all good discussions. It is also the basis of discussions that get us in trouble. For example, I sit in a FAR 34.2/DFARS 234.2 Federal Procurement paradigm and a similar paradigm for Enterprise IT based in ISO 12207, ITIL V3.1, CMMI,  or similar governance processes.

Both these domains are guided by a Governance framework for spending other people's money, planning for that spend, performing the work with that money, reporting the progress to plan for that spend to produce the needed value, and taking corrective actions when the outcomes don't match the plan to increase the probability that the needed value (Capabilities) will be delivered for the planned cost to keep the Return On Investment on track.

This paradigm is independent of the software development method - traditional or agile.

If you work where the customer has a low need to know the cost, schedule, or what will be produced for the money, then you likely sit somewhere else. 

Categories: Project Management

Symptoms versus Root Causes

Mon, 07/18/2016 - 01:01

There is a popular graph showing project performance versus the estimated project performance in "Schedule Estimation and
Uncertainty Surrounding the Cone of Uncertainty," Todd Little, IEEE Software, May/June 2006. 

Screen Shot 2016-07-17 at 4.52.40 PM

This chart (above)  shows data from samples of software development projects and is used by many in the agile community and by #NoEstimates advocates to conjecture that estimates are usually wrong. In fact estimates can easily be wrong for many reasons. But knowing why they are wrong is rarely the outcome of the discussion. 

This approach is missing a critical understanding about assessing the effectiveness of any estimating process. It starts with the notion of an ideal estimate. That is a post hoc assessment of the project. The idea estimate  is only known after  the project is over.

Next is another critical issue. 

Did the projects (in the graph above) overrun the estimate because the estimate was wrong or because the project was executed wrongly?

In our domain of complex projects, many of which are Software Intensive System of Systems, there are four primary Root Causes of Unanticipated Cost and Schedule Growth, shown below.

Screen Shot 2016-07-17 at 5.02.08 PM

The top graph shows samples from the programs in the business. But it does not show WHY those programs ended up  greater than the initial estimates. The graph is just a collection of data after the fact. What was the Root Cause of this unanticipated cost (and schedule) growth. The paper goes on to speak about a Estimating Quality Factor but that is only One of the four core Root Causes of cost growth from the PARCA research. As well the paper mentions other causes of growth, similar some PARCA causes.

But each project in the graph is not assigned a specific (maybe even more than one) cause. Without that assignment it is not possible to determine if estimating was the Root Cause, or as stated above one or more of the three other causes was the reason the project was above the Ideal line. 

In the notion of Ideal I will assume the estimate was Ideal if the project actuals matched the estimate.

The problem here is those advocating estimates are flawed, a waste, not needed, do harm, or are even evil use this chart to support their claim. Without stating what the Cause for the deviation from the Ideal is. Just that it IS. Not Why. There is no basis to make any claims about the issues with estimating. 

Without the Why, NO corrective action can be taken to improve the performance of both the project or any of the four listed (and many other) processes including  the estimating process for the project. This is the fundamental basis of Root Cause Analysis. And it comes down to this ...

Without the Root Cause for the undesirable outcome, no corrective action can be taken. You're just treating the Symptom. Those conjecturing a Cause (say Estimating is the smell of Dysfunction) have no basis on which to make that statement. It's a house built on sand.  Same for the Cone  of Uncertainty - also popularly misused, since the vertical scale is rarely calibrated for the specific domain, instead some broad and uncalibrated range provided for notational purposes.

And as a notional concept the Cone of Uncertainty is useful. As a practical concept, only when the vertical scale is calibrated (Cardinal versus Ordinal) can it be used to assess the uncertanty of the estimates during specific periods of the project. This is another misuse of statistical data analysis.

There are databases that provide information needed to calibrate that chart as well. But that's another post.

For now, take care when seeing the first chart, to ask do you know the cause for the projects that is above the Ideal Line?  No, then all you can say is we don't know why, but there are a lot of projects in our organization that have a Symptom of cost overrun, but we have no way to know why. And therefore we have no way to know how to fix this symptom. Just that we've observed it. But can't fix it.

To start to apply Root Cause Analysis, here is an introduction to the method we use on our Software Intensive Systems.

Root cause analysis master plan from Glen Alleman

 

 

Categories: Project Management

A collection of #NoEstimates Responses

Sun, 07/10/2016 - 19:40

The question of the viability of #NoEstimates starts with a simple principles based notion.

Can you make a non-trivial (NOT de minimis) decision in the presence of uncertainty without estimating?

The #NoEstimates advocates didn‚Äôt start there. They started with ‚ÄúEstimates are a waste, stop doing them.‚ÄĚ Those advocates also started with the notion that estimates are a waste for the developers. Not considering those who pay their salary have a fiduciary obligation to know something about cash demands and profit resulting from that decision in the future.

The size of the ‚Äúvalue at risk‚ÄĚ is also the starting point for estimates. If the project is small (de minimis) meaning if we over run significantly no one cares, then estimating is likely a waste as well. The size of the project, whether small or multi-million‚Äôs doesn't influence the decision to estimate. The decision is determined by ‚Äúvalue at risk,‚ÄĚ and that is determine by those paying NOT¬†by those consuming. So the fact I personally¬†work on larger projects does not remove the principle of ‚Äúvalue at risk.‚ÄĚ Any¬†client‚Äôs (internal or external) V@R may be much different than my personal experience¬†‚Äď but it‚Äôs not our money.

Next comes the original post from Woody ‚Äď ‚Äúyou can make decisions with No Estimates.‚ÄĚ If we are having a principled based conversation (which NE‚Äôer don‚Äôt) then that statement violates the principles of Microeconomics. Making decisions in the presence of uncertainty (and I‚Äôm assuming all projects of interest have uncertainty), then estimates are needed to make decision. Those decisions are based in MicroEcon on the Opportunity Cost and the probability of making the best choice for the project involves assessing the probability outcome of those choices, estimating is required.

Real options is a similar process in IT based on estimating. Vasco stated long ago he was using RO along with Bayesian Decision Making. I suspect he was tossing around buzz words without knowing what they actually mean.

From the business side, the final principle is Managerial Finance. This is the basis of business management of its financial assets. The balance sheet is one place these are found. Since the future returns from the expenses of today and the ‚Äúpotential‚ÄĚ expenses of the future are carried in that balance sheet, estimating is needed there as well for the financial well being of the firm.

These three principles Value at Risk, MicroEconomics of Decision Making, and Managerial Finance are ignored by the NE advocates when they start with the conjecture that ‚Äúdecisions can be made without¬†estimates,‚ÄĚ and continue on with ‚Äúestimates are a waste of developers time, they should be coding not estimating.‚ÄĚ

It‚Äôs the view of the world, that as a developer ‚Äúit‚Äôs all about me.‚ÄĚ Never looking at their paycheck and asking where did the money come from. That‚Äôs a common process and one I did when I started my career 35 years ago as a FORTRAN developer for Missile Defense radar systems and our boss had us take¬†out our paychecks (a paper check in those days) and look at the upper left hand corner. ‚ÄúThat money doesn‚Äôt come from the Bank of America, El Segundo Blvd, Redondo Beach, it comes from the US Air Force. You young pups need to stay on schedule and make this thing work as it says in our¬†contract.‚ÄĚ

In the end, the NE conversation can be about the issues in estimating and there are many - and Steve McConnell speaks to those. I work large federal acquisition programs ‚Äď ¬†IT and embedded systems. And many times the ‚Äúover target baseline‚ÄĚ root cause is from ‚Äúbad estimating.‚ÄĚ But the Root Cause of those bad estimates is not corrected by No¬†Estimating as #Noestimates would have us believe.

As posted on this¬†blog before and sourced from the Director of ‚ÄúPerformance Assessment and Root Cause Analysis,‚ÄĚ unanticipated growth in cost has 4 major sources:

1. Unrealistic performance expectations ‚Äď that will cost more money.
2. Unrealistic cost and schedule estimates ‚Äď the source of poor estimating.
3. Inadequate assessment of risk and unmitigated risk exposure.
4. Unanticipated technical issues.

Research where I work some times (www.ida.org) has shown these are core to problems of cost overruns in nearly every domain from ERP to embedded software intensive systems. It is unlikely those 4 root causes are not universal.

So what‚Äôs #NoEstimates trying to fix? They don‚Äôt say, just ‚Äúexploring‚ÄĚ new ways.‚ÄĚ In what governance frameworks? They don‚Äôt say. They don‚Äôt actually say much of anything, just ‚Äúestimates are waste, stop doing them and get back to coding.‚ÄĚ

As my boss in 1978 reminded us newly minted Master‚Äôs degreed coders, ‚Äúit ain‚Äôt your money it‚Äôs the USAF‚Äôs money, act accordingly ‚Äď give me an estimate of this thing you‚Äôre building can be used to find SS-9‚Äôs coming our way?‚ÄĚ Since then I‚Äôve never forgotten his words, ‚Äúbusiness (in that case TRW) needs to know how much, when, and what, if it‚Äôs going to stay in business.‚ÄĚ

 

Categories: Project Management