Skip to content

Software Development Blogs: Programming, Software Testing, Agile Project Management

Methods & Tools

Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!

Herding Cats - Glen Alleman
Syndicate content
Performance-Based Project Management¬ģ Principles, Processes, and Practices to Increase Probability of Project Success
Updated: 5 hours 9 min ago

Quote of the Day

Sun, 06/25/2017 - 18:48

Great spirits have always found violent opposition from mediocrities. The latter cannot understand it when a man does not thoughtlessly submit to hereditary prejudices but honestly and courageously uses his intelligence. Albert Einstein, in  The Tao of Systems Engineering: An Engineer's Survival Guide (Kindle Locations 324-326). Ronald Paul Sherwin

Categories: Project Management

Quote of the Day

Sat, 06/24/2017 - 15:39

As to methods, there may be a million and then some, but principles are few. The man who grasps principles can successfully select his own methods. The man who tries methods, ignoring principles, is sure to have trouble. ‚ąí¬†Harrington Emerson.

from an advertisement for The Book of Business, in Colliers: National Weekly, 1917 

When we hear a claim or conjecture about some method that fixes some named dysfunction, and that claim or conjecture clearly violates established principles, one of several outcomes can occur. Ask what do you mean or what principle informs your conjecture here? Many times this will be the start of a dialogue on the topic and learning on both side can start. Other teams, those being asked that question take offense to being asked to explain their position. In later case, it is best to move on, since willfully ignoring the principles that form the basis of the discussion is not going to lead to a shared exchange of ideas.  

 

Categories: Project Management

Quote of the Day

Fri, 06/23/2017 - 18:32

‚ÄúWe trained hard, but it seemed that every time we were beginning to form up into teams, we would be reorganized. I was to learn later in life that we tend to meet any new situation by reorganizing; and a wonderful method it can be for creating the illusion of progress while producing confusion, inefficiency, and demoralization.‚ÄĚ ‚Äď Petronius Arbiter (210 B.C.)

Categories: Project Management

Just a Reminder About Estimating Software Projects

Fri, 06/23/2017 - 17:52

Here's a clear and concise discussion of the estimating topic.

And just a reminder for making decisions in the presence of uncertainty.

There is NO Means

Categories: Project Management

Decisions Without Estimates?

Thu, 06/22/2017 - 20:52

There is a posted question at an agile conference. Can you make a decision without an estimate? Like many discussions in the domain of agile, the statement is made without any evidence that it is true, nor can even be true in principle. This type of fallacy is common. 

First a principle ...

There is NO means of making credible decisions in the presence of uncertainty without first estimating the outcome of that decision. This is a foundation principle of the economics of probabilistic decision making. To suggest you can make such a decision without estimates  willfully ignores that principle.

Decisions without estimates

Let's look at each of these from the point of view of Managerial Finance and Economics of Decision Making in the presence of Uncertainty. These two points of view are the basis of any credible business management process.

It's not your money, spend it wisely

First, let's establish a singular framing assumption

  • All project work¬†is uncertain - this uncertainty¬†comes in two types. Reducible (Epistemic) and Irreducible (Aleatory).
  • These uncertainties are probabilistic and statistical respectively.
  • Knowing the behaviors¬†of these uncertainties¬†- the probabilistic and statistical models - is part of deciding what to do.
  • The impact on the project from these probabilities¬†is also¬†probabilistic. Knowing this impact requires assessing the probabilities.

If you want to decide what's the probability of occurrence of some Epistemic uncertainty or the statistical processes for some aleatory activity, you need to estimate. Don't guess. Don't assume, Estimate. This process is the basis of all risk management. And Risk Management is How Adults Manage Projects - Tim Lister.

So with this in mind let's look at the conjectured process that can be performed on projects without estimating in the presence of uncertainty.

  • Longest entry Barrier First - so how would you know what the longest is without making an estimate since that longest is likely a probabilistic number in the future? If you the longest upfront, it's already happened.
  • Prototyping¬†First - OK, which feature do we prototype? How much effort is the prototype? What are we expecting to learn from the prototype before we start spending the customer's money?¬†
  • Strategic Items First - strategies are hypotheses. Hypotheses have uncertainties. Those¬†uncertainties¬†impact the validation of the Strategy. How can you assess the strategy¬†without making estimates of the impact of the outcome of the¬†hypothesis?
  • Customer Oriented First - does the customer have absolute confidence of what comes first? Does the customer operate in the presence of uncertainty?
  • High Risk First - all risk comes from uncertainty. Epistemic¬†uncertainty. Aleatory uncertainty. No decision can be made in the presence of risk - derived from its uncertainties¬†- without estimating the impact of that risk, the cost to mitigate that risk, the residual risk after mitigation.¬†Risk Management is How Adults Manage Projects - Tim Lister.¬†Be an adult, manage the project as a¬†Risk Manager. Be an Adult, estimate the impact of the risk on the probability of success.
  • Management Decision - OK, management decides. Any uncertainties¬†resulting from that decision? No, proceed. Yes? Any estimate on the impact of that decision on the probability of success? Just because management decided does not remove tuncertaintynty, unless that uncertainty has been analyzed.¬†
  • Reducing Cost First - how much cost, what sources of cost, what's the probability that the cost reduction will be effective?¬†
  • Minimal¬†Scope - how do you know what minimal is in the presence of uncertainty without estimating?
  • POC to Decide - same as Management Decision

The is NO principle by which you can make a credible decision in the presence of uncertainty (reducible and irreducible) without estimating the impact of that decision on the probability of success of your project.

 

Related articles What's the Smell of Dysfunction? Root Cause of Project Failure IT Risk Management Making Decisions In The Presence of Uncertainty Herding Cats: Five Immutable Principles of Project Success Estimating Processes in Support of Economic Analysis Herding Cats: Decisions Without Estimates? Herding Cats: Risk Management for Agile Software Development Projects
Categories: Project Management

Quote of the Day

Thu, 06/22/2017 - 16:29

Management of Software Development projects is not like a Rocky movie, where strong desire overcomes lack of skill, experience, and talent.

Skill, experience, tools, principles, processes, practices and talent are all needed to increase the probability success. Not understanding that estimates are rarely for those spending the money but for those paying for the Value produced by those spending the money. Not seeking the highest possible level of each of these, or tolerating less than effective application of these,  will result in disappointing results for those paying for and those developing the software system.

Categories: Project Management

Book of the Month

Thu, 06/22/2017 - 15:22

Death  of ExpertiseThe Death of Expertise describes how established knowledge is challenged and why this is critical in our modern society.

With nearly unlimited access to information - the information age - we've become narcissistic and misguided intellectual egalitarians.

Everything from webMD to Wikipedia, normal people are now experts.

Any idea now demands to be taken seriously. Unsubstantiated claims have the same visibility is tested theories and evidence-based principles, processes, and practices

In our software development domain, anyone with a blog and a Twitter account can make statements. Present company included.

The trouble is fact checking these claims takes effort.

With the right tag lines, those making bogus claims can collect followers with ease.

This book is a rear guard action on behave of those who actually know what they are talking about. 

Categories: Project Management

Quote of the Day

Wed, 06/21/2017 - 15:33

While management and leadership are related and often treated as the same, their central functions are different. Managers clearly provide some leadership, and leaders obviously perform some management.
However, there are unique functions performed by leaders that are not performed by managers. My observation over the past forty years ... is that we develop a lot of good managers, but very few leaders.
Let me explain the difference in functions that they perform:

  • A manager takes care of where you are; a leader takes you to a new place.
  • A manager is concerned with doing things right; a leader is concerned with doing the right things.
  • A manager deals with complexity; a leader deals with uncertainty.
  • A manager creates policies; a leader establishes principles.
  • A manager sees and hears what is going on; a leader hears when there is no sound and sees when there is no light.
  • A manager finds answers and solutions; a leader formulates the questions and identifies the problems.

- James E. Colvard

from "Systems Engineering Newsletter, June 2017, Project Performance International, PO Box 2385, Ringwood North Victoria, 3134, Australia

Categories: Project Management

Misinterpretations of the Cone of Uncertainty

Tue, 06/20/2017 - 20:31

The Cone of Uncertainty is a framing assumption used to model the needed reduction in some parameter of interest in domains ranging from software development to hurricane forecasting.

This extended post covers

  1. The framing assumptions of the Cone of Uncertainty.
  2. The Cone of Uncertainty as a Technical Performance Measure.
  3. Closed Loop Stochastic Adaptive control in the presence of Evolving Uncertainty.

These topics are connected in a simple manner.

All project work is uncertain (probabilistic and statistical). Uncertainty comes in two forms - Epistemic and Aleatory. Uncertainty creates Risk. Risk management requires active reduction of risk. Active reduction requires we have a desired reduction goal, perform the work needed to move the parameter toward the goal - inside the control limits, and measure progress toward the rduction goal. Management of this reduction work and measurement of the progress toward the goals is a Close Loop Control System paradigm. Closed Loop Control, has a goal, an action, a measurement, and a corrective action. These activities, their releationships and values are defined in a PLAN for increasing the probability of success for the project. The creation and management of the Plan is usually performed by the Program Planning and Controls group where I work. 

Framing Assumptions for the Cone of Uncertainty 

Of late, Cone of Uncertainty has become the mantra of No Estimates advocates claiming that data is needed BEFORE the Cone is of any use. This is course is NOT the correct use of the Cone. And without this data, the Cone has no value in the management of the work. This fallacy comes from a collection of data that did not follow the needed and planned reduction of uncertainty for the cost estimates of a set of software development projects. 

The next fallacy of this conjecture is that the root cause for why those projects did not have their uncertainty reduced in some defined and desirable manner was never provided.  The result of the observation is a symptom of something. But the Cause was not sought in any attributable way. This is a common problem in low maturity development organizations. We blame something for our problem, without finding out the cause of the something or even if that something is the actual cause.

In our domain, Root Cause Analysis is mandated before ANY suggested change for improvement, prevention, or corrective actions are taken. Our RCA method is the Apollo Method. Apollo is distinctly different from other root cause approaches in that each effect requires a Condition and Action. Please read the book in the previous sentence to see why this is critical and how it is done. Then every time you hear some statement about an observed problem, you can ask what's the cause (both condition and action)?

So when you hear I have data that shows that the Cone (or for that matter anything) does not follow the expected and desired outcomes and there is no Root Cause to explain that unexpected outcome - ignore that conjecture, until the reason for the observed outcome is found and corrective actions have been identified and those corrective actions have been confirmed to actually fix the observed problem. 

So let's recap what the Cone of Uncertainty is all about from those who have created the concept. Redefining the meaning and then arguing my data doesn't match that is a poor start to improving the probability of project success.

The Cone of Uncertanty describes the uncertainty in the behaviors or measurement of a project parameter and how that uncertanty needs to be reduced as the project proceeds to increase the Probability of Project Success.
If that parameter is NOT being reduced at the planned rate, then the Probability of Success is not increased in the planned needed manner.

If the parameter of interest is not being reduced as needed, go find out why and fix it, or you'll be late, over budget, and the technical outcome unacceptable. The Cone of Uncertanty does NOT need data to validate it is the correct paradigm. The Cone of Uncertainty is the framework for improving the needed performance of the project. It's a Principle. 

Here's some background on the topic of the Cone of Uncertainty. Each of these can be found with Google.

  1. Misinterpretations of the ‚ÄúCone of Uncertainty‚ÄĚ in Florida during the 2004 Hurricane Season, Kenneth Broad, Anthony Leiserowitz, Jessica Weinkle, and Marissa Steketee, American Meteorological Society, May 2007
  2. Reducing Estimation Uncertainty with Continuous Assessment: Tracking the ‚ÄúCone of Uncertainty‚ÄĚ, Pongtip Aroonvatanaporn, Chatchai Sinthop, Barry Boehm
  3. ‚ÄúShrinking The Cone Of Uncertainty With Continuous Assessment For Software Team Dynamics In Design And Development,‚ÄĚ Pongtip Aroonvatanaporn,‚ÄĚ Ph.D. Thesis, University of Southern California, August 2012.
  4. "Improving Software Development Tracking and Estimation Inside the Cone of Uncertainty, "Pongtip Aroonvatanaporn, Thanida Hongsongkiat, and Barry Boehm.

The Cone of Uncertainty as a Technical Performance Measure

Technical Performance Measures are one of four measures describing how a project is making progress to plan. These measures - combined - provide insight into the probability of program success.

  • Measure of Effectiveness
  • Measure of Performance
  • Technical Performance Measure
  • Key Performance Parameters

Here's a workshop that we give twice a year at the College of Performance Management's conference. The definition of a TPM is on page 14. 

The reduction of Uncertainty (as in the Cone shape of the Uncertainty) can be considered a Technical Performance Measure. The Planners of the program define the needed uncertainty at a specific point in time, measure that uncertainty at that point in time and assess the impact on the probability of program success by comparing planned versus actual.

This approach is the same for any other TPM - Risk, Weight, Throughput, and even Earned Value Management parameters. This is the basis of the closed loop control system used to manage the program with empirical data from past performance compared to the planned performance at specific points in time.

This Closed Loop Program Control Process provides the decision makers with actionable information. The steering target - the MOE, MOP, TPM, KPP - is defined upfront and evolves as the programs progress with new information. Same for the Uncertainties in the program measures. This approach is also the basis of Analysis of Alternatives and other trades when it is determined that the desired measures cannot be achieved. Decisions are made to adjust the work, adjust the requirements, or adjust the measures.

Close Loop Control Systems in the Presence of Emerging Stochastic Behaviours

Risk management is how adults manage projects - Tim Lister. All risk comes from uncertainty. Uncertainty comes in two forms - Epistemic (reducible) and Aleatory (Irreducible). Here's an overview of how to manage in the presence of uncertainty

 

Let's look at the Closed Loop Control System paradigm for managing projects. Control systems exist in many domains. From simple fixed setpoint systems like your thermostat controlling the HVAC system, or a PID controller running a cracking column at a refinery, to multi-dimensional evolving target control systems guiding an AIM-9L missile (which I wrote SW for), to multi-target acquisition systems needed to steer midcourse interceptors to their targets, to cloud-based enterprise ERP systems, to autonomous flight vehicles operating in mixed environments, with mission planning while in flight and collision avoidance with manned vehicles - the current generation of swarm UAVs in the battlefield.

A concept that spans all these systems - and is shared with project management control systems (we work in Program Planning and Controls as a working title) is the idea of a PLAN.

  • What is the Plan to keep the temperature in the room at 76 degrees? It's a simple plan, run the A/C or Heater, measure the current temperature, compare that to the desired temperature, adjust appropriately, all the way up to
  • An evolving battle management system where the¬†control loop for individual vehicles interacts with¬†control loops of other vehicles (manned and unmanned) as the emerging mission, terrain, weather, and mission.
  • What are the units of measure for this Plan? What are the probabilistic and statistical behaviors of these units measure? How can we measure the progress of the project toward compliance with these measures in the presence of uncertainty?
  • How can the¬†variance analysis of these measures be used to take corrective actions to¬†keep the program GREEN?

All these control systems share the same framing assumptions - there is a goal, the needed capabilities to accomplish that goal are known, I have  PLAN by which that goal can be accomplsihed, that PLAN may evolve - with manual intervention or with autonomous intervention - as the PLAN evolves, the Mission evolves, and the situation evolves.

 Project work is a complex, adaptive, stochastic process with evolving Goals, resources, and situation, all operating in the presence of uncertainty. Here are the moving parts for any non-trivial project.Screen Shot 2017-06-19 at 8.30.26 AM
In the control system for project management, like the control system for those non-trivial examples above - the PLAN for the behavior of the parameter of interest is the starting point. The feedback (closed loop feedback) of the variance between the desired Value and the Actual Value at the time of the measurement is connected with other information on the program to define the corrective actions needed to Keep the Program Green.

Like control systems in automation with closed loop control, a project control system must be implemented that controls cost, ensures technical performance, and manages schedules. All successful control systems require measurements. All closed loop control systems have a plan to steer toward. A temperature for a simple thermostate all the way to a target cost, schedule, and techncial performance for a complex software project. In the project management world those measurements (inputs/calculated values) are called metrics and the greater the frequency of measurement (up to a point - the Nyquist sample rate), the more accurate the control. All measurements need to be compared to an expectation - the Set Point.  When deviations are noticed action is required to modify the processes that produce the output - the product or service produced by the project.

Here are some resources of closed loop control systems for project management based on following the plan for the performance of the Parameter of Interest

  1. "Feedback Control in Project-Based Management," L. Scibile,¬†ST Division ‚Äď Monitoring and Communication Group (ST/MC) CERN, Geneva, Switzerland
Back to the Cone of Uncertainty

The notion¬†of the¬†Cone of Uncertainty¬†has been around for awhile. Barry Boehm's work in ‚ÄúSoftware Engineering Economics‚ÄĚ. Prentice-Hall, 1981.¬†

But first, let's establish a framing assumption. When you hear of projects where uncertainty is not reduced as the project progresses, ask a simple question. Why is this the case? Why, as the project progresses with new information, delivered products, reduced risk, is the overall uncertainty not being reduced? Go find the root cause of this, before claiming uncertainty doesn't reduce. Uncertainty as a principle for all projects, should be reducing thorugh the direct action of Project Management. If uncertainty is not reducing the case may be - bad management, an out of control project, or you're working in pure research world where things like that happen.

As well never measure any project parameter as a ratio. Relative numbers - ordinal - are meaningless when making decisions. Relative uncertainty is one. Relative to what? Cardinal numbers are measures used to make decisions.

So a quick review again. What is the Cone of Uncertainty?

  • The Cone is a project management framework describing the uncertainty aspects of estimates (cost and schedule) and other project attributes (cost, schedule, and technical performance parameters). Estimates of cost, schedule, technical¬†performance on the left side of the cone have a lower probability of being precise and accurate than estimates on the right side of the cone. This is due to many reasons. One is levels of uncertainty¬†early in the project. Aleatory and Epistemic uncertainties, which create the risk to the success of the project. Other uncertainties that create risk include:
    • Unrealistic performance expectation with missing Measures of Effectiveness and Measures of Performance
    • Inadequate assessment of risks and unmitigated exposure to these risks with proper handling plans.
    • Unanticipated technical issues with alternative plans and solutions to maintain effectiveness
  • Since all project work contains uncertainty, reducing this uncertainty¬†- which reduces risk - is the role of the project team and their management. Either the team itself, the Project or Program Manager, or on larger programs the Risk Management owner.¬†

Here's another simple definition of the Cone of Uncertainty: 

The Cone of Uncertainty describes the evolution of the measure of uncertainty during a project. For project success, uncertainty not only must decrease over time, but must also diminishe its impact on the project's outcome. This is done by active risk management, through probabalistic decision-making. At the beginning of a project, there is usually little known about the product or work results. Estimates are needed but are subject to large level of uncertainty. As more research and development is done, more information is learned about the project, and the uncertainty then decreases, reaching 0% when all risk has been mitigated or transferred. This usually happens by the end of the project.

So the question is? - How much variance reduction needs to take place in the project attributes (risk, effectiveness, performance, cost, schedule - shown below) at what points in time, to increase the probability of project success? This is the basis of Closed Loop Project Control  Estimates of the needed reduction of uncertanty, estimates of the possisble reduction of uncertainty, and estimates of the effectiveness of these reduction efforts are the basis of the Close Loop Project Control System.

This is the paradigm of the Cone of Uncertainty - it's a planned development compliance engineering tool, not an after the fact data collection tool

The Cone is NOT the result of the project's past performance. The Cone IS the Planned boundaries (upper and lower limits) of the needed reduction in uncertainty (or other performance metrics) as the project proceeds. When actual measures of cost, schedule, and technical performance are outside the planned cone of uncertainty, corrective actions must be taken to move those uncertanties inside the cone, if the project is going to meet it's cost, schedule, and technical performance goals. 

If your project's uncertanties are outside the Planned boundaries at the time when they should be inside the planned boundaries, then you are reducing the proabbility of project success

The Measures that are modeled in the Cone of Uncertainty are the Quantitative basis of a control process that establishes the goal for the performance measures. Capturing the actual performance, comparing it to the planned performance, and compliance with the upper and lower control limits provides guidance for making adjustments to maintain the variables perform inside their acceptable limits.

The Benefits of the Use of the Cone of Uncertainty 

The planned value, the upper and lower control limits, the measures of actual values from a Close Loop Control System - a measurement based feedback process to improve the effectiveness and efficiency of the project management processes.

  • Analyzing trends that help focus on problem areas at the earliest point in time - when the¬†variable under control¬†starts misbehaving, intervention can be taken. No need to wait till the end to find out¬†you're not going to make it.
  • Providing early insight into error-prone products that can then be corrected earlier and thereby at lower cost - when the trends are headed to the UCL and LCL, intervention can take place.
  • Avoiding or minimizing cost overruns and schedule slips by detecting them early - by observing trends to breaches of the UCL and LCL.
    enough in the project to implement corrective actions
  • Performing better technical planning, and making adjustments to resources based on discrepancies between planned and actual progress.

A critical success factor for all project work is Risk Management. And risk management includes the management of all kinds of risks. Risks from all kinds of sources of uncertainty, including technical risk, cost risk, schedule, management risk. Each of these uncertainties and the risks they produce can take on a range of values described by probability and statistical distribution functions. Knowing what ranges are possible and knowing what ranges are acceptable is a critical project success factor.

We need to know the Upper Control Limits (UCL) and Lower Control Limit (LCL) of the ranges of all the variables that will impact the success of our project. We need to know these ranges as a function of time With this paradigm we have logically connected project management processes with Control System processesIf the variances, created by uncertainty going outside the UCL and LCL. Here's a work in progress paper "Is there an underlying Theory of Project Management," that addresses some of the issues with control of project activities.

This is the critical concept in successful project management

We must have a Plan for the critical attributes - Mission Effectiveness, Technical Performance, Key Performance Parameters - for the items. If these are not compliant, the project has become subject to one of the Root Causes of program performance shortfall. We must have a burndown or burnup plan for producing the end item deliverables for the program that match those parameters over the course of the program. Of course, we have a wide range of possible outcomes for each item in the beginning. And as the program proceeds the variances measures on those items move toward compliance of the target number in this case Weight.

Screen Shot 2017-01-13 at 4.21.56 PM

Here's another example of the Cone of Uncertainty, in this case, the uncertainty in weight of a vehicle as designed by an engineering team. The UCL and LCL are defined BEFORE the project starts. These are the allowable ranges of the weight values for the object at specific points in time. When the actual weight or the projected weight goes outside that range Houston We Have A Problem.  These are used to inform the designer of the progress of the project as it proceeds. Staying inside the control limits is the Planned progress path to the final goal - in this case, temperature.

Wrap Up of this Essay

The Cone of Uncertanty, is a signaling boundary of a Closed Loop Control system used to manage the project to success with feedback from the comparison of the desired uncertainty in some parameter to the actual uncertainty in that parameter.

This boundary is defined BEFORE work start to serve as the PLANNED target to steer toward for a specific parameter. In simplier closed loop control systems, this is called the Set Point†

† The setpoint is the desired or target value for an essential variable of a system, often used to describe a standard configuration or norm for the system. In project management or engineering development, the set point is a stochastic variable that is evolving as the program progresses and may be connected in non-linear ways with other set points for other parameters.

  Related articles Are Estimates Really The Smell of Dysfunction? What's the Smell of Dysfunction? Herding Cats: The Economics of Decision Making on Software Projects Capabilities Based Planning - Part 2
Categories: Project Management

Quote of the Day

Mon, 06/19/2017 - 03:55

Under the current imperfect administration of the Universe, most new ideas are false, so most ideas for improvement make matters worse

Categories: Project Management

The Smell of Dysfunction and Corrective Action

Sat, 06/17/2017 - 17:57

When we hear about some dysfunction in our work, a product, a system - Root Cause Analysis can provide the corrective action needed to remove the dysfunction and most importantly prevent the dysfunction from returning.

There are Eight Questions needed to gain insight into the source of the problem, before conjecturing any fix. In the Appollo Root Cause Analysis method, we have both Events (Actions) and Conditions, so these Eight need to be restated

  1. What was the harm from the event or the condition that created the primary effect?
  2. What was the significance of this effect? How is this measured? 
  3. What was the set-up for the event or the condition? In Apollo, the cause and the effect are the same things in an infinite chain.  My cause is your effect, which creates a  new cause for a new effect.
  4. What triggered the event? Either an Action or a Condition. Find out, write that down, record the source of the observation and the connection in the chain of cause and effect.
  5. What made the harm as severe as it was? Assess - an after action report - what damage was done. 
  6. What kept the harm from being a lot worse? Look for preventative measures that are already in place.
  7. What should be learned from the event? This is the basis of another method, the Phoenix method of Root Cause Analysis, where Lessons Learned are captured and used to prevent or correct future problems.
  8. What should be done about the event? This is the Corrective Action Plan (CAP) for the Primary Effect. In Apollo, this is defined by removing one of more cause and effect paths.

Here's a framework for applying Apollo

Root cause analysis master plan from Glen Alleman

When you hear someting like Estimates are the smell of Dysfunction you'll now know that is complete utter malarky, since not cause has been stated and not corrective action has been devised that can be tested to prevent or correct that undefined smell. 

So don't let anyone get away with making a unsubstianted statement about some problem they have observed and some suggested correction to that problem. It's intellectual laziness to do make those statements. And intellectual laziness to not challenge them. 

Categories: Project Management

Performance-Based Project Management¬ģ

Sat, 06/17/2017 - 14:00

Successfully managing projects, no matter the domain, the size, the development or engineering method starts with a set of Principles, Practices, and Processes.

Here are those Principles and Practices. The Principles are Immutable. The Practices are universal. The Processes need to be tailored to your domain.

When hearing about some new, supposed innovative way for delivering value in exchange for money, ask does that idea fit into any known set of Principles and Practices of project success?

Principles and Practices of Performance-Based Project Management¬ģ from Glen Alleman Related articles Applying the Right Ideas to the Wrong Problem Capabilities Based Planning Information Technology Estimating Quality
Categories: Project Management

The Five Laws of Software Estimating are Wrong

Fri, 06/16/2017 - 17:11

There's a blog post from a few years back that has resurfaced The 5 Laws of Software Estimates. It's one of those posts that's heavy on opinion and light on principles. Time to revisit this ill-informed idea on why, what, and how we need to use estimates to make decisions in the presence of uncertainty. 

Law of Software Estimating Fact of Software Estimating Estimates are waste

A waste for whom?

Developers think they are waste, but it's not their money.

To those paying the developers, estimates provide actionable information needed to make decisions:

  • Can we afford to develop this feature?
  • If we develop this feature when will it start¬†earning its keep for the business
  • What tradeoffs¬†are possible¬†in cost and schedule for this feature compared to other features?
Estimates are non-transferrable 

Its claimed Software estimates are not fungible.

Estimates are transferrable if you keep track of what you develop and codify those Features for future reference.

I work in a domain where a Central Repiository is kept of all work performed and used to estimate other projects. This is called Reference Class Forecasting and done in all mature project management domains.

This is basic business process improvement and part of any credible estimating process.

Estimates are Wrong

 Estimates are NOT Guesses

This is a fundamental category error in knowledge. Assigning a meaning to a term that is incorrect.

Either a lack of knowledge about estimating or a willful lack of knowledge about estimating. I suspect that later since every High School Statistics class introduces the notion of an estimate and the two attributes of that estimate

  • Precision - what precision is needed to make your decision
  • Accuracy - what accuracy¬†is needed to make your decision

Define these before you start estimating, not afterward. 

We need to know the cost for the product is "at our below $1,200,000 with an 80% confidence would be a statement a CIO might make to a development team.

Or we need to know you can produce that product for that cost "on or before" the end of the 2nd quarter.

Estimates are Temporary

Yes they are, that's why the Estimate to Complete and the Estimate at Completion are continually emerging with new data, better models, empirical data from delivered.

This is obvious to any mature estimating organization.

Estimates are Necessary

Correct Businesses cannot make decisions about whether or not to build software without having some idea of the cost and time involved.

So why even introduce the first 4 as LAWs, rather state the first 4 as dysfunctions that need to be removed to increase the Value of estimates.    

So before you go off and Stop Estimating, ask yourself - or better confirm that those paying you for your work - that there is no need to know how much it will cost to produce the Value asked for or no need to know when that Value will arrive at some degree of precision and accuracy?

If there is no need, then estimates likely don't need to be made.

If there is a need for all the right business reasons, then ignore the first 4 Laws, and learn how to make estimates in the presence of uncertainty for those whose money you are spending.

Related articles Information Technology Estimating Quality Humpty Dumpty and #NoEstimates Five Immutable Principles of Project Success and Estimating Why Guessing is not Estimating and Estimating is not Guessing Capabilities Based Planning Architecture -Center ERP Systems in the Manufacturing Domain IT Risk Management Making Conjectures Without Testable Outcomes What's the Smell of Dysfunction?
Categories: Project Management

Quote of the Day

Thu, 06/15/2017 - 05:11

Have a roadmap instead of a road - Adrian Bolboaca

Without the map, we can't know if we're on the right road. But the map is not the terrain, so both are needed for success.

Categories: Project Management

Math for the Day

Tue, 06/13/2017 - 13:51

Screen Shot 2017-06-12 at 6.51.11 AM

Categories: Project Management

Five Immutable Principles of Project Success Requires Estimating

Sat, 06/10/2017 - 21:02

There are Five Immutable Principles of Project Success.

Screen Shot 2017-06-09 at 9.24.28 AMEach question requires an answer with sufficient confidence to make decisions about spending other people's money, that increases the probability of project success.

All project work is probabilistic (actually statistical and probabilistic), with reducible and irreducible uncertainties that create reducible and irreducible risks to the success of the project.

Let's look at what estimates are needed to answer each of the Five questions:

Principle Estimate Needed To Answer The Question What does DONE look like in units of measure meaningful to the decision makers?

DONE is always in units of business value. That Business Value has variance and that variance needs to be known to assure that the range of Value is tolerable for those paying. 

DONE is a delivered Capability that can be (should be) measured in a unit of Effectiveness and Performance. [1]

A core principle of Managerial Finance is Value cannot be determined without knowing the Cost to achieve that Value. 

So we need to know something about both measures - Cost and Value - to decide that what we think is DONE is affordable, arrives at the needed time to fulfill our business needs, and the cost of this Value can be sustained over the Value's delivery lifecycle.

Without estimates, the units of measure of DONE have no confidence that they will deliver the needed Value or the Cost to achieve that Value is possible before spending all the money and using all the time.

What's our plan to reach DONE as needed for the customer to receive the expected Value from our work?

A Plan is a Strategy for the success of the project. Strategies are hypotheses. Hypotheses require tests to confirm they represent the reality of the situation. 

With these hypotheses, we need to answer the question what's our confidence that the plan we have will actually work?

There is a bigger and better question though ... what confidence DO WE NEED in the plan for the Plan to be credible?

Anyone can construct a Plan. Is it a Credible Plan? Good question. Each requires us to make an estimate of the confidence in the elements of the Plan.

In our Software Intensive System of Systems world, this is called Probability of Program Success (PoPS).

What resources will we need to satisfy the business goals of the customer?

 For the project to be successful, we need resources. People, Facilities, Money, and Time.

How many people, of what skills? What facilities, tools? How much money? How much time?

Each of these needs is probabilistic when it comes to fulfilling the need. What range of variability can we tolerate? This requires making estimates and assessing that impact on the probability of success of the project.

What impediments will we encounter that will reduce the probability of reaching our business goal as needed for the customer to pay back the cost of delivering the needed Value

Risk Management is How Adults Manage Projects - Tim Lister.

All risk management operates in the presence of uncertainty. Making decisions in the presence of uncertainty requires estimating the outcomes of those decisions, the costs associated with those decisions.

You manage projects in the absence of risk management. 

You can't apply risk management without estimating.

No risk management? No adult management.

How are we going to measure progress toward DONE, in some unit meaningful to those paying for our work?

 Physical Percent Complete is the best measure of progress to plan. This is the beauty of Agile Development. Working Software is the Measure of Progress.

But that working software's arrival rate is statistically distributed. The notion of a burndown chart and the projection of progress to show the completion date ignores the statistical nature of software development and the statistical nature of the past performance. 

Many examples from NE show wild varainces in the past performance, but not the impact on the confidence of the complete date or cost. 

To show a confidence range on the completion date from the past performance - the empirical data - we need to estimate the possible ranges of performance in the future from the past performance AND from the possible risks in the future.

Estimates are needed to apply the Five Immutable Principles of Project Success 

[1] System Analysis, Design, and Development: Concepts, Principles, and Practices, Charles S. Wasson, Wiley Series in Systems Engineering and Management, 2006

Categories: Project Management

Five Immutable Principles of Project Success Requires Estimating

Sat, 06/10/2017 - 21:02

There are Five Immutable Principles of Project Success.

Screen Shot 2017-06-09 at 9.24.28 AMEach question requires an answer with sufficient confidence to make decisions about spending other people's money, that increases the probability of project success.

All project work is probabilistic (actually statistical and probabilistic), with reducible and irreducible uncertainties that create reducible and irreducible risks to the success of the project.

Let's look at what estimates are needed to answer each of the Five questions:

Principle Estimate Needed To Answer The Question What does DONE look like in units of measure meaningful to the decision makers?

DONE is always in units of business value. That Business Value has variance and that variance needs to be known to assure that the range of Value is tolerable for those paying. 

DONE is a delivered Capability that can be (should be) measured in a unit of Effectiveness and Performance. [1]

A core principle of Managerial Finance is Value cannot be determined without knowing the Cost to achieve that Value. 

So we need to know something about both measures - Cost and Value - to decide that what we think is DONE is affordable, arrives at the needed time to fulfill our business needs, and the cost of this Value can be sustained over the Value's delivery lifecycle.

Without estimates, the units of measure of DONE have no confidence that they will deliver the needed Value or the Cost to achieve that Value is possible before spending all the money and using all the time.

What's our plan to reach DONE as needed for the customer to receive the expected Value from our work?

A Plan is a Strategy for the success of the project. Strategies are hypotheses. Hypotheses require tests to confirm they represent the reality of the situation. 

With these hypotheses, we need to answer the question what's our confidence that the plan we have will actually work?

There is a bigger and better question though ... what confidence DO WE NEED in the plan for the Plan to be credible?

Anyone can construct a Plan. Is it a Credible Plan? Good question. Each requires us to make an estimate of the confidence in the elements of the Plan.

In our Software Intensive System of Systems world, this is called Probability of Program Success (PoPS).

What resources will we need to satisfy the business goals of the customer?

 For the project to be successful, we need resources. People, Facilities, Money, and Time.

How many people, of what skills? What facilities, tools? How much money? How much time?

Each of these needs is probabilistic when it comes to fulfilling the need. What range of variability can we tolerate? This requires making estimates and assessing that impact on the probability of success of the project.

What impediments will we encounter that will reduce the probability of reaching our business goal as needed for the customer to pay back the cost of delivering the needed Value

Risk Management is How Adults Manage Projects - Tim Lister.

All risk management operates in the presence of uncertainty. Making decisions in the presence of uncertainty requires estimating the outcomes of those decisions, the costs associated with those decisions.

You manage projects in the absence of risk management. 

You can't apply risk management without estimating.

No risk management? No adult management.

How are we going to measure progress toward DONE, in some unit meaningful to those paying for our work?

 Physical Percent Complete is the best measure of progress to plan. This is the beauty of Agile Development. Working Software is the Measure of Progress.

But that working software's arrival rate is statistically distributed. The notion of a burndown chart and the projection of progress to show the completion date ignores the statistical nature of software development and the statistical nature of the past performance. 

Many examples from NE show wild varainces in the past performance, but not the impact on the confidence of the complete date or cost. 

To show a confidence range on the completion date from the past performance - the empirical data - we need to estimate the possible ranges of performance in the future from the past performance AND from the possible risks in the future.

Estimates are needed to apply the Five Immutable Principles of Project Success 

[1] System Analysis, Design, and Development: Concepts, Principles, and Practices, Charles S. Wasson, Wiley Series in Systems Engineering and Management, 2006

Categories: Project Management

The Economics of Decision Making on Software Projects

Sat, 06/10/2017 - 01:13

The classic paper¬†¬†‚ÄúSoftware Engineering Economics,‚ÄĚ Barry Boehm, IEEE Transactions on Software Engineering, Vol SE-10(1), 1984, pp. 4-21. opens with a dictionary definition of¬†economics as a¬†social science concerned chiefly with description and analysis of the production distribution, and consumption of goods and services.¬†

A broader definition is

Economics is the study of how people make decisions in resource-limited situations. And those decisions are made in the presence of uncertainty.

Macroecon is about how people make decisions on a global scale. Microecon is about how people make decisions on a personal scale. This is about decisions for individuals and organizations. For software development, there are many decisions to be made. What Feature to develop next? Are we ready for the Release? Do we have enough time and money to develop the needed Features before the business needs to put those Features to work earned back their investment?

Since there are always limited resources - these are time, money, and people - we need some ability to make decisions in the presence of these limited resources. As well there is uncertainty on every project. Reducible uncertainty (Epistemic) and Irreducible (Aleatory). Reducible uncertainty can be reduced. Irreducible can not. We can spend money to reduce Epistemic uncertainty. Only margin can reduce irreducible uncertainty.

One model that is many times misunderstood by the agile community is the Cone of Uncertainty surrounding estimates as well as the technical and operational aspects of the project.

If your project's uncertainties are not being reduced in some declining manner - predefined manner - then you're going to be late and over budget and your deliverables are unlikely to work. So when those uncertainties are outside the upper and lower bounds of tghe Cone of Uncertainty, you'd better start looking for ways to fix the project.

Screen Shot 2017-06-09 at 5.37.16 PM
 So if we're going to be making decisions in the presence of uncertainty - somewhere along the horizontal axis of the project - with the resulting uncertainty range, we're going to need to estimate not only what the range should be for the phase of the project, but what effort and duration will be needed to reduce the variance of that uncertainty to keep that range inside the cone.

Related articles Architecture -Center ERP Systems in the Manufacturing Domain IT Risk Management Why Guessing is not Estimating and Estimating is not Guessing Making Conjectures Without Testable Outcomes Deadlines Always Matter What Happened to our Basic Math Skills? Root Cause of Project Failure
Categories: Project Management

Why Do Agilest Use Waterfall As Their Stalking Horse?

Wed, 06/07/2017 - 16:48

Many agilest use Waterfall as the Boogeyman of software development. They show pictures of Design-Code-Test process, with no feedback loops, linear connections from beginning to end in the project.

Here in the DOD, Waterfall went away long ago as an acquisition strategy. Here's one of many guidance documents. Waterfall is dead, stop using it as a whipping boy. If you're doing waterfall in your enterprise IT, you're doing it wrong. Stop and learn how to apply modern techniques, one of which is Agile, but there are others. Google dod 5000.2 procurement agile to see how it's done.

Screen Shot 2017-06-06 at 8.40.21 PM

Related articles Who's Budget is it Anyway? Capabilities Based Planning Architecture -Center ERP Systems in the Manufacturing Domain IT Risk Management
Categories: Project Management

D-Day

Wed, 06/07/2017 - 02:35

9c361277fee4688b9f2070121059f4bc

 

Categories: Project Management