Skip to content

Software Development Blogs: Programming, Software Testing, Agile Project Management

Methods & Tools

Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!

Herding Cats - Glen Alleman
Syndicate content
Performance-Based Project Management¬ģ Principles, Practices, and Processes to Increase Probability of Success S=k log W
Updated: 15 hours 25 min ago

Some More Background on Probability, Needed for Estimating

Sat, 03/28/2015 - 15:05

Statistics and Probability copyThe continued lack of understanding of the underlying probability and statistics of making decisions in the presence of uncertainty  to plague the discussion of estimating software.

All elements of all projects are statistical in nature. This statistical behaviour - reducible or irreducible stochastic processes - creates uncertainty.

Event based uncertainties have a probability of occurrence and a probability of the impact once that uncertainty becomes a relaity. These are Epistemic uncertainties - epistemology is the studdy of knowledge. Espitemtic means knowing or not knowing in this case/ We can buy knowledge. This is the core concept of agile paradigm/ We are buying down risk by building software to test the uncertainties of the project deliverables. This is the basis of saying agile is about risk management. But I suspect those saying that without being able to do the math as we say in our domain, don't realize what they are actually saying.

The natural occurring variances are aleatory. They are always there, they are irreducible. That is they can't be fixed. Work effort and duration is aleatory. The ONLY fix for aleatory uncertainty and the resulting risk is margin. Cost margin, schedule margin, technical performance margin. You can't buy the fix to aleatory uncertainty. 

Found a book today Discover Probability: How to Use It, How to Avoid Misusing It, and How It Affects Every Aspect of Your Life: How to Use It, How to Avoid Misusing It, and How It Affects Every Aspect of Your Life, Arieh Ben-Naim. World Scientific. 

This is one of those must read book for anyone working in a domain where probability and statistics dominates the decision making process. Unlike other books How To Measure Anything, The Flaw of Averages, How Not to Be Wrong: The Power of Mathematical Thinking, which is very good books - but populist in that they contain little in terms of actual mathematics. this books is in between. Lots of narrative, but math as well. Not like Probability Methods of Cost  Uncertainty Analysis: A Systems Engineering Perspective but in the middle.

In The End

you can't make decisions in the presence of uncertainty without estimating the outcome of your decision in the future. Using empirical data is preferred. But that empirical data MUST be adjusted for future uncertainty., past variances, sampling errors, poor representations of the actual process and the plethora of other drivers of uncertainty. Having small, simple samples without variances, and most of all confirming the past actually does represent the future - and doing that mathematically not just announcing it - is needed for any estimates of the future to have any credibility. Otherwise it's just an uninformed bad guess.

Categories: Project Management

Critical Success Factors of IT Forecasting

Fri, 03/27/2015 - 19:11

The Estimation problem in enterprise IT and Software Intensive System has been with us for decades if not from the beginning of time in the software business. While not software, but a hardware implementation of an algorithm, Alan Turing's problem was to tell his boss when he'll be able to crack for Enigma Code.

We still struggle with software estimates. Tom DeMarco's quote is  an important starting point, which I'll repeat here from the paper below

The estimator's charter is not to state what developers should od, but rather to provide a reasonable projection of what they will do.

Screen Shot 2015-03-27 at 11.48.41 AM
For those interested further understanding the estimate problem, this is a very good starting point. 

A key point from a reference "The Inaccurate Conception," CAM, 51(3):13-16, 2008, says

"When a weather forecast indicates a 40% chance of rain, and it rains on you, was the forecast accurate? If it doesn't rain on you, was the forecast inaccurate? Thought of in these terms, the concept of accuracy takes on a different meaning. It seems that the fact that it does or does not rain on you is not a particularly good measure of the accuracy of the rain estimate."

"The accuracy of a weather forecast is not whether it rains or not but whether it rains at the likelihood it was forecast to rain. Similarly, the accuracy of an estimate on a project is not whether the project achieves its goals but whether it correctly forecasts the probability of achieving its goals."

For some more background from the ACM Library, referenced here are some further readings

  • What is ¬†Good estimate? Whether Forecasting is Valuable, Phillip G. Armour, CACM, Vol. 56 No. 6. June 2013
  • Estimation is Not Evil, Phillip G. Armour, CACM, Vol. 57 No. 1, March 2014
Related articles Empirical Data Used to Estimate Future Performance Five Estimating Pathologies and Their Corrective Actions Calculating Value from Software Projects - Estimating is a Risk Reduction Process
Categories: Project Management

Quote of the Day

Fri, 03/27/2015 - 15:38

I have a card on my desk with a cute quote

May the Sun Always Shine Down on You ...as a constant reminder that there is a giant nuclear-powered fireball in the sky just barely holding it together.

Simply said It's Not About You

Categories: Project Management

Incremental Delivery of Features May Not Be Desirable

Thu, 03/26/2015 - 14:54

There is a popular notion in agile that incremental delivery of Features allows the project to stop at any time and deliver value to those paying for the work. This is another concept of agile, in the absence of a domain and context, is pretty much meaningless in terms of actionable decision making.

We need to know ahead of time what collection of features produce what business value to the customer. Arbitrarly stopping the project may be too soon or too late. If too soon, we'll leave the current features orphaned. They won't have what is needed to provide value to the customer.

Stopping to late leaves the delivered features with partially completed work that either pollutes the functionality or prevents complete use of that functionality.

What we need is a PLAN. A business capability deployment plan like the one below for a health insurance provider network system. This system integrates legacy systems with a new Insurance ERP system.

Project Maturity Flow is the Incremental Delivery of Business Capabilities on the Planned Need Date.

Incremental Capabilities

Partial completion may provide value to the customer. Or it may not, depending on what is provided. The pilot with demo grade data conversion doesn't do the providers much good as it's not real data and it's not real enrollment. Getting the shared group matrix up and running can test the integration across the provider network. The real value starts when real data is in the Data Mart and accessible by the new ERP system.

So Once Again

When we hear the  platitudes of deliver early deliver often or incremental delivery can be stopped at any time and value delivered to the customer please ask for a picture like this and have that person point to where and what on the picture can be stopped at what time. 

Building this picture is part of Project Governance. Part of the business management planning process  of how and when to spend the customers money in an agile emerging domain where requirements certainly do change. But it's not about coding and the needs of the coders. It's about business management and managing in the presence of MicroEconomics of Software Development. This project governance paradigm should be in place for any non-trivial effort.

Principles of program governance from Glen Alleman

And the Final Notion

Of course to do this successfully we need estimates of everything in order to assure the business plan is going to be met with some degree of confidence. Time to reach a useful capability, cost to reach that capability, risks in reaching that capability as planned, resource demands, and a myriad of other random variables. With this information and a statistical model (Monte Carlo, COD, Method of Moments, Design Structure Matrix, Probabilistic Value Stream Map), we can make decsisions about the business choices.

Without these processes, any non-trivial project is just coding until time and money runs out.

Related articles Release Early and Release Often Risk Management is How Adults Manage Projects Capabilities Based Planning First Then Requirements
Categories: Project Management

Flaw of Averages

Wed, 03/25/2015 - 15:41

There is a popular notion in agile estimating that a time series of past performance can be used to forecast the future performance of the project. Here's a clip of that time series.It is conjectured that data like this can be used to make decisions about the future. And certainly data about the past CAN be used and IS used to make decisions (forecast based decisions), but there are serious flaws in the suggested approach.

Screen Shot 2015-03-24 at 3.27.38 PM

The first flaw is the Flaw of Averages.

Sam Savage's book The Flaw of Averages: Why We Underestimate Risk in the Face of Uncertainty is a good place to start. From a 2000 newspaper article, this cartoon says it all.

FlawOfAverages

But the Flaw of Averages is deeper (pun intended) than the simple cartoon. Any time series of data, like the first chart can be used to forecast the possible outcomes from past performance. The Autoregressive Integrated Moving Average function in R, can easily shows that a time series like the first chart has a huge swing in possible outcomes. Below is an example, from a much better behaved time series. In this case the Schedule Performance Index of a software intensives system.

ARIMA

When we see Poorman's solutions to estimating future outcomes, be careful not to become that poorman by spending your customers money in ways that bankrupt the project.

Buy Savage's book, google software estimating, read up on ARIMA and time series analysis, download R and all its free books and documents. Here's some more resources on estimating software projects.

Don't fall prey to simple and many times simple minded approaches to managing other peoples money. Math is required when dealing with probabilistic processes found on ALL projects. The future is always uncertain and NEVER the same as the past. This is the case even of production processes found at Toyota. Margin and risk reduction activities are always needed. Knowing how much margin and what risk reduction activities is past of the planning process for any non-trivial project.

Related articles Empirical Data Used to Estimate Future Performance Five Estimating Pathologies and Their Corrective Actions Calculating Value from Software Projects - Estimating is a Risk Reduction Process Failure is not an Option Managing in the Presence of Uncertainty The Cost Estimating Problem Estimating Probabilistic Outcomes? Of Course We Can! Forecast, Automatic Routines vs. Experience
Categories: Project Management

Calculating Value from Software Projects - Estimating is a Risk Reduction Process

Tue, 03/24/2015 - 15:41

Software can provide increased benefits to internal users, with the firm paying for the development or acquisition of that software. Software can also reduce costs to the users in exchange for the cost of development or acquisition of the software.

This exchange of benefit for cost is the basis of all business decision making where the governance of resources is in place. This governance process assumes there are limited resources - cost, time, capacity for work, available labor, equipment, and any other element that enters in the production of output of the firm. 

Benefits produced in exchange for this cost also include risk reduction, as well as increased revenue, and lowered cost. But again this risk reduction is in exchange for cost - money paid for reduced risk.

To determine benefits we can use a decision framework †

  • Business scope - measures deliverable benefits in either cost or revenue.
  • Value and Potential Impact:
    • Improvement benefits ‚ÄĒ based on existing valuation tools or¬†what if calculations.
    • Scaling benefits ‚ÄĒ the total value of the associated business change resulting from the software acquisition and deployment.
    • Risk reduction benefits ‚ÄĒ the expected value of the risks being mitigates.
    • Future options ‚ÄĒ using Black-Scholes, Real Options, or Monte Carlo Simulation to determine future benefits.
  • Project cost - assess using standard project costing tools.
  • Direct savings ‚ÄĒ through operational cost reduction.
  • Real cash effective ‚ÄĒ netted off against direct savings.

Making Decisions in the Presence of Uncertainty ‡

It is better to be approximately right than precisely wrong - Warren Buffet

This quote is many times misused to say we can't possibly estimate so let's not estimate, let's just get started coding and we'll see what comes out. 

Since all project work is based on uncertainty ‚ÄĒ reducible and irreducible ‚ÄĒ risk is the result of this uncertainty. So first we need to determine what is the¬†Value At Risk before we can say how we are going to manage in the presence of this uncertainty.¬†

Risk is a quantity that has relevance on it's own. What's the risk that if I make this next turn on a Double-Black run in Breckenridge, there will be ice and I'll fall? Hard to say on the first run of the day, so I'll slow down as I come around the tree line. On the third run in a row, I've got the experience on that run today and the experience of skiing fast and hard on that run for several years to know more about the risk.

Since statistical uncertainty drives projects, the resulting risk from that uncertainty is probabilistic. When interdependencies between work elements, capacity for work, technical and operational processes, changing understanding, and a myriad of other variables are also uncertain, we are faced with the problem. This problem is no deterministic assessment of time, cost, or capabilities can be performed for our project. We must have a probabilistic process. And of course this probabilistic process requires estimating the range of values possible for each variable that interacts with the other variables.

Sampling from the past (empirical data) is a good way to start, but those past samples tell us nothing about the future statistical process unless all work in the future is identical to the work in the past. This is naive at best and dangerous at worst. Such naive assumptions are many times the Root Cause of major cost overruns in our space and defense software intensive systems business. Those same naive assumptions are applicable across all software domains.

When there is a network of work activities - as there is on any non-trivial project - each activity is a probabilistic process. The notion of Independent work in the agile paradigm must always be confirmed before assuming simple queuing process can be put to work. So when you hear about Little's Law and Bootstrapping simulations, confirm the interdependencies of the work. The model in the chart below and the Probability Distribution Function below that are from a Monte Carlo Simulation, where the interdependencies of the work activities are modeled by the tool. In this case RiskyProject that provides a Risk Register for reducible risks, the means of modeling the Irreducible uncertainty in the duration of the work, shows the coupling between the work elements Cruciality Index and other indicators of trouble to come from the status of the past performance.

Screen Shot 2015-03-24 at 8.17.58 AM

The chart below says that the activity being modeled (and all the activities in the network of activities are modeled, I just picked this one), has a 54% chance of completing on or before Oct 18th, 2015. 

Probabilistic Finish

The End

If you're project is small enough to be able to see all the work in one place, see how to produce outcomes from all that work, and has a low value at risk, making business based estimates using risk reduction is probably not needed. 

If you project has interdependencies between the work elements, the work is of sufficient duration you can't see the end in enough detail to remove all the statistical variances, and the value at risk is sufficiently high that the business impact of not showing up on time, on budget, with the needed outcomes will be unacceptable - then the process of managing in the presence of uncertainty must be able to estimate all the interacting variables. 

It's this simple

  • No risk, low impact for being wrong, low value at risk project ‚ÄĒ no need to worry about the future, just start ¬†and work out problems as you go.

But when we hear we don't need to estimate to make decisions and no domain and context is provided, those making that conjecture haven't considered the domain or context either. They're either unaware of the probability and statistics projects or they're intentionally ignoring the probability and statistics of projects. Since they ignore these fundamental process of all non-trivial project, ignore their advice.

‡ How to Measure Anything: Finding Value of Intangibles in Business 3rd Edition, Douglas Hubbard, 

† A New Framework for IT Investment Decisions, Anthony Barnes, 

Related articles Managing Projects By The Numbers Empirical Data Used to Estimate Future Performance Five Estimating Pathologies and Their Corrective Actions Managing in Presence of Uncertainty I Think You'll Find It's a Bit More Complicated Than That A Monte Carlo Simulation for Pi Day
Categories: Project Management

Capabilities Based Planning First Then Requirements

Sun, 03/22/2015 - 16:23

When I hear about requirements churn, bad requirements management - which is really bad business management, emergent requirements that turn over 20% a month for a complete turnover in 4 months - it's clear there is a serious problem in understanding how to manage the development of a non-trivial project.

Let's start here. Start with what capabilities does this project need to produce when it is done? The order of the capabilities is dependent of the business's ability to not only absorb the capability, but the value stream of those capabilities in support of the business strategy.

That picture at the bottom shows a value stream of capabilities for a health insurance provider network system. The notion of INVEST in agile has to be tested for any project. Dependencies exist and are actually required for enterprise projects. See the flow of capabilities chart below. Doing work in independent order would simply not work. 

Once we have the needed capabilities, and know their dependencies, we can determine - from the business strategy - what order they need to be delivered.
The Point When you hear about all the problems with requirements - or anything to do with software development - stop and remember - it is trivial to point out problems. The classical example of this trivial approach is estimates are the smell of dysfunction. This approach is a Dilbert carton management method. It's not only lame, it's not managing projects as an adult. Adults don't whine, they provide solutions.  So here's a place to start with Requirements Management.  Each  of these books informs our Command Media for requirements elicitation and management for software intensive systems. As well professional journals provide up to date guidance.  There are also tools for requirements management. But don't start with tools, start with a process. Analytic Hierarchy Process (AHP) is my favorite There is no reason to not have a credible requirements process - don't let the whiners dominate the conversation. Provide solutions to the problem.  Related articles Why We Need Governance I Think You'll Find It's a Bit More Complicated Than That The Use, Misuse, and Abuse of Complexity and Complex
Categories: Project Management

Quote of the Day

Sun, 03/22/2015 - 15:40

Science is the great antidote to the poison of enthusiasm and superstition.
- Adam Smith Wealth of Nations

If you hear a conjecture or a claim that sounds like it is not what you were taught in school, doesn't seem to make sense in a common sense way, or appears to violate established principles of science, math, or business - ask for the numbers.

Categories: Project Management

Risk Management is How Adults Manage Projects

Sat, 03/21/2015 - 23:50

The quote in the title is from Tim Lister. It says volumes about project management and project failure. It also means that managing risk is managing in the presence of uncertainty. And managing in the presence of uncertainty means making estimates about the impacts of our decision on future outcomes. So you can invert the statement when you hear we can make decisions in the absence of estimates.

For those interested in managing projects in the presence of uncertainty and the risk that uncertainty creates, here's a collection from the office library, in no particular order

Categories: Project Management

The Microeconomics of Decision Making in the Presence of Uncertainty - Re-Deux

Sat, 03/21/2015 - 22:43

Microeconomics is a branch of economics that studies the behavior of individuals and small impacting organizations in making decisions on the allocation of limited resources.

All engineering is constrained optimization. How do we take the resources we've been given and deliver the best outcomes. That's microeconomics is. Unlike models of mechanical engineering or classical physics, the models of microeconomics are never precise. They are probabilistic, driven by the underlying statistical processes of the two primary actors - suppliers and consumers. 

Let's look at both in light of the allocation of limited resources paradigm.

  • Supplier = development resources - these are limited in both time and capacity for work. And as likely talent and production of latent defects, which cost time and money to remove.
  • Consumer = those paying for the development resources have limited time and money. Limited money is obvious, they have a budget. Limited time, since the¬†time value of money of part of the Return in Capital equation used by the business. Committing capital (not real capital, software development is usually carried on the books as an expense), needs a time when that capital investment will start to return¬†value.¬†

In both case time, money, capacity for productive value are limited (scarce) and compete with each other and compete with the needs of both the supplier and the consumer. In addition, since the elasticity of labor costs is limited by the market, we can't simply buy cheaper to make up for time and capacity. It's done of course but always to the determent of quality and actual productivity.

So cost is inelastic, time is inelastic, capacity for work is inelastic and other attributes of the developed product constrained. The market need is like constrained as well. Business needs are rarely elastic - oh we really didn't need to pay people in the time keeping system, let's just collect the time sheets, we'll run payroll when that feature gets implemented.

Enough Knowing, Let's Have Some Doing

With the principles of Microeconomics applied to software development, there is one KILLER issue, that if willfully ignored ends the conversation for any business person trying to operate in the presence of limited resources - time, money, capacity for work.

The decisions being made about these limited resources are being made in the presence of uncertainty. This uncertainty - as mentioned - is based on random processes. Random process produce imprecise data. Data drawn from random variables. Random variables with variances, instability (stochastic processes), non-linear stochastic processes. 

Quick Diversion Into Random Variables

There are many mathematical definitions of random variables, but for this post let's use a simple one.

  • A variable is an attribute of a system or project that can take on multiple values. If the value of this variable is¬†fixed for example when someone asks what is¬†the number of people on the project can be known by counting then and writing that down. When someone asked you could count and say say 16.
  • When the values of the variable are¬†random then the variable can take on a range of values just like the non-random variable, but we don't know exactly what those values will be when we want to use that variable to ask a question. If the variable is a¬†random variable and¬†someone asks what will be the cost of this project when it is done, you'll have to provide a range of values and the confidence for each of the numbers in the range.¬†

A simple example - silly but illustrative - would be HR wants to buy special shoes for the development team, with the company logo on them. If we could not for some reason (doesn't matter why) measure the shoe size of all the males on our project, we could estimate how many shows of what size woudl be needed from the statistical distribution of males shoe sizes for a large population of make coders.

Mod8-image_shoe_male1

This would get use close to how many shoes of what size we need to order. This is a notional example, so please don't place an order for actual shoes. But the underlying probability distribution of the values the random variable can take on can tell us about the people working on the project.

Since all the variables on any project are random variables, we can't know the exact value of them at any one time. But we can know about their possible ranges and the probabilities of any specific value when asked to produce that value for making a decision. 

The viability of the population values and its analysis should not be seen not as a way of making precise predictions about the project outcomes, but as a way of ensuring that all relevant outcomes produced by these variables have been considered, that they have been evaluated appropriately, and that we have a reasonable sense what will happen for the multitude of values produced by a specific variable. It provides a way of structuring our thinking about the problem. 

Making Decisions In The Presence of Random Variables

To make a decision - a choice among several choices - means making an opportunity cost  decision based in random data. And if there is only one choice, then the choice is either take the choice or don't.

This means the factors that go into that decision are themselves random variables. Labor, productivity, defects, capacity, quality, usability, functionality, produced business capability, time. Each is a random variables, interacting in nonlinear ways with the other random variables.

To make a choice in the presence of this paradigm we must make estimates of not only the behaviour of the variables, but also the behaviors of the outcomes.

In other words

To develop software in the presence of limited resources driven by uncertain processes for each resource (time, money, capacity, technical outcomes), we must ESTIMATE the behaviors of these variables that inform our decision.

It's that simple and it's that complex. Anyone conjecturing decisions can be made in the absence of estimates of the future outcomes of that decision is willfully ignoring the Microeconomics of business decision making in the software development domain.

For those interested in further exploring of the core principle of Software Development business beyond this willful ignorance, here's a starting point.

These are the tip of the big pile of books, papers, journal articles on estimating software systems. 

A Final Thought on Empirical Data

Making choices in the presence of uncertainty can be informed by several means:

  • We have data from the past
  • We have a model of the system that can simulated
  • We have¬†reference classes from which we can extract similar information

This is empirical data. But there are several critically important questions that must be answered if we are not going to be disappointed with our empirical data outcomes

  • Is the past representative of the future?
  • Is the sample of data from the past sufficient to make sound forecasts of the future. The number of sample needed greatly influences the confidence intervals on the estimates of the future.

Calculating the number of samples needed for a specific level of confidence requires some statistics. But here's a place to start. Suffice it to say, those conjecturing estimates based on past performance (number of story point in the past) will need to produce the confidence calculation before any non-trivial decisions should be made on their data. Without those calculations the use of past performance be very sporty when spending other peoples money.

Thanks to Richard Askew for suggesting the addition of the random variable background

Categories: Project Management

Feynman Lectures Now Online

Sat, 03/21/2015 - 15:43

FeynmanThe Feynman Lectures were a staple of my education, including have Feynman come to UC Irvine a speak to the Student Physics Society on his current work in Quantum Electrodynamics (QED).

The 3 volume set is still in our library. Mine are hardbound, there are paper backs available now.

The books are not actually very good text books. The Lectures are just that, transcriptions of lectures. When reading them, you can hear Feynman talk, in the way several other authors write in the way they talk.

The point of this post is the Lectures are now available electronically at The Feynman Lectures on Physics.

For anyone interested in physics, or has ever heard of Richard Feynman should take a look. 

My memory - after many decades - is Feynman loved students in ways not all physics professors do. He was a professional teacher as well as a physicist. His Nobel prize never got in the way of his love of students. One of our own Nobel Laureates, Fred Reines  had a similar view of students - both undergrad and grad students. Dr. Reines would invite us to his house for BBQ and entertain us with stories of Los Alamos and other adventures.

Take a look, see how a true teacher writes about a topic he loves.

Related articles Empirical Data Used to Estimate Future Performance Five Estimating Pathologies and Their Corrective Actions Fifty years late for Feynman's lectures Managing Projects By The Numbers
Categories: Project Management

Five Estimating Pathologies and Their Corrective Actions

Fri, 03/20/2015 - 19:03

Jim Benson has a thought provoking post on the Five Pathologies of estimating. Each is likely present in an organization that has not moved up the maturity scale. Maturity levels in the CMMI paradigm. The post, like many notions in the agile world, starting with the Manifesto assumes the software development process is broken - ML 1. With that assumption there is little to direct the effort towards identifying the rot causes and taking corrective actions, as found in highly maturity levels. 

CMMI Maturity Levels

Let's See What Corrective Actions Will Remove the Symptoms produced by a Root Cause

Each item in the post could be a root cause or just a symptom of the root cause. A full Root Cause Analysis would be needed for a specific domain, but I'll make suggestions below that can be broadly applied to each. These responses are extracted from Steve Loving's "Mitigation's for the Planning Fallacy" given at the PMI Silicon Valley Symposium.

  • Guarantism ‚Äď The belief an estimate is actually correct. The commitment to an estimate as a¬†fact,¬†that is¬†we¬†guarantee the price of the work will be $X.XX
    • First let's establish a fundamental principle of all estimating processes. No point estimate is correct in the absence of a confidence level.
    • All numbers on projects are random numbers. Only the check out price at the grocery store is a¬†fixed number.¬†
    • Is that $600 plumbing quote a 80% confidence number or a 30% confidence numbers.
    • When the receiver of the quote doesn't ask for the confidence level, and further ask for the uncertainties in that quote - the reducible and irreducible uncertainties - then the provider of the quote of off the hook. And the receiver of the quote has not¬†locked that number in her mind.¬†
    • This is the anchoring and adjustment problem
    • CORRECTIVE ACTION - no quote accepted without a statistical confidence based on past performance or some parametric assessment
  • ¬†Promisoriality ‚Äď The belief that estimates are possible. This is labeled the Planning fallacy
    • Use of the Planning Fallacy without also reading the solutions to the Planning Fallacy is a Root Cause.
    • Bent Flyvbjerg has much to say on the Planning Fallacy and the corrective actions.
    • Reference Class Forecasting, Parametric Modeling, simple wide band Delphi and many other estimating techniques can address the Planning Fallacy
    • CORRECT ACTION¬†- Use of the Planning Fallacy and NOT reading the solutions, is a low maturity behaviour.
  • Swami-itis ‚Äď The belief that an estimates is a basis for sound decision making. All projects are probabilistic process driven by the underlying statistical network of interconnected activities. Fail to recognize that modeling this¬†system is the root cause of naive and ill-informed estimates.
    • Our domain - software intensive systems - is driven by Systems Engineering and Monte Carlo Simulation of the System of Systems.
    • Tools are available - some simple some complex. But the paradigm is systems engineering drives our thought processes.
  • Craftosis ‚Äď The assumption that estimates can be done better.¬†We believe we will get better at our estimates. ¬†That estimation is a skill. ¬†This may be true, ¬†our estimating skills can be honed. ¬†However, ¬†the planning fallacy and the realities of how we work put a cap on the accuracy we are able to attain.
    • We'e back to the misuse of the Planning Fallacy.¬†
    • So let's revisit the planning fallacy foundations so we don't misuse it again
      • The context of planning provides many examples in which the distribution of outcomes in past experience is ignored. Scientists and writers, for example, are notoriously prone to underestimate the time required to complete a project, even when they have considerable experience of past failures to live up to planned schedules. - - Kahneman and Tversky (1979)
      • Kahneman and Tversky tell us as does Flyvbjerg -¬†don't do stupid things on purpose. Don't underestimate when you have direct experience from the past that those underestimates were wrong.¬†

An Interlude at Address the Planning Fallacy

The Planning Fallacy is a set of cognitive biases present across all levels of expertise and all subject matters

  • Planners/PMs/Project teams are exhibiting the Planning Fallacy when they
    • Use internal, idealized scenarios about the future.
    • Ignore past information
    • Fall prey to false optimism.
    • Engage in the estimating processes with unacknowledged high motivational forces.
  • The Planning Fallacy has two components
    • Anchors that influence the Planning process ‚Äď data ¬†introduced early in planning, even spurious data, that then wrongly ¬†influence estimates
    • Related to the Planning Fallacy, found in novices and experts alike, are limits in cognition when probabilistic thinking is involved ‚Äď this becomes an issue when project teams must deal with likelihood of risk events. This is seen in the Healthcare.gov project.¬†Identifying and Managing Project Risk: Essential Tools for Failure-Proofing Your Project, Second Edition. By Tom Kendrick, 2009.
  • The Planning Fallacy has several dimensions
    • Plans
    • Past
    • Optimism
    • Motivation
    • Anchor
    • Models
  • Solutions to the Planning Fallacy are straight forward
    • Premortum -What were all the possible pitfalls in this imaginary project failure?
    • Reference Classes -¬†The Reference Class is a database of projects. A project team can compare their project to projects in the database. There are several external database for most IT projects.
    • Most Likely Development -¬†Look at parts of the project that have the highest probability of cost overrun, schedule overrun, or negative environmental impacts.
    • Bayesian analysis -¬†adjust judgment of the probability of success of the planning efforts.
    • Explicit model - simulation models, including Monte Carlo Carlo and Method of Moments.
    • Structured techniques -¬†processes they rely on data and detail¬†repeated multiple times throughout estimation phases
    • Decomposition - using the Work Breakdown Structure¬†to force more details, and illuminate faulty estimating models and resulting biases
    • Scrum -¬†Stories and tasks are ranked and scored by the Scrum team using story points.¬†
    • Theory of Constraints -¬†use real time mitigation for projects via buffer management.

OK, Back to the Post

  • Reality Blindness ‚Äď The insistence that estimates¬†are implementable¬†In business, ¬†we create estimates and plans and begin work immediately, ¬†rarely with the expectation that those plans will change.
    • This is a very naive process, a low maturity level in the CMMI sense.
    • Plans change, that's why it's called a plan.
    • When plans changes for all the right, and possibly wrong, reasons, estimates must be adjusted.
    • In our formal Federal Acquisition Regulation, Baseline Change Requests mandate a new Estimate to Complete and Estimate At Completion.

Summary

The post showed the symptoms of poor planning processes. The root causes are well know and the Corrective Actions readily available. The challenge is to have the will and fortitude at actually do the right thing.

This is a challenge in our high maturity space and defense business. So that's the real problem. It is not what many suggest - that estimating can't be done. It can. But you have to want to do it.

Like nearly every root cause it comes down to the people, page 18 below. Without a Root Cause Analysis, the original post just points out the symptoms and provides not means to take corrective actions. Leaving 2 out of the 3 steps missing for making improvements to our probability of project success. 

  Some Resource Materials

Final Comment

The key here is to understand that Benson's 5 symptoms of poor planning all have root causes. Each root cause has a corrective action. Each corrective action is hard to implement, hard to sustain, but for mission critical project manifestly important to apply.

And most important when we hear estimating can't be done, do your home work, look for thought leaders in estimating, read about estimating in all kinds of domains, and don't believe any conjecture without first testing that concept in the broader realm of research, evidence based, peer reviewed, field validated concept. The notion that we can make decisions about the spending of other peoples money n the absence of estimating how much, when we'll be done, and what we'll be able to deliver is as Alistair suggests about #NoEstimates is a pile of self-contradictory statements.

Time to reconnect with the business process of managing other peoples money.

 

Related articles The Use, Misuse, and Abuse of Complexity and Complex Release Early and Release Often Open Loop Thinking v. Close Loop Thinking
Categories: Project Management

Release Early and Release Often

Thu, 03/19/2015 - 14:23

After a conversation of sorts about release early and release often it's clear that without a domain this notion is a very nice platitude with no way to test its applicability in an actual business and technical environment.

Like many conversations based on platitudes, it ends with do what works for you, with no actionable outcomes.

Screen Shot 2015-03-19 at 7.02.40 AM

Let's Look from a Domain Point of View

The notion that releasing early assumes the customer can take the produced value early. That there are no externalities in the system. Since must systems are actually system of systems, the ability to accept early and often releases means there are no externalities. No externalities means - in our systems engineering dominated world of enterprise and software intensive systems - that the system is simple.

So for simple systems, sure early and often might be useful. Needs to be tested though with the business and the business rhythm of the business. Let's looks at several domains I work in:

  • Enterprise IT - an ERP system integrated with legacy systems, and producers and consumers of information. Early releases into production impact other systems. This drives churn for those systems to also take early and often releases, causing further churn. This wastes resources. As well ERP systems have externalities from legacy or other business systems and processes.¬†
  • Real Time Systems - have interactions with external systems - the system under control. Early may mean that the system under control is not ready to use the release and has to be modified to accept the release when it is ready.¬†

For Non-Trivial Systems - There's A Better Philosophy

Turn Early and Often into - Plan and Release - with margin for irreducible uncertainty and buy down for reducible uncertainty - for each of the needed capabilities, at the planned time, for the planned cost. 

These releases and their dates, are of course be incremental value produced by the project against the planned value for the project as a whole. But the delivery of this value must coincide with the business's ability to not only accept the value, but to put this value it to work.

The chart below shows an enterprise project with externalities, in a health insurance domain. You can see that early and often has dependencies. It always has dependencies on any non-trivial system. All enterprise systems have interdependencies and externalities. 

Screen Shot 2015-03-19 at 9.02.55 AM

Accept the produced value into production. The last paragraph in the RERO philosophy is untested and likely unsubstantiated in practice. Like many of the agile philosophies devoid of a domain and a context.

I'm a direct user of agile. In Enterprise and DOD environments. But when I hear a phrase in the absence of a domain, context in that domain, specific assessment of the system architecture and related business process architecture, it tells me it's just a platitude. 

And like all platitudes, you can't object to the words unless you have a basis of reality to stand on. Which is why it's easy to produce the platitudes but hard to apply them

Related articles Open Loop Thinking v. Close Loop Thinking When Is Delivering Early Not That Much Value? Empirical Data Used to Estimate Future Performance Self-Organization
Categories: Project Management

Managing in Presence of Uncertainty

Thu, 03/19/2015 - 13:19

On twitter there are several threads which speak to an underlying issue of managing in the presence of uncertainty. Ranging from we can make decisions without estimating to you can't possibly estimate the outcomes that occur in the future to the notion that my idea doesn't need to be tested outside my personal anecdotal experience. All the way to my idea violates the core principles of business decision making, but please apply it anyway.

The principles of successful project management are applicable across a broad spectrum of domains, when there is a value at risk sufficient to be concerned about on the part of those funding the project. Insufficient concern about loss of the investment? No one cares what you do or how you do it.

But if we have a non-trivial projects, here's five immutable principles for managing in the presence of uncertainty when spending other people's money

Screen Shot 2015-03-17 at 12.27.11 PM

But these principles need processes and practices. There's a book on that Performance-Based Project Management¬ģ. But there's all tons of other resources as well. Here's one we're applying currently in our domain

Screen Shot 2015-03-17 at 12.29.24 PM

So when you hear we can't or don't need estimate what will happen in the future what you are really hearing is we have no intention of managing in the presence of uncertainty.

Related articles I Think You'll Find It's a Bit More Complicated Than That What is a Team? Why We Need Governance Empirical Data Used to Estimate Future Performance Failure is not an Option Managing in the Presence of Uncertainty Estimating Probabilistic Outcomes? Of Course We Can! Criteria for a "Good" Estimate
Categories: Project Management

When Is Delivering Early Not That Much Value?

Wed, 03/18/2015 - 16:49

There is a phrase in agile deliver early and deliver often. Let's test this potential platitude in the enterprise and software intensive systems business 

Let's look at Delivery Often. Often needs to match the business rhythm of the project or the business. This requires answers to several questions:

  • How often can the business accept new features into the workflow processes?
  • Do users need training for these new features? Is so, is there an undo burden on the training organization for new training sessions?
  • Are there external process that need to stay in sync with these new features?
  • Are there changes to data or reports as a result of these new features?

Deliver Early is more problematic

  • Can the organization put the features to work?
  • Are there dependencies on external connections?

So let's look at the enterprise or software intensive systems domain. How about showing up as planned. This of course means having a Plan. The picture below is an enterprise systems that has Planned capabilities in a Planned order, with Planned features. 

Screen Shot 2015-03-17 at 9.54.27 AM

So before subcoming to the platitudes of agile, determine the needs for successful project completion in your domain. Then ask is that platitude is actually applicable to your domain?

Categories: Project Management

Quote of the Day

Tue, 03/17/2015 - 13:47

It's not so much about the creativity of the work, it's about emoting on an economically based schedule.
- Sean Penn talking with John Krakauer

At the end of the day all the work we do is about converting money into something of value. For that value to have value to those paying , we need to provide beneficial outcomes at time they can be put to use and for a cost that is less than the value they produce. Since all our project work is based on managing in the presence of uncertainty we need the ability to make decisions based on estimates. Since the future is emerging in front of us and the past is not likely to be representative of this future.

Failure to release this means a disconnection between cost and value.

 

Categories: Project Management

Managing Projects By The Numbers

Sun, 03/15/2015 - 19:42

Screen Shot 2015-03-14 at 12.40.20 PMWhen we hear we don't need deadlines; we don't need estimates; we can slice work small enough to have each and every activity be the same size, with no variance; we can use small samples with ¬Ī30% variance to forecast future outcomes in the absence of any uncertainties; we're Bad at Estimates; and the plethora of other reasons for not doing the work needed to be proper stewards of our customer's money - we're really saying we weren't paying attention in the High School Statistics Class.

All project activities are probabilistic, driven by the underlying uncertainties of the individual random processes. When coupled together - only trivial projects have independent work activities - cost, schedule, and technical activities drive each other in non-linear, non-deterministic ways. Managing in the presence of this uncertainty means risk reduction work or margin. Both are needed.

When it's not our money, we are obligated to be stewards of the money from those paying for the production of the value. This is a core principle of all business operations.

The notion that those on the cost side of the balance sheet have an equal voice of those on the revenue side when it comes to managing the firm's money to produce products or services seems to be lost on those not accountable for managing the money. 

Categories: Project Management

Pi Day

Sat, 03/14/2015 - 14:30

March 14, 2015

3/14/15

at

9:26:53 AM

ŌÄ=3.141592653

Categories: Project Management

Estimating Probabilistic Outcomes? Of Course We Can!

Fri, 03/13/2015 - 16:30

At a client site in Sacramento, CA off and on since last December. Driving back to hotel tonight, there was a radio show on the current Uniform California Earthquake Rupture Forecast (Version 3).

UCERF3_postcard

A former Colorado neighbor is an earthquake researcher and we've had discussions about probabilistic estimation of complex systems and complexity sciences

The notion that we can't predict - to some level of confidence - outcomes in the future is of course simply not correct. Earthquake prediction is not technically possible in the populist sense. It's a complex probabilistic process. 

Making forecasts - estimates of future outcomes - for software development projects is much less complex. The processes used to make these estimates range from past performance time series to multi-dimensional parametric models. Several tools are available for these parametric model. Steve McConnell provided  an original one I use a decade or so ago. Steve provides some background on making estimates where he speaks of the 10 deadly sins

  • Confusing targets with estimates - the bug-a-boo of all #NoEstimates advocates. It's simple - DON'T DO THIS.
  • Saying yes when you mean no - no quantitative data and guessing mans bad estimates - DON'T DO THIS
  • Committing Too Early - use cone of uncertainty
  • Assume underestimating has no impact on project - DON'T DO THIS
  • Estimating in the Impossible Zone - an optimistic estimate has a non-zero probability of coming true
  • Overestimating from use of new tools - DON'T DO THIS¬†¬†¬†¬†
  • Using only one estimating technique - DON'T DO THIS
  • Not using estimating software - DON'T DO THIS
  • Not including risk factors - the the primary sin of the simple small samples of stories or story points used to linearly forecast future performance. DON'T DO THIS.
  • Providing¬†off the cuff estimates - this is called guessing. DON'T DO THIS.

When you need to estimate - as you do in any non-trivial project - make sure you're not committing any of the 10 sins Steve mentions.

So Why The Earthquakes and SW Estimates?

One process is very complex and emerging science. One is a well developed mathematical process. 

There is so much misinformation about estimating software development, it's hard to know where to start. From outright wrong math, to misuse of mathematical concepts, to failure to acknowledge that estimates aren't for developers, they're for those paying the developers.

Categories: Project Management

Empirical Data Used to Estimate Future Performance Re-Deux

Fri, 03/13/2015 - 05:05

Came across this puzzling tweet today

... real empirical data & using probability to forecast are worlds apart. I don't buy "estimation uses data" argument.

This is always reminds me of Wolfgang Pauli's remark to a colleague who showed him a paper from an author who wanted Pauli to comment on...

Das ist nicht nur nicht richtig, es ist nicht einmal falsch!
It is not only not right, it is not even wrong

So let's Look At a Simple Forecasting Process

First forecasting is about the future. Estimates are about the past, present, and future. So estimates of future cost, schedule, and technical performance can be called forecasts.

A project's past performance data is a time series. Gather from things that happened in the past, based on intervals of times. These intervals should be evening spaced. They don't have to be, but that makes the analysis more complex. The example below is done with R is a statistical programming language found here www.r-project.org. R is used in a wide variety of domains. I was introduced to R through our son's work in cellular biology when he pointed out I'd get a D in his BioStats class he taught. Stop making linear projects, unadjusted for the variances of the past, and most of all unadjusted for variances created from uncertainty in the future. Come on Dad get with the program of making risk informed decisions. Here's a good reference on how to do this at a much broader scale Risk Informed Decision Making Handbook.

Below is a R plot from historical data of a project cost parameter, forecasting the possible values of this parameter to the future, using ARIMA (Autoregressive Integrated Moving Average). ARIMA is built into R - which can be downloaded for free. R and its statistical analysis capabilities are used in our domain to develop estimates. Using past performance - in the example below of cost index - we can forecast the eange and confidence on the bounds of that range - for cost index values.

Screen Shot 2015-03-11 at 3.37.12 PM

The chart above is from the paper below.

Earned Value Management Meets Big Data from Glen Alleman So when you hear about empirical data being used for forecasting the future. Or you hear the statement in the opening of this post, ask several questions:
  • Do you have any experience forecasting future outcomes from past performance that is mathematically credible?
  • Did you adjust your forecast for past variances?
  • Did you adjust your forecast for future uncertainty?

No? Then it's unlikely your number will have any chance of being correct.

We Have No Empirical Data, Now What?

Here's a continuation of the Tweet stream

Have you ever been asked to estimate something and haven't got any empirical data. This happens all the time. New teams are put together in new domains and asked to estimate, which really means commit. I don't see too many managers gathering real data about their projects and using them to forecast lead times.

Here's the way to solve this non-problem, problem.

This is a short list, just from my office book shelf. The office library has dozens of other books and the files have many dozens of recent papers on estimating software in the absence of empirical data. Google will find you 100's more.

So here's the final outcome. Whenever we hear about the reason we can't estimate, it's simply not true, never was true, never will be true. 

Related articles Software Engineering is a Verb Failure is not an Option Managing in the Presence of Uncertainty
Categories: Project Management