Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!
Software Development Blogs: Programming, Software Testing, Agile Project Management
Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!
Making decisions in the presence of uncertainty of a future outcomes resulting from that decision is an important topic in the project management, product development, and engineering domains. The first question in this domain is...
If the future is not identical to the past, how can we make a decision in the presence of this future uncertainty?
The answer is we need some means of taking what we know about the past and the present and turning it into information about the future. This information can be measurements of actual activities - cost, duration of work, risks, dependencies, performance and effectiveness measures, models and simulation of past and future activities, reference classes, parametric models.
If the future is identical to the past and the present, then all this data can show us a simple¬†straight line¬†projection from the past to the future.
But there are some questions:
The answers to these and many other questions can be found in the mathematics of probability and statistics. Here's some popular misconceptions of mathematical concepts
Modeling is the Key to Decision Making
"All models are wrong, some are useful," George¬†Box and Norman R. Draper (1987). Empirical Model-Building and Response Surfaces, p. 424, Wiley. ISBN 0471810339.¬†
We can't possibly estimate activities in the future if we don't already know what they are
We actually do this all the time. But more importantly there are simple step-by-step methods for making credible estimates about unknown - BUT KNOWABLE - outcomes.
This know of unknown but knowable is critical. If we really can't know - it is unknowable - then the work is not a project. It is pure research. So move on, unless you're a PhD researcher.
Here's a little dialog showing how to estimating most anything in the software development world.¬†
With your knowledge and experience in the domain and a reasonable understanding of what the customer wants (no units of measure for reasonable by the way, sorry), let's ask some questions.
I have no pre-defined expectation of the duration. That is I have no anchor to start. If I did and didn't have a credible estimate I'd be a Dilbert manager - and I'm not.
Microeconomics of Decision Making
¬†Making decisions about the future in the presence of uncertainty can be addressed by microeconomics principles. Microeconomics¬†is a branch of economics that studies the behavior of individuals and small impacting organizations in making decisions on the allocation of limited resources. Projects have limited resources, business has limited resources. All human endeavors have limited resources - time, money, talent, capacity for work, skills, and other unknowns.¬†
The microeconomics of decision making involves several variables
Formally, defining this choice problem is simple: there is a state space S, whose elements are called states of nature and represent all the possible realizations of uncertainty; there is an outcome space X , whose elements represent the possible results of any conceivable decision; and there is a preference relation ‚™ł over the mappings from S to X.¬†‚Ä†
This of course provides little in a way to make a decision on a project. But the point here is¬†making decisions in the presence of uncertainty is a well developed discipline. Conjecturing it can't be done simply ignores this discipline.
The Valuation of Project Deliverables
It's been conjectured that focusing on value is the basis of good software development efforts. When suggested that this value is independent of cost this is misinformed.¬†Valuation¬†and the resulting Value used to compare choices,¬†is the process of determining the economic value of an asset, be it a created product, a service, or a process. Value is defined as the net worth, or the difference between the benefits produced by the asset and the costs to develop or acquire the asset, all adjusted appropriately for probabilistic risk, at some point in time.
This¬†valuation has several difficulties:
The valuation of the outcomes of software projects depends on the analysis of these underlying costs and benefits. A prerequisite for cost-benefit analysis is the identification of the relevant value and cost drivers to produce that value. Both cost and value are probabilistic, driven by ¬†uncertainty - both reducible and irreducible uncertainty
In addition to measurable benefits and costs of the software project, the valuation process must consider uncertainty. Uncertainty arises from different sources. Natural uncertainty (aleatory) which is ¬†irreducible. This uncertainty relates to variations in the environment variables. Dealing with irreducible uncertainty requires¬†margin¬†for cost, schedule, and the performance of the outcomes. For both value and cost.
Event based uncertainty (epistemic) which is reducible. That is we can buy down this uncertainty with out actions. We can pay money to find things out. We can pay money to improve the value delivered from the cost we invest to produce that value.
Parameter uncertainty relates to the estimation of parameters (e.g., the reliability of the average number of defects). Model uncertainty relates to the validity of specific models used (e.g., the suitability of a certain distribution to model the defects). There is a straightforward taxonomy of uncertainty for software engineering that includes additional sources such as scope error and assumption error. The standard approach of handling uncertainty is by defining probability distributions for the underlying quantities, allowing the application of a standard calculus. Other approaches based on fuzzy measures or Bayesian networks consider different types of prior knowledge. ‚Ä°
The Final Point Once Again
The conjecture we can make informed decisions about choices in an uncertain future can be done in the absence of making estimates of the impacts of these choices has no basis in the mathematics of decision making.
This conjecture is simply not true. Any attempt to show this can be done has yet to materialize in any testable manner. This is where the basic math skills come into play. There is no math that supports this conjecture. Therefore there is no way to test this conjecture. It's personal opinion uninformed by any mathematics.
Proceed with caution when you hear this.
‚Ä† Decision Theory Under Uncertainty,¬†Johanna Etner, Meglena Jeleva, Jean-Marc Tallon,¬†¬†Centre d‚ÄôEconomie de la Sorbonne 2009.64
‚Ä°¬†Estimates, Uncertainty and Risk. IEEE Software, 69-74 (May 1997),¬†Kitchenham and Linkman and "Belief Functions in Business Decisions. In: Studies in Fuzziness and Soft Computing, Vol. 88,¬†Srivastava and MockRelated articles Information Technology Estimating Quality Everything I Learned About PM Came From a Elementary School Teacher Carl Sagan's BS Detector Eyes Wide Shut - A View of No Estimates
Decision making is hard. Decision making is easy when we know what to do. When we don't know what to do there are conflicting choices that must be balanced in the presence of uncertainty for each of those choices. The bigger issue is that important choices are usually ones where we know the least about the outcomes and the cost and schedule to achieve those outcomes.¬†
Decision science evolved to cope with decision making in the presence of uncertainty. This approach goes back to Bernoulli in the early 1700s, but remained an¬†academic subject into the 20th century, because there was¬†no satisfactory way to deal with the complexity of real life. Just after¬†World War II, the fields of systems analysis and operations research began¬†to develop. With the help of computers, it became possible to analyze¬†problems of great complexity in the presence of uncertainty.
In 1938, Chester Barnard, authored of¬†The Functions of the Executive,¬†and coined the term ‚Äúdecision making‚ÄĚ from the lexicon of public administration into the business world. This term replaced narrower descriptions such as ‚Äúresource allocation‚ÄĚ and ‚Äúpolicy making.‚ÄĚ
Decision analysis functions at four different levels
Each level focuses on different aspects of the¬†problem of making decisions. And it is decision making that we're after. ¬†The purpose of the analysis is not to obtain a set of numbers describing¬†decision alternatives. It is to provide the decision-maker the insight needed¬†to choose between alternatives. These insights typically have three elements:
Now To The Problem at Hand
It has been conjectured ...
The key here and the critical unanswered question is¬†how can a decision about an outcome in the future, in the presence of that uncertain¬†future, be made in the absence of estimating the attributes going into that decision?
That is,¬†if we have less than acceptable knowledge about a future outcome, how can we make a decision about the¬†choices¬†involved in that outcome?
Dealing with Uncertainty
All project work operates in the presence of uncertainty. The underlying statistical processes create probabilistic outcomes for future activities. These activities may be probabilistic events, or the naturally occurring variances of the processes that make up the project.¬†
Clarity of discussion through the language of probability is one of the¬†basis of decision analysis. The reality of uncertainty must be confronted and¬†described, and the mathematics of probability is the natural language to describe¬†uncertainty.
When we don't have the clarity of language, when redefining mathematical terms, misusing mathematical terms, enters the conversation, agreeing on the ways - and there are many ways - of making decisions in the presence of an uncertain future - becomes bogged down in approaches that can't be tested in any credible manner. What remains is personal opinion, small sample anecdotes, and attempts to solve complex problems with simple and simple minded approaches.¬†
For every complex problem there is an answer that is clear, simple, and wrong.¬†H. L. MenckenRelated articles Eyes Wide Shut - A View of No Estimates Carl Sagan's BS Detector Systems Thinking, System Engineering, and Systems Management
The estimator's charter is not to state what the developers should do, but rather to provide a reasonable project of what they will do. - Tom DeMarco
Here's a few resource materials for estimating cost, schedule, and technical outcomes on software intensive systems. In meeting about managing risk in the presence of uncertainty below, it became clear we need to integrate estimating with risk, technical performance measures, measures of effectiveness, measures of performance, cost, and schedule.
Some Recent Resources
There are 1,117 other papers and articles in the Software Cost Estimating folder on my server. These are just a very small sample of how to make estimates.
The notion that we can make decision in the presence of uncertainty in the absence of estimating (#NoEstimates) the outcomes of those decisions can only work if there no opportunity costs at risk in the future. That is¬†there is¬†nothing¬†at risk for making a¬†choice¬†between multiple outcomes.Related articles Humpty Dumpty and #NoEstimates Climbing Mountains Requires Good Estimates Carl Sagan's BS Detector
Our daughter is an elementary teacher in Austin Texas. A nice school, Number 2 school in Texas.
While visiting this week, we were talking about a new book a group of us are working on. While showing her the TOC, she said¬†Dad we do all that stuff (minus the finance side) every day, week, month,¬†semester, and year. It's not¬†that hard.¬†That's what we've been trained to do.¬†OK, but talent, dedication, skill, and a¬†gift for teaching helps.¬†
Here's how an elementary school teacher sees her job as the Project Manager of 20 young clients.
There was an interesting post on the #NoEstimates thread that triggered memories of our hiking and climbing days with our children (now grown and gone) and our neighbor who has summited many of the highest peaks around the world.
The quote was¬†Getting better at estimates is like using time to plan the Everest climb instead of climbing smaller mountains for practice.
A couple background ideas:
In our neighborhood are several semi-pro mountain climbers. People move to Colorado for the outdoor life, skiing, mountain and road biking, hiking, and climbing.¬†
Now to the Tweet suggesting that getting better at estimating is replaced by doing (climb) smaller projects. Turns out estimates are needed for those smaller mountains, estimates are needed for all hiking and climbing. But first...
Let's start with those¬†Things.
No matter how prepared you are, you need a plan. Practice on lower peaks is necessary but far from sufficient for success. Each summit requires planning in depth. For Long's peak you need a Plan A, Plan B, and possibly a Plan C. On most of all you need strong estimating skills and the accompanying experience to determine when to invoke each Plan. People die on Longs because they foolishly think they can¬†beat the odds and proceed with Plan B.
So the suggest that summiting something big, like any of the Seven Summits, without both deep experience and deep planning is likely going to not be heard of again.
¬†So the OP is likely speaking for not having summited much of anything, hard to tell, no experience resume attached.
The estimating part is basic,¬†Can we make it to the key hole on Long's Peak before the afternoon¬†storms¬†come¬†in/ On Everest,¬†can we make it to the Hillary Step before 1:00 PM?¬†No? Turn back, you're gonna die if you continue.
Can we make it to the delivery date at the pace we're on now, AND with the emerging situation for the¬†remaining¬†work, AND for the cost we're trying to keep AND with the needed capabilities the customer needs?¬†Remember the use of past performance is fine, If and Only If the future is something like the past, or we know something about how the future is different from the past.
When the future not like the past? We need a Plan B. And that plan has to have estimates of our future capabilities, cost expenditure rate, and our abilities to produce the needed capabilities.
ALL PLANNING IN THE PRESENCE OF UNCERTAINTY REQUIRES - MANDATES ACTUALLY - ESTIMATING.¬†
Ask any hiker, climber, development manager, business person. Time to stop managing by platitudes and start managing by the principles of good management.Related articles There is No Such Thing as Free The Fallacy of the Planning Fallacy Systems Thinking, System Engineering, and Systems Management Myth's Abound Eyes Wide Shut - A View of No Estimates
The Art of Systems Architecting‚Ä† is a book that changed the way I look at ¬†development of software intensive systems. As a manager of software in the system of systems domain, this book created a clear and concise vision of how to assemble all the pieces of the system into a single cohesive framework.
One of the 12 principles of the Agile Manifesto is¬†The best architectures, requirements, and designs emerge from self-organizing teams. The self-organizing team parts is certainly good. But good architectures don't¬†emerge, unless it's the¬†Ball of Mud architecture. Good architecture is a combination of science, engineering and art. Hence the title of the book.
Systems architecting borrows from other architectures, but the basic attributes are the same:¬†‚Ä†¬†
Why Do We Care About This?
When we hear of some new and possibly different approach to anything, we need to ask - what is the paradigm this idea fits into? If it is truly new, what paradigm does it replace and how does that replacement maintain the needed information from the old paradigm used for success and what parts of the old paradigm are replaced for the better and how can we be assured that it is actually better?
One answer starts with the¬†architecture¬†of the paradigm. In the case of¬†managing projects this is the programmatic architecture. This Principles, Practices, and Processes of the Programmatic Architecture.
Five Immutable Principles of project success can be found in...
With these principles we can apply Five Practices guided by these Principles
With the Principles and Practices in place, Processes can be defined for the specific needs of the domain.
So with the Principles, Practices, and Processes in place, we can now ask
When it is suggested a new approach be taken, where does that approach fit in the Principles, Practices, and Processes that are in place now? If there is no place, how does this new suggestion fulfill the needs of the business that are in place? If there needs aren't fulfilled, does the business acknowledge that those needs are no longer needed?
If not, the chances of this new idea of actually being accepted by the business are slim to none.Related articles Systems Thinking, System Engineering, and Systems Management Who's Budget is it Anyway? Eyes Wide Shut - A View of No Estimates
There are several paradigms for¬†Systems Thinking.¬†Ranging from Psychobabble to hard core Systems Engineering. A group of colleagues are starting a book with a working title¬†Increasing The Probability of Project Success, several of the chapters are based on Systems Thinking.
But first some background between Systems Theory, Systems Thinking, and Systems Engineering
Systems Theory¬†is the interdisciplinary study of¬†systems¬†in general, with the goal of elucidating principles that can be applied to all types of¬†systems¬†at all nesting levels in all fields of research.Systems Engineering¬†is an interdisciplinary field of¬†engineering¬†that focuses on how to design and manage complex¬†engineering systems over their life cycles. Systems Management¬†(MSSM, USC, 1980) is an umbrella discipline encompassing systems engineering, managerial finance, contract management, program management, human factors, operations research, in limitary, defense, space, and other complex systems disciplines)
Here's are two books references that inform our thought processes¬†
This book is the basis of Thinking about systems. It's a manufacturing and Industrial Engineering paradigm. Software Intensive Systems fit in here as well, since interfaces between system components define the complexity aspects of all system of systems.
This book opens with an Einstein quote¬†In the brain, thinking is doing. As engineers - yes software engineering is alive and well in many domains, no matter how much we think wqe have to do. We can plan, prepare, and predict, but action occurs through doing.
so when we hear any suggestion, ask how can this be put to work in some measurable way to assess the effectiveness and performance of the outcomes?
This is the companion mapping processes book.¬†Systems Thinking is the process of understanding how systems influence one another withn a world ¬†of systems and has been defined as an approach to problem solving by viewing our "problems" as parts of an obverall system, rather than reacting to a specific part or outcome.
There are many kinds of systems. Hard systems, software systems, evolutionary systems. It is popular to mix these, but that creates confusion and removes the ability to connect concepts with actionable outcomes.¬†
Cynefin is one of those popular approaches that has no units of measure of complex, complicated, chaotic, and obvious. Just soft self referencing words.¬†
so in our engineering paradigm this approach is not very useful.
Along with these appoaches are some other seminal works
In The End
Everything's ¬†system. Interactions between components is where the action is and where the problems come from. Any non-trivial systems has interactions that must be managed as system interactions. this means modeling these interactions, estimating the impacts of these interactions. defining the behaviors of these interaction before, during, and after their development,
This means recognizing the criteria for a mature and effective method of managing in the presence of uncertainty.
Project work is random. Most everything in the world ¬†is random. The weather, commuter traffic, productivity of writing and testing code. Few things actually take as long as they are planned. Cost is less random, but there are variances in the cost of labor, the availability of labor. Mechanical devices have variances as well.
The exact fit of a water pump on a Toyota Camry is not the same for each pump. There is a tolerance in the mounting holes, the volume of water pumped. This is a variance in the technical performance.
Managing in the presence of these uncertainties is part of good project management. But there are two distinct paradigms of managing in the presence of these uncertainties.
In the first case we have empirical data. In the second case we don't. There are two approaches to modeling what the system will do in terms of cost and schedule outcomes.
Bootstrapping the Empirical Data
With samples of past performance and the proper statistical assessment of those samples, we can¬†re-sample them to produce a model of future performance. This bootstrap resampling shares the principle of the second method - Monte Carlo Simulation - but with several important differences.
This¬†bootstrapping method is quick, easy, and produces a quick and easy result. But it has issues that must be acknowledged.
Monte Carlo Simulation
This approach is more general and removes many of the restrictions to the statistical confidence of¬†bootstrapping.
Just as a reminder, in principle both the parametric and the non-parametric bootstrap are special cases of Monte Carlo simulations used for a very specific purpose: estimate some characteristics of the sampling distribution. But like all principles, in practice there are larger differences when modeling project behaviors.
In the more general approach ¬†of Monte Carlo Simulation the algorithm repeatedly creating random data in some way, performing some modeling with that random data, and collecting some result.
In practice when we hear Monte Carlo simulation we are talking about a theoretical investigation, e.g. creating random data with no empirical content - or from reference classes - ¬†used to investigate whether an estimator can represent known characteristics of this random data, while the (parametric) bootstrap refers to an empirical estimation and is not necessary a model of the underlying processes, just a small sample of observations independent from the actual processes that generated that data.
The key advantage of MCS is we don't necessarily need ¬†past empirical data. MCS can be used to advantage if we do, but we don't need it for the Monte Carlo Simulation algorithm to work.
This approach could be used to estimate some outcome, like in the bootstrap, but also to theoretically investigate some general characteristic of an statistical estimator (cost, schedule, technical performance) which is difficult to derive from empirical data.
MCS removes the road block heard in many critiques of estimating -¬†we don't have any past data on¬†which¬†to¬†estimate. ¬†No problem, build a model of the work, the dependencies between that work, and assign statistical parameters to the individual or collected PDFs and run the MCS to see what comes out.
This approach has several critical advantages:
So Here's the Killer Difference
Bootstrapping models make several key assumptions, which may not be true in general. So they must be tested before accepting any of the outcomes.
Monte Carlo Simulation models provide key value that bootstrapping can't.
The critical difference between Bootstrapping and Monte Carlo Simulation is that MCS can show what the future performance has to be to stay on schedule (within variance), on cost, and have the technical performance meet the needs of the stakeholder.
When the process of defining the needed behavior of the work is done, a closed loop control system in put in place. This needed performance is the steering target. Measures of actual performance compared to needed performance generate the error signals for taking corrective actions. Just measuring past performance and assuming the future will be the same, is Open Loop control. Any non-trivial project management method needs a¬†closed loop control¬†system
Bootstrapping can only show what the future will be like if it like the past, not what it must be like. In Bootstrapping this future MUST be like the past. In MCS we can tune the PDFs to show what performance has to be to manage to that plan. Bootstrapping is reporting¬†yesterday's weather as tomorrow's weather - just like Steve Martin in LA Story. If tomorrow's weather turns out not to be like yesterday's weather, you gonna get wet.
MCS can forecast tomorrows weather, by assigning PDFs to future activities that are different than past activities, then we can make any needed changes in that future model to¬†alter the weather to meet or needs. This is in fact how weather forecasts are made - with much more sophisticated models of course here at the National Center for Atmospheric Research in Boulder, CO
This forecasting (estimating the future state) of¬†possible outcomes and the alternation of those outcomes through management actions to change dependencies, add or remove resources, provide alternatives to the plan (on ramps and off maps of technology for example), buy down risk, apply management reserve, assess impacts of rescoping the project, etc. etc. etc. ¬†is what project management is all about.
Bootstrapping is necessary but far from sufficient for any non-trivial project to show up on of before the need date (with schedule reserve), at o below the budgeted cost (with cost reserve) and have the produce or service provide the needed capabilities (technical performance reserve).
Here's an example of that probabilistic forecast of project performance from a MCS (Risky Project). This picture shows the probability for cost, finish date, and duration. But it is built on time evolving PDFs assigned to each activity in a network of dependent tasks, which models the work stream needed to complete as planned.
When that future work stream is changed to meet new requirements, unfavorable past performance and the needed corrective actions, or changes in any or all of the underlying random variables, the MCS can show us the expected impact on key parameters of the project so management in intervention can take place - since Project Management is a verb.
The connection between the Bootstrap and Monte Carlo simulation of a statistic is simple.
Both are based on repetitive sampling and then direct examination of the results.
But there are significant differences between the methods (hence the difference in names and algorithms). Bootstrapping uses the original, initial sample as the population from which to resample. Monte Carlo Simulation uses a data generation process, with known values of the parameters of the Probability Distribution Function. The common algorithm for MCS is Lurie-Goldberg.¬†Monte Carlo is used to test that the results of the estimators produce desired outcomes on the project. And if not, allow the modeler and her management to change those estimators and then mange to the changed plan.
Bootstrap can be used to estimate the variability of a statistic and the shape of its sampling distribution from past data. Assuming the future is like the past, make forecasts of throughput, completion and other project variables.¬†
In the end the primary differences (and again the reason for the name differences) is Bootstrapping is based on unknown distributions. Sampling and assessing the shape of the distribution in Bootstrapping adds no value to the outcomes. Monte Carlo is based on known or defined distributions usually from Reference Classes.Related articles Do The Math Complex, Complexity, Complicated The Fallacy of the Planning Fallacy
When we hear all the difficulties, misuse, abuse, and inabilities for making software development estimates, the real question is¬†what does the business think of this?
Good question. When our children were young - 6 or 7 - they asked the¬†big question once they started receiving an allowance for household chores.
What's the reason we have money? The best answer I could think of was¬†money provides the vehcile for the exchange of value. I pay you for doing something around the house which is of value to me or your mother. That money you receive can then be used to buy something of value for you.¬†You can exchange your money for that valued thing.
Money is the carrier of value between two parties in the transaction.
The basis of a successful business is the exchange of cost (money) for value. Either internally for building a product or externally for providing a service to an outside firm.
In practice this value and the cost to achieve this value operate in the presence of uncertainty in non-trivial situations. If I see a bicycle helmet I want, (notice I didn't say need, since I have a perfectly good and nice helmet) I can look at the price tag and determine if my old helmet needs to be replaced. That is¬†is the value of the new helmet of sufficient value to me that I'm willing to pay the cost and absorb the cost of the old helmet?
The price tag for the Helmet is clear. I might be able to get a discount. The value of the helmet is up to me. As a minimum the value is equal to the cost. This is the basis of Earned Value Management. But the value of the helmet may be immeasurable if it saves me from a brain trauma if I were to crash. I've never crashed or been hit on my road bike. A few small spills on the mountain bike.
The Business Notion of Value in Exchange for Cost
In the business of software development the exchange process takes place in the presence of uncertainty, there are no price¬†tags attached to the development process in the way the helmet does. Uncertainty about the value - beyond the cost. Uncertainty about the cost. Uncertainty about the risks, and all the other random variables associated with the project work.
When making decisions in the presence of these uncertainties, we can consider the opportunity cost. This is the loss of potential gain from other alternatives when one alternative is chosen.¬†If I choose one path for one cost over another path of another cost, what's the lost opportunity? This decision making process is the basis of¬†Microeconomics as it is applied to Software Development.
Opportunity costs are fundamental costs in economics, and are used in computing cost benefit analysis of a project. Such costs, however, are not recorded in the account books but are recognized in decision making by computing the cash outlays and their resulting profit or loss.
So let's ask a simple question...
How can we make those opportunity cost decisions in the presence of the uncertainty of their specific value? Meaning of the "value" of the Value - an actual number of Dead Presidents (dollars) is a random variable, dependent on a variety of things - can we make a decision in the presence of this uncertainty? This uncertainty is around the precision and accuracy of our knowledge of this "value."
The answer is we need to estimate the¬†value of the number of Dead Presidents for both the cost of the value and the value of the value. Both sides of the equation are needed. This means we need to know something about the¬†value returned in exchange for our cost to acquire that value. A common - and simple - way to measure that is through the Return on Investment (ROI). This investment in this case is the¬†cost¬†we've assigned to the expected¬†value to be returned. The formula is
ROI = (Value - Cost) / Cost
The two¬†variables¬†in most instances are random variables. If we're¬†Investing in a new helmet like a really nice Specialized S-works, we can make the trade offs - the opportunity costs, between the $250.00 for the helmet and the potential immeasurable value of¬†saving my head on the mountain bike, while riding with our collegiate racer son. The current helmet will likely do that job as well, but that carbon fiber S-works helmet will look very cool while riding next to our sons ¬†S-works Epic 29√©r. See you can buy¬†cool.
But now we need to determine the¬†opportunity cost for a software system, a feature in a software system, or something else related to a software system. That is we need to¬†decide in the presence of the¬†uncertainty¬†created by the randomness of the many variables of the software process.
How can we do that? Well, and here's the punch line...
WE NEED TO ESTIMATE.
Can we make that decision in the absence of making an estimate? Sure we can guess, we can make a blind decision, we can just¬†decide and suffer the consequences of that decision. But here's the rub as Willy Shakespeare once had Hamlet say, is that if it's not our money we're exchanging for the produced or acquired value, those providing the money have a vested interest in knowing how much money. And when the Value produced or acquired for the money will be returned.¬†
This is the basis of Microeconomics which is defined as the¬†behavior of individuals and small impacting organizations in making decisions on the allocation of limited resources.
So when someone says we can make decisions without estimates, their wrong. Unless hey haveno concern for the loss resulting from the¬†write off¬†of the opportunity cost. It is doubtful that those paying for the value have much interest in that.¬†
Can we make decisions without an estimate of the outcomes? Not in any credible way I know of. If there is a credible way, it has yet to be stated. Slicing is estimating. Using past performance for forecasting (estimating outcomes in the future) is estimating.
So the simple answer to the conjecture of making decisions in presence of uncertainty in the absence of an estimate is also simple
And anyone saying they can needs to come to grips with the principles of microeconomics of software development.Related articles There is No Such Thing as Free Who's Budget is it Anyway? Just Because You Say Words, It Doesn't Make Then True Essential Reading List for Managing Other People's Money The Dysfunctional Approach to Using "5 Whys"
There are several BS detector paradigms. One of my favorites is Carl Sagan's. This one has been adapted from¬†Reality Charting the book that goes along with the Apollo Method Root Cause Analysis process we use in our governance process in our domain. Any outage, any¬†hard break, and disruption to the weekly release process is cause for Root Cause Analysis.¬†
Here's Carl's check list applied to the #NoEstimates conjecture that decisions can be made in the absence of estimates
Try to prove your hypothesis wrong. Every truth is prototypical and the purpose of science is to disprove that which we think we know.¬†
¬†Use carefully designed experiments to test all hypotheses.
So it's becoming clear #NoEstimates does pass the smell test of the basic BS meter
The Big Questions
The rhetoric ¬†on #NoEstimates has now reached a fever pitch, paid conferences, books, blatant misrepresentations. Time to call BS and move on. This is the last post.¬†¬†I've met many interesting people in both good and bad ways. And will stay in touch.¬†So long and thanks for the Fish.¬†As Douglas Adams says. Those with the money will have the final say on this idea.
The current misrepresentation approach is to ¬†quote people like Bent Flyvbjerg - who by the way does superior work in the domain of mass transportation and public work. Bent, along with many others, one of which is a client, have studied the problems with¬†Mega Projects. The classic misuse of these studies starts with the reading of the introduction of a report and going no further. Here's a typical summary.
9/10 Costs (are) underestimated. 9/10 Benefits (are) overestimated 9/10 Schedules (are) underestimated.
OK, we all know that's what the report says, now what?
NO?, then the use of reports and broad unqualified clips from reports is just¬†Lying With Statistics.
The classic example from the same source states¬†Steve¬†McConnell¬†PROVES estimates can't be done in his book. Which of course the antithesis of the title of the book and the content of the book.
This approach is pervasive in places where¬†doing your homework appears to be a step that was skipped.
From our own research in DOD ACAT1 programs (>$5B qualifies for Megaprojects) here's the Root Cause of program problems in our domain.
When we hear some¬†extracted¬†statement from a source in another domain - Bent for example is large construction infrastructure projects - roads, rail, ports - moved to our domain without the underlying details of both the data, the root causes and all the possible corrective actions to avoid the problem in the first place - that idea is basically bogus. Don't listen. Do your own investigation, learn how to not succumb to those who¬†Lie With Statistics.
So let's look at some simple questions when we hear there are problems with our projects or projects in other domains trying to convince us it's applicable to our domain.
These questions and their lack of answers are at the heart of most project performance problems. So pointing out all the problems is very easy. Providing corrective actions once the root cause is discovered is harder, mandatorily harder by the way. Because¬†Risk Management is How Adults Manage Projects.¬†
First let's look at what Bent says
He states the political economy of megaprojects, that is massive investments of a billion dollars or more in infrastructure or technology,¬†consistently ends up costing more with smaller benefits than projected and almost always end up with costs that exceed the benefits.¬†
So the first question is¬†are we working in the mega project domain? No? Then can we assert Bent's assessments are applicable. If we haven't then we're¬†Lying with Statistics. (Read Huff's book to find out why).
Flyvbjerg then explores the reasons for the poor predictions and poor performance of giant investment projects and what might be done to improve their effectiveness. Have we explored the reasons why our projects overrun? No? Then we haven't done our homework and are speculating on false data. Another¬†How to Lie With Statistics.
Stating that projects over run 9 out of 10 times without also finding the reasons for this is the perfect¬†How to Lie with¬†Statistics.¬†Make a statement, no supporting data, be the provocateur.
When we read a statement without a domain or context, without a corrective action, that is intended to convey a different message, taken out of context, without the evidence it is applicable in our domain, than the person writing the original statement, is¬†Lying with Statistics - don't listen, go find out for yourself.
Related articlesThe Dysfunctional Approach to Using "5 Whys" There is No Such Thing as Free Mr. Franklin's Advice Essential Reading List for Managing Other People's Money Eyes Wide Shut - A View of No Estimates
It's been popular recently in some agile circles to mention¬†we use the 5 whys¬†when asking about dysfunction. This common and misguided approach assumes - wrongly - causal relationship are linear and problems come from a single source. For example:
Estimates are the smell of dysfunction. Let's ask the 5 Whys to reveal these dysfunctions
The natural tendency to assume that in asking¬†5 whys there is a connection from beginning to end for the thread connecting cause and effect. This single source of the problem - the symptom - is labeled the Root Cause. The question is¬†is the root cause that actual root cause. The core problem is the¬†5 whys¬†is not really seeking a solution but just eliciting more symptoms masked as causes.
A simple example illustrates the problem from¬†Apollo Root Cause Analysis.
Say we're in the fire prevention business. If preventing fires is our goal, let's look for the causes of the fire and determine the correction actions needed to actual prevent fire from occuring. In this example let's says we've identified 3 potential causes of fire. There is ...
So what is the root cause of the fire? To prevent the fire - and in the follow on example prevent a dysfunction - we must find at least one¬†cause of the fire that can be acted on to meet the goals and objectives of preventing the fire AND are within our control.
Here's a briefing used now for our development and deployment processes in the health insurance domain
Root cause analysis master plan from Glen Alleman
The notion that Estimates are the¬†smell of dysfunction in a software development organization and asking the 5 Whys in search for the Root Cause is equally flawed.¬†
The need to estimate or not estimate has not been established. It is presumed that it is the estimating process that creates the dysfunction, and then the search - through the 5 Whys - is the false attempt to categorize the root causes of this dysfunction. The supposed dysfunction is them reverse engineered to be connected to the estimating process. This is not only a na√Įve approch to solving the dysfunction is inverts the logic by ignoring the need to estimate. Without confirmation that estimates are needed ot not needed, the search for the cause of the dysfunction has no purposeful outcome.¬†
The decision that estimates are needed or not need does not belong to those being asked to produce the estimates. That decision belongs to those consuming the estimate information in the decision making process of the business - those whose money is being spent.
And of course those consuming the estimates need to confirm they are operating their decision making processes in some framework that requires estimates. It could very well be those providing the money to be spent by those providing the value don't actual need an estimate. The value at risk¬†may be low enough - 100 hours of development for a DB upgrade. But when the¬†value at risk is sufficiently large - and that determination of done again by those providing the money, then a legitimate need to know how much, when, and what is made by the business¬†In this case, decisions are based on Microeconomics of opportunity cost for uncertain outcomes in the future.
This is the basis of estimating and the determination of the real root causes of the problems with estimates. Saying¬†we're bad at estimating is NOT the root cause. And it is never the reason not to estimate. If we are bad at estimating, and if we do have confirmation and optimism biases, then fix them. Remove the impediments to produce credible estimates. Because those estimates are needed to make decisions in any non-trivial value at risk work.¬†
¬†Related articles Let's Get The Dirt On Root Cause Analysis The Fallacy of the Planning Fallacy Mr. Franklin's Advice The Dysfunctional Approach to Using "5 Whys" Essential Reading List for Managing Other People's Money
When I use a word¬†Humpty Dumpty said in a rather scornful tone, it means just what I choose to to mean - neither more nor less.
The question is, said Alice, whether you can make words mean so many different things.
The question is said Humpty Dumpty which is to ne master.
Through the¬†Looking¬†Glass, Chapter 6
The mantra of #NoEstimates is that No Estimates is not about Not Estimating. Along with that oxymoron comes
Forecasting is Not Estimating
This of course redefines the standard definition of both terms.¬†Estimating is a rough calculation or judgment of a value, number, quantity, or extent of some outcome.¬†
An estimate is Approximation, prediction, or projection of a quantity based on experience and/or information available at the time, with the recognition that other pertinent facts are unclear or unknown.
Forecasting is a prediction of a future event
Both Estimating and Forecasting result in a probabilistic output in the presence of uncertainty
Slicing is Not Estimating??
Slicing work into smaller pieces so that "standard" size can be used to project the work effort and completion time. This is a standard basis of estimate in many domains.¬†So slicing is Not Estimating in the #NoEstimates paradigm.¬†In fact slicing is Estimating, another inversion of the term
No means Yes
using Past Performance to estimate future performance is core to all estimating processes. Time series used to estimate possible future outcomes is easily done with AIRMA, 4 lines of R, and some raw data as shown in¬†The Flaw of Averages. But as described there, care is needed to confirm the future is like the past.
When We Redefine Words to Suite Our Needs We're Humpty Dumpty
Lewis Carol's¬†Alice in Wonderland is political allegory of 19th century England. When #NoEstimates redefines established mathematical terms like¬†Forecasting and¬†Estimating and ignores the underlying mathematics for time series forecasting, ARIMA for example, they are willfully ignoring established practices and replacing them with their own untested conjectures.
Key here¬†ways to make decisions with NO ESTIMATES. OK, show how that is not actually an estimating technical, no matter how simple or flawed and estimating technical.Related articles Mr. Franklin's Advice There is No Such Thing as Free The Fallacy of the Planning Fallacy Do The Math Monte Carlo Simulation of Project Performance Essential Reading List for Managing Other People's Money
When we hear things like ...
Why promising nothing delivers more and planning always fails,
¬†It's in doing the work that we discover the work that we must do,
¬†If estimates were real the team wouldn't have to know the delivery date, they just work naturally and be on date.
You have to ask do these posters have any understanding that it's not their money?¬†That¬†all project work is probabilistic. That nonlinear, non-stationary, stochastic processes drive ¬†uncertainty for all work in ways that cannot be controlled by disaggregating the work (slicing), or assuming that work elements are¬†independent from other work elements in all but the most trivial of project context.
Systems Thinking and Probability‚Ä†
All systems where optimum technical performance is needed require a¬†negative feedback loop as the basis for controlling the work in order to arrive on the planned date, with the planned capabilities, for the planned budget. If there is no need to arrive as planned or as needed, then no control system is needed, just spend until told to stop.
The negative feedback loop as a control system, is the opposite of the positive feedback loop. In chemistry a positive feedback loop is best referred to as an¬†explosion. In project management a positive feedback loop results in a project that requires greater commitment of resources to produce the needed capabilities beyond what was anticipated. That is cost and schedule overrun and lower probability of technical success.
A project is a type of complex adaptive system that acquires information about its environment and the interactions between the project elements, identifies information of importance, and places that information within a context, model, or schema, and then acts on this information to make decisions.
The individual members of the project act as a complex adaptive system themselves and exert influence on the selection of both the schema and the adaptive forces used to make decisions. The extent to which learning produces adaptive or maladaptive behavior determines the survival of failure of the project and the organization producing the value from the project.
Managing in the Presence of Uncertainty
Uncertainty creates risk. Reducible risk and irreducible risk. This risk by its nature is probabilistic. Complex systems tend to organize themselves in a normal distribution of outcomes ONLY if each individual element of the system is Independent and Identically distributed. If this is not the case, long tailed distributions result and are the source of Black Swans. And these Black Swans are unanticipated cost and schedule performance problems and technical failures we are familiar with in the literature. The project¬†explodes.¬†
So We Arrive at the End
To manage in the presence of an uncertain future for cost, delivery date, and delivered capabilities to produce the value in exchange for the cost, we need some mechanism to inform our decision making process based on these random variables. The random variables that create risk. Risk that must be reduced to increase the probability of success. The reducible risk and the irreducible risk.
This mechanism is the ability to¬†estimate to impact of any decision while making the¬†trade off between decision alternatives - this is the basis of Microeconomics - the tradeoff of a decision based on the¬†opportunity cost¬†of the collection of decision alternatives.
Anyone conjecturing that decisions can be made in the presence of uncertainty without making ¬†estimated impacts of those decisions has willfully ignored the foundational principles ¬†of ¬†Microeconomics.
The only possible way to make decisions in the absence of estimating the impact of a decision is when the decision has a trivial¬†value at risk. Many decisions are just that.¬†If I decide wrong the outcome has little or no impact on cost, schedule, or needed¬†technical¬†performance. In this case Not Estimating is a viable option. For all other conditions, Not Estimates results in a Black Swan¬†explosion of the customers budget, time line, and expected beneficial outcomes based on the produced value.
‚Ä† Technical Performance Measurement, Earned Value, and Risk Management: An Integrated Diagnostic Tool for Program Management, Commander N. D. Pisano, SC, USN, Program Executive Officer Air NSW, Assault and Special Missions Programs (PEO(A)). Nick is a colleague. This paper is from 1991 defining how to plan and assess performance for complex, emergent systems.¬†Related articles Just Because You Say Words, It Doesn't Make Then True There is No Such Thing as Free Essential Reading List for Managing Other People's Money The Dysfunctional Approach to Using "5 Whys" Mr. Franklin's Advice
Thanks to all my neighbors, friends, and colleagues for their service.
The book¬†Software for Your Head¬†was a seminal work when we were setting up our Program Management Office in 2002 for a mega-project to remove nuclear waste from a very contaminated site in Golden Colorado.
Here's an adaptation of those ideas to the specifics of our domain and problems
Software for your mind from Glen Alleman This approach was a subset of a much larger approach to managing in the presence of uncertainty, very high risk, and even higher rewards, all on a deadline, and fixed budget.¬† As was stated in the Plan of the Week.
Do this every week, guided by the 3 year master plan and make sure no one is injured or killed.
That project is documented in the book¬†Making the Impossible Possible summarized here.
Making the impossible possible from Glen Alleman Related articles The Reason We Plan, Schedule, Measure, and Correct The Flaw of Empirical Data Used to Make Decisions About the Future There is No Such Thing as Free
We've been doing this for 20 years and therefore you can as well
Is a common phrase used when asked¬†in what domain does you approach work? Of course without a test of that idea outside the domain in which the anecdotal example is used, it's going to be hard to know if that idea is actually credible beyond those examples.
So if we hear¬†we've been successful in our domain doing¬†something¬†or better yet NOT doing something, like say NOT estimating, ask in what domain have you been successful? Then the critical question, is there any evidence that the success in that domain is transferable to another domain? This briefing provides a framework - from my domain of aircraft development - illustrating that domains vary widely in their needs, constraints, governance processes and applicable and effective approaches to delivering value.
Paradigm of agile project management from Glen Alleman Google seems to have forgotten how to advance the slides on the Mac. So click on the presentation title (paradigm of agile PM) ¬†to do that. Safari works. Related articles The Reason We Plan, Schedule, Measure, and Correct The Flaw of Empirical Data Used to Make Decisions About the Future There is No Such Thing as Free Root Cause Analysis Domain is King, No Domain Defined, No Way To Test Your Idea Mr. Franklin's Advice
Education is not the learning of facts, but the training of the mind to think - Albert¬†Einstein¬†
So if we're going to learn how to think¬†about managing the spending of other peoples money in the presence of uncertainty, we need some basis of education.¬†
Uncertainty is a fundamental and unavoidable feature of daily life. Personal life and the life of projects. To deal with this uncertainty intelligently we represent and reason about these uncertainties. There are formal ways of reasoning (logical systems for reasoning found in the Formal Logic and Artificial Intelligence domain) and informal ways of reasons (based on probability and statistics of cost, schedule, and technical performance in the Systems Engineering domain).
If Twitter, LinkedIn, and other forum conversations have taught me anything, it's that many participants base their discussion on personal experience and opinion. Experience informs opinion. That experience may be based on¬†gut feel learned from the ¬†school of hard knocks.¬†But there are other ways to learn as well. Ways to guide your experience and inform your option. Ways based on education and frameworks for thinking about solutions to complex problems.
Samuel Johnson has served me well with his quote...
There are two ways to knowledge,¬†We know a subject ourselves, or we know where we can find information upon it.
Hopefully the knowledge we know ourselves has some basis in fact, theory, and practice, vetted by someone outside ourselves, someone beyond our personal anecdotal experience
Here's my list of essential readings that form the basis of my understanding, opinion, principles, practices, and processes as they are applied in the domains I work - Enterprise IT, defense and space and their software intensive systems.
So In The End
This list is the tip of the iceberg for access to the knowledge needed to manage in the presence of uncertainty while spending other peoples money.Related articles Mr. Franklin's Advice Want To Learn How To Estimate? Two Books in the Spectrum of Software Development
Any process that does not have provisions for its own refinement will eventually fail or be abandoned
- W. R. Corcoran, PhD, P.E.,¬†The Phoenix¬†Handbook: The Ultimate Event¬†Evaluation¬†Manual for Finding Profit¬†Improvement in Adverse Events,¬†Nuclear Safety Review Concepts, 19 October 1997.