Skip to content

Software Development Blogs: Programming, Software Testing, Agile Project Management

Methods & Tools

Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!

Herding Cats - Glen Alleman
Syndicate content
Performance-Based Project Management¬ģ Principles, Practices, and Processes to Increase Probability of Success
Updated: 9 hours 54 min ago

Who's Budget is it Anyway?

Wed, 05/27/2015 - 05:30

When we hear things like ...

Why promising nothing delivers more and planning always fails,
or
 It's in doing the work that we discover the work that we must do,
or
 If estimates were real the team wouldn't have to know the delivery date, they just work naturally and be on date.

You have to ask do these posters have any understanding that it's not their money? That all project work is probabilistic. That nonlinear, non-stationary, stochastic processes drive  uncertainty for all work in ways that cannot be controlled by disaggregating the work (slicing), or assuming that work elements are independent from other work elements in all but the most trivial of project context.

Systems Thinking and Probability†

All systems where optimum technical performance is needed require a negative feedback loop as the basis for controlling the work in order to arrive on the planned date, with the planned capabilities, for the planned budget. If there is no need to arrive as planned or as needed, then no control system is needed, just spend until told to stop.

The negative feedback loop as a control system, is the opposite of the positive feedback loop. In chemistry a positive feedback loop is best referred to as an explosion. In project management a positive feedback loop results in a project that requires greater commitment of resources to produce the needed capabilities beyond what was anticipated. That is cost and schedule overrun and lower probability of technical success.

A project is a type of complex adaptive system that acquires information about its environment and the interactions between the project elements, identifies information of importance, and places that information within a context, model, or schema, and then acts on this information to make decisions.

The individual members of the project act as a complex adaptive system themselves and exert influence on the selection of both the schema and the adaptive forces used to make decisions. The extent to which learning produces adaptive or maladaptive behavior determines the survival of failure of the project and the organization producing the value from the project.

Managing in the Presence of Uncertainty

Uncertainty creates risk. Reducible risk and irreducible risk. This risk by its nature is probabilistic. Complex systems tend to organize themselves in a normal distribution of outcomes ONLY if each individual element of the system is Independent and Identically distributed. If this is not the case, long tailed distributions result and are the source of Black Swans. And these Black Swans are unanticipated cost and schedule performance problems and technical failures we are familiar with in the literature. The project explodes. 

Screen Shot 2015-05-26 at 11.08.07 PM

So We Arrive at the End

To manage in the presence of an uncertain future for cost, delivery date, and delivered capabilities to produce the value in exchange for the cost, we need some mechanism to inform our decision making process based on these random variables. The random variables that create risk. Risk that must be reduced to increase the probability of success. The reducible risk and the irreducible risk.

This mechanism is the ability to estimate to impact of any decision while making the trade off between decision alternatives - this is the basis of Microeconomics - the tradeoff of a decision based on the opportunity cost of the collection of decision alternatives.

Anyone conjecturing that decisions can be made in the presence of uncertainty without making  estimated impacts of those decisions has willfully ignored the foundational principles  of  Microeconomics.

The only possible way to make decisions in the absence of estimating the impact of a decision is when the decision has a trivial value at risk. Many decisions are just that. If I decide wrong the outcome has little or no impact on cost, schedule, or needed technical performance. In this case Not Estimating is a viable option. For all other conditions, Not Estimates results in a Black Swan explosion of the customers budget, time line, and expected beneficial outcomes based on the produced value.

† Technical Performance Measurement, Earned Value, and Risk Management: An Integrated Diagnostic Tool for Program Management, Commander N. D. Pisano, SC, USN, Program Executive Officer Air NSW, Assault and Special Missions Programs (PEO(A)). Nick is a colleague. This paper is from 1991 defining how to plan and assess performance for complex, emergent systems. 

Related articles Just Because You Say Words, It Doesn't Make Then True There is No Such Thing as Free Essential Reading List for Managing Other People's Money The Dysfunctional Approach to Using "5 Whys" Mr. Franklin's Advice
Categories: Project Management

Memorial Day

Sun, 05/24/2015 - 19:27

Memorial-dayIn case you thought it was about the 3 day weekend, parties, and the beach

Thanks to all my neighbors, friends, and colleagues for their service.

Categories: Project Management

The Dysfunctional Approach to Using "5 Whys"

Fri, 05/22/2015 - 18:29

It's been popular recently in some agile circles to mention we use the 5 whys when asking about dysfunction. This common and misguided approach assumes - wrongly - causal relationship are linear and problems come from a single source. For example

Estimates are the smell of dysfunction. Let's ask the 5 Whys to reveal these dysfunctions

The natural tendency to assume that in asking 5 whys there is a connection from beginning to end for the thread connecting cause and effect. This single source of the problem - the symptom - is labeled the Root Cause. The question is is the root cause that actual root cause. The core problem is the 5 whys is not really seeking a solution but just eliciting more symptoms masked as causes.

A simple example illustrates the problem from Apollo Root Cause Analysis.

Say we're in the fire prevention business. If preventing fires is our goal, let's look for the causes of the fire and determine the correction actions needed to actual prevent fire from occuring. In this example let's says we've identified 3 potential causes of fire. There is ...

  1. An ignition source
  2. Combustible material
  3. Oxygen

So what is the root cause of the fire? To prevent the fire - and in the follow on example prevent a dysfunction - we must find at least one cause of the fire that can be acted on to meet the goals and objectives of preventing the fire AND are within our control.

If we decide to control of combustable materials then the root cause is the combustibles. Same for the oxygen. This can be done by inerting a confined space, say with nitrogen. Same for the ignition sources. This traditional Root Cause Analysis pursues a preventative solution that is within our control and meets the goals and objectives - prevent fire. But this is not actually the pursuit of the Root Cause. By pursuing this approach, we stop on a single cause that may or may not result in the best solution. We're mislead into a categorical thinking process that looks for solutions. This doesn't means there is no root cause. It means we can't that a root cause cannot be labels until we have decided on which solutions we are able to implement. The root cause is actually secondary to and contingent on the solution, not the inverse. Only after solutions have been established can we identify the actual root cause of the fire not be prevented.

The notion that Estimates are the smell of dysfunction in a software development organization and asking the 5 Whys in search for the Root Cause is equally flawed. 

The need to estimate or not estimate has not been established. It is presumed that it is the estimating process that creates the dysfunction, and then the search - through the 5 Whys - is the false attempt to categorize the root causes of this dysfunction. The supposed dysfunction is them reverse engineered to be connected to the estimating process. This is not only a na√Įve approch to solving the dysfunction is inverts the logic by ignoring the need to estimate. Without confirmation that estimates are needed ot not needed, the search for the cause of the dysfunction has no purposeful outcome.¬†

The decision that estimates are needed or not need does not belong to those being asked to produce the estimates. That decision belongs to those consuming the estimate information in the decision making process of the business - those whose money is being spent.

And of course those consuming the estimates need to confirm they are operating their decision making processes in some framework that requires estimates. It could very well be those providing the money to be spent by those providing the value don't actual need an estimate. The value at risk may be low enough - 100 hours of development for a DB upgrade. But when the value at risk is sufficiently large - and that determination of done again by those providing the money, then a legitimate need to know how much, when, and what is made by the business In this case, decisions are based on Microeconomics of opportunity cost for uncertain outcomes in the future.

This is the basis of estimating and the determination of the real root causes of the problems with estimates. Saying we're bad at estimating is NOT the root cause. And it is never the reason not to estimate. If we are bad at estimating, and if we do have confirmation and optimism biases, then fix them. Remove the impediments to produce credible estimates. Because those estimates are needed to make decisions in any non-trivial value at risk work. 

 

Related articles Let's Get The Dirt On Root Cause Analysis Essential Reading List for Managing Other People's Money The Fallacy of the Planning Fallacy Mr. Franklin's Advice
Categories: Project Management

Software for the Mind

Fri, 05/22/2015 - 00:21

The book Software for Your Head was a seminal work when we were setting up our Program Management Office in 2002 for a mega-project to remove nuclear waste from a very contaminated site in Golden Colorado.

Here's an adaptation of those ideas to the specifics of our domain and problems

Software for your mind from Glen Alleman This approach was a subset of a much larger approach to managing in the presence of uncertainty, very high risk, and even higher rewards, all on a deadline, and fixed budget.  As was stated in the Plan of the Week.
  • Monday - Where are we going this week?¬†
  • Daily - What are we doing along the way?
  • Friday - Where have we come to?

Do this every week, guided by the 3 year master plan and make sure no one is injured or killed.

That project is documented in the book Making the Impossible Possible summarized here.

Making the impossible possible from Glen Alleman Related articles The Reason We Plan, Schedule, Measure, and Correct The Flaw of Empirical Data Used to Make Decisions About the Future There is No Such Thing as Free
Categories: Project Management

We've Been Doing This for 20 Years ...

Thu, 05/21/2015 - 03:58

We've been doing this for 20 years and therefore you can as well

Is a common phrase used when asked in what domain does you approach work? Of course without a test of that idea outside the domain in which the anecdotal example is used, it's going to be hard to know if that idea is actually credible beyond those examples.

So if we hear we've been successful in our domain doing something or better yet NOT doing something, like say NOT estimating, ask in what domain have you been successful? Then the critical question, is there any evidence that the success in that domain is transferable to another domain? This briefing provides a framework - from my domain of aircraft development - illustrating that domains vary widely in their needs, constraints, governance processes and applicable and effective approaches to delivering value.

Paradigm of agile project management from Glen Alleman Google seems to have forgotten how to advance the slides on the Mac. So click on the presentation title (paradigm of agile PM)  to do that. Safari works. Related articles The Reason We Plan, Schedule, Measure, and Correct The Flaw of Empirical Data Used to Make Decisions About the Future There is No Such Thing as Free Root Cause Analysis Domain is King, No Domain Defined, No Way To Test Your Idea Mr. Franklin's Advice
Categories: Project Management

Essential Reading List for Managing Other People's Money

Mon, 05/18/2015 - 15:58

Education is not the learning of facts, but the training of the mind to think - Albert Einstein 

So if we're going to learn how to think about managing the spending of other peoples money in the presence of uncertainty, we need some basis of education. 

Uncertainty is a fundamental and unavoidable feature of daily life. Personal life and the life of projects. To deal with this uncertainty intelligently we represent and reason about these uncertainties. There are formal ways of reasoning (logical systems for reasoning found in the Formal Logic and Artificial Intelligence domain) and informal ways of reasons (based on probability and statistics of cost, schedule, and technical performance in the Systems Engineering domain).

If Twitter, LinkedIn, and other forum conversations have taught me anything, it's that many participants base their discussion on personal experience and opinion. Experience informs opinion. That experience may be based on gut feel learned from the  school of hard knocks. But there are other ways to learn as well. Ways to guide your experience and inform your option. Ways based on education and frameworks for thinking about solutions to complex problems.

Samuel Johnson has served me well with his quote...

There are two ways to knowledge, We know a subject ourselves, or we know where we can find information upon it.

Hopefully the knowledge we know ourselves has some basis in fact, theory, and practice, vetted by someone outside ourselves, someone beyond our personal anecdotal experience

Here's my list of essential readings that form the basis of my understanding, opinion, principles, practices, and processes as they are applied in the domains I work - Enterprise IT, defense and space and their software intensive systems.

  • Making Hard Decisions: An¬†Introduction¬†to Decision Analysis, Robert T. Clemen
    • Making decisions in the presence of uncertainty is part of all business and technical endeavors.
    • This book and several other should be the start when making decisions about how much, when, and what.
  • Apollo Root Cause Analysis: Effective Solutions to Everyday Problems, Every Time, Dean L. Gano.
    • There is a powerful quote from Chapter 1 of this book
      • STEP UP¬†¬†TO FAIL
      • Ignorance is a most wonderful thing.
      • It facilitates magic.
      • It allows the masses to be led.
      • It provides answers when there are none.
      • It allows happenings in the presence of danger.
      • All this, while the pursuit of knowledge can only destroy the illusion. It is any wonder mankind chooses ignorance?
    • This book is the starting point for all that follows. I usually only come to an engagement when there is¬†trouble.
    • No need for improvement if there's no trouble.
    • Without a Root Cause Analysis process and corrective actions all problems are just symptoms. And treating the symptoms does little to make improvements to any situation.¬†
    • So this is the seminal book, but any RCA process is better than none.
  • The Phoenix¬†Handbook, William R. Cocoran, PhD, P.E., Nuclear Safety Review Concepts, 19 October 1997 version.
    • This was a book and process used at Rocky Flats for Root Cause Analysis
  • Effective Complex Project Management: An¬†Adaptive¬†Agile Framework for Delivering Business Value, Robert K. Wysocki, J. Ross
    • All project work is probabilistic.
    • All project work is complex. Agile software development is not the same as project management.
    • For agile software development beyond a handful of people in the same room as their customer, project management is needed.
    • This book tells you where to start in performing the functions of Project Management in the Enterprise domain.
  • The Art of System Architecting, 2nd Edition, Mark W. Maier and Eberhardt Recthin, CRC Press
    • Systems have architecture. This architecture is purpose built.
    • The notion¬†the best architectures, requirements, and designs emerge from self-organizing teams needs to be tested in a specific domain.
    • Many domain have¬†reference architectures, DODAF¬†and¬†TOGAF are tow examples.
    • Architectures developed by self-organizing teams may or may not be useful over the life of the system. It depends on the skills and experience of the¬†architects. Brian Foote has a term for self-created architectures -¬†ball of mud. So care is needed in failing to test the self-organizaing team's ability to produce a good architecture.
    • The Recthin book can be your guide for that test.
  • Systems Enigneering: Coping With Complexity, Richard Stevens, Peter Brook, Ken Jackson, Stuart Arnold
    • All non-trivial projects are systems.
    • Systems are complex, they contain complexity
    • Defining complex, complexity, complicated needs to be done with care.
    • Much mis-information is around in the agile community about these terms. Usually used to make so point about how hard it is to manage software development projects.
    • In fact there is a strong case that much of the¬†complexity and¬†complex aspects in software development are simply¬†bad¬†management¬†of the requirements
  • Forecasting and Simulating Software Development Projects: Effective Modeling of Kanban & Scrum Projects using Monte Carlo Simulation, Troy Magennis
    • When we hear¬†Control in a non-determistic paradigm is an illusion at best, delusion at worst start with Troy's book to see that conjecture is actually not true.
    • If the system you're working on is truely non-deterministic - that is chaotic - you've got yourself a long road because you're on a Death March project. Run away as fast as you can.
  • Probability Methods for Cost Uncertainty Analysis: A Systems Engineering Perspective, Paul R. Garvey, CRC Press.
    • All project variables are probabilistic. Managing in the presence of uncertainty created by the statistical processes the result in probability is part of all project management.
    • This book speaks to the uncertainty in cost.
    • Uncertainty in schedule and technical performance are the other two variables.
    • Assuming deterministic variables or assuming you can't manage in the presence of uncertainty are both naive and ignore the basic mathematics of making decisions in the presence of uncertainty
  • Estimating Software-Intensive Systems:¬†Projects,Products and Processes, Richard D. Stutzke, Addison Wesley.
    • Software Intensive Systems¬†s any¬†system¬†where¬†software¬†contributes essential influences to the design, construction, deployment, and evolution of the¬†system¬†as a whole. [IEEE 42101:2011]
    • Such systems are by their nature complex, but estimating the attributes of such systems is a critical success factor in all modern business and technology functions.¬†
    • For anyone conjecyturing estimates can't be made in complex system, this book an mandatory reading.¬†
    • Estimates are hard, but can be done, and are done.¬†
    • So when you hear that conjecture ask¬†how you know that those estimates can't be made? Where's you evidence that counters the work found in this book. Not¬†anecdotes,¬†optioning,¬†¬†conjectures, but actual engineering¬†assessment with the¬†mathematics?
  • Effective Risk¬†Management:¬†Some¬†Keys to Success, 2nd¬†Edition, Edmund H. Conrow.
  • Project Risk Management: Processes, Techniques and Insight, Chris Chapman and Stephen Ward.
    • These two book are the core of Tim Lister's quote
    • Risk Management is How Adults Manage Projects
    • Risk management involves estimating
  • The Economics of¬†Iterative¬†Software Development" Steering Toward Business¬†Results, Walker Royce, Kurt Bittner, and Mike Perrow.
    • All software development is a MicroEconomics paradigm.
    • Where¬†the behavior of individuals and small impacting organizations in making decisions on the allocation of limited resources.
    • When you hear about conjectures for improving software development processes that violate Microeconomics, ignore them.
    • These limited resources are people, time, and money
  • Assessment¬†and Control of¬†Software¬†Risks, Capers Jones.
    • Since all management is risk management, here's a book that clearing states how to manage in the presence of uncertainty.
  • Software Cost¬†Estimating¬†with COCOMO II, Barry Boehm et. al.
    • This is the original basis of estimating with parametric processes
    • Numerous tools and processes are based on COCOMO
    • Parametric estimating makes use of Reference Classes, same as Monte Carlo Simulation
    • With a parametric model or a Reference Class model estimates of future outcomes can be made in every domain where we're not inventing new physics. This means there is no reason not to estimate for any software system found in any normal business environment.
    • This is not to say everyone can estimate. Nor should they. The¬†excuse of¬†we've never done this below really means you should go find someone who has.
  • Facts and Fallacies of Software Engineering, Robert L. Glass
    • There are many fallacies in the development of software
    • This book exposes most of them and provides corrective actions
  • How to Measure Anything: Finding the Value of Intangibles in Business, Douglas Hubbard
    • When we hear¬†we can't measure read this book.
    • This book has a great description of Monte Carlo Simulation (used everywhere in our domains).¬†
      • Monte Carlo started at Los Alamos during the bomb development process
      • MCS samples a large number of value¬†under in Probability Distribution Function that represents the statistical processes that are being modeled.¬†
      • MCS has some relatives,¬†Boot Strapping is one. But it operates in a different manner though, using past performance as a sample population.
  • Hard Fact, Dangerous Half-Truths & Total Nonsense, Jeffrey Pfeffer and Robert Sutton
    • This book was handed out by Ken Schwaber's "The State of Agile"
    • The key here is decisions are best made using facts. When facts aren't directly available, estimates of those facts are needed.
    • Making those estimates are part of every business decision, based on Microeconomics of writing software for money.

So In The End

This list is the tip of the iceberg for access to the knowledge needed to manage in the presence of uncertainty while spending other peoples money.

Related articles Mr. Franklin's Advice Want To Learn How To Estimate? Two Books in the Spectrum of Software Development
Categories: Project Management

Quote of the Day

Fri, 05/15/2015 - 21:33

Any process that does not have provisions for its own refinement will eventually fail or be abandoned

- W. R. Corcoran, PhD, P.E., The Phoenix Handbook: The Ultimate Event Evaluation Manual for Finding Profit Improvement in Adverse Events, Nuclear Safety Review Concepts, 19 October 1997.

Categories: Project Management

Complex, Complexity, Complicated

Thu, 05/14/2015 - 18:47

In the agile community it is popular to use the terms complex, complexity, complicated many times interchangeably and and many times wrongly. These terms are many times overloaded with an agenda used to push a process or even a method.

First some definitions

  • Complex - consisting of many different and connected part. Not easy to analyze or understand. Complicated or intricate. When a system or problem is considered complex, analytical approaches, like dividing it into parts to make the problem tractable is not sufficient, because it is the interactions of the parts that make the system complex and without these interconnections, the system no longer functions.
  • Complex System -¬†is a functional whole, consisting of interdependent and variable parts. Unlike conventional systems, the parts need not have fixed relationships, fixed behaviors or fixed quantities, and their individual functions may be undefined in traditional terms.
  • Complicated - containing a number of hidden parts, which must be revealed separately because they do not interact. Mutual interaction of the components creates nonlinear behaviors of the system. In principle all systems are complex. The number of parts or components is irrelevant n the definition of complexity. There can be complexity - nonlinear behaviour - in small systems of large systems.¬†
  • Complexity - there is no standard definition of complexity is a view of systems that suggests simple causes result in complex effects. Complexity as a term¬†is generally used to characterize a system with many parts whose interactions with each other occur in multiple ways. Complexity can occur in a variety of forms
    • Complex behaviour
    • Complex mechanisms
    • Complex situations
    • Complex systems
    • Complex data
  • Complexity Theory -¬†states that critically interacting components self-organize to form potentially evolving structures exhibiting a hierarchy of emergent system properties.¬†This theory takes the view that systems are best regarded as wholes, and studied as such, rejecting the traditional emphasis on simplification and reduction as inadequate techniques on which to base this sort of scientific work.

One more item we need is the types of Complexity

  • Type 1 - fixed systems, where the structure doesn't change as a function of time.
  • Type 2 - systems where time causes changes. This can be repetitive cycles or change with time.
  • Type 3 - moves beyond repetitive systems into organic where change is extensive and non-cyclic in nature.
  • Type 4 - are self organizing where we can¬†combine internal constraints of closed systems, like machines, with the creative evolution of open systems, like people.

And Now To The Point

When we hear complex, complexity, complex systems, complex adaptive system, pause to ask what kind of complex are you talking about. What Type of complex system. In what system are you applying the term complex. Have you classified that system in a way that actually matches a real system.

It is common use the terms complex, complicated, and complexity are interchanged. And software development is classified or mis-classified as one or the both or all three. It is also common to toss around these terms with not actual understanding of their meaning or application.

We need to move beyond buzz words. Words like Systems Thinking. Building software is part of a system. There are interacting parts that when assembled, produce an outcome. Hopefully a desired outcome. In the case of software the interacting parts are more than just the parts. Software has emergent properties. A Type 4 system, built from Type 1, 2, and 3 systems. With changes in time and uncertainty, modeling these systems requires stochastic processes. These processes depend on estimating behaviors as a starting point. 

The understanding that software development is an uncertain process (stochastic) is well known, starting in the 1980's [1] with COCOMO. Later models, like Cone of Uncertainty made it clear that these uncertainties, themselves, evolve with time. The current predictive models based on stochastic processes include Monte Carlo Simulation of networks of activities, Real Options, and Bayesian Networks. Each is directly applicable to modeling software development projects.

[1] Software Engineering Economics, Barry Boehm, Prentice-Hall, 1981.

Related articles Decision Analysis and Software Project Management Making Decisions in the Presence of Uncertainty Some More Background on Probability, Needed for Estimating Approximating for Improved Understanding The Microeconomics of a Project Driven Organization How to Avoid the "Yesterday's Weather" Estimating Problem Hope is not a Strategy
Categories: Project Management

Monte Carlo Simulation of Project Performance

Thu, 05/14/2015 - 17:30

Monte-Carlo-3Project work is random. Most everything in the world  is random. The weather, commuter traffic, productivity of writing and testing code. Few things actually take as long as they are planned. Cost is less random, but there are variances in the cost of labor, the availability of labor. Mechanical devices have variances as well.

The exact fit of a water pump on a Toyota Camry is not the same for each pump. There is a tolerance in the mounting holes, the volume of water pumped. This is a variance in the technical performance.

Managing in the presence of these uncertainties is part of good project management. But there are two distinct paradigms of managing in the presence of these uncertainties.

  1. We have empirical data of the variances. We have samples of the hole positions and sizes of the water pump mounting plate for the last 10,000 pumps that were installed. We have samples of how long it took to write a piece of code and the attributes of the code that are correlated to that duration. We have empirical measures.
  2. We have a theoretical model of the water pump in the form of a 3D CAD model with the materials modeling for expansion, drilling errors of the holes and other static and dynamic variances. We have modeling the duration of work using a Probability Distribution Function and a Three Point estimate of the Most Likely Duration, the Pessimistic and Optimistic duration. These can be derived form past performance, but we don't have enough actual data to produce the PDF and have a low enough Sample Error for our needs.

In the first case we have empirical data. In he second case we don't. There are two approaches to modeling what the system will do in terms of cost and schedule outcomes.

Bootstrapping the Empirical Data

With samples of past performance and the proper statistical assessment of those samples, we can re-sample them to produce a model of future performance. This bootstrap resampling shares the principle of the second method - Monte Carlo Simulation - but with several important differences.

  • The¬†researcher - and we are researching what the possible outcomes might be from our model - does not know nor have any control of the Probability Distribution Function that generated the past sample. You take what you got.¬†
  • As well we don;'t have any understanding of¬†Why those samples appear as they do. They're just there.¬†We get what we get.
  • This last piece is critical because it prevents us from defining what performance must be in place to meet some future goal. We can't tell what performance we need because we have not model of the¬†need performance, just samples from the past.
  • This results from the statistical conditions that there is a PDF for the process that ius unobserved. All we have is a few samples of this process.
  • With these few samples, we're going to resample them to produce a modeled outcome. This resampling locks in any behavior of the future using the samples from the past, which may or may not actually represent the¬†true underlying behavior. This may be all we can do because we don't have any theoretical model of the process.

This bootstrapping method is quick, easy, and produces a quick and easy result. But it has issues that must be acknowledged.

  • There is a fundamental assumption that the past empirical samples represent the future. That is the samples contained in the¬†bootstrapped list and their resampling ae also contained in all the future samples.
  • Said in a more formal way
    • If the sample of data we have from the past is a reasonable representation of the underlying population of all samples from the work process, then the distribution of parameter estimates produced from the bootstrap¬† model on a series of resampled data sets will provide a good approximation of the distribution of that statistics in the population.
    • With this sample data and its parameters (statistical moments) we can make a good approximation of the future.
  • There are some important statistical behaviors though that must be considered, starting with the future samples are identical to the statistical behaviors of the past samples.
    • Nothing is going to change in the future
    • The past and the future are identical statistically
    • In the project domain that is very unlikely
  • With all these condition, for a small project, with few if any interdependencies, a static work process with little valance, boot strapping is a nice quick and dirty approach to forecasting (estimating the future) ¬†based on the past.

Monte Carlo Simulation

This approach is more general and removes many of the restriction to the statistical confidence of bootstrapping.

Just as a reminder, in principle both the parametric and the non-parametric bootstrap are special cases of Monte Carlo simulations used for a very specific purpose: estimate some characteristics of the sampling distribution. But like all principles, in practice there are larger differences when modeling project behaviors.

In the more general approach  of Monte Carlo Simulation the algorithm repeatedly creating random data in some way, performing some modeling with that random data, and collecting some result.

  • The duration of a set independent tasks
  • The probabilistic completion date of a series of tasks connected in a network (schedule), each with a different Probability Distribution ¬†Function evolving as the project moves into the future.
  • A probabilistic cost ¬†correlated with the probabilistic schedule model. This is called the Joint Confidence Level. Both cost and schedule are random variance with time evolving changes in their respective PDFs.

In practice when we hear Monte Carlo simulation we are talking about a theoretical investigation, e.g. creating random data with no empirical content - or from reference classes -  used to investigate whether an estimator can represent known characteristics of this random data, while the (parametric) bootstrap refers to an empirical estimation and is not necessary a model of the underlying processes, just a small sample of observations independent from the actual processes that generated that data.

The key advantage of MCS is we don't necessarily need  past empirical data. MCS can be used to advantage if we do, but we don't need it for the Monte Carlo Simulation algorithm to work.

This approach could be used to estimate some outcome, like in the bootstrap, but also to theoretically investigate some general characteristic of an statistical estimator (cost, schedule, technical performance) which is difficult to derive from empirical data.

MCS removes the road block heard in many critiques of estimating - we don't have any past data on which to estimate.  No problem, build a model of the work, the dependencies between that work, and assign statistical parameters to the individual or collected PDFs and run the MCS to see what comes out.

This approach has several critical advantages:

  • The first is a restatement - we don't need empirical data, although it will add value to the modeling process.
    • This is the primary purpose of Reference Classes
    • They are the raw material for defining possible future behaviors form the past
  • We can make judgement of what he future will be like, or most importantly what the future MUST be like to meet or goals, run the simulation and determine is our planned work will produce a desired result.

So Here's the Killer Difference

Bootstrapping models make several key assumptions, which may not be true in general. So they must be tested before accepting any of the outcomes.

  • The future is like the past.
  • The statistical parameters are static - they don't evolve with time. That is the future is like the past, an unlikely prospect on any non-trivial project.
  • The sampled data is identical to the population data both in the past and in the future.

Monte Carlo Simulation models provide key value that bootstrapping can't.

  • Different Probability Distribution Function can be assigned to work as it progresses through time
  • The shape of that PDF can be defined from past performance, or defined from the¬†needed performance.

The critical difference between Bootstrapping and Monte Carlo Simulation is MCS can show what the future performance has to be to stay on schedule (within variance), on cost, and have the technical performance meet the needs of the stakeholder.

Bootstrapping can only show what the future will be like if it like the past, not what it must be like. In Bootstrapping this future MUST be like the past. In MCS we can tune the PDFs to show what performance has to be to manage to that plan. Bootstrapping is reporting yesterday's weather as tomorrow's weather - just like Steve Martin in LA Story. If tomorrow's weather turns out not to be like yesterday's weather, you gonna get wet.

MCS can forecast tomorrows weather, by assigning PDFs to future activities that are different than past activities, then we can make any needed changes in that future model to alter the weather to meet or needs. This is in fact how weather forecasts are made - with much more sophisticated models of course here at the National Center for Atmospheric Research in Boulder, CO

This forecasting (estimating the future state) of possible outcomes and the alternation of those outcomes through management actions to change dependencies, add or remove resources, provide alternatives to the plan (on ramps and off maps of technology for example), buy down risk, apply management reserve, assess impacts of rescoping the project, etc. etc. etc.  is what project management is all about.

Bootstrapping is necessary but far from sufficient for any non-trivial project to show up on of before the need date (with schedule reserve), at o below the budgeted cost (with cost reserve) and have the produce or service provide the needed capabilities (technical performance reserve).

Here's an example of that probabilistic forecast of project performance from a MCS (Risky Project). This picture shows the probability for cost, finish date, and duration. But it is built on time evolving PDFs assigned to each activity in a network of dependent tasks, which models the work stream needed to complete as planned.

When that future work stream is changed to meet new requirements, unfavorable past performance and the needed corrective actions, or changes in any or all of the underlying random variables, the MCS can show us the expected impact on key parameters of the project so management in intervention can take place - since Project Management is a verb.

Untitled

The connection between the Bootstrap and Monte Carlo simulation of a statistic is simple.

Both are based on repetitive sampling and then direct examination of the results.

But there are significant differences between the methods (hence the difference in names and algorithms). Bootstrapping uses the original, initial sample as the population from which to resample. Monte Carlo Simulation uses a data generation process, with known values of the parameters of the Probability Distribution Function. The common algorithm for MCS is Lurie-Goldberg. Monte Carlo is used to test that the results of the estimators produce desired outcomes on the project. And if not, allow the modeler and her management to change those estimators and then mange to the changed plan.

Bootstrap can be used to estimate the variability of a statistic and the shape of its sampling distribution from past data. Assuming the future is like the past, make forecasts of throughput, completion and other project variables. 

In the end the primary differences (and again the reason for the name differences) is Bootstrapping is based on unknown distributions. Sampling and assessing the shape of the distribution in Bootstrapping adds no value to the outcomes. Monte Carlo is based on known or defined distributions usually from Reference Classes.

Related articles Do The Math Complex, Complexity, Complicated The Fallacy of the Planning Fallacy
Categories: Project Management

Just Because You Say Words, It Doesn't Make Then True

Wed, 05/13/2015 - 22:33

When we hear words about any topic, my favorite of course is all things project manage, it doesn't make them true.

  • Earned Value is a bad idea in IT projects because it doesn't measure business value
    • Turns out this is actually true. The confusion was with the word VALUE
    • In Earned Value,¬†Value¬†is Budgeted Cost of Work Performed (BCWP) in the DOD vocabulary or Earned Value in the PMI¬†vocabulary
  • Planning is a waste
    • Planning is a Strategy for the¬†successful¬†completion of the project
    • It'd be illogical not to have a Strategy for the success of the project
    • So we need a plan.
    • As Ben Franklin knew "Failure to Plan, means we're Planning to Fail"
  • The Backlog is a waste and grooming the Backlog is a bigger waste
    • The backlog is a list of planned work to¬†produce¬†the value of the project
    • The backlog can change and this is the "change control paradigm" for agile.
    • Change control is a critical processes for all non-trivial projects
    • Without¬†change¬†control we don't have a stable description of what "Done" looks like for the¬†project. Without having some sense of "Done" we're on a Death March project
  • Unit Testing is a waste
    • Unit testing is the first step of Quality Assurance and¬†Independent¬†Verification and Validation
    • Without UT and in the presence of a QA and IV&V process, it will be "garbage in garbage out" for the software.
    • Assuming¬†the developers can do the testing is naive at best on any non-trivial project
  • Decisions can be made in the presence of uncertainty without estimates
    • This violates the principles of Microeconomics¬†
    • Microeconomics¬†is a branch of economics that studies the behavior of individuals and small impacting organizations in making decisions on the allocation of limited resources¬†
    • All projects have uncertainty - reducible and irreducible.
    • This uncertainty creates risk. This risk impacts the¬†behaviors¬†of the project work.
    • Making decisions - choices - in the presence of these uncertainties and resulting risks needs to assess some¬†behavior¬†that is probabilistic.
    • This probabilistic ¬†behavior¬†is¬†driven¬†by underlying statistical¬†processes.

So when we hear some phrase, idea, or conjecture - ask for evidence. Ask for domain. Ask for examples. If you hear we're just exploring ask who's paying for that? Because it is likely those words are unsubstantiated conjecture from personal experience and not likely very useful outside that personal experience

Related articles Root Cause Analysis The Reason We Plan, Schedule, Measure, and Correct The Flaw of Empirical Data Used to Make Decisions About the Future Want To Learn How To Estimate? There is No Such Thing as Free Estimates
Categories: Project Management

The Fallacy of the Planning Fallacy

Wed, 05/13/2015 - 15:06

The Planning Fallacy is well documented in many domains. Bent Flyvbjerg has documented this issue in one of his books, Mega Projects and Risk. But the Planning Fallacy is more complex than just the optimism bias. Many of the root causes for cost overruns are based in the politics of large projects.

The  planning fallacy is ...

...a phenomenon in which predictions about how much time will be needed to complete a future task display an optimistic bias (underestimate the time needed). This phenomenon occurs regardless of the individual's knowledge that past tasks of a similar nature have taken longer to complete than generally planned. The bias only affects predictions about one's own tasks; when outside observers predict task completion times, they show a pessimistic bias, overestimating the time needed. The planning fallacy requires that predictions of current tasks' completion times are more optimistic than the beliefs about past completion times for similar projects and that predictions of the current tasks' completion times are more optimistic than the actual time needed to complete the tasks.

The critical notion here is about ones own estimates. This is the critical reasons for 

With all that said, there still is a large body of evidence that estimating is still a major problem.† 

I have a colleague who is the former Cost Analysis Director of NASA. He has three reasons projects get in cost, schedule, and technical trouble:

  1. We couldn't know - we're working in a domain where discovery is actually the case. We're inventing new physics, discovering new drugs that have never been discovered before. We're doing unprecedented development. Most people using the term "we're exploring" don't likely know what they[re doing and those paying are paying for that exploring. Ask yourself if you're in the education or actually the research and development business.
  2. We didn't know - we could have known, but we just didn't want to. We couldn't afford to know. We didn't have time to know. We were incapable of knowing because we're outside our domain. Would you hire someone who didn't do his homework when it comes to providing the solution you're paying for? Probably not. Then why accept we didn't know as an excuse?
  3. We don't want to know - we could have known, but if we knew that'd be information that would cause this project to be canceled.

The Planning Fallacy

Daniel Kahneman (Princeton) and Amos Tversky¬†(Stanford) describe it as ‚Äúthe tendency to underestimate the time, costs, and risks of future actions and overestimate the benefit of those actions‚ÄĚ.¬† The results are time and cost overruns as well as benefit shortfalls.¬† The concept is not new: they coined the term in the 1970s and much research has taken place since, see the Resources below.

So the challenge is to not fall victim to this optimism bias and become a statistic in the Planning Fallacy.

How do we do that?

Here's our experience:

  • Start with a credible systems architecture with the topology of the delivered system:
    • By credible i mean not a paper drawing on the wall, but a a sysML description of the system and its components. sysML tool can be had for free along with commercial products.
    • Defining the interactions between the components is the critical issue to discover the location for optimism. The Big Visible Chart from sysML needs to hang on the wall for all to see where these connections take place.
    • Without this BVC, the optimism is¬†It not that complicated, what could possibly be the issue with our estimates.¬†
    • It's the interfaces where the project goes bad. Self contained components have problems for sure, but when connected to other components this becomes a system of systems and the result is an N2 problem.
  • Look for reference classes for the components
    • Has anyone here done this before?
    • No,? Do we know anyone who knows anyone who's done this before?
    • Is there no system like this system in the world?
    • If the answer to that is NO, then we need another approach - we're inventing new physics and this project is actually a research project - act appropriately¬†
  • Do we have any experience doing this work in the past?
    • No, why would we get hired to work on this project?
    • Yes, but we've failed in the past?
      • No problem, did we learn anything.
      • Did we find the Root Cause of the past performance problems and take the corrective actions?
      • Did we follow a know process (APOLLO) in that Root Cause Analysis and Corrective actions?
      • No, you're being optimistic that the problems won't come back
  • Do we have any sense of the Measures of the system that will drive cost?
    • Effectiveness - ¬†are the operational measures of success that are closely related to the achievements of the mission or operational objectives evaluated in the operational environment, under a specific set of conditions.
    • Performance - characterize physical or functional attributes relating to the system operation, measured or estimated under specific conditions.
    • Key Performance Parameters - represent the capabilities and characteristics so significant¬† that failure to meet them can be cause for reevaluation, reassessing, or termination of the program.
    • Technical Performance Measures - determine how well a system or system element is satisfying or expected to satisfy a technical requirement or goal
    • All the ...ilities
    • Without understanding these we have no real understanding of where the problems are going to be and the natural optimism comes out.
  • Do we know what technical and programmatic risks are going to be encountered in this project?
    • Do we have a risk register?
    • Do we know both the reducible and irreducible risks to the success of the project?
    • Do we have mitigation plans for the reducible risks?
    • For reducible risks without mitigation plans, do we have Management Reserve?
    • For irreducible risks do we have cost and schedule margin?
  • Do we have a Plan showing the increasing maturing of the deliverables of the project?
    • Do we know what Accomplishments must be performed to increase the maturity of the deliverable?
    • Do we know the Criteria for each Accomplishment, so we can measure the actual progress to plan?
    • Have we arranged the work to produce the deliverables in a logical network - or some other method like Kanban - that shows the dependencies between the work elements and the deliverables.¬†
    • This notion of dependencies is very underrated.¬†
      • The Kanban paradigm assumes this up front
      • Verifying there are actually NO dependencies is critical to all the processes based on having NO dependencies.¬†
      • It's seem rare that those verifications actually take place
      • This is an Optimism Bias in the agile software development world.
  • Do we have a credible, statistically adjusted, cost and schedule model for assessing the impact of any changes?
    • I'm confident our costs will not be larger than our revenue - sure right. Show me your probabilistic model.
    • No model, we're likely being optimistic and don;t even know it
    • Show Me The Numbers.

So With These And Others...We can remove the fallacy of the Planning Fallacy.

This doesn't mean our project will be successful. Nothing can guarantee that. But the Probability of Success will be increased.

In the end we MUST know the Mission we are trying to accomplish, the units of measure of that Mission in terms meaningful to the decision makers. Without that we can't now what DONE looks like. Amnd with that only our optimism will carry us along until it is too late to turn back.

Screen Shot 2015-05-12 at 1.53.57 PM

Anyone using Planning Fallacy as the excuse for project failure, not planning, not estimating, not actually doing their job as a project and business manager will likely succeed in the quest for project failure and get  what they deserve. Late, Over Budget, and the gadget they're building doesn't work as needed.

† Please note, that just because estimating is a problem in all domains, that's NO reason to not estimate. Like planning is a problem, it's no reason NOT to plan. Any suggestion that estimating or planning is not needed in the presence of uncertain future - as it is on all projects - is willfully ignoring the principles of Microeconomics - making choices in the presence of uncertainty based on opportunity cost . To suggest other wise confirms this ignorance.

Resources

These are some background on the Planning Fallacy problem from the anchoring and adjustment point of view that I've used over the years to inform our estimating processes for software intensive systems. After reading through these I hope you come to a better understanding of many of the mis-conceptions about estimate and the fallacies of how that is done in practice.

Interestingly there is a poster on twitter in the #NoEstimates thread that objects when people post links to their own work or work of others. Please do not fall prey to the notion that everyone has an equally informed opinion, unless you yourself have done all the research needed to cover the foundations of the topics. Outside resource are the very life blood of informed experience and the opinions that come from that experience. 

  1. ¬†Kahneman, Daniel; Tversky, Amos (1979). "Intuitive prediction: biases and corrective procedures".TIMS Studies in Management Science¬†12: 313‚Äď327.
  2.  "Exploring the Planning Fallacy" (PDF). Journal of Personality and Social Psychology. 1994. Retrieved 7 November 2014.
  3. Estimating Software Project Effort Using Analogies, 
  4. Cost Estimation of Software Intensive Projects: A Survey of Current Practices
  5. "If you don't want to be late, enumerate: Unpacking Reduces the Planning Fallacy". Journal of Experimental Social Psychology. 15 October 2003. Retrieved 7 November 2014.
  6. A Causal Model for Software Cost Estimating Error, Albert L. Lederer and Jayesh Prasad, IEEE Transactions On Software Engineering, Vol. 24, No. 2, February 1998.
  7. Assuring Software Cost Estimates? Is It An Oxymoron? 2013 46th Hawaii International Conference on System Sciences.
  8. A Framework for the Analysis of Software Cost Estimating Accuracy,¬†ISESE'06, September 21‚Äď22, 2006, Rio de Janeiro, Brazil.¬†
  9. "Overcoming the Planning Fallacy Through Willpower". European Journal of Social Psychology. November 2000. Retrieved 22 November 2014.
  10. Buehler, Roger; Griffin, Dale, & Ross, Michael (2002). "Inside the planning fallacy: The causes and consequences of optimistic time predictions". In Thomas Gilovich, Dale Griffin, & Daniel Kahneman (Eds.),¬†Heuristics and biases: The psychology of intuitive judgment, pp. 250‚Äď270. Cambridge, UK: Cambridge University Press.
  11. Buehler, Roger; Dale Griffin; Michael Ross (1995). "It's about time: Optimistic predictions in work and love".¬†European Review of Social Psychology¬†(American Psychological Association)¬†6: 1‚Äď32.¬†doi:10.1080/14792779343000112.
  12. Lovallo, Dan; Daniel Kahneman (July 2003). "Delusions of Success: How Optimism Undermines Executives' Decisions".¬†Harvard Business Review: 56‚Äď63.
  13. Buehler, Roger; Dale Griffin; Michael Ross (1994). "Exploring the "planning fallacy": Why people underestimate their task completion times".¬†Journal of Personality and Social Psychology¬†(American Psychological Association)¬†67¬†(3): 366‚Äď381.¬†doi:10.1037/0022-3514.67.3.366.
  14. Buehler, Roger; Dale Griffin; Johanna Peetz (2010). "The Planning Fallacy: Cognitive, Motivational, and Social Origins" (PDF). Advances in Experimental Social Psychology (Academic Press) 43: 9.
  15. Hourglass Is Half Full or Half Empty: Temporal Framing and the Group Planning Fallacy". Group Dynamics: Theory, Research, and Practice. September 2005. Retrieved22 November 2014.
  16. Stephanie P. Pezzoa. Mark V. Pezzob, and Eric R. Stone. "The social implications of planning: How public predictions bias future plans" Journal of Experimental Social Psychology, 2006, 221‚Äď227.
  17. "Underestimating the Duration of Future Events: Memory Incorrectly Used or Memory Bias?". American Psychological Association. September 2005. Retrieved 21 November 2014.
  18. "Focalism: A source of durability bias in affective forecasting.". American Psychological Association. May 2000. Retrieved 21 November 2014.
  19. Jones,, Larry R; Euske, Kenneth J (October 1991).¬†"Strategic misrepresentation in budgeting".Journal of Public Administration Research and Theory¬†(Oxford University Press)¬†1¬†(4): 437‚Äď460. Retrieved¬†11 March¬†2013.
  20. Taleb, Nassem (2012-11-27). Antifragile: Things That Gain from Disorder. ISBN 9781400067824.
  21. "Allocating time to future tasks: The effect of task segmentation on planning fallacy bias". Memory & Cognition. June 2008. Retrieved 7 November 2014.
  22. "No Light at the End of his Tunnel: Boston's Central Artery/Third Harbor Tunnel Project". Project on Government Oversight. 1 February 1995. Retrieved 7 November 2014.
  23. "Denver International Airport" (PDF). United States General Accounting Office. September 1995. Retrieved 7 November 2014.
  24. Lev Virine and Michael Trumper. Project Decisions: The Art and Science, Vienna, VA: Management Concepts, 2008. ISBN 978-1-56726-217-9 
    • Michael and Lev provide the Risk Management tool we use - Risky Project.
    • Risky Project is a Monte Carlo Simulation tool for reducible and irreducible risk from probability distribution functions of the uncertainty in project.
    • Which by the way is an actual MCS tools not based on¬†boot strapping from small number of past samples many times over.
  25. Overcoming the planning fallacy through willpower: effects of implementation intentions on actual and predicted task‚Äźcompletion times,¬†

  26. Anchoring and Adjustment in Software Estimation, Jorge Aranda and Steve Easterbrook, ESEC-FSE‚Äô05, September 5‚Äď9, 2005, Lisbon, Portugal.
  27. Anchoring and Adjustment in Software Estimation, Jorge Aranda, PhD Thesis, University of Toronto, 2005.
  28. Anchoring and Adjustment in Software Project Management: An Experiment Investigation, Timothy P. Costello, Naval Post Graduate School, September 1992.
  29. Anchoring Effect, Thomas Mussweiler, Birte Englich, and Fritz Strack
  30. Anchoring, Non-Standard Preferences: How We choose by Comparing with a Nearby Reference Point.
  31. Reference points and redistributive preferences: Experimental evidence, Jimmy Charité, Raymond Fisman, and Ilyana Kuziemko
  32. Anchoring and Adjustment, (YouTube), Daniel Kahneman. This anchoring and adjustment discussion is critical to how we ask the question how much, how big, and when.
  33. Anchoring unbound, Nicholas Epley and Thomas Gilovich 

  34. Assessing Ranges and Possibilities, Decision Analysis for the Professional, Chapter 12, Strategic Decision and Risk Management, Stanford Certificate Program. 
    • This book by the way should be mandatory reading for anyone suggesting the decisions can be made in the absence of estimates.
    • They can't and don't accept they can, because they can't
  35. Attention and Effort, Daniel Kaheman, Prentice Hall, The Hebrew University of Jerusalem, 1973.
  36. Availability: A heuristic fir Judging Frequency and Probability, Amos Tversky and Daniel Kahneman.
  37. On the Reality of Cognitive Illusions, Daniel Kahneman, Princeton University, Amos Tversky, Stanford University.

  38. Efficacy of Bias Awareness in Debasing Oil and Gas Judgements, Matthew B. Welsh, Steve H. Begg, and Reidar B. Bratvold. 
  39. The framing effect and risky decisions: Examining cognitive functions with fMRI, Cleotilde Gonzalez, Jason Dana, Hideya Koshino, and ,Marcel Just, The Journal of Economic Psychology, 26 (2005), 1-20.

  40.  

    Discussion Note: Review of Tversky & Kahnemann (1974): Judgment under uncertainty: heuristics and biases, Micheal Axelsen UQ Business School The University of Queensland Brisbane, Australia
  41. The Anchoring-and-Adjustment Heuristic, Why the Adjustments Are Insufficient, Nicholas Epley and Thomas Gilovich.

  42.  

    Judgment under Uncertainty: Heuristics and Biases, Amos Tversky; Daniel Kahneman, Science, New Series, Vol. 185, No. 4157. (Sep. 27, 1974), pp. 1124-1131

This should be enough to get you started and set the stage for rejecting any half baked ideas about anchoring and adjustment, planning fallacies, no need to estimate and the collection of other cocka mammy ideas floating around the web on how to make credible decisions with other peoples money.

Related articles The Reason We Plan, Schedule, Measure, and Correct Herding Cats: Five Estimating Pathologies and Their Corrective Actions Tunnel to Nowhere Root Cause Analysis The Flaw of Empirical Data Used to Make Decisions About the Future
Categories: Project Management

There is No Such Thing as Free

Tue, 05/12/2015 - 16:34

When our children were in High School, I strongly suggested they both take an economics class. Our daughter came home one day and announced at the dinner table

Dad, we learned something important today in Economics Class. 
Whats that dear?, 
I said, knowing the answer.
There's no such thing as free she said.

In finance, There is No Such Thing as a Free Lunch (TANSTAAFL) refers to the opportunity cost paid to make a decision. The decision to consume one product usually comes with the trade-off of giving up the consumption of something else. 

In a recent conversation, I was introduced to the notion of Extreme Contracts, 

  • Very short iterations, usually one week.
  • A flat fee for every iteration.
  • Money back guarantee.

The first bullet sounds like a good idea, provided one week can actually produce testable outcomes. The second bullet means that the variable will be the produced outcomes, since the probability that all work is of the same size, same risk, same effort is likely low on any non-trivial project.

The last bullet fails to acknowledge the principle of lost opportunity cost. This is the there is no such thing as free. When the delivered software is not what is needed, the cost of the software in Extreme Contracts, is free. But the lost capabilities in the time frame they are needed is not free. It has a cost. This is the Opportunity Cost that is lost. 

The basis of good project management, and especially in our domain, using Earned Value, depends on providing the needed capabilities at the needed time for the needed cost. Baked into that paradigm is all the cost, planned upfront with the appropriate confidence levels, alternative assessment, and margins and reserves. The discovery cost, risk mitigation cost - both reducible and irreducible - and the cost recovery for productive use of the delivered product or service on the planned date for the planned cost.

This is the foundation of Capabilities Based Planning used in enterprise IT and software intensive systems development.

Capabilities based planning (v2) from Glen Alleman Related articles The Reason We Plan, Schedule, Measure, and Correct How We Make Decisions is as Important as What We Decide. The Flaw of Empirical Data Used to Make Decisions About the Future
Categories: Project Management

Do The Math

Mon, 05/11/2015 - 16:18

MathAll project work is probabilistic. All decision making in the presence of probabilistic systems requires making estimates of future emerging outcomes.

But to do this properly we need to have a standard set of terms that can form the basis of understanding the problem and the solution.

When those terms are redefined for what ever reason, the ability to exchange ideas is lost. For example there is a popular notion that defining terms in support of an idea is useful

  • Forecasting and estimating are different things
    • Estimating is about past, precent and future
    • Forecasting is an estimate about the future
  • Monte Carlo Simulation is the same of Boot Strapping sampling
    • ¬†Monte Carlo is an algorithm process of selecting data from under a probabilistic distribution function
    • Boot ¬†strapping is sampling existing data from past samples
    • MCS uses a PDF not past data
  • Probabilistic ¬†forecasting outperforms estimating every time¬†
    • Probabilistic forecasting IS estimating
Related articles The Flaw of Empirical Data Used to Make Decisions About the Future Economics of Software Development Who Builds a House without Drawings? Herding Cats: Decision Analysis and Software Project Management
Categories: Project Management

Mr. Franklin's Advice

Mon, 05/11/2015 - 16:04

Quote-if-you-fail-to-plan-you-are-planning-to-fail-benjamin-franklin-46-18-75 Quote-plans-are-nothing-planning-is-everything-dwight-d-eisenhower-56565

 

 

 

 

 

So what does this actually mean in the project management domain?

Plans are strategies for the success of the project. Strategies are hypothesis. Hypothesis needs to be tested to determine their validity. These tests - in the project domain - comes from setting a plan, performing the work, accessing the compliance of the outcomes with the plan, that corrective actions in the next iteration of the works.

This seems obvious, but when we hear about the failures in the execution of the plans, we have to wonder what went wrong. Research has shown many Root Causes of project shortfalls. Here are four from our domain:

  • Unrealistic performance expectations, missing Measures of Effectiveness and Measures of Performance.
  • Unrealistic cost and schedule estimates based on inadequate risk adjusted growth models.
  • Inadequate assessments of risk and unmitigated exposure to these risks with propers handling plans.
  • Unanticipated technical issues without alternative plans and solutions to maintain effectiveness.

The root cause for each of these starts with the lack of the following

Unrealistic Performance Expectations

When we set out to define what performance is needed we must have a means of testing that this expectation can be achieved. There are several ways of doing this:

  • No Prototypes
  • No Modeling and Simulations of performance outcomes
  • No reference design to base modeling on to discover needed changes in baseline system architecture
  • No use of derived products

Unrealistic Cost and Schedule Estimates

  • No basis of estimate
  • Optimism bias
  • No parametric models¬†
  • No understanding of irreducible uncertainties in duration for work
  • No reference classes.

Inadequate Assessment of Risk

  • Not understanding "Risk management is how adult manage projects" - Tim Lister
  • No risk register connected to planning and scheduling processes
  • No Monte Carlo assessment of risk impacts on cost and schedule
  • No risk mitigation in baseline
  • Inadequate Management Reserve developed from modeling processes

Unanticipated Technical Issues

  • No Plan B
  • No in depth risk assessment of technologies
  • No "on ramps" or "off ramps" for technology changes

Each of these issues can be addressed through a Systems Engineering process using Measures of Effectiveness, Measures of Performance, and Technical Performance Measures. The planning process makes use of these measures to assess the credibility of the plan and the processes to test the hypothesis. 

Related articles Want To Learn How To Estimate? Debunking Let's Stop Guessing and Start Estimating Complex, Complexity, Complicated Estimates
Categories: Project Management

Let's Stop Guessing and Start Estimating

Thu, 05/07/2015 - 15:24

What’s the difference between estimate and guess?

One way to distinguish between them is degree of care taken when we arrive at a conclusion. A conclusion about how much effort work will take. How much it will cost to perform that work. If that work will hve any risk associated with it. 

Estimate¬†is derived from the Latin word¬†aestimare.¬†‚ÄúTo Value.‚ÄĚ The term estimate is also the origin of¬†estimable, ¬†meaning ‚Äúcapable of being estimated‚ÄĚ or ‚Äúworthy of esteem", and of¬†esteem, which meaning ‚Äúregard."

To make an estimate means to judge - using some method - the extent, nature, or value of something, with the implication that the result is based on expertise, data, a model, or familiarity. An estimate is the resulting calculation or judgment of the outcome or result. The related term is¬†approximation, meaning ‚Äúclose or near.‚ÄĚ Estimates have a measure of¬†nearness to the actual value. We may not be able to know the actual value, but the estimate is¬†close to that value. The confidence in the estimate adds more information about the¬†nearness of the estimate to the actual value.

To guess is to believe or suppose, to form an opinion based on little or no evidence, or to be correct by chance or conjecture. A guess is a thought or idea arrived at by one of these methods. Guess is a synonym for  conjecture and surmise, which like estimate, can be used as a verb or noun.

One step between a guess and an estimate is an educated guess, a more casual estimate. An idiomatic term for this conclusion is ‚Äúballpark figure.‚ÄĚ The origin of this American English idiom, which alludes to a baseball stadium, is not certain. One conclusion is is related to ‚Äúin the ballpark,‚ÄĚ meaning ‚Äúclose‚ÄĚ in the sense that one at such a location may not be in a precise location but is in the stadium.

We could have a hunch or an intuition about some outcome, some numerical value. Or we could engage in guesswork or speculation.

An interesting idea is ‚ÄúDead reckoning‚ÄĚ means the same thing as¬†guesswork, though it originally referred to navigation based on reliable information. Near synonyms describing thoughts or ideas developed with more rigor include¬†hypothesis¬†and¬†supposition, as well as¬†theory¬†and¬†thesis.

A guess is a casual,  spontaneous conclusion. 
An estimate is based on some thought and/or
data.

If those paying you can accept a Wild Ass Guess then you're probably done. If they have tolerance (risk tolerance) for loosing their value at risk if you're guess is wrong, then go ahead and Guess. Otherwise some form of estimate is likely needed to inform your decision about some outcome in the future that is uncertain.

Related articles How We Make Decisions is as Important as What We Decide. The Flaw of Empirical Data Used to Make Decisions About the Future Build a Risk Adjusted Project Plan in 6 Steps Want To Learn How To Estimate?
Categories: Project Management

A Framework for Managing Other Peoples Money

Tue, 05/05/2015 - 17:06

Managing other people's money - our internal money, money from a customer, money from an investor - means making rational decisions based on facts. In an uncertain and emerging future, making those decisions means assessing the impact of that decision on that future in terms of money, time, and delivered value.

To make those decisions - in the presence of this uncertainty - implies we need to develop information about the variables that appear in that future. We can use past data of course. That data needs to be adjusted for several factors:

  • Does this data in the past represent data in the future?
  • Does this data have a statistical assessment for variance, standard deviation, and other¬†moments that we can use to assess the usefulness of the data for making decisions in the future?
  • Are there enough data points to create a credible forecast of the future?

No answers to these questions? That data you collected not likely to have much value in making decisions for the future.

Categories: Project Management

How We Make Decisions is as Important as What We Decide.

Mon, 05/04/2015 - 15:36

Working over the week on a release of a critical set of project capabilities and need a break from that. This post will be somewhat scattered as I'm writing it in the lobby to get some fresh air.

Here's the post asking for a conversation about estimates. Here's a long response to that request.

Let's ignore the term FACT for the moment as untestable and see how to arrive at some answers for each statement. These answers are from a paradigm of Software Intensive Systems, where Microeconomics of decision-making is the core paradigm used to make decisions, based on Risk and Opportunity Costs from those decisions.

  • FACT: It is possible, and sometimes necessary, to estimate software tasks and projects.
    • It is always possible to estimate the future. This is well established in all domains. The mathematical means to makes estimates is readily available in any book store, college campus, and on the web.
    • The confidence in the estimate's value is part of the estimating process. Measurement of Error, Variance, Confidence Intervals, Sample Sizes for past performance, and a myriad of other measures are also readily available for the asking
    • The¬†value at risk¬†is one attribute of the estimate
    • Low¬†value at risk¬†provides a wider range on the confidence value
    • High¬†value at¬†risk¬†requires higher confidence
  • FACT: Questioning the intent behind a request for an estimate is the professional thing to do
    • Introducing the profession card is a common tactic. Developing software for moment is not a profession. A profession requires prolonged training, requiring recognition of professional qualifications.¬† Directive on Recognition of Professional Qualifications (2005/36/EC) ‚Äúthose practiced on the basis of relevant professional qualifications in a personal, responsible and professionally independent capacity by those providing intellectual and conceptual services in the interest of the client and the public‚ÄĚ.
    • Programmers are not professions in that sense.
    • To the CFO with a CPA ‚Äď which is a profession ‚Äď the intent of estimates is to inform those accountable for the money to make decisions about that money informed by the¬†value at risk.
    • To question that intent assumes those making those decisions no longer have the fiduciary responsibility for being the stewards of the money. And that responsibility is transferred to those¬†spending¬†the money.
    • This would imply the¬†separation¬†of concerns¬†on any¬†governance¬†based business has been suspended.
  • FACT: #NoEstimates is a Twitter hashtag and was never intended to become a demand, a method or a black-and-white truth
    • The Hash Tag's original poster makes a clear and concise statement.
      • We can make¬†decisions¬†in the absence of¬†estimating¬†the impact of those decisions.¬†
    • Until those original words are addressed, clarified, and possibly corrected, or even withdraw, the hashtag will remain contentious.
    • Since the original post would mean the principle of Microeconomics would not longer be applicable in the development of software using other people‚Äôs money in the presence of uncertainty.
    • Continually redefining the meaning of #NoEstimates to address the willful ignoring of Microeconomics is simply going in circles.
    • If it is possible to make a decision about the future in the presence of uncertainty about that future, uncertainty about the cost of achieving a beneficial outcome from the decision, about the uncertainty of the cost and time needed to achieve that probabilistic outcome ‚Äď WITHOUT estimating, let‚Äôs hear it.
    • And by the way, using past performance samples ‚Äď and small ones at that ‚Äď does not remove to need to estimate the future outcomes. It only provides one way to information the probabilistic behavior of that future outcome. It is still estimating. Calling it No Estimates and using past performance, no matter how poorly formed, does not replace the misuse of the term.
  • FACT: The #NoEstimates hashtag became something due to the interest it generated
    • This is a¬†shouting fire in a theater¬†approach to conversation
    • Without a domain and governance paradigm, the notion of¬†making decisions in the¬†absence¬†of¬†estimates¬†has no basis for being tested.
  • FACT: A forecast is a type of estimate, whether probabilistic, deterministic, bombastic or otherwise
    • Yes, forecasting is estimating outcomes in the future. The weather forecast is an estimate of the future behavior of the atmosphere.
    • Estimates of past and present can also be made.
  • FACT: Forecasting is distinct from estimation, at least in the common usage of the words,¬†in that it involves using data to make the ‚Äúestimate‚ÄĚ rather than relying on a person or people drawing on ‚Äúexperience‚ÄĚ or guessing
    • These definitions are not found outside the posters personally selected operational definitions. No probability and statistics book uses that definition. If there is one, please provide the reference. Wikipedia definitions from other domains don‚Äôt count. Find that definition in the estimating software systems and let‚Äôs talk further.
    • Texts like¬†Estimating Software¬†Intensive¬†Systems¬†do not make this distinction, nor do any other books, papers, and resources on estimating.
    • Estimating is about past, present, and future approximation of value found in system with uncertainty.
      • Estimate ¬†- a number that approximates a value of interest in a system with uncertainty.
      • Estimating - the process used to make such a calculation
      • To Estimate - find a value close to the actual value. 2 ‚Čą 2.3. 2 is an approximation of the value 2.3.¬†
    • Forecasts are about future approximations of values found in systems with uncertainty.
    • Looking for definitions outside the domain of software development and applying to fit the needs of the argument is disingenuous¬†
  • FACT: People who tweet with the hashtag #NoEstimates, or indeed any other hashtag, are not automatically saying ‚ÄúMy tweet is congruent and completely in agreement with the literal meaning of the words in the hashtag‚ÄĚ
    • Those who tweet with hashtag are in fact retweeting the notion that decisions can be made without estimates if they do not explicitly challenge that notion.
    • If that is not established, there is an implicit support of the original idea
  • FACT: The prevailing way estimation is done in software projects is single point estimation
    • This is likely a personal experience, since many stating that have limited experience outside their domain.
    • It is simply bad mathematics, well known to anyone who took a High School statistics class. If you did take that class and believe that, you get a D
  • FACT: The prevailing way estimates are used in software¬†organizations¬†is a push for a commitment, and¬†then an excuse for a whipping when the estimate is not met.
    • Again likely personal experience.
    • If the poster said¬†in my experience...¬†that would establish the limits of the statement.
    • ‚ÄúIME‚ÄĚ takes 3 letters. Those are rarely seen by those suggesting¬†not¬†estimating¬†is a desirable approach to managing in the presence of uncertainty while spending other people money.
    • Those complaining the phrase¬†spending other peoples money¬†are likely not dong that, or not doing that with a substantial¬†value at risk.¬†
  • FACT: The above fact does not make estimates a useless¬†artifact, nor estimation itself a useless or¬†damaging activity
    • Those proffering decisions can be made without estimating have in FACT said estimating are damaging, useless, and a waste of time.
    • Until that is countered, it will remain the basis of NoEstimates.

So if the OP is actually interested in moving from the known problem of using estimates in a dysfunction way, let's stop speaking about how to make decisions without estimates, and learn how to make good estimates needed for good decisions.

This issue of Harvard Business Review is dedicated to Make Better Decisions. Start with reading how to make good decisions. There is a wealth of guidance how to do that. Why use Dilbert-style management examples. We all now about those. How about some actionable corrective actions to the root causes of bad management. All backed up with data beyond personal anecdotes. Reminds me of eraly XP where just try it was pretty much the approach. So if the OP is really interested in...

Let’s use our collective influence and intelligence to take the discussion forward to how we can cure the horrible cancer in our industry of Estimate = Date = Commitment.

Then there are nearly unlimited resources for doing that. The first is to call BS on the notion decisions can be made without estimates, without stating where this is applicable - first. Acknowledge unequivocally that estimates are needed when the value at risk reaches a level deemed important by the owners of the money, and start acting like the professionals we want to be and gain a seat at the team to improve the probability of success of our projects with credible estimates of cost, effort, risk, productivity, production of value, and all the other attributes of that success. 

For those interested in exploring further how to provide value to those paying your salary, here are some posts on Estimating Books

Categories: Project Management

Forecasting the Future is Critical to All Success

Fri, 05/01/2015 - 18:47

Skate to Where Puck Will BeFull attribution to Gaping Void for this carton. http://gapingvoid.com/2010/05/03/daily-bizcard-11-fred-wilson/

Wayne makes realtime estimates on every skate stroke of where he is going, where the puck is going, and where all the defensemen are going to be when he plans to take his shot on goal.

When we hear we can make decisions about the future without estimating the impact from those decision or using only small sample, non-statistically adjusted measures, or ignore the stochastic behaviors of the past and the future we'll be ion the loosing end of the shot on goal.

There simply is no way out of the need to estimate the future for any non-trivial project funded by other peoples money. Trivial project? Our own money? Act as you wish. No one cares what you do. Make a suggestion it works somewhere else, better come to the table with some testable data independent of personal anecdotes. 

Related articles Root Cause Analysis The Reason We Plan, Schedule, Measure, and Correct The Flaw of Empirical Data Used to Make Decisions About the Future The Flaw of Averages and Not Estimating
Categories: Project Management

Fit for Purpose and Fit for Use

Fri, 05/01/2015 - 07:15

ITILIn the governance paradigm there are two terms used to test ideas, processes, and their outcomes.

Fit for Purpose and Fit for Use

Both these terms are used to describe value of an IT outcome. A product or service value is defined by fit to purpose (utility) and fit to use (warranty).

Fit to purpose, or utility, means that service needs to fulfill customer needs.

Fit for use, or warranty, means that product or service is available when a user needs it.

From Phillippe Kruchten's The Frog and the Octopus we get 8 factors defining a context from which to test any  idea that we encounter.

  1. Size
  2. Criticality
  3. Business model
  4. Stable architecture
  5. Team distribution
  6. Governance
  7. Rate of change
  8. Age of system

So when we hear something that doesn't seem to pass the smell test, think of Carl's advice

Nice Hypothesis

And then when we hear you're just not willing to try out this idea, think of some more of his advice.

Tumblr_mizbc3VWht1ri1icuo1_500

Then ask, how is this idea of your Fit for Purpose and Fit for Use in those 8 context factors? Don't have an answer? Then maybe the personal anecdotes are ready for prime time in Enterprise domain.

Related articles Root Cause Analysis The Reason We Plan, Schedule, Measure, and Correct
Categories: Project Management

Domain is King, No Domain Defined, No Way To Test Your Idea

Thu, 04/30/2015 - 16:09

ValueStreamFrom Mark Kennaley's book. 

When we hear of a new and exciting approach to old and never ending problems, we need to first ask in what domain is your new and clever solution applicable

No domain, not likely to have been tested outside you personal anecdotal experience.

Here's Mark's Scaling factors. He uses Agile as the starting point, but I'm expanding it to any suggestion that has yet to be tested outside of personal anecdotes

  • Team size - 1 maybe 2 ‚ÜĒÔłé 1000's.
    • If you've tested your idea on a small team, will it work in a larger setting?
    • There are project where 1000's of engineers, developers, testers, support people work on the same outcome.
    • There are projects where 2 people work on the same project.
    • The business process usually defines the team size, not the method. Writing code for a family owned sprinkler company is not the same as writing code for a world-wide ERP system.
  • Geographic location - co located ‚ÜĒÔłé across the planet.
    • Having a co-located team is wonderful in some instances.
    • In other instances this is physically not possible.
    • In other cases, the customer simply can't be with the development team, no matter how desirable that is.
  • Organizational Structure - single monolithic team ‚ÜĒÔłé Multiple Divisions
    • Small firms with single structures¬†
    • Larger firms with "business units"
    • Even larger firms with separate companies under the same Logos
    • Your method doesn't get to say how the firm is¬†organized, it needs to adapt to that organization.
  • Compliance - None ‚ÜĒÔłé Life, financial, safety critical
    • Governance is a broad term, so is compliance
    • The notion of customer collaboration over contract negotiation is usually spoken by those with low value at risk.
  • Domain Complexity - Straightforward ‚ÜĒÔłé Very Complex
    • Complex and Complexity are relative terms.
    • For a developer who has built web pages for the warehouse, the autonomous rendezvous and dock flight software may appear complex.¬†
    • To the autonomous rendezvous and dock developers, the GPS ground system of 37 earth stations may appear complex.
    • ¬†Pick the domain, then assess the complexity
  • Technical Complexity - same as above

What's the Point?

When a new approach is being proposed, without a domain, it's a solution looking for a problem to solve.

Categories: Project Management