Skip to content

Software Development Blogs: Programming, Software Testing, Agile Project Management

Methods & Tools

Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!

Herding Cats - Glen Alleman
Syndicate content
Performance-Based Project Management¬ģ Principles, Practices, and Processes to Increase Probability of Success
Updated: 2 hours 7 min ago

What Happened to our Basic Math Skills?

Wed, 07/01/2015 - 15:29

Screen Shot 2015-06-30 at 10.03.06 PMMaking decisions in the presence of uncertainty of a future outcomes resulting from that decision is an important topic in the project management, product development, and engineering domains. The first question in this domain is...

If the future is not identical to the past, how can we make a decision in the presence of this future uncertainty?

The answer is we need some means of taking what we know about the past and the present and turning it into information about the future. This information can be measurements of actual activities - cost, duration of work, risks, dependencies, performance and effectiveness measures, models and simulation of past and future activities, reference classes, parametric models.

If the future is identical to the past and the present, then all this data can show us a simple straight line projection from the past to the future.

But there are some questions:

  • Is the future like the past? Have we just assumed this? Or have we actually developed an understanding of the future from looking into¬†what could possible change in the future from the past?
  • If there is no change, can that future be sustained long enough for our actions to have a beneficial impact?
  • If we discover the future may not be like the past, what is the statistical behavior of this future, how can we discover this behavior, and how will these changes impact our decision making processes?

The answers to these and many other questions can be found in the mathematics of probability and statistics. Here's some popular misconceptions of mathematical concepts

Modeling is the Key to Decision Making

"All models are wrong, some are useful," George Box and Norman R. Draper (1987). Empirical Model-Building and Response Surfaces, p. 424, Wiley. ISBN 0471810339. 

  • This book is about process control systems and the statistical process models used to design and operate the control systems in chemical plants. (This is a domain I have worked in and developed software for).
  • This quote has been wildly misquoted, not only out of context, but also completely out of the domain it is applicable to.
  • All models are wrong says, every model is wrong because it is a simplification of reality. This is the definition of a model.
  • Some models, in the "hard" sciences, are only a little wrong. They ignore things like friction or the gravitational effect of tiny bodies. Other models are a lot wrong - they ignore bigger things. In the social sciences, big things are ignored.
  • Statistical models are descriptions of ¬†systems using mathematical language. In many cases we can add a certain layer of abstraction to enable an inferential procedure.
  • It is¬†almost impossible¬†for a single model to describe perfectly a real world phenomenon given our ¬†own subjective view of the world, since our sensory system is not perfect.
  • But - and this is the critical misinterpretation of Box's quote - successful statistical inference does happen any a certain degree of consistency we exploit.
  • So our¬†almost always wrong models¬†do prove¬†useful.

We can't possibly estimate activities in the future if we don't already know what they are

We actually do this all the time. But more importantly there are simple step-by-step methods for making credible estimates about unknown - BUT KNOWABLE - outcomes.
This know of unknown but knowable is critical. If we really can't know - it is unknowable - then the work is not a project. It is pure research. So move on, unless you're a PhD researcher.

Here's a little dialog showing how to estimating most anything in the software development world. 
With your knowledge and experience in the domain and a reasonable understanding of what the customer wants (no units of measure for reasonable by the way, sorry), let's ask some questions.

I have no pre-defined expectation of the duration. That is I have no anchor to start. If I did and didn't have a credible estimate I'd be a Dilbert manager - and I'm not.

  • Me¬†- now that you know a little bit about my needed feature, can you develop this in less than 6 months?
  • You¬†- of course I can, I'm not a complete moron.
  • Me¬†- good, I knew I was right to hire you. How about developing this feature in a week?
  • You¬†- are you out of your mind? I'd have to be a complete moron to sign up for that.
  • Me¬†- good, still confirms I hired the right person for the job. How about getting it done in 4 months?
  • You¬†- well that's still seems like too long, but I guess it'll be more than enough time if we run into problems or it turns out you don't really know what you want and change your mind.
  • Me¬†- thanks for the confidence in my ability. How about 6 weeks for this puppy?
  • You¬†- aw come on, now you're making me cranky. I don't know anyone except someone who has done this already, that can do it in 6 weeks. That's a real stretch for me. A real risk of failure and I don't want that. You hired me to be successful, and now you're setting me up for failure.¬†
  • Me¬†- good, just checking. How about 2¬Ĺ months - about 10 weeks?
  • You¬†- yea that still sounds pretty easy, with some margin. I'll go for that.
  • Me¬†- Nice, I like the way you think. How about 7 weeks?¬†
  • You¬†- boy you're a pushy one aren't you. That's a stretch, but I've got some sense of what you want. It's possible, but I can't really commit to being done in that time, it'll be risky but I'll try.
  • Me¬†- good, let's go with 8¬Ĺ weeks for now, and we'll update the estimate after a few weeks of you actually producing output I can look at.

Microeconomics of Decision Making

 Making decisions about the future in the presence of uncertainty can be addressed by microeconomics principles. Microeconomics is a branch of economics that studies the behavior of individuals and small impacting organizations in making decisions on the allocation of limited resources. Projects have limited resources, business has limited resources. All human endeavors have limited resources - time, money, talent, capacity for work, skills, and other unknowns. 

The microeconomics of decision making involves several variables

  • Opportunity cost - the value of what we give up by taking that action. If we decide between A and B and choose B, what is the cost of A that we're giving up.
  • Marginal cost analysis -¬†impact of small changes in the ‚Äúhow-much‚ÄĚ decision.
  • Sunk cost -¬†costs that have already been incurred and cannot be recovered.
  • Present Value -¬†The value today of a future cost or benefit.

Formally, defining this choice problem is simple: there is a state space S, whose elements are called states of nature and represent all the possible realizations of uncertainty; there is an outcome space X , whose elements represent the possible results of any conceivable decision; and there is a preference relation ‚™ł over the mappings from S to X.¬†‚Ć

This of course provides little in a way to make a decision on a project. But the point here is making decisions in the presence of uncertainty is a well developed discipline. Conjecturing it can't be done simply ignores this discipline.

The Valuation of Project Deliverables

It's been conjectured that focusing on value is the basis of good software development efforts. When suggested that this value is independent of cost this is misinformed. Valuation and the resulting Value used to compare choices, is the process of determining the economic value of an asset, be it a created product, a service, or a process. Value is defined as the net worth, or the difference between the benefits produced by the asset and the costs to develop or acquire the asset, all adjusted appropriately for probabilistic risk, at some point in time.

This valuation has several difficulties:

  • Costs and benefits might occur at different points in time and need to be adjusted, or discounted, to account for time value of money. The fundamental principle that money is worth more today than in the future under ordinary economic conditions.¬†
  • Not all determinants of value are known at the time of the valuation since there is uncertainty inherent in all project and business environments.¬†
  • Intangible benefits like learning, growth or emergent opportunities, and embedded flexibility are the primary sources of value in the presence of uncertainty.¬†

The valuation of the outcomes of software projects depends on the analysis of these underlying costs and benefits. A prerequisite for cost-benefit analysis is the identification of the relevant value and cost drivers to produce that value. Both cost and value are probabilistic, driven by  uncertainty - both reducible and irreducible uncertainty

Modeling Uncertainty

In addition to measurable benefits and costs of the software project, the valuation process must consider uncertainty. Uncertainty arises from different sources. Natural uncertainty (aleatory) which is  irreducible. This uncertainty relates to variations in the environment variables. Dealing with irreducible uncertainty requires margin for cost, schedule, and the performance of the outcomes. For both value and cost.

Event based uncertainty (epistemic) which is reducible. That is we can buy down this uncertainty with out actions. We can pay money to find things out. We can pay money to improve the value delivered from the cost we invest to produce that value.

Parameter uncertainty relates to the estimation of parameters (e.g., the reliability of the average number of defects). Model uncertainty relates to the validity of specific models used (e.g., the suitability of a certain distribution to model the defects). There is a straightforward taxonomy of uncertainty for software engineering that includes additional sources such as scope error and assumption error. The standard approach of handling uncertainty is by defining probability distributions for the underlying quantities, allowing the application of a standard calculus. Other approaches based on fuzzy measures or Bayesian networks consider different types of prior knowledge. ‡

The Final Point Once Again

The conjecture we can make informed decisions about choices in an uncertain future can be done in the absence of making estimates of the impacts of these choices has no basis in the mathematics of decision making.

This conjecture is simply not true. Any attempt to show this can be done has yet to materialize in any testable manner. This is where the basic math skills come into play. There is no math that supports this conjecture. Therefore there is no way to test this conjecture. It's personal opinion uninformed by any mathematics.

Proceed with caution when you hear this.

† Decision Theory Under Uncertainty, Johanna Etner, Meglena Jeleva, Jean-Marc Tallon,  Centre d’Economie de la Sorbonne 2009.64

‡ Estimates, Uncertainty and Risk. IEEE Software, 69-74 (May 1997), Kitchenham and Linkman and "Belief Functions in Business Decisions. In: Studies in Fuzziness and Soft Computing, Vol. 88, Srivastava and Mock

Related articles Information Technology Estimating Quality Everything I Learned About PM Came From a Elementary School Teacher Carl Sagan's BS Detector Eyes Wide Shut - A View of No Estimates
Categories: Project Management

Making Decisions In The Presence of Uncertainty

Tue, 06/30/2015 - 03:57

Screen Shot 2015-06-29 at 5.30.17 PMDecision making is hard. Decision making is easy when we know what to do. When we don't know what to do there are conflicting choices that must be balanced in the presence of uncertainty for each of those choices. The bigger issue is that important choices are usually ones where we know the least about the outcomes and the cost and schedule to achieve those outcomes. 

Decision science evolved to cope with decision making in the presence of uncertainty. This approach goes back to Bernoulli in the early 1700s, but remained an academic subject into the 20th century, because there was no satisfactory way to deal with the complexity of real life. Just after World War II, the fields of systems analysis and operations research began to develop. With the help of computers, it became possible to analyze problems of great complexity in the presence of uncertainty.

In 1938, Chester Barnard, authored of¬†The Functions of the Executive,¬†and coined the term ‚Äúdecision making‚ÄĚ from the lexicon of public administration into the business world. This term replaced narrower descriptions such as ‚Äúresource allocation‚ÄĚ and ‚Äúpolicy making.‚ÄĚ

Decision analysis functions at four different levels

  • Philosophy - uncertainty is a consequence of our incomplete knowledge of the world. In some cases, uncertainty can be partially or completely resolved before decisions are made and resources committed. In many important cases, complete information is not available or is too expensive (in time, money, or other resources) to obtain.
  • Decision framework - decision analysis provides concepts and language to help the decision-maker. The decision maker is aware of the adequacy or inadequacy of the decision basis:
  • Decision-making process - provides a step-by-step procedure that has proved practical in tackling even the most complex problems in an efficient and orderly way.
  • Decision making¬†methodology - provides a number of specific tools that are sometimes indispensable in analyzing a decision problem. These tools include procedures for eliciting and constructing influence diagrams, probability trees, and decision trees; procedures for encoding probability functions and utility curves; and a methodology for evaluating these trees and obtaining information useful to further refine the analysis.

Each level focuses on different aspects of the problem of making decisions. And it is decision making that we're after.  The purpose of the analysis is not to obtain a set of numbers describing decision alternatives. It is to provide the decision-maker the insight needed to choose between alternatives. These insights typically have three elements:

  • What is important to making the decision?
  • Why is it important?
  • How important is it?

Now To The Problem at Hand

It has been conjectured ...

No Estimates

The key here and the critical unanswered question is how can a decision about an outcome in the future, in the presence of that uncertain future, be made in the absence of estimating the attributes going into that decision?

That is, if we have less than acceptable knowledge about a future outcome, how can we make a decision about the choices involved in that outcome?

Dealing with Uncertainty

All project work operates in the presence of uncertainty. The underlying statistical processes create probabilistic outcomes for future activities. These activities may be probabilistic events, or the naturally occurring variances of the processes that make up the project. 

Clarity of discussion through the language of probability is one of the basis of decision analysis. The reality of uncertainty must be confronted and described, and the mathematics of probability is the natural language to describe uncertainty.

When we don't have the clarity of language, when redefining mathematical terms, misusing mathematical terms, enters the conversation, agreeing on the ways - and there are many ways - of making decisions in the presence of an uncertain future - becomes bogged down in approaches that can't be tested in any credible manner. What remains is personal opinion, small sample anecdotes, and attempts to solve complex problems with simple and simple minded approaches. 

For every complex problem there is an answer that is clear, simple, and wrong. H. L. Mencken

Related articles Eyes Wide Shut - A View of No Estimates Carl Sagan's BS Detector Systems Thinking, System Engineering, and Systems Management
Categories: Project Management

Information Technology Estimating Quality

Mon, 06/29/2015 - 01:07

The estimator's charter is not to state what the developers should do, but rather to provide a reasonable project of what they will do. - Tom DeMarco

Here's a few resource materials for estimating cost, schedule, and technical outcomes on software intensive systems. In meeting about managing risk in the presence of uncertainty below, it became clear we need to integrate estimating with risk, technical performance measures, measures of effectiveness, measures of performance, cost, and schedule.

Some Recent Resources

There are 1,117 other papers and articles in the Software Cost Estimating folder on my server. These are just a very small sample of how to make estimates.

The notion that we can make decision in the presence of uncertainty in the absence of estimating (#NoEstimates) the outcomes of those decisions can only work if there no opportunity costs at risk in the future. That is there is nothing at risk for making a choice between multiple outcomes.

Related articles Humpty Dumpty and #NoEstimates Climbing Mountains Requires Good Estimates Carl Sagan's BS Detector
Categories: Project Management

Everything I Learned About PM Came From a Elementary School Teacher

Thu, 06/25/2015 - 18:08

Our daughter is an elementary teacher in Austin Texas. A nice school, Number 2 school in Texas.

While visiting this week, we were talking about a new book a group of us are working on. While showing her the TOC, she said Dad we do all that stuff (minus the finance side) every day, week, month, semester, and year. It's not that hard. That's what we've been trained to do. OK, but talent, dedication, skill, and a gift for teaching helps. 

Here's how an elementary school teacher sees her job as the Project Manager of 20 young clients.

  • Plan before starting anything, it‚Äôs going to go wrong, so know that up front and be able to recognize the train wreck is coming and get out of the way.
    • The plan is a strategy for the successful completion of the project.
    • Without the plan, you don't know how to assess progress in terms meaningful to the¬†decision¬†makers. Measures of cost and¬†schedule¬†are¬†measures¬†of effectiveness.¬†Measurers¬†of¬†stories produces,¬†features¬†delivered¬†aren;t measures or capabilities produced.
    • A Capabilities Based Plan is that measure. What capabilities doesn't the customer need to¬†accomplishment¬†the¬†business¬†case or fulfill a mission.
    • In education Blooms Taxonomy with TLOs and ELO's define the¬†capabilities¬†the student will possess¬†at the end of the course.
  • Have a notion of what done looks like, so when you get there, you can stop and move on.
    • Done is defined as possessing a¬†capability¬†to accomplish something.
    • Write this down in units of Effectiveness and Performance¬†
  • Have your Plan B always ready to go and then start thinking of Plan C when Plan B is under way. No Plan A ever lasts too long in the presence of chaos.
    • Risk management is how adults manage projects - Tim Lister
    • Adult supervision is the role of the teacher. Many times adult¬†supervision¬†of the role of the project¬†manager.
  • Make sure you‚Äôve got all the right resources lined up and ready to spring into action when things go wrong. Classroom aides, class leaders, parents, staff all ready to go when the plan goes in the ditch.
    • Resource planning is a critical success factor for all projects.
  • Know what can go wrong before you start, steer away from trouble and trouble will stay away.
    • Risk planning is planning. Planning is strategy.¬†
    • Apply good risk management to all activities on the project. Perform some formal sequence of risk¬†management. Pick one. My favorite is the SEI Continuous Risk Management process
  • Separate the trouble makers from the main stream. You know them on day one.¬†
    • Any good project manager can see trouble coming.
    • Isolate the troubled parts. Assign them to¬†separate¬†teams. Have them fix the problem so the rest¬†of the project isn't impacted by them
  • Show up early, prepare for the work, clean up afterward, so you can start ‚Äúclean‚ÄĚ again the next day. No less that 100% complete at the end of each period of performance. If not you‚Äôll pay dearly for ¬†it later.
    • Being prepared is the major attribute of project success.
    • This means planning.
    • Let's¬†things¬†emerge is nice of small non-trivial projects with low value at risk.¬†
  • Always ask ‚Äúis this your best work?‚ÄĚ and ‚Äúdid you put your name on it?‚ÄĚ Otherwise you're creating re-work.
    • Set the highest¬†quality¬†standards possible
  • No crying when it doesn‚Äôt work. Redo it and get back on schedule, recess time is schedule margin - you get to stay in and finish your planned work.
    • No¬†whining, every one put your "big boy " pants on a do the work needed to get the job done.
  • Take a break, go outside and play, think what you‚Äôre going to do next hour. Come back and do it.
    • Have retrospectives.
    • Look back for opportunities for improvement
    • Do Root Cause Analysis to find out the "real" why things didn't work
    • Have fun while still working hard

Is This Your Best Work

Related articles Who's Budget is it Anyway? Systems Thinking, System Engineering, and Systems Management Myth's Abound
Categories: Project Management

Climbing Mountains Requires Good Estimates

Mon, 06/22/2015 - 06:01

LongsswridgerouteThere was an interesting post on the #NoEstimates thread that triggered memories of our hiking and climbing days with our children (now grown and gone) and our neighbor who has summited many of the highest peaks around the world.

The quote was Getting better at estimates is like using time to plan the Everest climb instead of climbing smaller mountains for practice.

A couple background ideas:

  • The picture above is Longs Peak. We can see Longs Peak from our back deck in Niwot Colorado. It's one of 53 14,000 foot mountains in Colorado -¬†Fourteeners. Long is one of 4 along the front range.¬†

In our neighborhood are several semi-pro mountain climbers. People move to Colorado for the outdoor life, skiing, mountain and road biking, hiking, and climbing. 

Now to the Tweet suggesting that getting better at estimating is replaced by doing (climb) smaller projects. Turns out estimates are needed for those smaller mountains, estimates are needed for all hiking and climbing. But first...

  • No one is going to climb Everest - and live to tell about it - without first having summited many other¬†high peaks.
  • Anyone interested in the trials and tribulations of Everest should start with John Krakauer's¬†Into Thin Air: A Personal Account of the Mt. Everest Disaster.
  • Before attempting - and attempting is the operative word here - any signifiant peak several things have to be in place.

Let's start with those Things.

No matter how prepared you are, you need a plan. Practice on lower peaks is necessary but far from sufficient for success. Each summit requires planning in depth. For Long's peak you need a Plan A, Plan B, and possibly a Plan C. On most of all you need strong estimating skills and the accompanying experience to determine when to invoke each Plan. People die on Longs because they foolishly think they can beat the odds and proceed with Plan B.

So the suggest that summiting something big, like any of the Seven Summits, without both deep experience and deep planning is likely going to not be heard of again.

 So the OP is likely speaking for not having summited much of anything, hard to tell, no experience resume attached.

The estimating part is basic, Can we make it to the key hole on Long's Peak before the afternoon storms come in/ On Everest, can we make it to the Hillary Step before 1:00 PM? No? Turn back, you're gonna die if you continue.

Can we make it to the delivery date at the pace we're on now, AND with the emerging situation for the remaining work, AND for the cost we're trying to keep AND with the needed capabilities the customer needs? Remember the use of past performance is fine, If and Only If the future is something like the past, or we know something about how the future is different from the past.

When the future not like the past? We need a Plan B. And that plan has to have estimates of our future capabilities, cost expenditure rate, and our abilities to produce the needed capabilities.


Ask any hiker, climber, development manager, business person. Time to stop managing by platitudes and start managing by the principles of good management.

Related articles There is No Such Thing as Free The Fallacy of the Planning Fallacy Systems Thinking, System Engineering, and Systems Management Myth's Abound Eyes Wide Shut - A View of No Estimates
Categories: Project Management

The Art of Systems Architecting

Sun, 06/21/2015 - 23:36

Art of Systems ArchitectingThe Art of Systems Architecting† is a book that changed the way I look at  development of software intensive systems. As a manager of software in the system of systems domain, this book created a clear and concise vision of how to assemble all the pieces of the system into a single cohesive framework.

One of the 12 principles of the Agile Manifesto is The best architectures, requirements, and designs emerge from self-organizing teams. The self-organizing team parts is certainly good. But good architectures don't emerge, unless it's the Ball of Mud architecture. Good architecture is a combination of science, engineering and art. Hence the title of the book.

Systems architecting borrows from other architectures, but the basic attributes are the same: † 

  • The architect is principally the agent of the client, not the builder. The architect must act in the best interests of the client.
  • The architect works jointly with the client and the builder on the problem and the definition of the solution. Systems requirements - in the form of needed capabilities, their Measures of Effectiveness, Measures of Performance (MOP), Key Performance Parameters (KPP), and Technical Performance Measures (TPM), are an input. The client will provide the requirements, but the architect is expected to jointly help the client ¬†determine the requirements.
  • The architect's product is an architectural representation. A set of abstracted designs of the system.
  • The product of the architect is not just a physical representation of the system. Cost estimates are part of any feasible deliverables as well. Knowing the value of some¬†built item requires we also know the cost. The system architecture must cover physical structure, system behavior, cost, performance, delivery schedule, and other elements needed to clarify the clients priorities.
  • The initial architecture is a¬†Vision¬†of the future outcome of the work effort. This description os a set of specific models. These include the needed capabilities, the motives of the outcome, beliefs, and unstated assumptions. These distinctions are critical when creating standards for the architecture. ToGAF and DoDAF are examples of architecture standards.¬†

Why Do We Care About This?

When we hear of some new and possibly different approach to anything, we need to ask - what is the paradigm this idea fits into? If it is truly new, what paradigm does it replace and how does that replacement maintain the needed information from the old paradigm used for success and what parts of the old paradigm are replaced for the better and how can we be assured that it is actually better?

One answer starts with the architecture of the paradigm. In the case of managing projects this is the programmatic architecture. This Principles, Practices, and Processes of the Programmatic Architecture.

Five Immutable Principles of project success can be found in...

5 Immutable Principles

With these principles we can apply Five Practices guided by these Principles

Screen Shot 2015-06-21 at 3.24.50 PM

With the Principles and Practices in place, Processes can be defined for the specific needs of the domain.

Screen Shot 2015-06-21 at 3.26.57 PM

So with the Principles, Practices, and Processes in place, we can now ask

When it is suggested a new approach be taken, where does that approach fit in the Principles, Practices, and Processes that are in place now? If there is no place, how does this new suggestion fulfill the needs of the business that are in place? If there needs aren't fulfilled, does the business acknowledge that those needs are no longer needed?

If not, the chances of this new idea of actually being accepted by the business are slim to none.

Related articles Systems Thinking, System Engineering, and Systems Management Who's Budget is it Anyway? Eyes Wide Shut - A View of No Estimates
Categories: Project Management

Systems Thinking, System Engineering, and Systems Management

Sun, 06/21/2015 - 16:46

There are several paradigms for Systems Thinking. Ranging from Psychobabble to hard core Systems Engineering. A group of colleagues are starting a book with a working title Increasing The Probability of Project Success, several of the chapters are based on Systems Thinking.

But first some background between Systems Theory, Systems Thinking, and Systems Engineering

Systems Theory is the interdisciplinary study of systems in general, with the goal of elucidating principles that can be applied to all types of systems at all nesting levels in all fields of research.

Systems Engineering is an interdisciplinary field of engineering that focuses on how to design and manage complex engineering systems over their life cycles. Systems Management (MSSM, USC, 1980) is an umbrella discipline encompassing systems engineering, managerial finance, contract management, program management, human factors, operations research, in limitary, defense, space, and other complex systems disciplines)

Here's are two books references that inform our thought processes 

Systems ThinkingThis book is the basis of Thinking about systems. It's a manufacturing and Industrial Engineering paradigm. Software Intensive Systems fit in here as well, since interfaces between system components define the complexity aspects of all system of systems.

This book opens with an Einstein quote In the brain, thinking is doing. As engineers - yes software engineering is alive and well in many domains, no matter how much we think wqe have to do. We can plan, prepare, and predict, but action occurs through doing.

so when we hear any suggestion, ask how can this be put to work in some measurable way to assess the effectiveness and performance of the outcomes?

Systems Thinking Building MapsThis is the companion mapping processes book. Systems Thinking is the process of understanding how systems influence one another withn a world  of systems and has been defined as an approach to problem solving by viewing our "problems" as parts of an obverall system, rather than reacting to a specific part or outcome.

There are many kinds of systems. Hard systems, software systems, evolutionary systems. It is popular to mix these, but that creates confusion and removes the ability to connect concepts with actionable outcomes. 

Cynefin is one of those popular approaches that has no units of measure of complex, complicated, chaotic, and obvious. Just soft self referencing words. 

so in our engineering paradigm this approach is not very useful.

Along with these appoaches are some other seminal works

In The End

Everything's  system. Interactions between components is where the action is and where the problems come from. Any non-trivial systems has interactions that must be managed as system interactions. this means modeling these interactions, estimating the impacts of these interactions. defining the behaviors of these interaction before, during, and after their development,

This means recognizing the criteria for a mature and effective method of managing in the presence of uncertainty.

  • Recognition by clients and providers the need to¬†architect the system.
  • Acceptance of a disciple to those function using known methods.
  • Recognition of the separation of value judgements and technical decisions between cleint, architect and builder.
  • Recognition that the architecture is an art as well as a science, in particular, the development and use of nonanalytical techniques.
  • Effective utilization of an educated professional staff engaged in the process of systems level architecting.
Related articles Eyes Wide Shut - A View of No Estimates Who's Budget is it Anyway? The Dysfunctional Approach to Using "5 Whys"
Categories: Project Management

Monte Carlo Simulation of Project Performance

Thu, 06/18/2015 - 15:32

Monte-Carlo-3Project work is random. Most everything in the world  is random. The weather, commuter traffic, productivity of writing and testing code. Few things actually take as long as they are planned. Cost is less random, but there are variances in the cost of labor, the availability of labor. Mechanical devices have variances as well.

The exact fit of a water pump on a Toyota Camry is not the same for each pump. There is a tolerance in the mounting holes, the volume of water pumped. This is a variance in the technical performance.

Managing in the presence of these uncertainties is part of good project management. But there are two distinct paradigms of managing in the presence of these uncertainties.

  1. We have empirical data of the variances. We have samples of the hole positions and sizes of the water pump mounting plate for the last 10,000 pumps that were installed. We have samples of how long it took to write a piece of code and the attributes of the code that are correlated to that duration. We have empirical measures.
  2. We have a theoretical model of the water pump in the form of a 3D CAD model with the materials modeling for expansion, drilling errors of the holes and other static and dynamic variances. We have modeling the duration of work using a Probability Distribution Function and a Three Point estimate of the Most Likely Duration, the Pessimistic and Optimistic duration. These can be derived form past performance, but we don't have enough actual data to produce the PDF and have a low enough Sample Error for our needs.

In the first case we have empirical data. In the second case we don't. There are two approaches to modeling what the system will do in terms of cost and schedule outcomes.

Bootstrapping the Empirical Data

With samples of past performance and the proper statistical assessment of those samples, we can re-sample them to produce a model of future performance. This bootstrap resampling shares the principle of the second method - Monte Carlo Simulation - but with several important differences.

  • The¬†researcher - and we are researching what the possible outcomes might be from our model - does not know nor have any control of the Probability Distribution Function that generated the past sample. You take what you got.¬†
  • As well we don;'t have any understanding of¬†Why those samples appear as they do. They're just there.¬†We get what we get.
  • This last piece is critical because it prevents us from defining what performance must be in place to meet some future goal. We can't tell what performance we need because we have not model of the¬†need performance, just samples from the past.
  • This results from the statistical conditions that there is a PDF for the process that ius unobserved. All we have is a few samples of this process.
  • With these few samples, we're going to resample them to produce a modeled outcome. This resampling locks in any behavior of the future using the samples from the past, which may or may not actually represent the¬†true underlying behavior. This may be all we can do because we don't have any theoretical model of the process.

This bootstrapping method is quick, easy, and produces a quick and easy result. But it has issues that must be acknowledged.

  • There is a fundamental assumption that the past empirical samples represent the future. That is, the samples contained in the¬†bootstrapped list and their resampling are also contained in all the future samples.
  • Said in a more formal way
    • If the sample of data we have from the past is a reasonable representation of the underlying population of all samples from the work process, then the distribution of parameter estimates produced from the bootstrap¬† model on a series of resampled data sets will provide a good approximation of the distribution of that statistics in the population.
    • With this sample data and its parameters (statistical moments) we can make a good approximation of the future.
  • There are some important statistical behaviors though that must be considered, starting with the future samples are identical to the statistical behaviors of the past samples.
    • Nothing is going to change in the future
    • The past and the future are identical statistically
    • In the project domain that is very unlikely
  • With all these condition, for a small project, with few if any interdependencies, a static work process with little valance, boot strapping is a nice quick and dirty approach to forecasting (estimating the future) ¬†based on the past.

Monte Carlo Simulation

This approach is more general and removes many of the restrictions to the statistical confidence of bootstrapping.

Just as a reminder, in principle both the parametric and the non-parametric bootstrap are special cases of Monte Carlo simulations used for a very specific purpose: estimate some characteristics of the sampling distribution. But like all principles, in practice there are larger differences when modeling project behaviors.

In the more general approach  of Monte Carlo Simulation the algorithm repeatedly creating random data in some way, performing some modeling with that random data, and collecting some result.

  • The duration of a set independent tasks
  • The probabilistic completion date of a series of tasks connected in a network (schedule), each with a different Probability Distribution ¬†Function evolving as the project moves into the future.
  • A probabilistic cost ¬†correlated with the probabilistic schedule model. This is called the Joint Confidence Level. Both cost and schedule are random variance with time evolving changes in their respective PDFs.

In practice when we hear Monte Carlo simulation we are talking about a theoretical investigation, e.g. creating random data with no empirical content - or from reference classes -  used to investigate whether an estimator can represent known characteristics of this random data, while the (parametric) bootstrap refers to an empirical estimation and is not necessary a model of the underlying processes, just a small sample of observations independent from the actual processes that generated that data.

The key advantage of MCS is we don't necessarily need  past empirical data. MCS can be used to advantage if we do, but we don't need it for the Monte Carlo Simulation algorithm to work.

This approach could be used to estimate some outcome, like in the bootstrap, but also to theoretically investigate some general characteristic of an statistical estimator (cost, schedule, technical performance) which is difficult to derive from empirical data.

MCS removes the road block heard in many critiques of estimating - we don't have any past data on which to estimate.  No problem, build a model of the work, the dependencies between that work, and assign statistical parameters to the individual or collected PDFs and run the MCS to see what comes out.

This approach has several critical advantages:

  • The first is a restatement - we don't need empirical data, although it will add value to the modeling process.
    • This is the primary purpose of Reference Classes
    • They are the raw material for defining possible future behaviors form the past
  • We can make judgement of what he future will be like, or most importantly what the future MUST be like to meet or goals, run the simulation and determine is our planned work will produce a desired result.

So Here's the Killer Difference

Bootstrapping models make several key assumptions, which may not be true in general. So they must be tested before accepting any of the outcomes.

  • The future is like the past.
  • The statistical parameters are static - they don't evolve with time. That is the future is like the past, an unlikely prospect on any non-trivial project.
  • The sampled data is identical to the population data both in the past and in the future.

Monte Carlo Simulation models provide key value that bootstrapping can't.

  • Different Probability Distribution Function can be assigned to work as it progresses through time
  • The shape of that PDF can be defined from past performance, or defined from the¬†needed performance. This is a CRITICAL capability of MCS

The critical difference between Bootstrapping and Monte Carlo Simulation is that MCS can show what the future performance has to be to stay on schedule (within variance), on cost, and have the technical performance meet the needs of the stakeholder.

When the process of defining the needed behavior of the work is done, a closed loop control system in put in place. This needed performance is the steering target. Measures of actual performance compared to needed performance generate the error signals for taking corrective actions. Just measuring past performance and assuming the future will be the same, is Open Loop control. Any non-trivial project management method needs a closed loop control system

Bootstrapping can only show what the future will be like if it like the past, not what it must be like. In Bootstrapping this future MUST be like the past. In MCS we can tune the PDFs to show what performance has to be to manage to that plan. Bootstrapping is reporting yesterday's weather as tomorrow's weather - just like Steve Martin in LA Story. If tomorrow's weather turns out not to be like yesterday's weather, you gonna get wet.

MCS can forecast tomorrows weather, by assigning PDFs to future activities that are different than past activities, then we can make any needed changes in that future model to alter the weather to meet or needs. This is in fact how weather forecasts are made - with much more sophisticated models of course here at the National Center for Atmospheric Research in Boulder, CO

This forecasting (estimating the future state) of possible outcomes and the alternation of those outcomes through management actions to change dependencies, add or remove resources, provide alternatives to the plan (on ramps and off maps of technology for example), buy down risk, apply management reserve, assess impacts of rescoping the project, etc. etc. etc.  is what project management is all about.

Bootstrapping is necessary but far from sufficient for any non-trivial project to show up on of before the need date (with schedule reserve), at o below the budgeted cost (with cost reserve) and have the produce or service provide the needed capabilities (technical performance reserve).

Here's an example of that probabilistic forecast of project performance from a MCS (Risky Project). This picture shows the probability for cost, finish date, and duration. But it is built on time evolving PDFs assigned to each activity in a network of dependent tasks, which models the work stream needed to complete as planned.

When that future work stream is changed to meet new requirements, unfavorable past performance and the needed corrective actions, or changes in any or all of the underlying random variables, the MCS can show us the expected impact on key parameters of the project so management in intervention can take place - since Project Management is a verb.


The connection between the Bootstrap and Monte Carlo simulation of a statistic is simple.

Both are based on repetitive sampling and then direct examination of the results.

But there are significant differences between the methods (hence the difference in names and algorithms). Bootstrapping uses the original, initial sample as the population from which to resample. Monte Carlo Simulation uses a data generation process, with known values of the parameters of the Probability Distribution Function. The common algorithm for MCS is Lurie-Goldberg. Monte Carlo is used to test that the results of the estimators produce desired outcomes on the project. And if not, allow the modeler and her management to change those estimators and then mange to the changed plan.

Bootstrap can be used to estimate the variability of a statistic and the shape of its sampling distribution from past data. Assuming the future is like the past, make forecasts of throughput, completion and other project variables. 

In the end the primary differences (and again the reason for the name differences) is Bootstrapping is based on unknown distributions. Sampling and assessing the shape of the distribution in Bootstrapping adds no value to the outcomes. Monte Carlo is based on known or defined distributions usually from Reference Classes.

Related articles Do The Math Complex, Complexity, Complicated The Fallacy of the Planning Fallacy
Categories: Project Management

Top 50 Influencers in Project Management

Wed, 06/17/2015 - 04:42

Screen Shot 2015-06-16 at 8.39.29 PM

Categories: Project Management

Eyes Wide Shut - A View of No Estimates

Wed, 06/17/2015 - 03:15

When we hear all the difficulties, misuse, abuse, and inabilities for making software development estimates, the real question is what does the business think of this?

Good question. When our children were young - 6 or 7 - they asked the big question once they started receiving an allowance for household chores.

What's the reason we have money? The best answer I could think of was money provides the vehcile for the exchange of value. I pay you for doing something around the house which is of value to me or your mother. That money you receive can then be used to buy something of value for you. You can exchange your money for that valued thing.

Money is the carrier of value between two parties in the transaction.

The basis of a successful business is the exchange of cost (money) for value. Either internally for building a product or externally for providing a service to an outside firm.

IMG00007-20101106-1403In practice this value and the cost to achieve this value operate in the presence of uncertainty in non-trivial situations. If I see a bicycle helmet I want, (notice I didn't say need, since I have a perfectly good and nice helmet) I can look at the price tag and determine if my old helmet needs to be replaced. That is is the value of the new helmet of sufficient value to me that I'm willing to pay the cost and absorb the cost of the old helmet?

The price tag for the Helmet is clear. I might be able to get a discount. The value of the helmet is up to me. As a minimum the value is equal to the cost. This is the basis of Earned Value Management. But the value of the helmet may be immeasurable if it saves me from a brain trauma if I were to crash. I've never crashed or been hit on my road bike. A few small spills on the mountain bike.

The Business Notion of Value in Exchange for Cost

In the business of software development the exchange process takes place in the presence of uncertainty, there are no price tags attached to the development process in the way the helmet does. Uncertainty about the value - beyond the cost. Uncertainty about the cost. Uncertainty about the risks, and all the other random variables associated with the project work.

When making decisions in the presence of these uncertainties, we can consider the opportunity cost. This is the loss of potential gain from other alternatives when one alternative is chosen. If I choose one path for one cost over another path of another cost, what's the lost opportunity? This decision making process is the basis of Microeconomics as it is applied to Software Development.

Opportunity costs are fundamental costs in economics, and are used in computing cost benefit analysis of a project. Such costs, however, are not recorded in the account books but are recognized in decision making by computing the cash outlays and their resulting profit or loss.

So let's ask a simple question...

How can we make those opportunity cost decisions in the presence of the uncertainty of their specific value? Meaning of the "value" of the Value - an actual number of Dead Presidents (dollars) is a random variable, dependent on a variety of things - can we make a decision in the presence of this uncertainty? This uncertainty is around the precision and accuracy of our knowledge of this "value."

The answer is we need to estimate the value of the number of Dead Presidents for both the cost of the value and the value of the value. Both sides of the equation are needed. This means we need to know something about the value returned in exchange for our cost to acquire that value. A common - and simple - way to measure that is through the Return on Investment (ROI). This investment in this case is the cost we've assigned to the expected value to be returned. The formula is

ROI = (Value - Cost) / Cost

EvadeThe two variables in most instances are random variables. If we're Investing in a new helmet like a really nice Specialized S-works, we can make the trade offs - the opportunity costs, between the $250.00 for the helmet and the potential immeasurable value of saving my head on the mountain bike, while riding with our collegiate racer son. The current helmet will likely do that job as well, but that carbon fiber S-works helmet will look very cool while riding next to our sons  S-works Epic 29ér. See you can buy cool.

But now we need to determine the opportunity cost for a software system, a feature in a software system, or something else related to a software system. That is we need to decide in the presence of the uncertainty created by the randomness of the many variables of the software process.

How can we do that? Well, and here's the punch line...


Can we make that decision in the absence of making an estimate? Sure we can guess, we can make a blind decision, we can just decide and suffer the consequences of that decision. But here's the rub as Willy Shakespeare once had Hamlet say, is that if it's not our money we're exchanging for the produced or acquired value, those providing the money have a vested interest in knowing how much money. And when the Value produced or acquired for the money will be returned. 

This is the basis of Microeconomics which is defined as the behavior of individuals and small impacting organizations in making decisions on the allocation of limited resources.

So when someone says we can make decisions without estimates, their wrong. Unless hey haveno concern for the loss resulting from the write off of the opportunity cost. It is doubtful that those paying for the value have much interest in that. 


Can we make decisions without an estimate of the outcomes? Not in any credible way I know of. If there is a credible way, it has yet to be stated. Slicing is estimating. Using past performance for forecasting (estimating outcomes in the future) is estimating.

So the simple answer to the conjecture of making decisions in presence of uncertainty in the absence of an estimate is also simple


And anyone saying they can needs to come to grips with the principles of microeconomics of software development.

Related articles There is No Such Thing as Free Who's Budget is it Anyway? Just Because You Say Words, It Doesn't Make Then True Essential Reading List for Managing Other People's Money The Dysfunctional Approach to Using "5 Whys"
Categories: Project Management

Carl Sagan's BS Detector

Sat, 06/06/2015 - 00:24

There are several BS detector paradigms. One of my favorites is Carl Sagan's. This one has been adapted from Reality Charting the book that goes along with the Apollo Method Root Cause Analysis process we use in our governance process in our domain. Any outage, any hard break, and disruption to the weekly release process is cause for Root Cause Analysis. 

Here's Carl's check list applied to the #NoEstimates conjecture that decisions can be made in the absence of estimates

  1. Seek independent facts. Remember, a fact is a cause supported by sensed evidence and should be independently verified by you before it can be deemed legitimate. If you cannot find sensed evidence of causal relationships you should be skeptical. 
    • Writing software for money is based on making decisions in the presence of uncertainty.
    • a primary process for making those those decisions is Microeconomics and Opportunity costs.¬†What's the lost opportunity for one decison over another?
    • To do this we need to estimate both the cost that results from choosing one decision over another and the benefits, opportunity, from that decision.
  2. Welcome open debate on all points of view. Suspend judgment about the event or claim until all cause paths have been pursued to your satisfaction using RealityCharting¬ģ.¬†
    • The notion that #NoEstimates is¬†exploring and seeking conversation is not evidenced.
    • Any challenge questions are labeled as trolling, harassing, and unwanted.
    • You'd think is #NoEstimates is the¬†next big thing responding to any and all challenges would be a terrific marketing opportunity. It's basic sales strategy, find all objections to your products vale propositon, over come them, and the¬†deal is closed.
  3. Always challenge authority. Ask to be educated. Ask the expert how they came to know what they know. If they cannot explain it to your satisfaction using evidence-based causal relationships then be very skeptical. 
    • This is not the same as¬†challenge¬†everything.
    • This is¬†when you hear¬†something¬†that is not backed b tangible evidence¬†challenge¬†it.
    • Ask the person making the claim how they know it will work outside their personal anecdotal experience?
  4. Consider more than one hypothesis. The difference between a genius and a normal person is that when asked to solve a problem the genius doesn’t look for the right answer, he or she looks for how many possible solutions he or she can find. A genius fundamentally understands that there is always another possibility, limited by our fundamental ignorance of what is really happening. 
    • The self proclaimed thought leaders,¬†agile leaders were¬†challenged, I've been¬†challenged, I must be¬†a agile though leader, needs to be tested with actual evidence.
  5. Don’t defend a position because it is yours. All ideas are prototypical because there is no way we can really know all the causes. Seek to understand before seeking to be understood. 
    • Use this on both sides of the conversation.
    • Where can non estimates be applied?
    • Where is it not¬†applicable
    • Provide evidence on both sides.
  6. Try to quantify what you think you know. Can you put numbers to it? 
    • Show the numbers
    • Do the math
  7. If there is a chain of causes presented, every link must work. Use RealityCharting¬ģ to verify that the chain of causes meets the advanced logic checks defined above and that the causes are sufficient in and of themselves.¬†
    1. If estimating is the smell of dysfunction, show the causal chain of the dysfunction.
    2. Confirm that estimating is actually the root cause of management dysfunction.
    3. Is misuse and abuse of estimates caused by the estimating process?
  8. Use Occam’s razor to decide between two hypothesis; If two explanations appear to be equally viable, choose the simpler one if you must. Nature loves simplicity.
    • When conjecturing that stopping estimates fixed the dysfunction, is this the simplist solution?
    • How about stopping Bad Management practices
    • In the upcoming #NoEstimates book, Chapter 1 opens with a blatant Bad Management process of assigning a project to an inexperienced PM that is 100's of times bigger than she has ever seen.
    • Then blaming the estimating process for her failure.
    • This notion is continued on with references to other failed projects, without ever seeking the actual root cause of the failure.
    • No evidence is ever presented to show that stopping estimates will have make the project successful.
  9. Try to prove your hypothesis wrong. Every truth is prototypical and the purpose of science is to disprove that which we think we know. 

    • This notion is lost on those conjecturing #NoEstimates is applicable their personal anecdotal experience.¬†
    • Testing the idea in external domain, not finding a CEO that supports the notion.¬†
    • The null hypothesis test H0, is basic High School statistics.¬†
    • Missing entirely there

    Use carefully designed experiments to test all hypotheses.
    • No such thing in the #NoEstimates paradigm

So it's becoming clear #NoEstimates does pass the smell test of the basic BS meter

The Big Questions

  1. What's the answer to how can we make a decision in the presence of uncertainty and not estimate and NOT violate the core principles of Microeconomics
  2.  It's not about the developers like or dislike of estimates. When I was a developer - radar, realtime controls, flight avionics, enterprise IT - I never liked estimates. It's about business. It's not our money. This notion appears to be completely lost. It's the millennials  view of the world. We have two millennials (25 and 26) It's all about ME. Even if those suggesting are millennials, the message appears to be it's all about me. Go talk to the CFO.

The End

The rhetoric  on #NoEstimates has now reached a fever pitch, paid conferences, books, blatant misrepresentations. Time to call BS and move on. This is the last post.  I've met many interesting people in both good and bad ways. And will stay in touch. So long and thanks for the Fish. As Douglas Adams says. Those with the money will have the final say on this idea.

Categories: Project Management

Myth's Abound

Thu, 06/04/2015 - 18:35

Screen Shot 2015-06-04 at 8.43.58 AMMyths, misinformation, and magic bullets abound in the software development business. No more so than in the management and estimating of complex development projects.

The Standish report is a classic example of applying How to Lie with Statistics. This book is a must read for anyone responsible for spending other peoples money.

The current misrepresentation approach is to  quote people like Bent Flyvbjerg - who by the way does superior work in the domain of mass transportation and public work. Bent, along with many others, one of which is a client, have studied the problems with Mega Projects. The classic misuse of these studies starts with the reading of the introduction of a report and going no further. Here's a typical summary.

9/10 Costs (are) underestimated. 9/10 Benefits (are) overestimated 9/10 Schedules (are) underestimated.

OK, we all know that's what the report says, now what?

  • Do you have a root cause for each of the project's overages?
  • Did those root causes have sufficient assessment to show:
    • Primary Effect ‚Äď is any effect we want to prevent.
    • Action ‚Äď momentary causes that bring condition together to cause an effect.
    • Conditions ‚Äď the fundamental causal element of all that happens. It is made up of an effect and its immediate causes that represent a single causal relationship.
      • As a minimum, the causes in this set consist of an action and one or more conditions.
      • Causal sets, like causes, cannot exist alone.
      • They are part of a continuum of causes with no beginning or end, which leads to the next principle that¬†Causes and Effects are Part of an Infinite Continuum of Causes.

NO?, then the use of reports and broad unqualified clips from reports is just Lying With Statistics.

The classic example from the same source states Steve McConnell PROVES estimates can't be done in his book. Which of course the antithesis of the title of the book and the content of the book.

This approach is pervasive in places where doing your homework appears to be a step that was skipped.

From our own research in DOD ACAT1 programs (>$5B qualifies for Megaprojects) here's the Root Cause of program problems in our domain.

Screen Shot 2015-06-04 at 9.07.45 AM

When we hear some extracted statement from a source in another domain - Bent for example is large construction infrastructure projects - roads, rail, ports - moved to our domain without the underlying details of both the data, the root causes and all the possible corrective actions to avoid the problem in the first place - that idea is basically bogus. Don't listen. Do your own investigation, learn how to not succumb to those who Lie With Statistics.

So let's look at some simple questions when we hear there are problems with our projects or projects in other domains trying to convince us it's applicable to our domain.

  • Was there a credible set of requirements that were understood by those initiating the project?
    • No? You're going to be over budget, late, and the products not likely to meet the needs of those paying.
  • Was there a credible basis of estimate for the defined work?
    • No? Then what ever estimate was produced is not going to be what the project actually costs, its duration, or the planned value.
  • Was the project defined in a Risk Adjusted manner for both reducible and irreducible uncertainties?
    • No? Then those uncertainties are still there and will cause the project to be over budget and late.
    • If the reducible risks are unaddressed, will come true with their probabilistic outcomes will drive the project over budget, late, and not provide the needed capabilities
    • If the irreducible risks don't have margin, you're late and over budget before you even start.
  • Was there a means of measuring physical percent complete in meeting the needed Effectiveness and Performance measures?
    • No? Then money will be spent and time will pass and there is no way to know if the project is actually producing value and meeting its goals.

These questions and their lack of answers are at the heart of most project performance problems. So pointing out all the problems is very easy. Providing corrective actions once the root cause is discovered is harder, mandatorily harder by the way. Because Risk Management is How Adults Manage Projects. 

First let's look at what Bent says

He states the political economy of megaprojects, that is massive investments of a billion dollars or more in infrastructure or technology, consistently ends up costing more with smaller benefits than projected and almost always end up with costs that exceed the benefits. 

So the first question is are we working in the mega project domain? No? Then can we assert Bent's assessments are applicable. If we haven't then we're Lying with Statistics. (Read Huff's book to find out why).

Flyvbjerg then explores the reasons for the poor predictions and poor performance of giant investment projects and what might be done to improve their effectiveness. Have we explored the reasons why our projects overrun? No? Then we haven't done our homework and are speculating on false data. Another How to Lie With Statistics.

Stating that projects over run 9 out of 10 times without also finding the reasons for this is the perfect How to Lie with Statistics. Make a statement, no supporting data, be the provocateur.

The End

When we read a statement without a domain or context, without a corrective action, that is intended to convey a different message, taken out of context, without the evidence it is applicable in our domain, than the person writing the original statement, is Lying with Statistics - don't listen, go find out for yourself.

Related articles

The Dysfunctional Approach to Using "5 Whys" There is No Such Thing as Free Mr. Franklin's Advice Essential Reading List for Managing Other People's Money Eyes Wide Shut - A View of No Estimates
Categories: Project Management

The Dysfunctional Approach to Using "5 Whys" - Redux

Wed, 06/03/2015 - 21:54

It's been popular recently in some agile circles to mention we use the 5 whys when asking about dysfunction. This common and misguided approach assumes - wrongly - causal relationship are linear and problems come from a single source. For example:

Estimates are the smell of dysfunction. Let's ask the 5 Whys to reveal these dysfunctions

The natural tendency to assume that in asking 5 whys there is a connection from beginning to end for the thread connecting cause and effect. This single source of the problem - the symptom - is labeled the Root Cause. The question is is the root cause that actual root cause. The core problem is the 5 whys is not really seeking a solution but just eliciting more symptoms masked as causes.

A simple example illustrates the problem from Apollo Root Cause Analysis.

Say we're in the fire prevention business. If preventing fires is our goal, let's look for the causes of the fire and determine the correction actions needed to actual prevent fire from occuring. In this example let's says we've identified 3 potential causes of fire. There is ...

  1. An ignition source
  2. Combustible material
  3. Oxygen

So what is the root cause of the fire? To prevent the fire - and in the follow on example prevent a dysfunction - we must find at least one cause of the fire that can be acted on to meet the goals and objectives of preventing the fire AND are within our control.

Here's a briefing used now for our development and deployment processes in the health insurance domain

Root cause analysis master plan from Glen Alleman

The notion that Estimates are the smell of dysfunction in a software development organization and asking the 5 Whys in search for the Root Cause is equally flawed. 

The need to estimate or not estimate has not been established. It is presumed that it is the estimating process that creates the dysfunction, and then the search - through the 5 Whys - is the false attempt to categorize the root causes of this dysfunction. The supposed dysfunction is them reverse engineered to be connected to the estimating process. This is not only a na√Įve approch to solving the dysfunction is inverts the logic by ignoring the need to estimate. Without confirmation that estimates are needed ot not needed, the search for the cause of the dysfunction has no purposeful outcome.¬†

The decision that estimates are needed or not need does not belong to those being asked to produce the estimates. That decision belongs to those consuming the estimate information in the decision making process of the business - those whose money is being spent.

And of course those consuming the estimates need to confirm they are operating their decision making processes in some framework that requires estimates. It could very well be those providing the money to be spent by those providing the value don't actual need an estimate. The value at risk may be low enough - 100 hours of development for a DB upgrade. But when the value at risk is sufficiently large - and that determination of done again by those providing the money, then a legitimate need to know how much, when, and what is made by the business In this case, decisions are based on Microeconomics of opportunity cost for uncertain outcomes in the future.

This is the basis of estimating and the determination of the real root causes of the problems with estimates. Saying we're bad at estimating is NOT the root cause. And it is never the reason not to estimate. If we are bad at estimating, and if we do have confirmation and optimism biases, then fix them. Remove the impediments to produce credible estimates. Because those estimates are needed to make decisions in any non-trivial value at risk work. 


Related articles Let's Get The Dirt On Root Cause Analysis The Fallacy of the Planning Fallacy Mr. Franklin's Advice The Dysfunctional Approach to Using "5 Whys" Essential Reading List for Managing Other People's Money
Categories: Project Management

Humpty Dumpty and #NoEstimates

Wed, 06/03/2015 - 06:06

Humpty-Dumpty--010When I use a word Humpty Dumpty said in a rather scornful tone, it means just what I choose to to mean - neither more nor less.

The question is, said Alice, whether you can make words mean so many different things.

The question is said Humpty Dumpty which is to ne master.

Through the Looking Glass, Chapter 6

The mantra of #NoEstimates is that No Estimates is not about Not Estimating. Along with that oxymoron comes

Forecasting is Not Estimating

  • Forecasting the future based on past performance is not the same as estimating the future from past performance.
  • The Humpty Dumpty logic is Forecasting ‚ȆEstimating.

This of course redefines the standard definition of both terms. Estimating is a rough calculation or judgment of a value, number, quantity, or extent of some outcome. 

An estimate is Approximation, prediction, or projection of a quantity based on experience and/or information available at the time, with the recognition that other pertinent facts are unclear or unknown.

  • Let‚Äôs estimate how many Great Horned Owls are in the county by sampling.
  • Let‚Äôs estimate to the total cost of this project using reference classes assigned to work element duration and running a Monte Carlo simulation

Forecasting is a prediction of a future event

  • Let‚Äôs produce weather forecast for the next five days

Both Estimating and Forecasting result in a probabilistic output in the presence of uncertainty

Slicing is Not Estimating??

Slicing work into smaller pieces so that "standard" size can be used to project the work effort and completion time. This is a standard basis of estimate in many domains. So slicing is Not Estimating in the #NoEstimates paradigm. In fact slicing is Estimating, another inversion of the term
No means Yes

Past Performance is #NoEstimates

using Past Performance to estimate future performance is core to all estimating processes. Time series used to estimate possible future outcomes is easily done with AIRMA, 4 lines of R, and some raw data as shown in The Flaw of Averages. But as described there, care is needed to confirm the future is like the past.

When We Redefine Words to Suite Our Needs We're Humpty Dumpty

Lewis Carol's Alice in Wonderland is political allegory of 19th century England. When #NoEstimates redefines established mathematical terms like Forecasting and Estimating and ignores the underlying mathematics for time series forecasting, ARIMA for example, they are willfully ignoring established practices and replacing them with their own untested conjectures.

No Estimates

Key here ways to make decisions with NO ESTIMATES. OK, show how that is not actually an estimating technical, no matter how simple or flawed and estimating technical.

Related articles Mr. Franklin's Advice There is No Such Thing as Free The Fallacy of the Planning Fallacy Do The Math Monte Carlo Simulation of Project Performance Essential Reading List for Managing Other People's Money
Categories: Project Management

Who's Budget is it Anyway?

Wed, 05/27/2015 - 05:30

When we hear things like ...

Why promising nothing delivers more and planning always fails,
 It's in doing the work that we discover the work that we must do,
 If estimates were real the team wouldn't have to know the delivery date, they just work naturally and be on date.

You have to ask do these posters have any understanding that it's not their money? That all project work is probabilistic. That nonlinear, non-stationary, stochastic processes drive  uncertainty for all work in ways that cannot be controlled by disaggregating the work (slicing), or assuming that work elements are independent from other work elements in all but the most trivial of project context.

Systems Thinking and Probability†

All systems where optimum technical performance is needed require a negative feedback loop as the basis for controlling the work in order to arrive on the planned date, with the planned capabilities, for the planned budget. If there is no need to arrive as planned or as needed, then no control system is needed, just spend until told to stop.

The negative feedback loop as a control system, is the opposite of the positive feedback loop. In chemistry a positive feedback loop is best referred to as an explosion. In project management a positive feedback loop results in a project that requires greater commitment of resources to produce the needed capabilities beyond what was anticipated. That is cost and schedule overrun and lower probability of technical success.

A project is a type of complex adaptive system that acquires information about its environment and the interactions between the project elements, identifies information of importance, and places that information within a context, model, or schema, and then acts on this information to make decisions.

The individual members of the project act as a complex adaptive system themselves and exert influence on the selection of both the schema and the adaptive forces used to make decisions. The extent to which learning produces adaptive or maladaptive behavior determines the survival of failure of the project and the organization producing the value from the project.

Managing in the Presence of Uncertainty

Uncertainty creates risk. Reducible risk and irreducible risk. This risk by its nature is probabilistic. Complex systems tend to organize themselves in a normal distribution of outcomes ONLY if each individual element of the system is Independent and Identically distributed. If this is not the case, long tailed distributions result and are the source of Black Swans. And these Black Swans are unanticipated cost and schedule performance problems and technical failures we are familiar with in the literature. The project explodes. 

Screen Shot 2015-05-26 at 11.08.07 PM

So We Arrive at the End

To manage in the presence of an uncertain future for cost, delivery date, and delivered capabilities to produce the value in exchange for the cost, we need some mechanism to inform our decision making process based on these random variables. The random variables that create risk. Risk that must be reduced to increase the probability of success. The reducible risk and the irreducible risk.

This mechanism is the ability to estimate to impact of any decision while making the trade off between decision alternatives - this is the basis of Microeconomics - the tradeoff of a decision based on the opportunity cost of the collection of decision alternatives.

Anyone conjecturing that decisions can be made in the presence of uncertainty without making  estimated impacts of those decisions has willfully ignored the foundational principles  of  Microeconomics.

The only possible way to make decisions in the absence of estimating the impact of a decision is when the decision has a trivial value at risk. Many decisions are just that. If I decide wrong the outcome has little or no impact on cost, schedule, or needed technical performance. In this case Not Estimating is a viable option. For all other conditions, Not Estimates results in a Black Swan explosion of the customers budget, time line, and expected beneficial outcomes based on the produced value.

† Technical Performance Measurement, Earned Value, and Risk Management: An Integrated Diagnostic Tool for Program Management, Commander N. D. Pisano, SC, USN, Program Executive Officer Air NSW, Assault and Special Missions Programs (PEO(A)). Nick is a colleague. This paper is from 1991 defining how to plan and assess performance for complex, emergent systems. 

Related articles Just Because You Say Words, It Doesn't Make Then True There is No Such Thing as Free Essential Reading List for Managing Other People's Money The Dysfunctional Approach to Using "5 Whys" Mr. Franklin's Advice
Categories: Project Management

Memorial Day

Sun, 05/24/2015 - 19:27

Memorial-dayIn case you thought it was about the 3 day weekend, parties, and the beach

Thanks to all my neighbors, friends, and colleagues for their service.

Categories: Project Management

Software for the Mind

Fri, 05/22/2015 - 00:21

The book Software for Your Head was a seminal work when we were setting up our Program Management Office in 2002 for a mega-project to remove nuclear waste from a very contaminated site in Golden Colorado.

Here's an adaptation of those ideas to the specifics of our domain and problems

Software for your mind from Glen Alleman This approach was a subset of a much larger approach to managing in the presence of uncertainty, very high risk, and even higher rewards, all on a deadline, and fixed budget.  As was stated in the Plan of the Week.
  • Monday - Where are we going this week?¬†
  • Daily - What are we doing along the way?
  • Friday - Where have we come to?

Do this every week, guided by the 3 year master plan and make sure no one is injured or killed.

That project is documented in the book Making the Impossible Possible summarized here.

Making the impossible possible from Glen Alleman Related articles The Reason We Plan, Schedule, Measure, and Correct The Flaw of Empirical Data Used to Make Decisions About the Future There is No Such Thing as Free
Categories: Project Management

We've Been Doing This for 20 Years ...

Thu, 05/21/2015 - 03:58

We've been doing this for 20 years and therefore you can as well

Is a common phrase used when asked in what domain does you approach work? Of course without a test of that idea outside the domain in which the anecdotal example is used, it's going to be hard to know if that idea is actually credible beyond those examples.

So if we hear we've been successful in our domain doing something or better yet NOT doing something, like say NOT estimating, ask in what domain have you been successful? Then the critical question, is there any evidence that the success in that domain is transferable to another domain? This briefing provides a framework - from my domain of aircraft development - illustrating that domains vary widely in their needs, constraints, governance processes and applicable and effective approaches to delivering value.

Paradigm of agile project management from Glen Alleman Google seems to have forgotten how to advance the slides on the Mac. So click on the presentation title (paradigm of agile PM)  to do that. Safari works. Related articles The Reason We Plan, Schedule, Measure, and Correct The Flaw of Empirical Data Used to Make Decisions About the Future There is No Such Thing as Free Root Cause Analysis Domain is King, No Domain Defined, No Way To Test Your Idea Mr. Franklin's Advice
Categories: Project Management

Essential Reading List for Managing Other People's Money

Mon, 05/18/2015 - 15:58

Education is not the learning of facts, but the training of the mind to think - Albert Einstein 

So if we're going to learn how to think about managing the spending of other peoples money in the presence of uncertainty, we need some basis of education. 

Uncertainty is a fundamental and unavoidable feature of daily life. Personal life and the life of projects. To deal with this uncertainty intelligently we represent and reason about these uncertainties. There are formal ways of reasoning (logical systems for reasoning found in the Formal Logic and Artificial Intelligence domain) and informal ways of reasons (based on probability and statistics of cost, schedule, and technical performance in the Systems Engineering domain).

If Twitter, LinkedIn, and other forum conversations have taught me anything, it's that many participants base their discussion on personal experience and opinion. Experience informs opinion. That experience may be based on gut feel learned from the  school of hard knocks. But there are other ways to learn as well. Ways to guide your experience and inform your option. Ways based on education and frameworks for thinking about solutions to complex problems.

Samuel Johnson has served me well with his quote...

There are two ways to knowledge, We know a subject ourselves, or we know where we can find information upon it.

Hopefully the knowledge we know ourselves has some basis in fact, theory, and practice, vetted by someone outside ourselves, someone beyond our personal anecdotal experience

Here's my list of essential readings that form the basis of my understanding, opinion, principles, practices, and processes as they are applied in the domains I work - Enterprise IT, defense and space and their software intensive systems.

  • Making Hard Decisions: An¬†Introduction¬†to Decision Analysis, Robert T. Clemen
    • Making decisions in the presence of uncertainty is part of all business and technical endeavors.
    • This book and several other should be the start when making decisions about how much, when, and what.
  • Apollo Root Cause Analysis: Effective Solutions to Everyday Problems, Every Time, Dean L. Gano.
    • There is a powerful quote from Chapter 1 of this book
      • STEP UP¬†¬†TO FAIL
      • Ignorance is a most wonderful thing.
      • It facilitates magic.
      • It allows the masses to be led.
      • It provides answers when there are none.
      • It allows happenings in the presence of danger.
      • All this, while the pursuit of knowledge can only destroy the illusion. It is any wonder mankind chooses ignorance?
    • This book is the starting point for all that follows. I usually only come to an engagement when there is¬†trouble.
    • No need for improvement if there's no trouble.
    • Without a Root Cause Analysis process and corrective actions all problems are just symptoms. And treating the symptoms does little to make improvements to any situation.¬†
    • So this is the seminal book, but any RCA process is better than none.
  • The Phoenix¬†Handbook, William R. Cocoran, PhD, P.E., Nuclear Safety Review Concepts, 19 October 1997 version.
    • This was a book and process used at Rocky Flats for Root Cause Analysis
  • Effective Complex Project Management: An¬†Adaptive¬†Agile Framework for Delivering Business Value, Robert K. Wysocki, J. Ross
    • All project work is probabilistic.
    • All project work is complex. Agile software development is not the same as project management.
    • For agile software development beyond a handful of people in the same room as their customer, project management is needed.
    • This book tells you where to start in performing the functions of Project Management in the Enterprise domain.
  • The Art of System Architecting, 2nd Edition, Mark W. Maier and Eberhardt Recthin, CRC Press
    • Systems have architecture. This architecture is purpose built.
    • The notion¬†the best architectures, requirements, and designs emerge from self-organizing teams needs to be tested in a specific domain.
    • Many domain have¬†reference architectures, DODAF¬†and¬†TOGAF are tow examples.
    • Architectures developed by self-organizing teams may or may not be useful over the life of the system. It depends on the skills and experience of the¬†architects. Brian Foote has a term for self-created architectures -¬†ball of mud. So care is needed in failing to test the self-organizaing team's ability to produce a good architecture.
    • The Recthin book can be your guide for that test.
  • Systems Enigneering: Coping With Complexity, Richard Stevens, Peter Brook, Ken Jackson, Stuart Arnold
    • All non-trivial projects are systems.
    • Systems are complex, they contain complexity
    • Defining complex, complexity, complicated needs to be done with care.
    • Much mis-information is around in the agile community about these terms. Usually used to make so point about how hard it is to manage software development projects.
    • In fact there is a strong case that much of the¬†complexity and¬†complex aspects in software development are simply¬†bad¬†management¬†of the requirements
  • Forecasting and Simulating Software Development Projects: Effective Modeling of Kanban & Scrum Projects using Monte Carlo Simulation, Troy Magennis
    • When we hear¬†Control in a non-determistic paradigm is an illusion at best, delusion at worst start with Troy's book to see that conjecture is actually not true.
    • If the system you're working on is truely non-deterministic - that is chaotic - you've got yourself a long road because you're on a Death March project. Run away as fast as you can.
  • Probability Methods for Cost Uncertainty Analysis: A Systems Engineering Perspective, Paul R. Garvey, CRC Press.
    • All project variables are probabilistic. Managing in the presence of uncertainty created by the statistical processes the result in probability is part of all project management.
    • This book speaks to the uncertainty in cost.
    • Uncertainty in schedule and technical performance are the other two variables.
    • Assuming deterministic variables or assuming you can't manage in the presence of uncertainty are both naive and ignore the basic mathematics of making decisions in the presence of uncertainty
  • Estimating Software-Intensive Systems:¬†Projects,Products and Processes, Richard D. Stutzke, Addison Wesley.
    • Software Intensive Systems¬†s any¬†system¬†where¬†software¬†contributes essential influences to the design, construction, deployment, and evolution of the¬†system¬†as a whole. [IEEE 42101:2011]
    • Such systems are by their nature complex, but estimating the attributes of such systems is a critical success factor in all modern business and technology functions.¬†
    • For anyone conjecyturing estimates can't be made in complex system, this book an mandatory reading.¬†
    • Estimates are hard, but can be done, and are done.¬†
    • So when you hear that conjecture ask¬†how you know that those estimates can't be made? Where's you evidence that counters the work found in this book. Not¬†anecdotes,¬†optioning,¬†¬†conjectures, but actual engineering¬†assessment with the¬†mathematics?
  • Effective Risk¬†Management:¬†Some¬†Keys to Success, 2nd¬†Edition, Edmund H. Conrow.
  • Project Risk Management: Processes, Techniques and Insight, Chris Chapman and Stephen Ward.
    • These two book are the core of Tim Lister's quote
    • Risk Management is How Adults Manage Projects
    • Risk management involves estimating
  • The Economics of¬†Iterative¬†Software Development" Steering Toward Business¬†Results, Walker Royce, Kurt Bittner, and Mike Perrow.
    • All software development is a MicroEconomics paradigm.
    • Where¬†the behavior of individuals and small impacting organizations in making decisions on the allocation of limited resources.
    • When you hear about conjectures for improving software development processes that violate Microeconomics, ignore them.
    • These limited resources are people, time, and money
  • Assessment¬†and Control of¬†Software¬†Risks, Capers Jones.
    • Since all management is risk management, here's a book that clearing states how to manage in the presence of uncertainty.
  • Software Cost¬†Estimating¬†with COCOMO II, Barry Boehm et. al.
    • This is the original basis of estimating with parametric processes
    • Numerous tools and processes are based on COCOMO
    • Parametric estimating makes use of Reference Classes, same as Monte Carlo Simulation
    • With a parametric model or a Reference Class model estimates of future outcomes can be made in every domain where we're not inventing new physics. This means there is no reason not to estimate for any software system found in any normal business environment.
    • This is not to say everyone can estimate. Nor should they. The¬†excuse of¬†we've never done this below really means you should go find someone who has.
  • Facts and Fallacies of Software Engineering, Robert L. Glass
    • There are many fallacies in the development of software
    • This book exposes most of them and provides corrective actions
  • How to Measure Anything: Finding the Value of Intangibles in Business, Douglas Hubbard
    • When we hear¬†we can't measure read this book.
    • This book has a great description of Monte Carlo Simulation (used everywhere in our domains).¬†
      • Monte Carlo started at Los Alamos during the bomb development process
      • MCS samples a large number of value¬†under in Probability Distribution Function that represents the statistical processes that are being modeled.¬†
      • MCS has some relatives,¬†Boot Strapping is one. But it operates in a different manner though, using past performance as a sample population.
  • Hard Fact, Dangerous Half-Truths & Total Nonsense, Jeffrey Pfeffer and Robert Sutton
    • This book was handed out by Ken Schwaber's "The State of Agile"
    • The key here is decisions are best made using facts. When facts aren't directly available, estimates of those facts are needed.
    • Making those estimates are part of every business decision, based on Microeconomics of writing software for money.

So In The End

This list is the tip of the iceberg for access to the knowledge needed to manage in the presence of uncertainty while spending other peoples money.

Related articles Mr. Franklin's Advice Want To Learn How To Estimate? Two Books in the Spectrum of Software Development
Categories: Project Management

Quote of the Day

Fri, 05/15/2015 - 21:33

Any process that does not have provisions for its own refinement will eventually fail or be abandoned

- W. R. Corcoran, PhD, P.E., The Phoenix Handbook: The Ultimate Event Evaluation Manual for Finding Profit Improvement in Adverse Events, Nuclear Safety Review Concepts, 19 October 1997.

Categories: Project Management