Warning: Table './devblogsdb/cache_page' is marked as crashed and last (automatic?) repair failed query: SELECT data, created, headers, expire, serialized FROM cache_page WHERE cid = 'http://www.softdevblogs.com/?q=aggregator/categories/6&page=5' in /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/database.mysql.inc on line 135

Warning: Cannot modify header information - headers already sent by (output started at /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/database.mysql.inc:135) in /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/bootstrap.inc on line 729

Warning: Cannot modify header information - headers already sent by (output started at /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/database.mysql.inc:135) in /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/bootstrap.inc on line 730

Warning: Cannot modify header information - headers already sent by (output started at /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/database.mysql.inc:135) in /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/bootstrap.inc on line 731

Warning: Cannot modify header information - headers already sent by (output started at /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/database.mysql.inc:135) in /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/bootstrap.inc on line 732
Software Development Blogs: Programming, Software Testing, Agile, Project Management
Skip to content

Software Development Blogs: Programming, Software Testing, Agile Project Management

Methods & Tools

Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!

Project Management
warning: Cannot modify header information - headers already sent by (output started at /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/database.mysql.inc:135) in /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/common.inc on line 153.

Is #NoEstimates a House Built on Sand?

Herding Cats - Glen Alleman - Mon, 08/17/2015 - 02:45

This is my last post on the topic of #NoEstimates. Let's start with my professional observation. All are welcome to provide counter examples. 

Estimates have little value to those spending the money.
Estimates are of critical value to those providing the money.

Since those spending the money usually appear to not recognize the need for estimating for those providing the money, the discussion has no basis on which to exchange ideas. Without the acknowledgement that in business there are is collection of principles that are immutable, those spending the money have little understanding of where the money to do their work comes from†.

Here are the business principles that inform how the business works when funding the development of value:

  • The future is uncertain, but this uncertainty can be modeled. It is not¬†unknowable.
  • Managerial Accounting provides managers with¬†accounting¬†information in order to better inform themselves before they decide matters within their organizations, which aids their¬†management¬†and performance of control functions.
  • Economic Risk Management identifies, analyzes and accepts or mitigates the uncertainties encountered in the managerial decision-making processes

On the project management side, there are also immutable principles required for project success

  • There is some notion of what Done looks like, in units of measure meaningful to the decision makers. Effectiveness and Performance are two standard measures in domains where¬†systems thinking prevails¬†
  • There is a work Plan to reach Done. This Plan can be simple or it can be complex. But the order of the work and the dependencies between the work elements are the basis of all planning processes.
  • There is ¬†a plan for the needed resources to reach Done. This includes staff, facilities, and funding. This means knowing something about¬†how much and¬†when for these resources.
  • There is the recognition of the risk involved in reached Done, and a response to those risks
  • There is some means of measuring physical progress to the Plan to reach Done, so corrective actions can be taken to increase the probability of success. Tangible outcomes from the Planned work is the preferred way to measure progress

The discussion - of sorts - around No Estimates has reached a low point in shared understanding. But first let me set the stage

If the Business and Project Success principles are not accepted as the basis of discussion for any improvements, then there is no basis of discussion. Stop reading there's nothing here for you. If these principles are acknowledged, then please continue

In a recent post from one of the Original authors of the #NoEstimates hashtag it was said...

Quit estimates cold turkey. Get some kind of first-stab working software into the customer‚Äôs hands as quickly as possible, and proceed from there. What does this actually look like? When a manager asks for an estimate up front, developers can ask right back, ‚ÄúWhich feature is most important?‚ÄĚ‚ÄĒ then deliver a working prototype of that feature in two weeks. Deliver enough working code fast enough, with enough room for feedback and refinement, and the demand for estimates might well evaporate.¬†Let‚Äôs stop trying to predict the future. Let‚Äôs get something done and build on that‚Ää‚ÄĒ‚Ääwe can steer towards better.

This is one of those over generalizations that when questioned get strong pushback to the questioner. Let's deconstruct this paragraph a bit in the context of software development.

  • Quit estimates cold turkey - perhaps those paying for the work need to be consulted to determine if they have any vested interest in knowing about cost and schedule of the value they're paying for.
  • Some kind of first stab software working - sounds nice. But how much does that cost? And when can that be delivered. Can any feature in the requested list - the needed capabilities and their supporting technical and operational requirements - be done in 2 weeks? Can you show that can happen, so is that just a platitude repeated enough that is has become a meme - without any actual evidence of being true?¬†
  • Deliver enough working code fast enough, with enough room for feedback and refinement, and the demand for estimates might well evaporate - there are two cascaded IF conditions here. IF we deliver working code fast enough, IF we leave room for feedback, THEN estimate MIGHT evaporate.

This last one is one of those  IF PIGS COULD FLY type statements.

So Here's the Issue

If it is conjectured that we can make decisions in the presence of uncertainty - and all project work operated in the presence of uncertainty by its very definition - otherwise it'd be production - then how can we make a choice between alternatives if we can't estimate the outcomes of those choices?

This is the basis of MicroEconomics and Managerial Finance. When the OP'ers of #NoEstimates make these types of statements they're doing so on the their volition. It's likely their strongly held belief that decisions can be made without estimating the outcomes of those decisions. 

So when questioned about what principles these conjectures are based on returns shorn for asking, acquisitions of trolling, being rude, having no respect for the person making these unfounded, unsubstantiated, untested, domain free statements, it seems almost laughable. At times it appears to be willful ignorance of the basic tenants of business decision making. I don't pretend to know what's in the minds of many #NE supporters. Having talked to some advocates who are skeptical, it turns out when questioning further they are unwilling to disavow the notion that there is merit in exploring further. 

This is a familiar course for climate change deniers. All the evidence is not in, so let's challenge everything and see what we can discover. This notion of challenging and exploring in the absence of established principles is not that useful actually. In a domain like managerial finance, Microeconomics of software development decision making, in the realm of decision making in general, the principles and practices are well established.

What's now know is actually that those principles and practicers are not know to those making the conjecture that we should challenge everything. Much like the political climate deniers - Well I'm not a scientist but I heard on the internet there is some dissent in the measurements ... So I'm not familiar with probability and statistics and haven't taken a microeconomics class or read any Managerial Finance books, But almost sure that those self proclaimed thought leaders for #NoEstimates have something worth looking into.

Harsh, you bet it's harsh. Any idea presented in open forum will be challenged when that idea willfully violate the principles on which business operates. Better be prepared to be challenged and better be prepared to bring evidence your conjecture has merit. This happens all the time in science, mathematics, and engineering. Carl Sagan's BS Detector is one place to start. Or John Baez's Crack Pot Index are useful in the science and math world. 

No Estimates has now reached that level, with some outrageous claims. 

  • Not doing estimates improves project performance by 10X
  • Estimates are actually evil
  • Estimating destroys innovation
  • Steve McConnell¬†proves in his book estimates can't be done.
  • Todd Littles Figure 2 shows how bad we are at estimating - without of course reading the rest of the presentation showing how to correct these errors.

Making Credible Decisions in the Presence of Uncertainty

Decision making is the basis of business management. Here's an accessible text for learning to making decisions in the presence of uncertainty, Decision Analysis for the Professional. When there is any suggestion that decision can be made without estimate ask if the personal making that conjecture has an evidence this is possible. Ask if they're read this book. Ask if their decision making process has:

  • A decision making framework
  • A decision making process
  • A methodology for making decisions
  • How this decision making process works in the presence complex organizations
  • A probability and statistics model for making decsions.

Here's some more background on making decisions in the presence of uncertainty.

This is a sample of the many resources available for making decisions in the presence of uncertainty. There is also a large collection of estimating software development projects. The one we use in our work is

This an other resources are the basis of understanding how to make decision.

When it is conjectured we can decide with estimating, ask have you any evidence what so ever this is possible beyond your personal opinion and anecdotal experience? No? Then please stop trying to convince me your unsubstantiated  idea has any merit in actual business practice. 

And this is why I've decided to stop writing about the nonsense of #NoEstimates. There is no basis for the discussion anchored in principles, practices, or processes of business based in managerial finance and Microeconomics of decision making.

It's a House Built On Sand

† I learned this in the first week of my first job after graduate school. 


Related articles Making Conjectures Without Testable Outcomes Estimating Processes in Support of Economic Analysis Root Cause of Project Failure Herding Cats: How To Make Decisions Estimating and Making Decisions in Presence of Uncertainty Why Guessing is not Estimating and Estimating is not Guessing
Categories: Project Management

How To Make Decisions

Herding Cats - Glen Alleman - Sat, 08/15/2015 - 23:00

Decisions are about making Trade Offs for the project that are themselves about:

  • Evaluating alternatives.
  • Integrating and balancing all the considerations (cost, performance, Producibility, testability, supportability, etc.).
  • Developing and refining the requirements, concepts, capabilities of the product or services produced by the project or product development process.
  • Making trade studies and the resulting trade offs that enables the selection of the best or most balanced solution to fulfill the business need or accomplishment of the mission.

The purpose of this process is to:

  • Identify the trade-offs ‚Äď the decisions to be made ‚Äď among requirements, design, schedule, and cost.
  • Establish the level of assessment commensurate with cost, schedule, performance, and risk impact based on the value at risk for the decision.
    • Low value at risk, low impact, simple decision making ‚Äď possibly even gut feel.
    • High value at risk, high impact, the decision-making process must take into account these impacts.

Making decisions about capabilities and resulting requirements is the start of discovering what DONE looks like, by:

  • Establishing alternatives for the needed performance and functional requirements.
  • Resolving conflicts between these requirements in terms of the product‚Äôs delivered capabilities.

Decisions about the functional behaviors and their options is next. These decisions:

  • Determine preferred set of requirements for the needed capabilities. This of course is an evolutionary process as requirements emerge, working products are put to use, and feedback is obtained.
  • Determine the customer assesses requirements for lower-level functions as each of the higher-level capabilities are assessed.
  • Evaluate alternatives to each requirement, each capability, and the assessed value of each capability ‚Äď in units of measure meaningful to the decision makers.

Then comes the assessment the cost effectiveness of each decision:

  • Develop the Measures of Effectiveness and Measures of Performance for each decision.
  • Identify the critical Measures of Effectiveness of each decision in fulfilling the project‚Äôs business goal or mission. These Technical Performance Measures are used to assess the impact of each decision on the produced value of the project.

Each of these steps is reflected in the next diagram.


Value of This Approach

When we hear that estimates are not needed to make decisions, we need to ask how the following questions can be answered:

  • How can we have systematized thought process, where the decisions are based on measureable impacts?
  • How can we clarify our options, problem structure, and available trade-offs using units of measure meaningful to the decision makers?
  • How can we improve communication of ideas and professional judgment within our organization through a shared exchange of the impacts of our decisions?
  • How can we improve communication of rationale for each decision to others outside the organization?
  • How can we be assured of our confidence that all available information has been accounted for in a decision?

 The decision making process is guided by the identification of alternatives

Decision-making is about deciding between alternatives. These alternatives need to be identified, assessed, and analyzed for their impact on the probability of success of the project.

These impacts include, but are not limited to:

  • Performance
  • Schedule
  • Cost
  • Risk
  • Affordability
  • Producibility
  • And all the other ‚Ķilities associated with the outcomes of the project

The effectiveness of our decision making follows the diagram below:


In the End - Have all the Alternatives Been Considered?

Until there is a replacement for the principles of Microeconomics, for each decision made on the project, we will need to know the impact on cost, schedule, technical parameters, and other attributes of that decision. To not know those impacts literally violates the principles of microeconomics and the governance framework of all business processes, where the value at risk is non-trivial.


When you hear planning ahead, by assessing our alternatives is overrated, quit estimating cold turkey think again. And ask evidence of how to make decisions in the presence of uncertainty with making estimates, making trade-offs, evaluating alternatives - probabilistic alternatives - and all the those other decision making processes found in your managerial finance book you read in engineering, computer science, or business school

† Derived from Module J: Trade Study Process, Systems Engineering, Boeing.

Related articles Making Conjectures Without Testable Outcomes Estimating and Making Decisions in Presence of Uncertainty Estimating Processes in Support of Economic Analysis
Categories: Project Management

Earned Value + Agile a Match Made ini Heaven?

Herding Cats - Glen Alleman - Sat, 08/15/2015 - 01:13

At a recent conference the discussion of the integration of Agile with Earned Value Management on programs subject to FAR 34.201 and DFARS 252.234-7001 was the topic. Here's my presentation.

Turns out it is a match made in heaven. Since that conference, I've worked a DOJ proposal where agile and EVM are mandated, along with the SDLC (Scrum) and the Agile Development tool (TFS). The flood gates are now opening on Software Intensive System on procurements requiring an EVMS. The NDIA (National Defense Industry Association) is release an integration document in October defining how to make the match. Agile luminaries have spoken at DOD conferences and given advice to Undersecretaries on the topic. One office I work in is writing an implementation guide, as is the GAO.
Categories: Project Management

The Flaw of Averages

Herding Cats - Glen Alleman - Fri, 08/14/2015 - 04:30

In a recent post of forecasting capacity planning a time series of data was used as the basis of the discussion.

Screen Shot 2015-08-13 at 6.40.47 PM

Some static statistics were then presented.

Screen Shot 2015-08-13 at 6.43.01 PMWith a discussion of the upper and lower ranges of the past data. The REAL question though is what is the likely outcomes for data in the future given the past performance data. That is if we recorded what happened in the past, what is the likely data in the future?

The average and upper and lower ranges from the past data are static statistics. That is all the dynamic behavior of the past is wiped out in the averaging and deviation processes, so that information can no longer be used to forecast the possible outcomes of the future. 

This is one of the attributes of The Flaw of Averages and How to Lie With Statistics, two books that should be on every managers desk. That is managers tasked with making decisions in the presence of uncertainty when spending other peoples money.

We now have a Time Series and can ask the question what is the range of possible outcomes in the future given the values in the past? This can easily be done with a free tool - R. R is a statistical programming language that is free from the Comprehensive R Archive Network (CRAN). In R, there are several functions that can be used to make these forecasts. That is what are the estimated values in the future form the past and their confidence intervals.

Let's start with some simple steps:

  1. Record all the data in the past. For example make a text file of the values in the first chart. Name that file NE.Numbers
  2. Start the R tool. Better yet download an IDE for R. RStudio is one. That way there is a development environment for your statistical work. As well there are many Free R books on statistical forecasting - estimating outcomes in the future.
  3. OK, read the Time Series of raw data from the file of value as assign it to a Variable
    • NETS=ts(NE.Numbers)
    • The ts function converts the Time Series into an object - a Time Series - that can be used by the next function
  4. With the Time Series now in the right format, apply the ARIMA function. ARIMA is Autoregressive Integrated Moving Average. Also know as the Box-Jenkins algorithm. The is George Box of the famously misused and blatantly abused quote all models are wrong some models are useful. If you don't have the full paper where that quote came from and the book Time Series Analysis: Forecasting and Control, Box and Jenkins, please resist re-quoting out of context. That quoyte has become the meme for those not having the  background to do the math for time series analysis and it becomes a mantra for willfully ignoring the math needed to actually make estimates of the future - forecasting - using time series of the past in ANY domain. ARIMA is the beginning basis of all statistical forecasting, the science, engineering, and finance.
    • The ARIMA algorithm has three parameters -¬†p, d, q
    • p is the order of the autoregressive model.
    • d is the degree of he differencing
    • q is the order of the moving average
    • Here's the manual in R for ARIMA
  5. With the original data turned into a Time Series and presented to the ARIMA function we can now apply the Forecast function. This function provides methods and tools for displaying and analyzing univariate time series forecasts including exponential smoothing via state space models and automatic ARIMA modelling.
  6. When applied to the ARIMA output we get a Forecast series that can be plotted.

Here's what all this looks like in RStudio:

NETS=ts(NE.Numbers) - convert the raw numbers to a time series
NETSARIMA=arima(NETS, c=order(0,1,1)) - make an ARIMA object
NEFORECAST = forecast(NETSARIMA) - make a forecast using that
plot(NEFORECAST) - plot it

Here's the plot, with the time series from the raw data and the 80% and 90% confidence bands on the possible outcomes in the future. 


The Punch Line

You want to make decisions with other peoples money when the 80% confidence in a possible outcome is itself a - 56% to +68% variance? really. Flipping coins gets a better probability of an outcome inside all the possible outcomes that happened in the past. The time series is essentially a random series with very low confidence of being anywhere near the mean. This is the basis of The Flaw of Averages. 

Where I work this would be a non-starter if we came to the Program Manager with this forecast of the Estimate to Complete based on an Average with that wide a variance. 

Possible where there is low value at risk, a customer that has little concern for cost and schedule overrun, and maybe where the work is actually and experiment  with no deadline or not-to-exceed budget, or any other real constraint. But if your project has a need date for the produced capabilities, a date when those capabilities need to start earning their keep and need to start producing value that can be booked on the balance sheet a much higher confidence in what the future NEEDS to be is likely going to be the key to success

The Primary Reason for Estimates

First estimates are for the business. Yes developers can use them too. But the business has a business goal. Make money at some point in the future on the sunk costs of today - the breakeven date. These sunk costs are recoverable - hopefully - so we need to know when we'll be even with our investment. This is how business works, they make decisions in the presence of uncertainty - not on the opinion of development saying we recorded our past performance on an average for projected that to the future. No, they need a risk adjusted, statistically sound level of confidence that they won't run out money before breakeven. What this means in practice is a management reserve and cost and schedule margin to protect the project from those naturally occurring variances and those probabilistic events to derail all the best laid plans.

Now developers make not think like this. But someone somewhere in a non-trivial business does. Usually in the Office of the CFO. This is called Managerial Finance and it's how serious money at risk firms manage. 

So when you see time series like those in the original post, do your homework and show the confidence of the probability of the needed performance actually showing up. And by needed performance I mean the steering target used in the Closed Loop Control system used to increase the probability that the planned value - that the Agilest so dearly treasure - actually appears somewhere near the planned need date and somewhere around the planned cost so the Return on Investment those paying for your work are not disappointed with a negative return and label their spend as underwater. 

So What Does This Mean in the End?

Even when you're using past performance - one of the better ways of forecasting the future - you need to give careful consideration of those past numbers. Averages and simple variances which wipe out the actual underlying time series variances - are not only naive, they are bad statistics used to make bad management decisions. 

Add to the poorly formed notion that decisions can be made about future outcomes in the presence of  uncertainty in the absence of estimates about that future and you've got the makings of management disappointment. The discipline of estimating future outcomes from past behaviors is well developed. The mathematics and especially the terms used in that mathematics are well established. Here's some source we use in our everyday work. These are not populist books, they are math and engineering. They have equations, algorithm, code examples. They are used used the value at risk is sufficiently high that management is on the hook for meeting the performance goals in exchange for the money assigned to the project.

If you work a project that doesn't care too much about deadlines, budget overages, or what gets produced other than the minimal products, then these books and related papers are probably not for you. And most likely Not Estimating the probability that you'll not over spend, show up seriously late, and fail to produce the needed capabilities to meet the Business Plans, will be just fine. But if you are expected to meet the business goals in exchange for the spend plan you've beed assigned, these might be a good place start to avoid being a statistic (dead skunk on the middle of the road) in the next Chaos Report (no matter how poorly the statistics are).

This by the way is an understanding I came to on the plane flight home this week. #Noestimates is a credible way to run your project when these conditions are in place. Otherwise you may what to read how to make credible forecasts of what the cost and schedule is going to be for the value produced with your customer's money, assuming they actually care about not wasting it.


Related articles IT Risk Management Thinking, Talking, Doing on the Road to Improvement Estimating Processes in Support of Economic Analysis Making Conjectures Without Testable Outcomes
Categories: Project Management

No Signal Means, No Steering Target, Means No Corrective Actions

Herding Cats - Glen Alleman - Wed, 08/12/2015 - 06:13

Screen Shot 2015-08-11 at 6.44.36 PMThe management of projects involves many things. Capabilities, Requirements, Development, Staffing, Budgeting, Procurement, Accounting, Testing, Security, Deployment, Maintenance, Training, Support, Sales and Marketing, and other development and operational processes. Each of these has interdependencies with other elements. Each operates in its own specific ways on the project. Almost all have behaviors described by probabilistic model driven by the underlying statistical processes.

Management in this sense is control in the presence of these probabilistic processes. And yes we can control these items - it's a well developed process, starting with Statistical Process Control, Monte Carlo Simulation of resulting, Bayesian Networks, Probabilistic Real Options and other methods based on probabilistic processes.

The notion that are not controllable is at its heart flawed and essentially misinformed. But this control requires information. It's been mentioned before about Closed Loop Control, Closed Loop versus Open Loop, Staying on Plan Means Closed Loop Control, Use and Misuse of Control Systems, and Why Project Management is a Control System.

All these lead to Five Immutable Principles of Project Success. Along with these Principles, comes Practices, and Processes. But it's the Principles we're after as a start.

Each of the Principles makes use of information used on managing the project. This information is the signal used by management to make decisions. These signals are used to compare the current of the project (the system under management) to the desired state of the system. This is the basis of the Control System used to manage the System Under Management. With the control system - what ever that may be and there are many systems - the next step to gather the information needed to make decisions using this control system. This means information about the  past, present, and future of the system under management. Past data - when recorded - is available from the database of what happened in the past. Present data should be readily available directly from the system. The future data presents a unique problem. Rarely is information about the future recorded some place. It's in the future and hasn't happened yet.  But we can find sources for this future information. We have models of what the system may do in the future. We may have similar systems from the past that can be used to create future data. But there is a critical issue here. The future may not be like the past. Or the future may be impacted in ways not found on similar projects. But we to come up with this information if we are to make decisions about the future. So if we're missing the model, or missing the similar project, what can we do? We can make estimates from the data or models in some probabilistically informed manner. This is the role of estimating. To inform our decision making processes in the presence of uncertainty of possible future outcomes, knowing something about the past and present state of the system under management. With this probabilistic information, decisions can be made to take corrective actions to keep our project under control. That is moving in the direction we planned to move to reach the goal of the project. This goal is usually...
... provide needed capabilities to those paying for the project to meet some business goal or fulfill a mission strategy. To accomplish some beneficial outcome in exchange for the cost and time invested in development of the capabilities. So what happens when we don't have information about the future. That is we choose to Not Estimate when faced with the uncertainty of the possible future states of the system we need to take corrective actions to keep the project moving in the desired direction? Without these estimates, we have no signal needed to take corrective. We have an Open Loop Control system. A system that takes any path it wants, it has no control mechanism to keep it on track. The open loop control system is a non-feedback system in which the control input to the system is determined using only the current state of the system and a model of the system. There is no feedback to determine if the system is achieving the desired output based on the reference input or set point. The system does not observe itself to correct itself and, as such, is more prone to errors and cannot compensate for disturbances to the system. This means we're going to get what we're going to get, with no chance to steer the system toward our desired outcome. For any non-trivial system, not estimating the future state, creating the error signal between the desired future state and the current state and using that error signal to take corrective actions is considered Bad Management.  This would be considered Doing Stupid Things On Purpose in most domains that spend other people's money to produce needed capabilities and the resulting value for the planned cost on the planned date.  
Categories: Project Management

New Features on PlanningPoker.com

Mike Cohn's Blog - Tue, 08/11/2015 - 15:00

The PlanningPoker.com team has been quite busy. A couple of months ago, they launched a brand new design and mobile support. Since then, they have continued to improve the site based on your feedback.

Now, they’ve launched PlanningPoker.com’s first set of premium features. While these premium features do require a monthly fee, the base game will always be free.

Visit PlanningPoker.com to see these new features in action:

  • Import and export from JIRA and other backlog tools
  • Add and edit story details and acceptance criteria within the game
  • Monitor remaining team velocity while estimating
  • Build custom pointing scales
  • Save your default game settings

… and more.

Digital product development agency 352 Inc. is the team that created the new PlanningPoker.com, and they’re committed to making it the most helpful and efficient way to run your planning sessions. They are also open to your feedback on how to improve the product.

Give the new features a try and let them know what more you would like to see. To learn more about Planning Poker in general, check out our informational page on it.

Reasons for Continuous Planning

I’m working on the program management book, specifically on the release planning chapter. One of the problems I see in programs is that the organization/senior management/product manager wants a “commitment” for an entire quarter. Since they think in quarter-long roadmaps, that’s not unreasonable—from their perspective.

AgileRoadmapOneQuarter.copyright-300x223There is a problem with commitments and the need for planning for an entire quarter. This is legacy (waterfall) thinking. Committing is not what the company actually wants. Delivery is what the company wants. The more often you deliver, the more often you can change.

That means changing how often you release and replan.

Consider these challenges for a one-quarter commitment:

  1. Even if you have small stories, you might not be able to estimate perfectly. You might finish something in less time than you had planned. Do you want to take advantage of schedule advances?
  2. In the case of too-large stories, where you can’t easily provide a good estimate, (where you need a percent confidence or some other mechanism to explain risk,) you are (in my experience) likely to under-estimate.
  3. What if something changes mid-quarter, and you want more options or a change in what the feature teams can deliver? Do you want to wait until the end of a quarter to change the program’s direction?

If you “commit” on a shorter cadence, you can manage these problems. (I prefer the term replan.)

If you consider a no-more-than-one-monthly-duration “commit,” you can see the product evolve, provide feedback across the program, and change what you do at every month milestone. That’s better.

Here’s a novel idea: Don’t commit to anything at all. Use continuous planning.


If you look at the one-quarter roadmap, you can see I show  three iterations worth of stories as MVPs. In my experience, that is at least one iteration too much look-ahead knowledge. I know very few teams who can see six weeks out. I know many teams who can see to the next iteration. I know a few teams who can see two iterations.

What does that mean for planning?

Do continuous planning with short stories. You can keep the 6-quarter roadmap. That’s fine. The roadmap is a wish list. Don’t commit to a one-quarter roadmap. If you need a commitment, commit to one iteration at a time. Or, in flow/kanban, commit to one story at a time.

That will encourage everyone to:

  1. Think small. Small stories, short iterations, asking every team to manage their WIP (work in progress) will help the entire program maintain momentum.
  2. See interdependencies. The smaller the features, the clearer the interdependencies are.
  3. Plan smaller things and plan for less time so you can reduce the planning load for the program. (I bet your planning for one iteration or two is much better and takes much less time than your one-quarter planning.)
  4. Use deliverable planning (“do these features”) in a rolling wave (continue to replan as teams deliver).

These ideas will help you see value more often in your program. When you release often and replan, you build trust¬†as a program. Your managers might stop asking for “commits.”

If you keep the planning small, you don’t need to gather everyone in one big room once a quarter for release planning. If you do continuous planning, you might never need everyone in one room for planning. You might want everyone in one room for a kickoff or to help people see who is working on the program. That’s different than a big planning session, where people plan instead of deliver value.

If you are managing a program, what would it take for you to do continuous planning? What impediments can you see? What risks would you have planning this way?

Oh, and if you are on your way to agile and you use release trains, remember that the release train commits to a date, not scope and date.

Consider planning and replanning every week or two. What would it take for your program to do that?

Categories: Project Management

How to Predict the Release of Your Project Without Estimating

From the Editor of Methods & Tools - Mon, 08/10/2015 - 15:41
We often hear that estimating is a must in project management. “We can’t make decisions without them” we hear often. This video shows examples of how you can predict a release date of a project without any estimates, relying only on easily available data. Learn how you can follow progress on a project at all […]

Are 64% of Features Really Rarely or Never Used?

Mike Cohn's Blog - Wed, 08/05/2015 - 15:00

A very oft-cited metric is that 64 percent of features in products are “rarely or never used.” The source for this claim was Jim Johnson, chairman of the Standish Group, who presented it in a keynote at the XP 2002 conference in Sardinia. The data Johnson presented can be seen in the following chart.

Johnson’s data has been repeated again and again to the extent that those citing it either don’t understand its origins or never bothered to check into them.

The misuse or perhaps just overuse of this data has been bothering me for a while, so I decided to investigate it. I was pretty sure of the facts but didn’t want to rely solely on my memory, so I got in touch with the Standish Group, and they were very helpful in clarifying the data.

The results Jim Johnson presented at XP 2002 and that have been repeated so often were based on a study of four internal applications. Yes, four applications. And, yes, all internal-use applications. No commercial products.

So, if you’re citing this data and using it to imply that every product out there contains 64 percent “rarely or never used features,” please stop. Please be clear that the study was of four internally developed projects at four companies.

Thu, 01/01/1970 - 01:00