Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!
Software Development Blogs: Programming, Software Testing, Agile Project Management
Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!
I have a new article up on agileconnection.com called The Case for #NoEstimates.
The idea is to produce value instead of spending time estimating. We have a vigorous “debate” going on in the comments. I have client work today, so I will be slow to answer comments. I will answer as soon as I have time to compose thoughtful replies!
This column is the follow-on to How Do Your Estimates Provide Value?
If you would like to learn to estimate better or recover from “incorrect” estimates (an oxymoron if I ever heard one), see Predicting the Unpredictable. (All estimates are guesses. If they are ever correct, it’s because we got lucky.)
If you work in a domain, as I do, that needs to answer¬†when will we providing that needed capability to produce the planned value for the planned amount of money, then estimating is going to be part of answering those questions. If you work where those paying for the work have no interest in knowing answers, or have confidence you'll not overrun the cost, schedule, and deliver the needed capabilities as planned, then maybe estimates are needed. Be interesting to hear from those actually paying for those outcomes to see what they need to make decisions in the presence of uncertainty.
Here's some guidance for getting started with estimating software efforts.
And some tools to help out
So you see a trend here? There is nearly unlimited resources on how to estimate software development projects, how to manage in the presence of uncertainty, how to elicit requirements, how the plan and schedule software projects.
So if you hear¬†we're bad at estimates, that's likely the actual experience for the person making that statement, because the person saying that hasn't yet learned how to estimate. Or when we hear¬†estimates are a waste, it's likely it's not their money, so to them estimates take away from some other activity they see as more important. Why do they care of the project overruns it's budget, is late, or doesn't produce the needed value? Or my favorite¬†estimates are the smell of dysfunction, when there is no domain, root cause, or corrective actions suggested, because that's actually hard work, and it's much easier just to point out bad management than provide suggestions of good management.¬†
Estimating is hard. Anything of value is hard. All the easy problems have been solved.¬†
But if we are to ever get a handle on the root causes of software project failure modes, we do need to seek out the root causes. A that means much more than just asking the 5 Whys. That's one of many steps in RCA and far from the complete solution to removing the symptoms of our problems. So start there, but never leave it there.¬†
Here's some approach we use
Unanticipated cost growth is a symptom of failing to properly estimate in the first place, update those estimates as the project progresses, and deal with the underlying risks that drive that cost growth. Same for lateness and less than acceptable delivered capabilities. Once the estimate has been established in some credible form, adjusted as the project progresses, you of course have to execute to the plan, or change the plan. It's a closed loop system
The answer to the root causes that create the symptoms we so quickly label as¬†smells of dysfunction is NOT to stop doing something, but the learn what the actual cause is. Stopping before this knowledge is acquired leaves the symptom still in place. And that doesn't help anyone.Related articles Who Builds a House without Drawings? Decision Analysis and Software Project Management Herding Cats: Five Estimating Pathologies and Their Corrective Actions Capability Maturity Levels and Implications on Software Estimating Economics of Software Development Qui Bono Want To Learn How To Estimate? Calculating Value from Software Projects - Estimating is a Risk Reduction Process Root Cause Analysis
When we hear about project planning and scheduling, we think about tasks being planned, organized in the proper order, durations assigned to the work, resources committed to perform the work.¬†
This is all well and good, but without a risk adjusted view of the planned work, it's going to be disappointing at best. There are some key root causes of most project failure. Let's start here.
Each of these has been shown, through research on failed programs, to contribute to cost and schedule impacts. Unrealistic expectations of the project's deliverables, Technical issues, Naive Cost and Schedule estimating, and less than acceptable risk mitigation planning.
Project Mangement in Six Steps
Here's how to address the cost and schedule estimating
Develop a schedule. What ever your feelings are for Gantt's, sticky note, or any handwaving processes you've learned to use, you need a sequence of the work, the dependencies, the planned durations. Without something like that , you have no idea what work is needed to complete the project. Here's a straightforward Master Schedule for some flight avionics on a small vehicle. All software, has to complete as planned, otherwise the users can't do their job as planned. And since¬†they're¬†the¬†ones paying for¬†our work, they have an¬†expectation¬†of us showing up, near oru budget, with the needed capabilities. Not¬†the Minimum, the NEEDED
Using a Monte Carlo tool (RiskyProject), here is run for the probabilities of cost, duration and completion dates. All project work is probabilistic, any notion that a deterministic plan can be successful is going to result in disappointment.
We usually call our planning sessions done when we can get the risk adjusted Integrated Master Schedule to show the completion date of¬†on or before the need date to an 80% confidence level.
With a resource loaded schedule - or some external time phased cost model - we can now show the probability of completing¬†on or before¬†the need date, and¬†at of below¬†the planned budget. The chart below informs everyone what the chances are for success for the cost and schedule aspects of the project. Technically it has to work. The customer gets to say that. The¬†Fit for Use and¬†Fit for Purpose¬†measures if we're in an IT Governance paradigm. The¬†Measures of Effectiveness and¬†Measures of Performance if we're in other paradigms. Those measures can be modeled as well, but I'm just focusing on cost and schedule here.
With this information we can produce the needed¬†margin and¬†management reserve¬†to protect the delivery date and budget. Showing up late and over budget for a product that works is usually not good business.¬†
Do You Need all This?
What's the Value At Risk?
Don't know the value at risk? Don't have a clear vision of what done looks like in ¬†units of measure meaningful to the decision makers? You've got bigger problems. This approach won't help?Related articles Calculating Value from Software Projects - Estimating is a Risk Reduction Process Incremental Delivery of Features May Not Be Desirable Decision Analysis and Software Project Management How to Avoid the "Yesterday's Weather" Estimating Problem Hope is not a Strategy Critical Success Factors of IT Forecasting
The notion that planning is of little value, seems to be lost on those not accountable for showing up on or before the need date, at or below the needed cost, and with the planned capabilities needed to fulfill the business case or successfully accomplish the mission.
Yogi says it best in our project management domain. And it bears repeating when someone says¬†let's get started and we'll let the requirements emerge. Or my favorite,¬†let's get started and we'll use our perform numbers to forecast future performance, we don't need no stink'in estimates.
Yogi says ...If you don't know where you are going, you'll end up someplace else.
This is actually a quote from Alice in Wonderland¬†
"Would you tell me, please, which way I ought to go from here?"
"That depends a good deal on where you want to get to," said the Cat.
"I don't much care where--" said Alice.
"Then it doesn't matter which way you go," said the Cat.
"--so long as I get SOMEWHERE," Alice added as an explanation.
"Oh, you're sure to do that," said the Cat, "if you only walk long enough."
(Alice's Adventures in Wonderland,¬†Chapter 6)
This is often misquoted as If you don't know where you're going, any road will get you there. Which is in fact ¬†technically not possible and not from Alice.
So What To Do?
We need a plan to deliver the value that is being exchanged for the cost of that value. We can't determine the result value - benefit - until first we know the cost to produce that value, then we have to know when that value will be arriving.¬†
Both these conditions are basic Managerial Finance 101 concepts base on Return on Investment.¬†
ROI = (Value - Cost) / Cost
The first thing some will say is¬†but value can't be monetized. Ask the CFO of your firm to see what she thinking about monetizing the outcomes of your work on the balance sheet. Better yet, don't embarrass yourself, read¬†Essentials of Managerial¬†Finance, Brigham and Weston. Mine is 11th Edition, looks like its up to the 14th Edition.
As well, once it is established that both ¬†cost and value are random variables about measurements in the future, you'll need to estimate them to some degree of confidence if you're going to make decisions. These decisions are actual opportunity cost decisions about future outcomes. This is the basis of Microeconomics of software development.
So when you hear¬†we can make decisions about outcomes in the future in the presence of uncertainty - the¬†basis¬†of project work -¬†without estimating¬†-¬†don't believe a word of it. Instead read Weston and you too will have the foundational skills to know better.¬†
Because the close loop management processes we need on project and product development require we make decisions in the presence of uncertainty. There is simply no way to do that without estimating all the variance when we Plan, Do, Check, Act. Each is a random variable, with random outcomes. each require some access of¬†what will happen if I do this. A that notion of¬†let's just try it¬†reminds me of my favorite picture of open loop, no estimates, no measurement, no corrective action management.
One suggestion from the #NoEstimates community is the use of empirical data of past performance. This is many time called¬†yesterdays weather.¬†First let's make sure we're not using just the averages from yesterdays weather. And even adding the variance to that small sample of past performance can lead to very naive outcomes.¬†
We need to do some actual statistics on that time series. A simple R set of commands will produce the chart below from the time series of past performance data.
¬†But that doesn't really help without some more work.
Getting Out of the Yesterday's Weather Dilemma
Let's use the chart below to speak about some sources of estimating NOT based on simple small samples of yesterdays weather. This is a Master Plan for a non-trivial project to integrate half dozen or so legacy enterprise systems with a new health insurance ERO system for an integrated payer/provider solution:¬†
This bring up a critical issue with all estimates. Did the numbers produced from the past performance meet the expected values or were they just the numbers we observed? This notion of taking the observed numbers and using them for forecasting the future is an Open Loop control system. What SHOULD the numbers have been to meet our goals? What SHOULD the goal have been? Did know that, then there is no baseline to compare the past performance against to see if it will be able to meet the future goal.¬†
I'll say this again - THIS IS OPEN LOOP control, NOT CLOSED LOOP. No about of dancing around will get over this, it's a simple control systems principle found here. Open and Close Loop Project Controls
KPPs represent the capabilities and characteristics so significant¬† that failure to meet them can be cause for reevaluation, reassessing, or termination of the program.
The connections between these measures are shown below.
With these measures, tools for making estimates of the future - forecasts - using statistical tools, we can use¬†yesterdays weather,¬†tomorrow models and related reference classes, desired MOE's, MOP's, KPP's, and TPM's and construct a credible estimate of what needs to happen and then measure what is happening and close the loop with an error signal and take corrective action to stay on track toward our goal.
This all sounds simple in principle, but in practice of course it's not. It's hard work, but when you assess the value at risk to be outside the tolerance range where thj customer is unwilling to risk their investment, we need tools and processes wot actually control the project.Related articles Hope is not a Strategy Incremental Delivery of Features May Not Be Desirable
In the business of building software intensive systems; estimating, performance forecasting and ¬†management, closed loop control in the presence of uncertainty for all variables is the foundation needed for ¬†increasing the probability of success.
This means math is involved in planning, estimating, measuring, ¬†analysis, and corrective actions to Keep the¬†Program¬†Green.
When we have past performance data, here's one approach...
And the details of the math in the Conference paper
¬† Related articles Hope is not a Strategy How to Avoid the "Yesterday's Weather" Estimating Problem Critical Success Factors of IT Forecasting
I have an article up on agileconnection.com. It’s called How Do Your Estimates Provide Value?
I’ve said before that¬†We Need Planning; Do We Need Estimation?¬†Sometimes we need estimates. Sometimes we don’t. That’s why I wrote Predicting the Unpredictable: Pragmatic Approaches for Estimating Cost or Schedule.
I’m not judging your estimates. I want you to consider how you use estimates.
BTW, if you have an article you would like to write for agileconnection.com, email it to me. I would love to provide you a place for your agile writing.
The following was originally published in Mike Cohn's monthly newsletter. If you like what you're reading, sign up to have this content delivered to your inbox weeks before it's posted on the blog, here.
I’ve been in some debates recently about whether the ScrumMaster should be full time. Many of the debates have been frustrating because they devolved into whether a team was better off with a full-time ScrumMaster or not.
I’ll be very clear on the issue: Of course, absolutely, positively, no doubt about it a team is better off with a full-time ScrumMaster.
But, a team is also better off with a full-time, 100 percent dedicated barista. Yes, that’s right: Your team would be more productive, quality would be higher, and you’d have more satisfied customers, if you had a full-time barista on your team.
What would a full-time barista do? Most of the time, the barista would probably just sit there waiting for someone to need coffee. But whenever someone was thirsty or under-caffeinated, the barista could spring into action.
The barista could probably track metrics to predict what time of day team members were most likely to want drinks, and have their drinks prepared for them in advance.
Is all this economically justified? I doubt it. But I am 100 percent sure a team would be more productive if they didn’t have to pour their own coffee. Is a team more productive when it has a fulltime ScrumMaster? Absolutely. Is it always economically justified? No.
What I found baffling while debating this issue was that teams who could not justify a full-time ScrumMaster were not really being left a viable Scrum option. Those taking the “100 percent or nothing” approach were saying that if you don’t have a dedicated ScrumMaster, don’t do Scrum. That’s wrong.
A dedicated ScrumMaster is great, but it is not economically justifiable in all cases. When it’s not, that should not rule out the use of Scrum.
And a note: I am not saying that one of the duties of the ScrumMaster is to fetch coffee for the team. It’s just an exaggeration of a role that would make any team more productive.
Making decisions in the presence of this uncertainty is part of our job as project managers, engineers, developers on behave of those paying for our work.
It's also the job of the business, whose money is being spent on the projects to produce tangible value in exchange for that money.
From the introduction of the book to the left...
Science and engineering, our modern ways of understanding and altering the world, are said to be about accuracy and precision. Yet we best master the complexity of our world by cultivating insight rather than precision. We need insight because our minds are but a small part of the world. An insight unifies fragments of knowledge into a compact picture that fits in our minds. But precision can overflow our mental registers, washing away the understanding brought by insight. This book shows you how to build insight and understanding first, so that you do not drown in complexity.
So what does this mean for our project world?
In both these conditions we need to¬†get organized in order to address the ¬†underlying uncertainties. We need to put structure in place in some manner. Decomposing the work is a common way in the project domain. From a Work Breakdown Structure to simple¬†sticky notes on the wall, breaking problems down into smaller parts is a known successful way to address a problem.¬†
With this decomposition, now comes the hard part. Making decisions in the presence of this uncertainty.
Reasoning about things that are uncertain is done with probability and statistics. Probability is a¬†degree of belief.¬†
I believe we have a 80% probability of completing¬†on or before the due date for the migration of SQL Server 2008 to SQL Server 2012.
Why do we have this belief? ¬†Is it based on our knowledge from past experience. Is this knowledge sufficient to establish that 80% confidence?
The answers to each of these¬†informs our belief.¬†
Chaos, Complexity, Complex, Structured?
A well known agile thought leader made a statement today
I support total chaos in every domain
This is unlikely going to result in sound business decisions in the presence of uncertainty. Although there may be domains where chaos might produce usable results, when some degree of confidence that the money being spent will produce the needed capabilities,¬†on of before the need date, at of below the budget needed to be profitable, and with the collection of all the needed¬†capability¬†to accomplish the¬†mission or meet the¬†business¬†case, we're going to need to know how to manage our work to achieve those outcomes.
So let's assume - with a high degree of confidence - that we need to manage in the presence of uncertainty, but we have little interest in encouraging chaos, here's one approach.
So In The End
Since all the world's a set of statistical processes, producing probabilistic outcomes, which in turn create risk to any expected results when not addressed properly - the notion that decisions can be made in the presence of this condition can only be explained by the¬†willful ignorance of the basic facts of the¬†physic of project work.¬†¬† Related articles The Difference Between Accuracy and Precision Making Decisions in the Presence of Uncertainty Managing in Presence of Uncertainty Herding Cats: Risk Management is How Adults Manage Projects Herding Cats: Decision Analysis and Software Project Management Five Estimating Pathologies and Their Corrective Actions
Good estimates are the key to project success. Estimates provide information to the decision makers to assess adherence to performance specifications and plans, make decisions, revise designs and plans, and improve future estimates and processes.
We use estimates and measurements to evaluate the feasibility and affordability of products being built, choose between alternatives designs, assess risk, and support business decisions. Engineers compare estimates if technical baselines of observed performance to decide of the product meets its functional and performance requirements. These are used by management to control processes and detect compliance problems. Process manager use capability baselines to improve production processes.
Developers, engineers, and planners estimate resources needed to develop, maintain, enhance and deploy products, Project planners use estimates for staffing, facilities. Planners and managers use estimates for resources to determine project cost and schedule and prepare budgets and plans.¬†
Managers compare estimates - cost and schedule baselines - and actual values to determine deviations from plan and understand the root causes of those deviations needed to take corrective actions. Estimates of product, project, and process characteristics provide¬†baselines to assess progress during the project.¬†
Bad estimates affect all participants in the project or product development process. Incomplete and inaccurate estimates mean inadequate time and money ¬†available for increasing the probability of project success.
The Nature of Estimation
The verb¬†estimate means to produce a statement of the approximate value of some quantity that describes or characterizes an object. The noun¬†estimate refers to the value produced by the verb. The¬†object can be an artifact - software, hardware, documents - or an activity - planning, development, testing, or process.
We make estimates because we cannot directly measure the value of that quantity because:
Reasons to Estimate and Measure Size, Cost and Schedule
Reasons to Estimate Effort, Cost, and Schedule
Reasons to Estimate Capability and Performance
There are many sources of data for making estimates, some reliable some not. Human subject matter expert based estimates have been shown to be the least reliable, accurate and precise due to the¬†biases involved in the human processes of developing the estimate. estimates based on past performance, while useful, must be adjusted for the statistical behaviors of the past and the uncertainty of the future.¬†
If the estimate is misused in any way, this is not the fault of the estimate - both noun and verb - byt simply bad management. Fix that first, then apply proper estimating processes.
If your project or product development effort does none of these activities or has no need for¬†information¬†on which to make a decision, then estimating is likely a waste of time.
But before deciding estimate are the smell of¬†dysfunction, with NO root cause identified for corrective action check with those paying your salary first, to see what they have to say about your desire to spend their money in presence of uncertainty with the absence of an estimate to see what they say.
‚Ä† This post is extracted from¬†Estimating¬†Software¬†Intensive¬†Systems: Project, Products and Processes, Dr. Richard Stutzke, Addison Wesley. This book is a mandatory read for anyone working in a software domain on any project that is mission critical. This means if you need to show up on or before the need date, at or below your planned cost, with the needed capabilities - Key Performance Parameters , without which the project will get cancel - then you're going to need to estimate all the parameters of your project, If your project doesn't need to show up on time, stay on budget, or can provide less than the needed capabilities, no need to estimate. Just spend your customer's money, she'll tell you when to stop.Related articles Capability Maturity Levels and Implications on Software Estimating Incremental Delivery of Features May Not Be Desirable Capabilities Based Planning First Then Requirements
In Lean there is a supporting process of 5S's. 5S's is a workplace organization method that uses a list of five words¬†seiri,¬†seiton,¬†seiso,¬†seiketsu,¬†and¬†shitsuke.¬†This list describes how to organize a work places for efficiency and effectiveness by identifying and storing items used, maintaining the areas and items, and sustaining the new order. The decision making process usually comes from a dialogue about standardization, which build understanding around the employees of how they should do their work.
At one client we are installing Microsoft Team Foundation Server, for development, Release Management and Test Management. The current processes relies on the heroics of many on the team every Thursday night to get the release out the door.
We started the improvement of the development, test, and release process with Root Cause Analysis. In this domain Cyber and Security are paramount, so when there is a cyber or a data security ¬†issue, RCA is the core process to address the issue.
The results of the RCA have show that the work place is chaotic at times, code poorly managed, testing struggles on deadline, and the configuration of the release base inconsistent. It was clear we were missing tools, but the human factors were also the source of the problem - the symptom of latent defects and a¬†break fix paradigm.
There are many ways to ask and answer the 5 Whys and apply the 5 S's, but until that is done and the actual causes determined, and the work place cleaned up, the symptoms will continue to manifest in undesirable ways.¬†
If we're going to start down the path of 5 Whys and NOT actually determine the Root Cause and develop a corrective action plan, then that is in itself a waste.¬†Related articles Five Estimating Pathologies and Their Corrective Actions Economics of Software Development
Economics is called the¬†Dismal Science.¬†Economics is the branch of knowledge concerned with the¬†production, consumption, and transfer of wealth.¬†Economics¬†is generally about behaviors of humans and markets, given the scarcity of means, arises to achieve certain ends.
How does economics apply to software development? We're not a market, we don't create wealth, at least directly, we create products and services that may create wealth. Microeconomics¬†is a branch of¬†economics¬†that studies the behavior of individuals and their decision making on the allocation of limited resources. It's the scarcity of resources that is the basis of Microeconomics. Software development certainly operates in the presence of scarce resources. MicroEconomics is closer to what we need to make decisions in the presence of uncertainty. The general economics processes ae of litle interest, so starting with Big Picture Econ books is not much use.
Software economics is a subset of Engineering Economics. A key aspect of all Microeconomics applied to engineering problems is the application of¬†Statistical¬†Decisions¬†Theory - making decisions in the presence of uncertainty. Uncertainty comes in two types:
Aleatory uncertainty can be addressed by adding margin to our work. Time and Money. Epistemic uncertainty and the missing information has economic value to our decision making processes. That is there is¬†economic value in decision based problems in the presence of uncertainty.
This¬†missing information can be¬†bought down with simple solutions. Prototypes for example. Short deliverables to test an idea or confirm an approach. Both are the basis of Agile and have been discussed in depth in¬†Software¬†Engineering¬†Economics, Barry Boehm, Prentice Hall. 1981.
Engineering economics is the application of economic techniques to the evaluation of design and engineering alternatives. Engineering economics assesses the appropriateness of a given project, estimates of its value, and justification of the project (or product) from an engineering standpoint.
This involves the time value of money and cash-flow concepts, - ¬†compound and continuous interest. It continues with economic practices and techniques used to evaluate and optimize decisions on selection of strategies for project success.¬†
When I hear I read that¬†book and it's about counting lines of code, the reader has failed to comprehend the difference between principles and practices. The section of Statistical Decision theory are about the Expected Value of Perfect Information and how to make decisions with Imperfect information.
Statistical Decision Theory is about making choice, identifying the values, uncertainties and other issues relevant in a given decision, its rationality, and the resulting optimal decision. In Statistical Decision Theory, the underlying statistical processes and the resulting Probabilistic outcomes require us to Estimate in the presence of uncertainty.
Writing software for money, other people's money, requires us to estimate how much money, when we'll be done spending that money and what will result from that spend.
This is the foundation of the Microeconomics of Software Development
If there is no scarcity of resources - time, cost, technical performance - then estimating is not necessary. Just start the work, spend the money and you'll be done when you're done. If however¬†Related articles Five Estimating Pathologies and Their Corrective Actions Critical Success Factors of IT Forecasting
Last week, I had a nice Skype call with a reader who was seeking my advice on becoming an author and speaker, and I gave him some pointers. I normally don’t schedule calls with random people asking for a favor, but this time I made an exception. I had a good reason.
No matter your life experience, your view of the world, whether you've had military experience or not, this is - or should be - an inspirational commencement speech.¬†
This Blog has been focused on improving program and project management process for many years. Over that time I've run into several¬†bunk ideas around projects, development, methods, and process of managing other peoples money. When that happens, the result is a post or two about the nonsense idea and the corrections to those ideas, not just form my experience but from the governance frameworks that guide our work.
This Blog is dedicated to the proposition that all information is not created equal. Much of it is endowed by its creators with certain undeniable wrongs. Misinformation is dangerous!!
There's a lot of crap floating around the any business or technical field. Much of it gets passed around by well-meaning folks, but it is harmful regardless of the purity of the conveyer.
People who attempt to debunk myths, mistakes, and misinformation are often tireless in their efforts. They are also too often helpless against the avalanche of misinformation.
The Debunker Club is an experiment in professional responsibility. Anyone who's interested may join as long as they agree to the following:
When we hear a suggestion about a process that inverts the normal process based on a governance framework - say Microeconomics of Software Development, we need to ask¬†who benefits?¬†How would that suggestion be tangibly beneficial to the recipient that is now inverted?
Estimates for example are for the business, why would the business no longer what an estimate of cost, schedule, or technical performance of the provided capabilities?
In the world of spending money to produce value, the one that benefits, should be, must be, the one paying for that value and therefore have a compelling interest in the information needed to make decisions about how the money is spent.
When that relationship between paying and benefit is inverted, then the path to Qui Bono is inverted as well.
In the end¬†follow the money must be the basis of assessing the applicability of any suggestion. If it is suggested that decision making can be done in the absence of estimating the impacts of those decisions, who benefits. If it's not those paying for the value, then Qui Bono is not longer applicable.¬†
After my lists of mindfulness books and happiness books, here you can find the 20 Best Creativity Books in the World.
This list is created from the books on GoodReads tagged with “creativity”, sorted using an algorithm that favors number of reviews, average rating, and recent availability.
There is a popular notion in the #NoEstimates paradigm that¬†Empirical¬†data is the basis of forecasting the future performance of a development project. In principle this is true, but the concept is not complete in the way it is used. Let's start with the data source used for this conjecture.
There are 12 sample in the example used by #NoEstimates. In this case¬†stickies per week. From this¬†time series¬†an¬†average is calculated for the future. This is the¬†empirical¬†data is used to estimate in the No Estimates¬†paradigm. The Average is 18.1667 or just 18¬†stickies per week.
But we all have read or should have read Sam Savage's¬†The Flaw of Averages. This is a very nice¬†populist book. By populist I mean an easily accessible text with little or not mathematics in the book. Although Savage's work is highly mathematically based with his tool set.
There is a simple set of tools that can be applied for Time Series analysis, using past performance to forecast future performance of the system that created the previous time series. The tool is R and is free for all platforms.¬†
Here's the R code for performing a statistically sound forecast to estimate the possible ranges values the past empirical¬†stickies¬†can take on in the future.
Put the time series in an Excel file and save it as TEXT named BOOK1
> SPTS=ts(Book1) - apply the Time Series function in R to convert this data to a time series
> SPFIT=arima(SPTS) - apply the simple ARIMA function to the time series
> SPFCST=forecast(SPFIT) - build a forecast from the ARIMA outcome
> plot(SPFCST) - plot the results
Here's that plot. This is the 80% and 90% confidence bands for the possible outcomes in the future from the past performance - empirical data from the past.¬†
The 80% range is 27 to 10 and the 90% range is 30 to 5.
So the killer question.
Would you bet your future on a probability of success with a +65 to -72%¬†range¬†of cost, schedule, or¬†technical¬†performance of the outcomes?
I hope not. This is a flawed example I know. Too small a sample, no adjustment of the ARIMA factors, just a quick raw assessment of the data used in some quarters as a replacement for actually estimating future performance. But this assessment shows how to ¬†empirical¬†data COULD ¬†support making decisions about future outcomes in the presence of uncertainty using past time series once the naive assumptions of sample size and wide variances are corrected..
If you hear you can make decisions without estimating that's pretty much a violation of all established principles of Microeconomics and statistical forecasting. When answer comes back¬†we sued¬†empirical¬†data, that your time series empirical data, download R, install all the needed packages, put the data in a file, apply the functions above and see if you really want to commit to spending other peoples money with a confidence range of¬†+65 to -72% ¬†of performing like you did in the past? I sure hope not!!Related articles Flaw of Averages Estimating Probabilistic Outcomes? Of Course We Can! Critical Success Factors of IT Forecasting Herding Cats: Empirical Data Used to Estimate Future Performance Some More Background on Probability, Needed for Estimating Forecast, Automatic Routines vs. Experience Five Estimating Pathologies and Their Corrective Actions