Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!
Software Development Blogs: Programming, Software Testing, Agile Project Management
Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!
On a twitter discussions and email exchanges there is a notion of¬†populist books versus¬†technical¬†books ¬†used to address issues and problems encountered in our project management domains. My recent book¬†Performance-Based Project Management¬ģ is a populist book. There are principles, practices, and processes in the book that can be put to use on real projects, but very few equations and numbers. It's mostly narrative about increasing the probability of project success. But the to calculate that probability based on other numbers, processes, and systems is not there. That's the realm of¬†Technical books and journal papers.
The content of the book was developed with the help of editors at American Management Association, the publisher. The Acquisition Editor contacted me about writing a book for the customers of AMA. He explained up front AMA is in the money making business of selling books. And that although I may have many good ideas, even ideas that people might want to read about, it's an AMA book and I'll be getting lots of help developing those ideas into a book that will make money for AMA.
The distinction between a populist book and a technical book are the differences between a book that addresses a broad audience with a general approach to the topic and a¬†deep dive book focused on a narrow audience.
But one other disticntion is for most of the technical approaches, some form of calculation takes place to support the materials found in the populist material. One simple example is estimating. There are estimating articles and some books that lay out the principles of estimates. We have those in our domain in the form of guidelines and a few texts. But to calculate the Estimate To Complete in a statistically sound manner, technical knowledge and the underlying mathematics of non-linear, non-stationary, stochastic processes (Monte Carlo Simulation of the projects work structure) is needed.¬†
Two examples of populist versus technical
Two from my past two from my current work.¬†
These two books are about the same topic. General relativity and its description of the shape of our universe. ¬†One is a best selling popularization of the topic, found in many home libraries of those interested in this fascinating topic. The one on the left is on my shelf from a graduate school course on General Relativity along with Misner, Thorne, and Wheeler's¬†Gravity.
Dense is an understatement for the math and the results of the book on the left. So if you want to calculate something about a rapidly spinning Black Hole, you're going to need that book. The book on the right will talk about those Black Holes in non-mathematical terms, but no numbers come out from that description.
The book on the left is about probabilistic processes in everyday life that we misunderstand or are biased to misunderstand. The many cognitive biases we use to convince ourselves we are making the right decisions on projects are illustrated through nice charts and graphs.
We use the book on the left in our work with non-stationary stochastic process of complex project cost and schedule modeling. Making these decisions is critical to quantifying how technical and economics risk may affect a system's cost. This book is a treatment of how probability methods are applied to model, measure, and manage risk, schedule, and cost engineering for advanced systems. Garvey's shows how to construct models, do the calculations, and make decisions with these calculations.
Here's The Point - Finally
If you come across a suggestion that decisions can be made in the absence of knowing anything about the future numbers or about actually¬†doing the math, put that suggestion in the class of¬†populist¬†descriptions of a complex topic.
If you can't calculate something, then you can't make a decision based on the evidence represented by numbers. If you can't decide based on the math, then the only way left is to decide on intuition, hunchs, opinion, or some other seriously flawed non-analytical basis.
Just a reminder from Mr. Deming stated in yesterday's post
If it's not your money, there's likley an expectation that those providing the money are intestered in the calculations needed to make those decisions.¬†
Action accordinglyRelated articles Estimating Guidance When the Solution to Bad Management is a Bad Solution Measures of Program Performance Should I Be Estimating My Work? Decision Making Without Estimates? Trust but Verify Anecdotal Evidence is not Actually Evidence Why Trust is Hard Baloney Claims: Pseudo - science and the Art of Software Methods
All ideas require credible evidence to be tested, suspect ideas require that even more so - Deep Inelastic Scattering, thesis adviser, University of California, 1978
When the response to questions about the applicability of an idea is push back with accusations that those asking the questions in an attempt to determine the applicability and truth of the statement are somehow¬†afraid of that truth, suggests there is little evidence as a¬†test¬†of those conjectures.
When there are proposals that ignore the principles of business, microeconomics, control systems theory, and are based on well know¬†bad management practices, with well know and easy to apply corrective actions - there is¬†no there, there.¬†
So without a testable process, in a testable domain, with evidence based assessment of appliability, outcomes, and benefits, any suggestion is opinion at best and blather at worse.Related articles Trust but Verify Assessing Value Produced By Investments Should I Be Estimating My Work?
In a sufficiently complex project we need measures of progress to plan beyond¬†burning down our list of same sized stories, which by the way require non-trivial work to make same sized and keep same sized. And of course if this same size-ness does not have a sufficiently small variance all that effort is a waste.
But let's assume we're not working on a sufficiently small project where¬†same sized¬†work efforts can be developed, we need measures of progress related to the Effectiveness of the deliverables and the Performance of those deliverable in producing that effectiveness for the customer.
Here's a recent webinar on this topic.
Measurement News Webinar from Glen Alleman And of course we need to define in what domain this approach can be applied, what domain it is too much, and what domain it is actually not enough. Paradigm of agile project management from Glen Alleman Then the actual conversation about any approach to¬†Increasing the Probability of Success for our work efforts can start. Along with identifying the underlying Root Causes of any impediments to that goal that exist today and the corrective actions needed to remove them.¬† Without knowing the root cause and corrective actions any suggested solution has little value as it is speculative at best and nonsense at worse.
Much has been written about the¬†Estimating Problem, the optimism bias, the planning fallacy, and other related issues with estimating in the presence of Dilbert-isk style management. The notion that the solution to the¬†estimating problem is not to estimate, but to start work, measure the performance of the work, and use that to forecast completion dates and efforts is essentially falling into the trap Steve Martin did in LA Story.¬†
Using yesterday's weather becasue he was too lazy to make tomorrow's forecast
By the way each of those issues has a direct and applicable solution. So next time you hear someone use them as the basis of a new idea, ask if they have tried the¬†known to work solution to the planning fallacy, estimating bias, optimism bias, and the myriad of other project issues with knowing solutions?
All measuring performance to date does is measure¬†yesterday's weather. This¬†yesterday's weather paradigm has been well studied. If in fact your project is based on¬†Climate then yesterday's weather is likely a good indicator of tomorrow's weather.
The problem of course with the¬†yesterday's weather approach, is the same problem Steve Martin had in LA Story when he used a previously recorded weather forecast for the next day.¬†
Today's weather turned out not to be like yesterday's weather.
Those posting that stories settle down to a rhythm assume - and we know what assume means - that the variances in the work efforts are settling down as well. That would mean the word assume comes true¬†Ass out of U and Me. That's a hugely naive approach, without actual confirmation that the variances are small enough to not impact the past performance. When you have statistical processes looking like this, from small sampled projects in the absence of actual reference class - in this case self-reference class - you're also being hugely naive about the possible behaviours of stochastic processes.
Then when you¬†slice the work to same sized efforts - this is actually process used in the domains we work: DOD, DOE, ERP - you're actually estimating future performance base on a¬†reference class and calling it Not Estimating.
So when you hear examples and Bad Management, over commitment of work, assigning a project manager to a project that is 100's of time larger than that PM has ever experienced and expecting success, getting a credible estimate and cutting it in half, or any other Dilbert style management process - and you start with dropping the core process needed to increase the probability of success.
This approach is itself contrary to good project management principles, which are quite simple:
Principles and Practices of Performance-Based Project Management¬ģ from Glen Alleman ¬†
¬†If we start with a solution to a problem of Bad Management, before assuring that the Principles and Practices of Good Management are in place, we'll be¬†paving the cow path as we say in our enterprise, space, defense domain. This means that the solution will have not actually fixed the problem. It will have not treated the root cause of the problem, just addressed the symptoms.
There is no substitute for Good Management.
And when you hear there is a¬†smell¬†of bad management and there is no enumeration of the root causes and the corrective actions to those root causes, remember Ingio Montoya's retort to Vizzini's statement
You keep using that word. I do not think it means what you think it means.
That word is dysfunction, smell, root cause - all of which are missing the actual innumerated root causes, assessment of the possible corrective actions, and resulting removal of the symptoms.¬†
I speak about this approach from my hands on experience working the¬†Performance Assessment and Root Cause Analysis¬†on programs that are in the headlines.Related articles Should I Be Estimating My Work? Estimating Guidance Assessing Value Produced By Investments Basis of #NoEstimates are 27 Year Old Reports
With the plethora¬†of opinions on estimating - some informed, many uninformed - here's my list of books and papers that¬†inform our software estimating activities for Software Intensive Systems. These books range for hard core engineering to populist texts¬†
is not actually true after you have¬†read the book. So please read the book and see how McConnell provides step-by-step actions for producing credible estimates.
Estimating software development starts with understanding what the software system is supposed to be doing and how we're able to measure that. This process is based on defining the needed ¬†capabilities, the measures of Effectiveness, Measures of Performance, Key Performance Parameters, and Technical Performance Measures all needed for the ultimate success of the project. Along with a¬†Plan showing the increasing maturity of the delivered capabilities. If we don't these in some form, it's going to be a disappoiintment for those payinig for our efforts when they get to the end and the outcomes are what they were expecting.
Capabilities are not Requirements. Requirements implement Capabilities. Capabilities are pretty much fixed while the Requirements evolve. Capabilities Based Planning is the basis of project management in many Software Intensive Systems.
Capabilities based planning (v2) from Glen Alleman With Capabilities in hand, the development of the system moves to the Systems Engineering and Requirements process:
With the project's capabilities defined to a level needed to start the project - failing to do this results in a¬†Death March at worse, and spending the customer's money to discover what should have been discovered before starting. With the capabilities, the project needs to be¬†managed in a way that will increase the probability of success.
So when you hear of some new approach to project management, ask if there is any connection to a domain and a context in that domain. Because there any many ideas about how to improve the probability of project success. But without a domain and context it'll be hard to assess if they are applicable to your specific situation. Here's one way to think about this domain dependency. From solo projects to national assets, methods, processes, tools are different as is the¬†value at risk.¬†
Paradigm of agile project management from Glen Alleman Related articles How to Estimate Software Development
In the mathematics of physics,¬†there are two essential types of values in all calculations - Scalars and¬†Vectors.
Scalars are isolated values, with no outside context. Indeed They remain the same regardless¬†of any context. A common example would be mass. An object has a mass of 1 Kilogram no¬†matter where it is, or how much physical space it occupies. The context of the object cannot¬†change the scalar value of its mass.
Vectors are contexted values, and can change depending on that context. An object has¬†weight, dependent on both the mass value and gravity context of the object. An object with¬†high mass, may still have no weight in the corresponding context of gravity.
But most specifically, vector values allow the calculations of change over time. Numbers (scalars) without context (vectors) are not metrics.
It is meaningless¬†to say that cost to operate the IT Service Desk has doubled within the last ten years,¬†without also showing how the number of employees has tripled in the same time.
It is meaningless to say a self-selected 120 projects exceeded their estimated cost and duration without an assessment of the credibility of that¬†original estimate and the determination of the Root Cause of that overage.
It is meaningless to say the end date and cost can be forecast with saying something about the underlying uncertainties in effort size, risk, inter-dependencies, changing requirements, defect rates, labor absorption rates, integration issues, performance issues, and complexities of emerging behaviours once the system starts to come together and applied to fulfill the needed capabilities.
When we hear about small sample sized forecasts of¬†same sized work activities, or selecting the¬†next priority item to work on without considering the inter-dependencies of past work items or future work items - we're speaking about¬†Scalar numbers, not¬†Vector numbers.
Vectors state magnitude and direction. Open Loop control only states magnitude. Use it at your own risk.
Much of the noise around agile is based on platitudes and words in the absence of a domain and a context in that domain. Here's a possible anchoring processes
Product development kaizen (pdk) from Glen Alleman And an example paper based on these principles.
Speaking at the Integrated Program Management Conference in Bethesda MD this week. The keynote speaker Monday was Katrina McFarland, Assistant Secretary of Defense (Acquisition)(ASD(A)), the principal adviser to the Secretary of Defense and Under Secretary of Defense for Acquisition.¬†
During her talk she spoke of the role of Earned Value Management. Here's a mashup¬†of her remarks...
EV is a thoughtful tool as the basis of a conversation for determining the value (BCWP) produced by the investment (BCWS). This conversation is an assessment of the efficacy of our budget.¬†
We can determine the effecacy of our budget through:
These measures answer the question of¬†what is the efficacy of our budget in delivering the outcomes of our project.
The¬†value of the project outcomes must be traceable to a strategy for the business or mission. Once this strategy has been identified, the Measures of Effectiveness, Performance, and Technical Performance Measures can be assigned to the elements of project. These are shown in the figure below
This approach is scalable up and down the complexity of projects based on five immutable principles of project success.
Without credible answers to each of these questions, the project is on the path to failure on day one.
The most unsuccessful three years in the education of cost estimators appears to be fifth-grade artihmetic - Norman R. Augustine
Augustine is former Chairman and CEO of Martin Marietta. His seminal book¬†Augustine's Laws, describes the complexities and conundrums of today's business management and offers solutions. Anyone interested in learning how successful management of complex technology based firms is done, should read that book. As well, read McConnell's book and see if you can find where¬†
Because I sure can't find that¬†proof or any mention that¬†estimates don't work, other than for those who failed to pay attention in the 5th grade arithmetic class.
Hugh McCleod's art for Zappo's provides the foundation for trust in that environment
If I'm the head of HR, I'm responsible for filling the desks at my company with amazing employees.¬†I can hold people to all the right standards. But ultimately I can't control what they do.¬†This is why hiring for culture works.¬†What Zappos does is radical because it trusts.¬†It says "Go do the best job you can do for the customer, without policy".¬†And leaves employees to come up with human solutions.¬†Something it turns out they're quite good at, if given the chance.
Now let's take another domain, one I'm very famailar with - fault tolerant process control systems. Software and support hardware applied to emergency shutdown of exothermic chemical reactors - those that make the unleaded gasoline for our cars, nuclear reactors and conventional fired power generation, gas turbine controls, and other¬†must work properly machines. And a similar domain of DO-178c flight control systems, which must equally work without fail and provide all the needed capabilities on day one.
At Zappos, the HR Diector describes a work environment where employess are free to do the best job they can for the customer. In the domains above, employees also work to do the best job for the customer they can, but flight safety, live safety, equipment safety are also part of that¬†best job. In other domains we work, doing the best job for the customer means processing with extremely low error rates, transactions for 100's of millions of dollars of value in the enterprise IT paradigm. Medical insurance provider services, HHS enrollment, enterprise IT in a variety of domains.
Zappo's can recover from an error, other domains can't. Nonrecoverable errors mean serious loss of revenue, or even loss of live. In the other domains, failure is similar consequences. I come from those domains, they inform my view of the software development world - where software¬†fail safe and¬†fault tolerance is¬†the basis of business success.
So when we hear about the freedom to fail early and fail often in the absence of a domain or context, care is needed. Without a¬†domain and context, it is difficult to assess the credibility of any concept, let alone one that is untested outside of personal ancedote. It comes down to Trust alone or¬†Trust But Verify.¬†I could also guarantee that Zappos has some of the¬†verify¬†process. It is doubtful employees are left to do anything they wish for their customer. The simple reason there is a business governance process at any firm, no matter the size. Behaviour, even¬†full trust behavior fits inside that governance process.
All the rhetoric around any idea needs actionable outcomes that can be tested in the market place, beyond¬†the personal anecdotes of self-selected conversations.
The question asked by #NoEstimates is in the form of a statement.¬†
On the surface this statement sounds interesting until the second sentence. The MicroEconomics of writing software for money is based on estimating future outcomes thaty result from current day decisions. But let's pretend for a moment that Micro Econ is beyond consideration, this is never true, but let's pretend.
The next approach is to construct a small decision tree that can invert the question. Forget the exploring, since all business effort is a¬†zero sum game, in that someone has to pay for everything we do. Exploring, coding, testing, installing training, even operating.
¬†So let's start down the flow chart.
Is It Your Money?
In the crass world of capitalism, money talks, BS walks. While this may be abhorrent to some, it's the way the world works, and unless you've got your own bank, you're going to likely have to use other peoples money to produce software - ¬†either for yourself or for someone else. Self-funded start up? No problem, but even the best known names in software today went on to raise more money to move the firm forward. Then¬†self-funded became venture funded, private equity funded, and then publicly funded.
If you're writing software for money, and it's not your money, those providing the money have - or should have if they're savvy investors - a vested interest in knowing how much will this cost. As well when will it be done and most importantly, what will be delivered during the work effort and at the end.¬†
This requires estimating
Is There A Governance Policy Where You Work?
Governance of software development, either internal projects, external projects, or product development is a subset of corporate governance.¬†
If you work at a place that has no corporate governance, then estimating is probably a waste.
If however, you work somewhere that does have a corporate governance process - no matter how simple - and this is likely the case when there is a non-trival amount of money at risk, then someone, somewhere in the building has an interest in knowing how much things will cost before you spend the money to do them or buy them.¬†
This requres estimating
What's the Value at Risk for Your Project?
If the value at risk for a project is low - that is if you spend all the money and consume all the time and produce nothing and the management of your firm writes that off as a loss without much concern, then estimating probably doesn't add much value.
But if those providing you the money have an expectation that something of value will be returned and that something is needed for the business, then writing off the time and cost is not likely to be seen as favorable to you the provider.¬†
We trusted you because you said "trust me" and you didn't show up on or before the planned time, at or below the planned budget, with the needed capabilities - and you didn't want to estimate those up front and keep us informed about your new and updated Estimate To Complete and Estimate At Complete so we could take corrective actions to help you out - then we're going to suggest you look for work somewhere else.
On low value projects estimating the probability of success, the probability of the cost of that success, the probability of completion date of that success is not likely of much value.
But using the Value At Risk paradigm - risk of loss of a specific asset (in this case the value produced by the project) is¬†defined as a threshold loss value, such that the probability that the loss on the value from the project over the given time horizon exceeds some value.
As an aside the notion of¬†slicing does not reduce the Value at Risk. Slicing is a estimating¬†normalization¬†process where the exposure to risk is reduced to same sized¬†chucks. But the natural and event based variability of the work is still present in the chunks, and the probability of impacting the outcome of the project due to changes in demand, productivity, defects, rework, unfavorable and even un anticipated interactions between the produced chuncks needs to be accounted for in some estimating process. aAs well¬†the sunk cost of¬†breaking down the work into same sized chunks needs to be accounted.
In our space and defense world, there is the 44 Day Rule, where¬†chunks of work are broken down into 44 days (2 financial months) with tangible deliverables. The agile community would consider this to long, but they don't work on National Asset billion dollar software intensive programs, so ignore that for the moment.
So yes, slicing is a good project management process. But using the definition of No Estimates in the opening, slicing is Estimating and done in every credible project management process, usually through the Work Breakdown Structure guide.
The Five Immutable Principles of Project Success
To increase the probability of project success, five immutable principles must be in place and have credible answers to their question. Each requires some form of an estimate, since the outcomes from these principles is in the future. No amount of¬†slicing and dicing is going to produce a non-statistical or non-probabilistic outcome. All slicing does - as mentioned before - is reduce the variance of the work demand, not the work productivity, the variance in that work productivity process, the rework due to defects, or any unidentified dependencies between those work products that will create uncertainty and therefore create risk to showing up on time, on budget, and on specification.
The Devil Made Me Do It
Those of us seeking an evidence based discussion about the issues around estimating - and there are an endless supply of real issues with real solutions - have pushed back on using Dilbert cartons. But I just couldn't resist today carton.
When we need to make a decision between options - Microeconomics and¬†opportunity costs¬†- about some outcome in the future, we need an estimate of the cost and benefit of that choice. To suggest that decisions can be made without estimates has little merit in the real world of spending other peoples money.No Estimates Needs to Come In Contact With Those Providing the Money
The first is the self-selection problem of statistics. This is the Standish problem. Send out a survey, tally the results from those that were returned. Don't publish how many surveys went out and how many came back.
These are both members of the¬†cherry picking process. The result is lots of exchanges of questions to the original conjecture that have not basis in evidence for the conjecture.
When you encounter such a conjecture, apply the Sagan's¬†BS detection kit
When there is push back from hard questions, you'll know those making the claims have no evidence and are essentially BS'ing their constituents.
There is this notion in some circles that trust trumps all business management processes.
"–Ē–ĺ–≤–Ķ—Ä—Ź–Ļ, –Ĺ–ĺ –Ņ—Ä–ĺ–≤–Ķ—Ä—Ź–Ļ, –õ—É—á—ą–Ķ –Ņ–Ķ—Ä–Ķ–Ī–ī–Ķ—ā—Ć, —á–Ķ–ľ –Ĺ–Ķ–ī–ĺ–Ī–ī–Ķ—ā—Ć"
Who's butchered translation is Trust but Verify, don't rely on chance.
President Regan used that proverb reflected back to the Russian in the SALT I treaty. So what does it mean¬†trust that people can think for themselves and decide if it applies to them ...¬†that not making estimates of the cost, performance and schedule for the project are needed?
The first question is - what's the value at risk? Trust alone is likely possible in low value at risk. In that case the impact of not showing up on or before the needed time, at or below the needed cost, and with ot without all the needed capabilities for the¬†mission or business case fulfillment has much less impact and therefore is acceptable.
Trust but Verify
6 week DB update with 3 developers
18 month ERP integration with 87 developers whose performance is reported to the BoD on a quarterly basis
Water filter install in kitchen using the local handyman
Water filter install in kitchen with wife testing to see if it does what it said in the brochure
Install the finch feeder on the pole attached to the back deck in front of the kitchen window over looking the golf course.
Design and build the 1,200 square foot deck attached to the second floor on the back of the house using the architects plans and schedule the county for the inspection certificate so it can be used last summer.
Arrange for a ride in a glider at the county airport sometime Saturday afternoon
Plan departure from DIA and check for departure delay of SWA flight DEN to DCA.
In the first instances (left column)¬†trust us, we'll be done in the 6 week window probably means that team doesn't need to do much estimating other than the agree among themselves that the¬†Promise made to the manager has a good chance of coming true.
The second (right column) $178M ERP integration project in a publicly traded firm, filing their 10K and subject to FASB 86, and having¬†promised the shareholders, insured, and the provider network that the new system will remove all the grief of the several dozen legacy apps will be¬†made all better -¬†on or before¬† the Go Live date announced at the Board Meeting and in the Press has a good chance of coming true.¬†
To assess that chance, more that¬†Trust¬†is needed. Evidence of the probability of completing¬†on or before¬†the go live date and¬†at or below the target budget is needed. That probability is developed with an estimating process and updated on a periodic basis - in this case every month, with a mid-month assessment of the month end's reportable data.¬†
So next time you hear...
...think of the¬†Value at Risk, the fiduciary responsibility to those funding your work, to ask and produce an answer to the question of¬†how much, when, and what will be delivered. And even possibly the compliance responsibility - SOX, CMS, FAR/DFARS, ITIL - for¬†knowing to some degree of confidence the Estimate To Complete¬†and the¬†Estimate at Complete for your project. Writing 6 week warehouse apps, probably not much value. Spending 100's of millions of stock holders money and¬†betting the company likely knowing something like those numbers is needed.
Trust Alone is Open Loop Trust but Verify is Closed Loop
Control systems from Glen Alleman ¬† Without knowing the¬†Value At Risk it's difficult if not impossible to have a conversation about applying any method of managing the spend of other peoples money. Here's a clip from another book that needs to be on the shelf of anyone accoutable for spending money in the presence of a governance process. Practical Spread Sheet Risk Modeling. Don't do risk management of other peoples money? Then you don't need this book or similar ones, and likley don't need to estimate the impact of decisions made using other peoples money. Just keep going, your customer will tell you when to stop. ¬†
1024 - 2014
Thanks to Mr. Honner, a mathematics teacher in at Brooklyn Technical High School. If you like mathematics and appreciate the contribution a good teacher can make to mathematically understanding which is woefully lacking in our project management domain, sign up to get his blog posts.
The #NoEstimates movement appears to be based on a 27 year old report‚Ä† that provides examples of FORTRAN and PASCAL programs as the basis on which estimates is done.¬†
A lot has happend since 1987. For a short crtiique on the Software Crisis report - which is referenced in the #NoEstimates argument, see "There is No Software Engineering Crisis."
1000's of research and practicum books and papers on how to estimate software projects have be published. Maybe it's time to catch up with the 21st Century approach of estimating the time, cost, and capabilities needed to deliver value for those paying for our work. These approaches¬†answer the mail in the 1987 report, along with the much referenced¬†NATO Software Crisis report published in 1986.
While estimates have always been needed to make decisions in the paradigm of Microeconomics of software development, the techniques, tools, and data have improved dramatically in the last 27 years. Let's acknowldge that and start taking advantage of the efforts to improve our lot in life of being good stewards of other peoples money. And when we hear #Noestimates can be used to forecast completion times and costs at the end, test that idea with activities in the Baloney Claims check list.
‚Ä†¬†#NoEstimates is an approach to software development that arose from the observation that large amounts of time were spent over the years in estimating and improving those estimates, but we see no value from that investment. Indeed, according to scholars Conte, Dunmore and Shens¬†¬†a good estimate is one that is within 25% of the actual cost, 75% of the time. in¬†http://www.mystes.fi/agile2014/
As a small aside, that's not what the statement actually says in the context of statistical estimating. It says there is a 75% confidence that there will be on overage of 25% which needs to be covered with management reserve for 25% to protect the budget. Since all project work is probabilistic, uncertainty is both naturally occurring and event based. Event based uncertainty can be reduced by spending money. This is a core concept of Agile development. Do small things to discover what will and won't work. Naturally occurring uncertainty, can only be handled with¬†margin. In this statement - remember it's 27 years old - there is a likelihood that a 25% management reserve will be needed 25% of the time there is a project estimate produced. If you know that ahead of time, it's won't be a disappointment when it occurs 25% of the time.
This is standard¬†best management practice in mature organizations. In some domains, it's mandatory to have Management Reserve built from Monte Carlo Simulations using Reference Classes of past performance.Related articles How to Estimate Software Development Software Requirements Are Vague How Not To Make Decisions Using Bad Estimates #NoEstimates? #NoProjects? #NoManagers? #NoJustNo