Skip to content

Software Development Blogs: Programming, Software Testing, Agile Project Management

Methods & Tools

Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!

Herding Cats - Glen Alleman
Syndicate content
Performance-Based Project ManagementÂź Principles, Practices, and Processes to Increase Probability of Project Success
Updated: 4 hours 34 min ago

Estimating Accuracy Mathematics

Fri, 03/24/2017 - 01:55

In the estimating business, like many things in project management, there is confusion about principles, practices, and processes. And sometimes even outright misinformation. 

Here's an example used by the #NoEstimates advocates. Starting in 1986, there is a sentence that says more or less what the slide says below.

A good estimation approach should provide estimates that are within 25% of the actual results, 75% of the time

The book this statement comes from is Conte, S. D., H. E. Dunsmore and V.Y. Shen. Software Engineering Metrics and Models. Menlo Park CA: The Benjamin/Cummings Publishing Company, Inc., 1986. Looking at the statement is on page 172-175 open on my desk right now. Steve McConnell abstracted the original page content into those words. 

 

C7mfWBCW0AEB33Z

And the words on page 172 to 175 speak about the Magnitude and Mean Magnitude of relative Error. The term within 25% is the Mean Relative Error, that is the estimate is within 25% of the actual value - the real value compared to the estimated value.

So if the actual value - after we are done - is $25,000 then is the estimate is within 25% of that estimate - $18,700 - then that's a good start.

Screen Shot 2017-03-23 at 8.44.03 AMIn other words, if the error of our estimate is less that 25% of the actual outcome, 75% of the time, we're doing pretty well early in the project - possibly on day one. In our NASA Software Intensive System of Systems business, we need an 80% confidence basis of estimate in the proposal - A 20% MRE. Conte, Dunsmore, and Shen's number is a 75% confidence level in 1986.

We use Monte Carlo Simulation tools and Method of Moments algorithms from very large historical - the holy grail of empirical forecasting - databases and apply analogous and parametric models for work that is new to get these numbers.

The notion used by #NoEstinates advocates does NOT mean that the estimate is within 25% of the actual. But the Mean Relative Error of the estimate is with within 25%. They would know that if the Read the Book and stopped echoing someone else's poorly translated mathematics.

This is a serious error in understanding the principles of estimating, and this error is repeated throughout the #NoEstimates community. It's time to put it right. 

Please go buy Software Engineering Metrics and Models, it's cheap and packed full of the mathematics needed to actually perform credible estimating on software intensive systems. And download the paper that followed "A Software Metrics Survey." While you're at it buy Estimating Software Intensive System of Systems and you to can start debunking the #NoEstimates hoax that Decisions can be made in the presence of uncertainty without estimating the impact of those decisions.

The only way this can happen is if there is no uncertainty, the future is like the past, there is no risk - reducible or irreducible and nothing changes. 
 

 

 

Related articles Risk Management is How Adults Manage Projects Information Technology Estimating Quality Herding Cats: The Fallacy of Wild Numbers Herding Cats: Quote of the Day Want To Learn How To Estimate? Two Books in the Spectrum of Software Development Essential Reading List for Managing Other People's Money What Happened to our Basic Math Skills? The Fallacy of the Planning Fallacy
Categories: Project Management

Estimating Accuracy Mathematics

Thu, 03/23/2017 - 16:03

In the estimating business, like many things in project management, there is confusion about principles, practices, and processes. And sometimes even outright misinformation. 

Here's an example used by the #NoEstimates advocates. Starting in 1986, there is a sentence that says more or less what the slide says below

A good estimation approach should provide estimates that are within 25% of the actual results, 75% of the time

The book this statement comes from is Conte, S. D., H. E. Dunsmore and V.Y. Shen. Software Engineering Metrics and Models. Menlo Park CA: The Benjamin/Cummings Publishing Company, Inc., 1986. Looking at the statement is on page 172-175 open on my desk right now. Steve McConnell abstracted the original page content into those words. 

 

C7mfWBCW0AEB33Z

And the words on page 172 to 175 speak about the Magnitude and Mean Magnitude of relative Error. The term within 25% is the Mean Relative Error, that is the estimate is within 25% of the actual value - the real value compared to the estimated value.

So if the actual value - after we are done - is $25,000 then is the estimate is within 25% of that estimate - $18,700 - then that's a good start.

Screen Shot 2017-03-23 at 8.44.03 AMIn other words, if the error of our estimate is less that 25% of the actual outcome, 75% of the time, we're doing pretty well early in the project - possibly on day one. In our NASA Software Intensive System of Systems business, we need an 80% confidence basis of estimate in the proposal - A 20% MRE. Conte, Dunsmore, and Shen's number is a 75% confidence level in 1986.

We use Monte Carlo Simulation tools and Method of Moments algorithms from very large historical - the holy grail of empirical forecasting - databases and apply analogous and parametric models for work that is new to get these numbers.

The notion used by #NoEstinates advocates does NOT mean that the estimate is within 25% of the actual. But the Mean Relative Error of the estimate is with within 25%. They would know that if the Read the Book and stopped echoing someone else's poorly translated mathematics.

This is a serious error in understanding the principles of estimating, and this error is repeated throughout the #NoEstimates community. It's time to put it right. 

Please go buy Software Engineering Metrics and Models, it's cheap and packed full of the mathematics needed to actually perform credible estimating on software intensive systems. And download the paper that followed "A Software Metrics Survey." While you're at it buy Estimating Software Intensive System of Systems and you to can start debunking the #NoEstimates hoax that Decisions can be made in the presence of uncertainty without estimating the impact of those decisions.

 

 

 

Related articles Risk Management is How Adults Manage Projects Information Technology Estimating Quality Herding Cats: The Fallacy of Wild Numbers Herding Cats: Quote of the Day Want To Learn How To Estimate? Two Books in the Spectrum of Software Development Essential Reading List for Managing Other People's Money What Happened to our Basic Math Skills? The Fallacy of the Planning Fallacy
Categories: Project Management

Quote of the Day

Tue, 03/21/2017 - 22:43

“The real value of computers is communication, not computation.”
- Natasha Kalatin

Categories: Project Management

Quote of the Day

Tue, 03/21/2017 - 22:43

“The real value of computers is communication, not computation.”
- Natasha Kalatin

Categories: Project Management

Quote of the Day

Mon, 03/20/2017 - 15:44

You can sway a thousand men by appealing to their prejudices quicker than you can convince one man by logic. – Robert A. Heinlein, Revolt in 2100

Categories: Project Management

Quote of the Day

Mon, 03/20/2017 - 15:44

You can sway a thousand men by appealing to their prejudices quicker than you can convince one man by logic. – Robert A. Heinlein, Revolt in 2100

Categories: Project Management

Where is the Adult Supervision on this Program?

Mon, 03/20/2017 - 01:34

Most programs I work are in trouble in some form - other wise they would not have hired us to help. 

One quote we use to describe these situations is

What's the difference between this program and the Boy Scouts? The Boy Scouts have adult supervision.

But today I sat through an out brief from a government auditing agency on a $10B program and one of my colleagues made this statement

This program is like a house with 4 teenage boys, where the parents went on vacation and never returned.

When we encounter these situations, it is tempting to start providing solutions. This is a serious mistake unless we have done the Root Cause Analysis of the observed problem. It may turn out what is being observed is the symptom of a root cause that was not understood by the people that hired us.

This is a rookie mistake in the process improvement business. I see it all the time. Some self-proclaimed thought leader pontificates on what is wrong with the software industry. Makes blanket statements like the current software development methods have not made much difference in the general success in the software development - the software crisis. That is according to the much debunked Standish Reports.

The charts showing software project failures, challenged or otherwise, never seem to have a root cause of the failure or less than desired performance. Same is the case with those advocating that the Cone of Uncertainty doesn't match how the work proceeds on their projects. 

Why? Why are you seeing what you're seeing? What are the possible causes of these observations?

Like most voices pointing out obvious difficulties in writing software for money, without identifying the Root Cause, the proposed solutions have no basis for testing their fix will actually fix anything. 

Most of the time when we come onboard to put the project back on track, we find missing processes, missing data produced from those processes, and most of the time people behaving badly. Now you could make the case that it starts with the people. Which is a noble conjecture and could possibly be true. 

But the naive notion that people can operate on a non-trivial in the absence of a process is just that - naive. 

One place to start is Identifying Acquisition Framing Assumptions Through Structured Deliberation 

Related articles Estimating and Making Decisions in Presence of Uncertainty The Dysfunctional Approach to Using "5 Whys" Managing in Presence of Uncertainty Herding Cats: Cone of Uncertainty - Part Cinq (Updated) Five Estimating Pathologies and Their Corrective Actions Are Estimates Really The Smell of Dysfunction? Estimating Probabilistic Outcomes? Of Course We Can! Herding Cats: Software Development for the 21st Century
Categories: Project Management

Quote of the Day

Sun, 03/19/2017 - 01:55

Insight, untested and unsupported, is an insufficient guarantee of truth. - Bertrand Russell, Mysticism and Logic 1929

When we hear an extraordinary claim, demand extraordinary evidence.

Categories: Project Management

Operations, Finance, and Accounting for the Development of Software

Fri, 03/17/2017 - 21:21

It's common to hear, projects are overhead, we just need to get the value to the customer as fast as possible. This seems to be a lament from agile developers. Anything getting in the way of them coding is seen as an impediment. 

Here's a little story.

Long ago (1978) in a land far away (Redondo Beach) I too wrote code and continued to write code for a decade or so after that. This code was for a Missile Defense system - Cobra Dane and a few satellites that supported it. My boss was a crusty engineer with a long history of delivering products to the DOD that showed up on time and on budget and worked in the defense of the nation. When us young pups arrived to move the development processes into the next century (minicomputers, high-level languages, integrated development environments) from the previous generation of punching holes in pieces of paper - either card stock or rolls of paper tape - we thought we were clearly superior to those of the past generation and were proud to tell anyone who wold listen how we had a lock on what to do best.

After tolerating our attitudes for a few months - and it was good work which is why we were hired and continued to work there for many years - Fred had a little talk. It was clear many years later, he was a good development manager and waited for the appropriate time to coach and mentor us.  In those days we got a paper paycheck and took it across the street to the Bank of America and deposited it every other Friday. The checks were handed out in envelopes by Fred personally. 

Take out your paycheck boys (very few women in those days, although my next boss - Carol - was one of the best) and look in the upper left-hand corner. It says Bank of America and the name of our company (TRW). Well, I just want to clear up a misconception here. That's not where that money comes from. The money on that check comes from the United States Airforce. The Bank of America is just holding it until you cash it. It's the United States Air Force that pays your salary and mine, and the United States Air Force wants us to do our work in the way they want, not the way you want or the way you think they want. So please consider that the next time you get some cockamamy idea that your way of doing things is better than the United States Air Force thinks you should do it. It's their money.

He liked to emphasize the full name of the customer, point out the blue suiters walking around Building O6 in Redondo Beach, not as just customers, but US Air Force customers. I work in that same domain 30 years later. Eat lunch with uniformed staff, shop at noon with stores containing uniformed personnel, stay at hotels, where uniformed personnel are eating. It's a unique environment. but one where Process is King since the money for or work comes from a sovereign.

That was a long story, so here's the point.

Screen Shot 2017-03-17 at 8.24.15 AMThe notion that process and overhead are a waste is true in some sense.

Too much process, too much overhead is a waste. But it's not your money so process and overhead are part of spending that money. And most of all, the amount of process and overhead is not likely your decision unless that decision has been assigned to you. This is how business works.

A Scrum of people works very well when doing Scrum, but operations, finance, and accounting are not the same as writing software for money. A Scrum of cost accountants, planner, finance, and operations people is called Chaos. It works for coders, not so much for others.

Projects are operational, financial, and accounting vehicles for managing and keeping track of that money. What I learned from Fred was this...

Without an understanding of where my paycheck came from, how the customer's money was turned into salaries, those salaries turned into products, and those products turned into revenue, we (all us young pups) were going to continue in our roles of being just labor. 

In those days, TRW sent a few of us back to school (MS, Systems Management, USC) to learn how to put to work the words in the book of one of the founders of our little firm, The Management of Innovative Technological Corporations, Simon Ramo, who was the R in Thompson, Ramo, and Woolrich.

There doesn't seem to be many Fred's in the world of the internet. They're still there at my clients. It's a Value at Risk thing in my opinion. 

If you hear #NoProjects, #NoManagement, #NoEstimates, #NoPlans and the people saying that are at a firm making good margin on millions of dollars of revenue good for them.

But ask how they submit the month end financial report to the investors or stockholders. You may find out they account for all that money in some form of a project - finite bounded set of work that converts cost into revenue and what's left is profit. That those numbers are assigned to discrete buckets of cost, revenue, and project and they call that a project since it's a handy way to keep all the money from getting all mixed up.

Related articles Architecture -Center ERP Systems in the Manufacturing Domain IT Risk Management Why Guessing is not Estimating and Estimating is not Guessing
Categories: Project Management

Quote of the Day

Fri, 03/17/2017 - 05:59

If you don’t pay appropriate attention to what has your attention, it will take more of your attention than it deserves. 
— David Allen, Consultant

Since all project work is uncertain, both reducible and irreducible randomness, we need to pay attention to the root causes and outcomes that impact the probability of our project's success. To make decisions on how the handle the risk created by this uncertainty, we need to estimate both the drives of this risk and the consequences of the risk. 

Screen Shot 2017-03-16 at 6.27.47 AM

Risk Management is How Adults Manage Projects - Tim Lister

Related articles Architecture -Center ERP Systems in the Manufacturing Domain IT Risk Management Managing in Presence of Uncertainty Why Guessing is not Estimating and Estimating is not Guessing Making Conjectures Without Testable Outcomes Deadlines Always Matter Managing Projects By The Numbers Strategy is Not the Same as Operational Effectiveness
Categories: Project Management

Quote of the Day

Thu, 03/16/2017 - 13:52

Ignorance more frequently begets confidence than does knowledge: it is those who know little, and not those who know much, who so positively assert that this or that problem will never be solved by science. - Charles Darwin, Introduction, The Descent of Man (1871)

Categories: Project Management

Case Studies versus Ancedotes

Wed, 03/15/2017 - 21:15

It's common in the software development domain, especially agile to provide anecdotes in support for some suggested change to worked good for the person conveying the anecdote. But has not verification that the suggestion works anywhere else  

Here's some background on how to conduct a case study to support those suggestions.

Screen Shot 2017-03-15 at 2.09.20 PM

Related articles The Microeconomics of Decision Making in the Presence of Uncertainty Why We Need Governance Herding Cats: Software Development for the 21st Century Two Books in the Spectrum of Software Development
Categories: Project Management

Misunderstanding Making Decisions in the Presence of Uncertainty (Update Part 2)

Tue, 03/14/2017 - 17:12

There was a Tweet a few days ago from one of the founders of eXtreme Programming, that said...

What happens if you shift focus from "accurate estimation" to "reliably shipping by a date"? 

This quote shows the missing concept of the processes for making decisions in the presence of uncertainty and the processes and events that create uncertainty that impact the reliability of making the date to ship value.

The answer is ... You can't shift focus from accurate estimate to reliably shipping by a date ...

Accurate and precise estimates (to predefined values as shown in the target picture below) are needed before you can reliably ship products by a date. Because you can't know that date with any needed level of confidence without making estimates about the reducible and irreducible uncertanties that inpact that date.

So the answer to the question is.

In the presence of uncertanity, You can't reliably ship by a date without estimating the impact of those uncertanties on the probability of making the date.

Since uncertainty creates risk, managing in the presence of uncertainty requires Risk Management, we can now answer the question, with:

  • If you want a reliable shipping date, you have to discover and handle the uncertainties in the work needed to produce the outcomes to be delivered on that date.
  • You have to estimate the needed schedule, cost, and technical performance margins needed to protect that date from the Aleatory uncertainties.
  • You have to estimate the probabilistic occurrence of the epistemic uncertainties that will impact that date and provide a Plan B an intervention, or some corrective action to protect that date.

Each of these uncertainties creates risk to meeting that reliable shipping date. And as we all know

Risk Management is How Adults Manage Projects - Tim Lister

Details of the Answer to the Question

First, let's establish a principle. All project work has uncertainty. Uncertainty comes from the lack of precision and accuracy about the possible values of a measurement of a project attribute.

There is naturally occurring variability from uncontrolled processes. There is a probability of the occurrence of a future event. This absence of knowledge (Epistemic uncertainty)  can be modeled as a probability of occurrence or a statistical distribution of the natural variability. If your project has no uncertainty, there is no need to estimate. All outcomes are certain, occurring with 100% probability, and with 0% variance. Turns out in the 

These uncertainties come in two forms. Naturally occurring variances (Aleatory uncertainty) and Event based probabilities (Epistemic uncertainty).

The naturally occurring variability comes from uncontrolled and uncontrollable processes. This uncertainty is modeled as a statistical distribution from past performance or an underlying statistical process model, usually stochastic (stationary or non-stationary). The probability of a random event is the absence of knowledge. This uncertainty is modeled as a probability of occurrence or a statistical distribution of the natural variability. If your project has no uncertainty, there is no need to estimate. All outcomes are uncertain, occurring with 100% probability, and with 0% variance.

A formal definition of uncertainty in the project decision-making paradigm is ...

Situation where the current state of knowledge is such that (1) the order or nature of things is unknown, (2) the consequences, extent, or magnitude of circumstances, conditions, or events is unpredictable, and (3) credible probabilities to possible outcomes cannot be assigned. 

If your project has no uncertainty, there is no need to estimate. Then the planned ship date is deterministic. All outcomes are certain, occurring with 100% probability, and with 0% variance.

Turns out in the real world there is no such project.

When we say uncertainty, we speak about a future state of the system that is not fixed or determined. Uncertainty is related to three aspects of the management of projects:

  1. The external world - the activities of the project itself.
  2. Our knowledge of this world - the planned and actual behaviors of the project.
  3. Our perception of this world - the data and information we receive about these behaviors.

Let's revisit the two flavors of uncertainty - uncertainty that can be reduced (Epistemic) and uncertainty that cannot be reduced (Aleatory)

Screen Shot 2017-03-11 at 1.06.11 PM
Aleatory uncertainties are unknowns that differ each time we assess them. They are values drawn from an underlying population of possible values. They are uncertainties that we can't do anything about. They cannot be suppressed or removed. My drive from my house to my secret parking spot on the east side of Denver International Airport is shown at 47 minutes by Google Maps. If I ask Google what's the duration at a specific time of day, 3 days from now, I'll get a different number. When I get on the farm road to I-25 and get to I-25 I may find a different time. This time is the random variances of distance and traffic conditions. I need margin to protect me from being late to the parking spot.

The naturally occurring work effort in the development of a software feature - even if we've built the feature before - is an irreducible uncertainty. The risk is created when we have not accounted for this natural variances in our management plan for the project. If we do not have a sufficient buffer to protect the plan from these naturally occurring variances, our project will be impacted in unfavorable ways.

The notion (as suggested in the quote) of shifting from accurate (what ever that means) ways of estimating to reliability shipping be a date is not physically possible since the irreducible and reducible uncertainties are always present.

Dealing with Aleatory (irreducible) uncertainty and the resulting risk requires we have margin. Aleatory uncertainty is expressed as a process variability. Work effort variances, productivity variances, quality of product and resulting rework variances.  Epistemic uncertainties are systematic, caused by things we know about in principle. The probability of something happening. An aleatory risk is expressed as a relation to a value. A percentage of that value. This is the motivation for short work intervals found in agile development. 

Epistemic uncertainties are systematic, caused by things we know about in principle. The probability of something happening. This uncertainty is introduced by a probabilistic event, rather than a naturally occurring process. Epistemic uncertainty is introduced by an assumption about the world in which the system is embedded. This assumption can be from the lack of data - an ontological uncertainty. Epistemic uncertainties have probabilities of occurrence. The likelihood of a failure for example. Epistemic uncertainty can also occur when there is a subjective evaluation of the system - a risk from a rare event or an event with little or no empirical data. Epistemic uncertainty can also occur from the incompleteness of knowledge - a major hazard or condition not identified or a causal mechanism the remains undetected. And epistemic uncertainty can also occur from undetected design errors, introduced by ontological uncertainties into the system behavior. 

Epistemic uncertainty can also occur when there is a subjective evaluation of the system - a risk from a rare event or an event with little or no empirical data. Epistemic uncertainty can also occur from the incompleteness of knowledge - a major hazard or condition not identified or a causal mechanism the remains undetected. And epistemic uncertainty can also occur from undetected design errors, introduced by ontological uncertainties into the system behavior. 

Before completing this post, let's look quickly at procession and accuracy as mention in the original quote. All estimates have precision and accuracy. Deciding how much precision and accuracy is needed for a credible estimate is critical to the success of that decision. One starting point is the value at risk. By determining the value at risk, we can determine how much precision and accuracy is needed and how much time and cost we should put into the estimating process.

Screen Shot 2017-03-11 at 12.39.08 PM

Let's go back to the original quote.

What happens if you shift focus from "accurate estimation" to "reliably shipping by a date"? 

With our knowledge of Epistemic and Aleatory uncertainty, we now know we cannot reliably ship by a date, without knowing the extent of the reducible and irreducible uncertainties, that protect that date with margin or reserve for the irreducible uncertainties and specific actions, redundancies, or interventions for the reducible uncertainties. To know how much margin or reserve for irreducible uncertainties and performance of the of the redundancies we now need to know. 

For our credible estimate, we must have a desired and measurable:

  • Precision - how small is the variance of the estimate?
  • Accuracy - how close is the estimate to the actual value?
  • Bias - what impacts on precision and accuracy come from human judgments?

So in the end, if we are to make a decision in the presence of uncertainty, we MUST make estimates to develop a reliable shipping date while producing an accurate and precise estimate of the cost, schedule, and technical performance of the product shipped on that date.

So it comes down to this, no matter how many times those claiming otherwise, so I'll shout this to make it clear to everyone...

YOU CANNOT MAKE A DECISION IN THE PRESENCE OF UNCERTAINTY (reducible or irreducible) WITHOUT MAKING ESTIMATES

A Short List of Resources for Managing in the Presence of Uncertainty

Risk Management is essential for development and production programs. Information about key project cost, performance, and schedule attributes is often uncertain or unknown until late in the program.
Risk issues that can be identified early in the program, which will potentially impact the program later, termed Known Unknowns can be alleviated with good risk management. in Effective Risk Management 2nd Edition, Edmund Conrow, AIAA, 2003

 

Papers on Risk Management

  1. “Quantifying Uncertainty in Early Lifecycle Cost Estimation (QUELCE),” Robert Ferguson, Dennis Goldenson, James McCurley, Robert Stoddard, David Zubrow, and Debra Anderson, Technical Report, CMU/SEI-2011-TR-026 ESC-TR-2011-026
  2. “The Development of Progress Plans Using a Performance–Based Expert Judgment Model to Assess Technical Performance and Risk,” Justin W. Eggstaff, Thomas A. Mazzuchi, and Shahram Sarkani, Systems Engineering, Volume 17, Issue 4, Winter 2014, Pages: 375–391
  3. “Using the Agile Methodology to Mitigate the Risks of Highly Adaptive Projects,” Dana Roberson and Mary Anne Herndon, 10th Annual CMMI Technology Conference And User Group, November 5 – 8, 2012, Denver, CO
  4. “Hybrid–Agile Software Development Anti–Patterns, Risks, and Recommendations,” Paul E. McMahon, Cross Talk: The Journal of Defense Software Engineering, July/August 2015, pp. 22–26.
  5. “Using the Agile Methodology to Mitigate the Risks of Highly Adaptive Projects,” Dana Roberson and Mary Anne Herndon, 10th Annual CMMI Technology Conference And User Group, November 5 – 8, 2012, Denver, CO.
  6. “Assessment of risks introduced to safety critical software by agile practices — A Software Engineer’s Perspective,” Janusz Górski Katarzyna Ɓukasiewicz, AGH University of Science and Technology, University in Kraków, Poland, Computer Science, Vol 13, No 4.
  7. “Ready & Fit: Understanding Agile Adoption Risk in DoD and Other Highly Regulated Settings,” Suzanne Miller and Mary Ann Lapham, 25th Annual Software Technology Conference, Salt Lake City, 8-10 April 2013.
  8. “Architecting Large Scale Agile Software Development: A Risk–Driven Approach,” Ipek Ozkaya, Michael Gagliardi, Robert L. Nord, CrossTalk: The Journal of Defense Software Engineering, May/June 2013.
  9. “Risk Management Method using Data from EVM in Software Development Projects,” Akihiro Hayashi and Nobuhiro Kataoka, International Conference on Computational Intelligence for Modelling, Control and Automation, Vienna, Austria, Dec. 10 to Dec. 12, 2008.
  10. “Analyse Changing Risk of Organizational Factors in Agile Project Management,” Shi Tong, Chen Jianbin, and Fang DeYing, The 1st International Conference on Information Science and Engineering (ICISE2009).
  11. “Modeling Negative User Stories is Risky Business,” Pankaj Kamthan and Nazlie Shahmir, 2016 IEEE 17th International Symposium on High Assurance Systems Engineering.
  12. “Project Risk Management Model Based on PRINCE2 and Scrum Frameworks,” Martin Tomanek, Jan Juricek, The International Journal of Software Engineering & Applications (IJSEA), January 2015, Volume 6, Number 1, ISSN: 0975-9018
  13. “How to identify risky IT projects and avoid them turning into black swans,” Magne Jþrgensen, Ernst & Young: Nordic Advisory Learning Weekend, Riga, 2016.
  14. “A Methodology for Exposing Software Development Risk in Emergent System Properties,” Technical Report 11-101, April 21, 2001, Victor Basili, Lucas Layman, and Marvin Zelkowitz, Fraunhofer Center for Experimental Software Engineering, College Park, Maryland.
  15. “Outlining a Model Integrating Risk Management and Agile Software Development,” Jaana Nyfjord and Mira Kajko-Mattsson, 34th Euromicro Conference Software Engineering and Advanced Applications.
  16. “Towards a Contingency Theory of Enterprise Risk Management,” Anette Mikes Robert Kaplan, Working Paper 13–063 January 13, 2014, AAA 2014 Management Accounting Section (MAS) Meeting Paper
  17. “Agile Development and Software Architecture: Understanding Scale and Risk,” Robert L. Nord, IEEE Software Technology Conference, 2012, Salt Lake City, 23-26 April, 2012.
  18. “Using Risk to Balance Agile and Plan Driven Methods,” Barry Boehm and Richard Turner, IEEE Computer, June 2003.
  19. “Does Risk Management Contribute to IT Project Success? A Meta-Analysis of Empirical Evidence,” Karel de Bakker, Albert Boonstra, Hans Wortmann, International Journal of Project Management, 2010.
  20. “A Model for Risk Management in Agile Software Development,” Ville Ylimannela, Communications of Cloud Software.
  21. “Product Security Risk Management in Agile Product Management,” Antti VĂ€hĂ€-SipilĂ€, OWASP AppSec Research, 2010
  22. “A Probabilistic Software Risk Assessment and Estimation Model for Software Projects,” Chandan Kumar and Dilip Kumar Yadav, Eleventh International Multi-Conference on Information Processing-2015 (IMCIP-2015)
  23. “Risk: The Final Agile Frontier,” Troy Magennis, Agile 2015.
  24. ”Risk Management and Reliable Forecasting using Un-Reliable Date,” Troy Magennis, Lean Kanban, Central Europe, 2014.
  25. “Management of risks, uncertainties and opportunities on projects: time for a fundamental shift,” Ali Jaafari, International Journal of Project Management 19 (2001) 89-101.
  26. “On Uncertainty, Ambiguity, and Complexity in Project Management,” Michael T. Pich, Christoph H. Loch, and Arnoud De Meyer, Management Science © 2002 INFORMS, Vol. 48, No. 8, August 2002 pp. 1008–1023
  27. “Risk Options and Cost of Delay,” Troy Magennis, LKNA 2014.
  28. “Transforming project risk management into project uncertainty management,” Stephen Ward and Chris Chapman, International Journal of Project Management 21 (2003) 97–105.
  29. “Risk-informed decision-making in the presence of epistemic uncertainty,” Didier Dubois, Dominique Guyonnet, International Journal of General Systems, Taylor & Francis, 2011, 40 (2), pp. 145-167.
  30. “A case study of risk management in agile systems development,” Sharon Coyle and Kieran Conboy, 17th European Conference on Information Systems (Newell S, Whitley EA, Pouloudi N, Wareham J, Mathiassen L eds.), 2567-2578, Verona, Italy, 2009
  31. “Risk management in agile methods: a study of DSDM in practice,” Sharon Coyle, 10th International Conference on eXtreme Programming and Agile Processes in Software Engineering, 2009.
  32. “Distinguishing Two Dimensions Of Uncertainty,” Craig R. Fox and GĂŒlden ÜlkĂŒmen, in Perspectives on Thinking, Judging, and Decision Making, Brun, W., Keren, G., Kirkeboen, G., & Montgomery, H. (Eds.), 2011.
  33. “Two Dimensions of Subjective Uncertainty: Clues from Natural Language,” Craig R. Fox and GĂŒlden ÜlkĂŒmen.
  34. “An Essay Towards Solving a Problem in the Doctrine of Chance,” By the late Rev. Mr. Bayes, communicated by Mr. Price, in a letter to John Canton, M. A. and F. R. S.
  35. “Playbook: Enterprise Risk Management for the U.S. Federal Government, in support of OMB Circular A-123.”
  36. “Joint Agency Cost Schedule Risk and Uncertainty Handbook,” Naval Center for Cost Analysis, 12 March 2014.
  37. “Quantitative Risk ‒ Phases 1 & 2: A013 ‒ Final Technical Report SERC-2013-TR-040-3,” Walt Bryzik and Gary Witus, November 12, 2013, Stevens Institute of Technology
  38. “Distinguishing Two Dimensions of Uncertainty,” Craig Fox and GĂŒlden Ülkumen, in Perspectives of Thinking, Judging, and Decision Making
  39. “Using Risk to Balance Agile and Plan Driven Methods,” Barry Boehm and Richard Turner, IEEE Computer, June 2003
  40. “Commonalities in Risk Management and Agile Process Models,” Jaana Nyfjord and Mira Kajko-Mattsson, International Conference on Software Engineering Advances(ICSEA 2007).
  41. “Software Risk Management in Practice: Shed Light on Your Software Product,” Jens Knodel, Matthias Naab, Eric Bouwers, Joost Visser, IEEE 22nd International Conference on Software Analysis, Evolution, and Reengineering (SANER), 2015
  42. “Software risk management,” Sergey M. Avdoshin and Elena Y. Pesotskaya, 7th Central and Eastern European Software Engineering Conference, 2011.
  43. “A New Perspective on GDSD Risk Management Agile Risk Management,” Venkateshwara Mudumba and One-Ki (Daniel) Lee, International Conference on Global Software Engineering, 2010
  44. “Using Risk Management to Balance Agile Methods: A Study of the Scrum Process,” Benjamin Gold and Clive Vassell, 2nd International Conference of Knowledge-Based Engineering and Innovation, November 5-6, 2015
  45. “Using Velocity, Acceleration, and Jerk to Manage Agile Schedule Risk,” Karen M. Bumbary, 2016 International Conference on Information Systems Engineering
  46. “The Risks of Agile Software Development Learning from Adopters,” Amany Elbanna and Suprateek Sarker, IEEE Software, September/October 2016
  47. “Software Delivery Risk Management: Application of Bayesian Networks in Agile Software Development,” Ieva Ancveire, Ilze Gailite, Made Gailite, and Janis Grabis, Information Technology and Management Science, 2015/18.
  48. “Lightweight Risk Management in Agile Projects,” Edzreena Edza Odzaly, Des Greer, Darryl Stewart, 26th Software Engineering Knowledge Engineering Conference (SEKE), November 2015.
  49. “A Method of Software Requirements Analysis Considering the Requirements Volatility from the Risk Management Point of View,” Yunarso Anang, Masakazu Takahashi, and Yoshimichi Watanabe, 22nd International Symposium on QFD, Boise, Idaho.
  50. “Analyse Changing Risk of Organizational Factors in Agile Project Management,” Shi Tong, Chen Jiabin, and Fang DeYing, The 1st International Conference on Information Science and Engineering (ICISE2009)
  51. “Outlining a Model Integrating Risk Management and Agile Software Development,” Jaana Nyfjord and Mira Kajko-Mattsson, 34th Euromicro Conference Software Engineering and Advanced Applications. 2009.
  52. “How Do Real Options Concepts Fit in Agile Requirements Engineering?,” Zornitza Racheva and Maya Daneva, Eighth ACIS International Conference on Software Engineering Research, Management and Applications, 2010..
  53. NASA Risk Informed Decision Making Handbook, NASA/SP-2010-576, Version 1.0 April, 2010.
  54. “Managing Risk Within A Decision Analysis Framework,” Homayoon Dezfuli, Robert Youngblood, Joshua Reinert, NASA Risk Management Page.
  55. “NASA Risk Management Handbook,” NASA/SP-2011-3422, Version 1.0, November 2011.
  56. “Risk Management For Software Projects In An Agile Environment – Adopting Key Lessons From The Automotive Industry,” Oana Iamandi, Marius Dan, and Sorin Popescu, Conference: MakeLearn and TIIM Joint International Conference 2015: Managing Intellectual Capital and Innovation for Sustainable and Inclusive Society, Bari, Italy, 2015
  57. “Role of Agile Methodology in Software Development,” Sonia Thakur and Amandeep Kaur, International Journal of Computer Science and Mobile Computing, Volume 2, Issue 10, October 2013, pp. 86-90
  58. “Agile Risk Management Workshop,” Alan Moran Agile Business Conference, 08.10.2014, London England
  59. “Embrace Risk! An Agile approach to risk management,” Institute for Agile Risk Management, 2014.
  60. “Risks in distributed agile development: A review,” Suprika Vasudeva Shrivastava and Urvashi Rathod, Procedia - Social and Behavioral Sciences 133 ( 2014 ) 417 – 424, ICTMS-2013
  61. “Risk and uncertainty in project management decision-making,” Karolina Koleczko, Public Infrastructure Bulletin, Vol. 1, Issue. 8 [2012], Art. 13.
  62. “Uncertainty and Project Management: Beyond the Critical Path Mentality,” A. De Meyer, C. Loch, And M. Pich, INSEAD Working Paper, 2001.
  63. “Proposal of Risk Management Metrics for Multiple Project Software Development,” Miguel Wanderleya, JĂșlio Menezes Jr., Cristine GusmĂŁoa, Filipe Limaa, Conference on ENTERprise Information Systems / International Conference on Project Management / Conference on Health and Social Care Information Systems and Technologies, CENTERIS / ProjMAN / HCist 2015 October 7-9, 2015.
  64. “Outling a Model Integration Risk Management and Agile Software Development,” Jaana Nyfjord and Mira Kajko-Mattsson, 34th Euromicro Conference Software Engineering and Advanced Applications, 2008.
  65. “The Impact Of Risk Checklists On Project Manager’s Risk Perception And Decision-Making Process,” Lei Li, Proceedings of the Southern Association for Information Systems Conference, Savannah, GA, USA March 8th–9th, 2013.
  66. “Identifying The Risks Associated With Agile Software Development: An Empirical Investigation.” Amany Elbanna, MCIS 2014 Proceedings. Paper 19.
  67. “Uncertainty, Risk, and Information Value in Software Requirements and Architecture,” Emmanuel Letier, David Stefan, and Earl T. Barr, ICSE ’14, May 31 – June 7, 2014, Hyderabad, India
  68. “Software project risk analysis using Bayesian networks with causality constraints,” Yong Hu, Xiangzhou Zhang, E. W. T. Ngai, Ruichu Cai, and Mei Liu, Decision Support Systems 56 (2013) 439–449.
  69. “Risk Based Scrum Method: A Conceptual Framework,” Nitin Uikey and Ugrasen Suman, 2015 2nd International Conference on “Computing for Sustainable Global Development, 11th - 13th March, 2015.
  70. “Implementation of Risk Management with SCRUM to Achieve CMMI Requirements,” Eman Talal Alharbi, M. Rizwan Jameel Qureshi, International Journal Computer Network and Information Security, 2014, 11, 20-25.
  71. “Risk, ambiguity, and the Savage axioms,” Daniel Ellsberg, The Quarterly Journal of Economic, Vol. 75, No. 4, pp. 643-669, Nov 1961.
  72. “Analytical Method for Probabilistic Cost and Schedule Risk Analysis: Final Report,” Prepared for NASA, 5 April 2013.
  73. “Risk, Ambiguity, and the Savage Axioms,” Daniel Ellsberg, August 1961, RAND Corporation, Report P-2173.
  74. “Dealing with Uncertainty Arising Out of Probabilistic Risk Assessment,: Kenneth Solomon, William Kastenberg, and Pamela Nelson, RAND Corporation, R-3045-ORNL, September 1983.
  75. “Using Risk to Balance Agile and Plan-Driven Methods,” Barry Boehm and Richard Turner, IEEE Computer, June 2003.
  76. “Epistemic Uncertainty Analysis: An Approach Using Expert Judgment and Evidential Credibility,” Patrick Hester, International Journal of Quality, Statistics, and Reliability, Volume 2012, Article ID 617481, 8 pages
  77. “Representation of Analysis Results Involving Aleatory and Epistemic Uncertainty,” J. C. Helton, J. D. Johnson, W. L. Oberkampf, C. J. Sallaberry, Sandia Report SAND2008-4379, 2008.
Books on Risk Management
  1. Probabilistic Risk Assessment and Management for Engineers and Scientist 2nd Edition, Ernest J. Henley and Hiromitsu Kumamoto, IEEE Press, 2000.
  2. Agile Risk Management and Scrum, Alan Moran, Institute for Agile Risk Management, 2014.
  3. Managing the Unknown: A New Approach to Managing High Uncertainty and Risk in Projects 1st Edition, Christoph H. Loch, Arnoud DeMeyer, and Michael Pich, Wiley, 2006.
  4. Identifying and Managing Project Risk: Essential Tools for Failure-Proofing Your Project, Tom Kendrick, AMACOM, 3rd Edition, March 2015
  5. Integrated Cost and Schedule Control in Project Management, Second Edition 2nd Edition, Ursula Kuehn, Management Concepts, 2010.
  6. Effective Risk Management, 2nd Edition, Edmund Conrow, AIAA, 2003.
  7. Effective Opportunity Management for Project: Exploiting Positive Risk, David Hillson, Taylor & Francis, 2004.
  8. Project Risk Management: Process, Techniques, and Insights, 2nd Edition, Chris Chapman and Stephen Ward, John Wiley & Sons, 2003.
  9. Managing Project Risk and Uncertainty: A Constructively Simple Approach to Decision Making, Chris Chapman and Stephen Ward, John Wiley & Sons, 2002
  10. Technical Risk Management, Jack Michaels, Prentice Hall, 1996.
  11. Managing Risk: Methods for Software Systems Development, Elaine Hall, Software Engineering Institute, Addison Wesley, 1998.
  12. Software Engineering Risk Management: Finding your Path Through the Jungle, Version 1.0, Dale Karolak, IEEE Computer Society, 1998.
  13. Risk Happens: Managing Risk and Avoiding Failure in Business Projects, Mike Clayton, Marshall Cavendish, 2011.
  14. Waltzing with Bears: Managing Risk on Software Projects, Tom Demarco and Timothy Lister, Dorset House, 2003.
  15. Software Engineering Risk Management, Dale Karolak, IEEE Computer Society Press, 1996.
  16. Practical Project Risk Management: The ATOM Methodology, David Hillson, Management Concepts Press, 2012.
  17. Risk Management in Software Development Projects, John McManus, Routledge, 2003.
  18. Department of Defense Risk, Issue, and Opportunity Management Guide for Defense Acquisition Programs, June 2015, Office of the Deputy Assistant Secretary of Defense for Systems Engineering Washington, D.C.
  19. Project Risk Management: Process, techniques, and Insights, 2nd Edition, Chris Chapman and Stephan Ward, John Wiley & Sons, 2003.
  20. Technical Risk Management, Jack Michaels, Prentice Hall, 1996.
  21. Software Engineering Risk Management, Dale Walter Karolak, IEEE Computer Society, 1996.
  22. Software Engineering Risk Management: Finding Your Path Through the Jungle, Version 1.0, Dale Walter Karolak, IEEE Computer Society, 1998.
  23. Managing Risk: Methods for Software Systems Development, Elaine Hall, Addison Wesley, 1998.
  24. Risk Happens!: Managing Risk and Avoiding Failure in Business Projects, Mike Clayton, 2011.
  25. Probability Methods for Cost Uncertainty Analysis: A Systems Engineering Perspective, Paul Garvey, CRC Press, 2000.
  26. A Beginners Guide to Uncertainty of Measurement, Stephanie Bell, National Physics Laboratory, 1999.
  27. Practical Risk Assessment for Project Management, Stephen Grey,
  28. Assessment and Control of Software Risks, Capers Jones, Prentice Hall, 1993.
  29. Distinguishing Two Dimensions of Uncertainty, Craig Fox and GĂŒlden Ülkumen, in Perspectives of Thinking, Judging, and Decision Making

 

 

Related articles Making Decisions In The Presence of Uncertainty The Flaw of Empirical Data Used to Make Decisions About the Future Managing in Presence of Uncertainty Herding Cats: Decision Making On Software Development Projects Making Decisions in the Presence of Uncertainty Managing in the Presence of Uncertainty
Categories: Project Management

Quote of the Day

Tue, 03/14/2017 - 13:51

Magic, it must be remembered, is an art which demands collaboration between the artist and his public - E. M. Butler The Myth of the Magus, 1948

When we hear the magic of making decisions in the presence of uncertainty without the need to estimate the cost, outcome, or impact of that decision, it's surely magical.

Categories: Project Management

Deploying ERP with Agile

Mon, 03/13/2017 - 18:01

Here's a paper from a prior agile conference that is still applicable today

Screen Shot 2017-03-13 at 10.59.04 AM

Categories: Project Management

Quote of the Day

Sun, 03/12/2017 - 13:39

Trust a witness in all matters in which neither his self-interest, his passions, his prejudices, nor the love of the marvelous is strongly concerned. When they are involved, require corroborative evidence in exact proportion to the contravention of probability by the thing testified - Thomas Henry Huxley (1825-1895)

Categories: Project Management

Quote of the Day

Sun, 03/12/2017 - 02:46

Magic, it must be remembered, is an art which demands collaboration between the artist and his public - E. M. Butler The Myth of the Magus, 1948

When we hear the magic of making decisions in the presence of uncertainty without the need to estimate the cost, outcome, or impact of that decision, it's surely magical.

Categories: Project Management

Planning before Scheduling

Fri, 03/10/2017 - 05:05

Planning is an unnatural process, it’s much more fun to get on with it. The real benefit of not planning is that failure comes as a complete surprise and is not preceded by months of worry.
‒ Sir John Harvey Jones

And of course 

Screen Shot 2017-03-09 at 8.52.33 PM 

and ...

For which of you, intending to build a tower, sitteth not down first, and counteth the cost, whether he have sufficient to finish it? Lest haply, after he hath laid the foundation, and is not able to finish it, all the behold it begin to mock him, saying, This man began to build, and was not able to finish ―Luke 14:28-30

The Plan describes where we are going, the various paths we can take to reach our destination, and the progress or performance assessment points along the way to assure we are on the right path. These assessment points measures the “maturity” of the product or service against the planned maturity. This is the only real measure of progress – not the passage of time or consumption of money.

Without a Plan, the only purpose of the Schedule is to show the sequence of work, the dependencies between the work activities, and collecting progress the passage of time and consumption of money.

The Plan - actually the Integrated Master Plan - is the basis of the Integrated Master Schedule

Screen Shot 2017-03-09 at 9.08.14 PM

This decomposition is not unique to the IMP/IMS paradigm. Without some form of decomposition of what “done” looks like, it is difficult to connect the work of the project to the outcomes of the project. This decomposition – which is hierarchical – provides the mechanism to increase cohesion and decrease coupling of the work effort. This coupling and cohesion comes from the systems architecture world is has been shown to increase the robustness of systems. The project cost, schedule, and resulting deliverables are a system, subject to these coupling and cohesion.

The mechanics of the Integrated Master Plan and Integrated Master Schedule looks like this

Screen Shot 2017-03-09 at 9.09.55 PM

Related articles Build a Risk Adjusted Project Plan in 6 Steps Architecture -Center ERP Systems in the Manufacturing Domain Herding Cats: How to Deal With Complexity In Software Projects? IT Risk Management Why Guessing is not Estimating and Estimating is not Guessing
Categories: Project Management

The Fallacy of Wild Numbers

Fri, 03/10/2017 - 03:57

DeMacro made this post, which has been picked up by the agile community to mean estimating is a waste. 

My early metrics book, Controlling Software Projects: Management, Measurement, and Estimation (Prentice Hall/Yourdon Press, 1982), played a role in the way many budding software engineers quantified work and planned their projects. [
] The book’s most quoted line is its first sentence: “You can’t control what you can’t measure.” This line contains a real truth, but I’ve become increasingly uncomfortable with my use of it.

Implicit in the quote (and indeed in the book’s title) is that control is an important aspect, maybe the most important, of any software project. But it isn’t. Many projects have proceeded without much control but managed to produce wonderful products such as Google Earth or Wikipedia.

To understand control’s real role, you need to distinguish between two drastically different kinds of projects:

  • Project A will eventually cost about a million dollars and produce value of around $1.1 million.

  • Project B will eventually cost about a million dollars and produce value of more than $50 million.

What’s immediately apparent is that control is really important for Project A but almost not at all important for Project B. This leads us to the odd conclusion that strict control is something that matters a lot on relatively useless projects and much less on useful projects. It suggests that the more you focus on control, the more likely you’re working on a project that’s striving to deliver something of relatively minor value.

Let's start with some logic assessment in the context of an actual business.

  • If I'm investing $1,000,000 (long form for effect) and only getting back, $1,100,000 - that is $100,000 return on a $1,000,000 investment, that's a 10% return on investment. Assuming the investment of $1,000,000 is just labor, a standard burdened rate gets me, 10 staff, over a year's work. The natural variances of work productivity, staff turnover, delays and disruptions, and other event-based and naturally occurring variances has a high chance of wiping out or greatly reducing my $100,000 return.
  • If I'm investing $1,000,000 and getting $50,000,000 back, that's a 500% return on my investment. Implying there is no need to measure the project performance and by implication no need to estimate. This, of course, ignores the need to know how much of that $50M minus the $1M am I willing to loose? This is the value at risk discussion.

Let's look at a few of the companies SEC 10K to see what their Return on Equity and Return on Investment is ...

The 50 to 1 return on a project is a logical fallacy of exaggeration /  stretching the truth / overstatement occurs when a point is made by saying something that would be true, but the truth has been distorted in some way. 

Having been through a few startups, with one went public, one failed, and one got bought and being married to a person who has gone through 6 startups, I have some familiarity with how startups work. 

A 50 to 1 ROI is unheard of. There is mention of Google maps as a 50:1 return. Google Maps makes money by monetizing the Pins. The 2015 10K shows $74.5B total revenue for Google. The total cost of revenue is 37% and the R&D expenses as a percentage of revenue is 15%. The details of individual products are in the balance sheet, but there is no 500% ROI on Maps or any other product used as an example.

As well, again from hands-on experience with startups, those investing need to know when they'll get their money back. If they are not an external investor, the internal CFO needs that same information. DeMacro had made several other off the wall comments - one in IEEE Computer - that appear to be misinformed about how projects and business work in the 21st century. For example in "Software Engineering: An Idea Whose Time Has Come and Gone?" where the original 10% and 500% return example comes from, he goes on with an analogy of parenting teenagers where he says ...

Now apply “You can’t control what you can’t measure” to the teenager. Most things that really matter—honor, dignity, discipline, personality, grace under pressure, values, ethics, resourcefulness, loyalty, humor, kindness—aren’t measurable.

I don't know where he grew up, but where I grew up - the Texas Panhandle - those attributes were certainly measurable. So I get a bit skeptical when a thought leader makes statements that are not logical. The measures of attributes are exemplified in the Boy Scouts, on sports teams, sitting in the church pews, observed in the volunteer activities, and further on in adulthood, in the leadership activities of the military and business management.

Don't fall for the Strawman Fallacy, where exaggerating, misrepresenting, or just completely fabricating an argument takes place. This kind of dishonesty serves to undermine honest rational debate.

This is common in the #NoEstimates community.

  • We can build a ticketing system in 24hrs.
  • All estimates are evil.
  • I can make decisions in the presence of uncertainty without estimating the impact of those decisions.
  • I really don't mean NO when I say NO Estimates, I mean YES, but it means NO, so give me $1,000 and I'll let you hear me say that in person.

 

Related articles The Microeconomics of Decision Making in the Presence of Uncertainty Systems Thinking, System Engineering, and Systems Management Risk Management is How Adults Manage Projects
Categories: Project Management

12 Principles of Agile with and without Estimates

Thu, 03/09/2017 - 06:11

It's popular to speak about the Agile Manifesto and the 12 Principles of Agile. When we hear that the next big thing in agile is Not Estimating, let's look to see how those 12 Principles can be applied without those estimates? 

12 Principles of Agile

How estimates help implement these principles

Without estimates what is missing

1

Satisfy customer with continuous delivery of value

Value cannot be determined without knowing the cost to deliver that value and the time frame the cost is expended. This is basic managerial finance when spending other people’s money

Business strategy is based on the delivery of value at the planned time for that value for the planned cost of that value

Without estimating the delivered value to the business for the estimated cost and time to delivery that value, the balance sheet will be wholly uninformed about when the breakeven date for this expenditure

2

Welcome change to improve the value of products

When change arrives, the first question is what will be the cost benefit of this change.

Since software development usually operated in the presence of uncertainty, an estimate of the cost and benefit of the change needs to be made.

Without estimating the impact of the requested change, those paying for the change are exposed to Un-Validated Risk.

Will this change impact our return-on-investment?

Will this change cause technical, cost, or schedule impacts we don’t know about until it’s too late?

3

Deliver working products often

How often? Every month, every two weeks? Every day?

Estimating the impact, absorption rate, produced monetary value or disrupting costs is part of the decision making process.

 

4

Work together with customers

Always a good idea, no matter the development method.

Those customer may have a fiduciary obligation to know the Estimate at Completion and the Estimate to Complete as you spend their money in the presence of uncertainty.

So when those customers ask when do you think we’ll be able to start using the software? Or, what’s you estimate of the features we can deliver before the trade show? Or our board is interested in our sales forecast for the feature you’re developing What are you going to say, Oh we don’t estimate our work, or time, or how much money you need to give us. We’re a No Estimates shop, you’ll just have to trust we show up on time, and not spend too much of your money and the marketing and sales folks will get what they need to reach breakeven as planned

5

Build products with motivated individuals

Yep, can’t go wrong here

 

6

Promote sustained develop rhythms

Sustained rhythms need some insight in what can be sustained in the presence of uncertainty./ Uncertainty that created risk. Uncertainty the results for Aleatory processes of the staff, tools, environments, externalities.

If you aren’t observing (empirical) and modeling with that data, the confidence that the rhythm can be sustained has no basis of fact.

It’s the old FedEx ad, where the guy on the phone says sure I can do that, several times, then when he hangs up askes himself, how can I do that

7

Measure progress with working products

If you are measuring after the fact you are executing the project with Open Loop Control.

What will be cost to get this working product on the date we need the working product?

What variance to this planned performance are appearing and what effort, cost or changes are need to get back on plan is Closed Loop Control.

Without estimates, there is only Open Loop Control.

8

Use face-to-face communication whenever possible

Yes, good process.

But face-to-face is just a mechanism.

Answering questions in the face-by-face manner about direction and outcomes in the future in the presence of uncertainty is Closed Loop Control

 

9

Technical excellence and good design improves agility

Technically compliant products are good products. But there is Epistemic and Aleatory uncertainty is all products. What are the upper and lower control limits for these measures. What are the impacts on the product when the actual measures go outside the upper and lower control limits.

Knowing how to set these upper and lower limits, knowing the Epistemic and Aleatory uncertainties that create risk to staying inside the bands is all done through estimates

Without estimates there can be no Closed Loop Control in the presence of uncertainty.

10

Simplicity is essential

This is a platitude. How simple is essential is the actual question.

This is a Systems Engineering question.  We need to remember H. L. Mencken’s quote

For every complex problem there is an answer that is clear, simple, and wrong.
so how simple will some estimates, since the system has not yet been built.

Assessing the parameters for how simple requires estimating the impact of the various components of the system on the complexity of the system.

This is a systems engineering function. Addressable in many case by the Design Structure Matrix paradigm and the probabilistic and statistical interactions of these components

11

The best architecture and requirements emerge from self-organizing teams

This is actually an untested claim. System architecture in non-trivial systems can be guided by architecture reference design. ToGAF, DoDAF, Zachman and other formal framework.

Again this is a Systems Engineering process

Without estimates the interactions both collaborative and interference cannot be assessed, until the product or service is complete

12

Regularly reflect and make adjustments on improving performance

Making adjustments to the technical and programmatic process in the presence of uncertainty requires estimating the impacts of those adjustments.

Without estimating the impacts of a decision, when the processes are stochastic, leaves the decision makers in the dark.

Hoepfully it's clear that estimating is part of making decisions in the agile paradogm, just like it is in any paradigm where uncertanty exists.

To suggest that decision can be made in the presence of uncertainty withoiut estimating the impact of those decisions, ignores to basic principles of Microeconomics of decision making 

Related articles Making Decisions In The Presence of Uncertainty Eyes Wide Shut - A View of No Estimates Root Cause Analysis
Categories: Project Management