Skip to content

Software Development Blogs: Programming, Software Testing, Agile Project Management

Methods & Tools

Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!

Herding Cats - Glen Alleman
Syndicate content
Performance-Based Project Management¬ģ Principles, Practices, and Processes to Increase Probability of Project Success
Updated: 1 hour 9 min ago

Quote of the Day

Fri, 02/24/2017 - 21:50

Simplicity is key, because it is tied up with being fundamental - Harvey Freidman

Now the question really becomes - how simple is simple enough? When we hear some phrase like this, and we don't hear the units of measure of how simple, how to reach simple, then there is only a platitude - no actionable outcomes. How can we measure simplicity? How can we measure the coupling and cohesion of all the parts that make up a simple design, process, or system - to confirm the result is the simplest? How can we learn to ignore the platitudes of those making claims about simple systems are the best when they provide units of measure of the system, simple, or best?

So remember

Explanations exist; they have existed for all time; there is always a well-known solution to every human problem ‚ÄĒ neat, plausible, and wrong. - H.L. Mencken

Categories: Project Management

Dunning-Kruger and Modern Software Project Management

Thu, 02/23/2017 - 20:33

In "Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments," David Dunning and Justin Kruger state that the less skilled or competent you are, the more confident you are that you're actually very good at what you do. Their central finding is not only do such people reach the erroneous conclusion and make unfortunate choices, but their incompetence robs them of the ability to realize it.

This, of course, is true of everyone in some way. We think we have a great sense of humor when we don't. We rate ourselves higher than others in a variety of skills

Lake Wobegon - where all the children are above average

Turns out though that less competent people overestimate themselves more than others. 

The reason is the absence of a quality called metacognition, the ability to step back and see our own cognitive process in perspective. Good singers know when they've hit a sour note, good directors know when a scene in a play isn't working, and intelligently self-aware people know when they're out of their depth. 

When I hear unsubstantiated claims, usually from sole proprietors, working on de minimis projects, I think of Dunning-Kruger. I've vowed to ignore them and move on. If it works in their domain, then any advice they provide is usually limited to their domain and their experiences in that domain. But the continued chant that certain processes, methods, fixing for dysfunction continue. Getting louder when they are asked to show the evidence their idea has a basis in principle.  This is the confirmation bais for their misunderstanding of the principles on which they are making their claims. 

 

 

Related articles Architecture -Center ERP Systems in the Manufacturing Domain IT Risk Management Why Guessing is not Estimating and Estimating is not Guessing Making Conjectures Without Testable Outcomes Deadlines Always Matter
Categories: Project Management

Quote of the Day

Tue, 02/21/2017 - 05:49

‚ÄúI am not much given to regret, so I puzzled over this one a while. Should have taken much more statistics in college, I think.‚ÄĚ ‚ÄĒ Max Levchin, Paypal Co-founder, Slide Founder

Perhaps anyone conjecturing that decisions can be made in the presence of uncertainty may want to call  their High School Probability and Statistics teacher and find out what they missed

Categories: Project Management

Quote of the Day

Fri, 02/17/2017 - 22:58

The real world is fraught with risk.
Forecasting with empirical data - the #NoEstimates approach to estimating - ignores this fact. In the NE version of forecasting there is no place for uncertainty, which is why No Estimates advocates assert Forecasting is not estimating. In this version of Estimating without calling it Estimating, the past behaviour represents the future behaviour, with not adjustments for reducible and irreducible uncertainties that create risk to the project's success.
When we try to plan, knowing we cannot predict the future precisely, the mismatch between the planning and the real world, creates confusion. This confusion creates biases and distortions, resulting in padded estimates and undue optimism as well as misuse and even dysfunction of the management in the presence of uncertainty.
There needs to be an approach to planning and forecasting that is connected with reality, an approach that acknowledges uncertainty from the beginning and continues to manage in the presence of that uncertanty throughout the life of the project.
This approach mandates estimating all our work that not de minimis. This work is driven by uncertainty - reducible and irreducible.
This uncertainty creates risk and Risk Management is How Adults Manage Projects - Tim Lister.

Practical Risk Assessment for Project Management, Stephen Grey, Wiley Series in Software Engineering Practice, 1995. Nothing has changed since these words. We still live, operate, and manage in the presence of uncertainty. Here's how we manage in our software intensive system of systems domain. Your domain may be different, but the principles of managing in the presence of uncertainty are the same.

Here's the now familiar briefing for how we management in the presence of uncertainty in our Software Intensive System of Systems domain.

Managing in the presence of uncertainty from Glen Alleman

 

Related articles The Flaw of Empirical Data Used to Make Decisions About the Future The Flaw of Averages and Not Estimating Herding Cats: Decision Making On Software Development Projects Monte Carlo Simulation of Project Performance
Categories: Project Management

Decision Making On Software Development Projects

Fri, 02/17/2017 - 21:38

Decision Making on Software Development Projects
Is Both Simple and Complex at the Same Time

All projects operate in the presence of uncertainty.

Real-world decision making is performed under uncertainty.

Decision makers must make decisions which best incorporate these uncertainties.

These uncertainties come in two forms ‚Äď reducible (I can do something about it) and irreducible (I can't do anything about it).

Irreducible (Aleatory) uncertainty is the natural variability of the processes and technology on the project. These are stochastic, they may be nonstationary, and are due to chance from the underlying statistical distributions.

Aleatory uncertainties are modeled as random variables described by statistical distributions  (Triangle is a common one when the actual distribution is not known) 
In aleatory uncertainty, decision makers make assumptions about the distribution's descriptive statistics - the Mean and Variance are needed along with the Most Likely (Mode) at a minimum. The shape of the curve is needed as well.

Irreducible uncertainty can only be dealt with margin. Cost margin, schedule margin, technical margin.

Reducible (Epistemic) uncertainty is subjective, with the subjectivity coming from lack of knowledge.

This lack of knowledge comes from the probabilistic nondeterministic behavior of the system or the environment.

Reducible uncertainty is addressed with redundancy, experiments, prototypes, models, measures, empirical data to reveal knowledge about the underlying probabilities of the process.

All Uncertainty Creates Risk.

Reducible risk requires estimating the probability distribution of the occurrence.

Irreducible risk requires estimating the statistical distribution of the naturally occurring processes.

Risk Management is How Adults Manage projects ‚Äď Tim Lister.

Risk management requires estimating.

Adult management of projects requires estimating.

Not Estimating means not managing as an Adult.

  1. ¬†‚ÄúTreatment of aleatory and epistemic uncertainty in performance assessments for complex systems,‚ÄĚJ. C. Helton and D. E. Burmaster, Reliability Engineering and System Safety, vol. 54, no. 2-3, pp. 91‚Äď94, 1996.
  2. ‚ÄúUncertainty quantification using evidence theory,‚ÄĚ W. L. Oberkampf, in Proceedings from the Advanced Simulation & Computing Workshop, Albuquerque, NM, USA, 2005.
  3. ‚ÄúTreatment of uncertainty in performance assessments for complex systems,‚ÄĚJ. C. Helton, Risk Analysis, vol. 14, no. 4, pp. 483‚Äď511, 1994.
  4. S. N. Rai, D. Krewski, and S. Bartlett, ‚ÄúA general framework for the analysis of uncertainty and variability in risk assessment,‚ÄĚ Human and Ecological Risk Assessment, vol. 2, no. 4, pp. 972‚Äď989, 1996.
  5. ‚ÄúUnderstanding uncertainty,‚ÄĚW. D. Rowe, Risk Analysis, vol. 14, no. 5, pp. 743‚Äď750, 1994.
  6. Possibility Theory: An Approach to Computerized Processing of Uncertainty, D. Dubois and H. Prade, Plenum Press, New York, NY, USA, 1988.
  7. ‚ÄúJudgment under uncertainty: heuristics and biases,‚ÄĚA. Tversky and D. Kahneman, Science, vol. 185, no. 4157, pp. 1124‚Äď1131, 1974.
  8. Methods for representing uncertainty. A literature review, Enrico Zio and Nicola Pedroni, Foundation for an Industrial Safety Culture, Toulouse, France
  9. Estimating Software-Intensive Systems, Richard Stutzke, Addison Wesley
Related articles Making Decisions in the Presence of Uncertainty Managing in the Presence of Uncertainty Herding Cats: Cone of Uncertainty - Part Cinq (Updated) Herding Cats: Managing Uncertainty, Risk, Threat, and Opportunity Some More Background on Probability, Needed for Estimating Estimating is Risk Management IT Risk Management Risk Management is How Adults Manage Projects
Categories: Project Management

Quote of the Day

Fri, 02/17/2017 - 21:32

People behave the way they are managed
- A Wise old Navy Captain as told by LCMDR Nicholas Pisano

Categories: Project Management

Fifty Great Project Management Blogs

Thu, 02/16/2017 - 19:06

Screen Shot 2017-02-16 at 11.03.57 AMOn Line PM Course has listed the 50 Greatest Project Management Blogs in alphabetical order

Categories: Project Management

Quote of the Day

Sat, 02/04/2017 - 23:14

Every systematic development of any subject ought to begin with a definition, so that everyone may understand what the discussion is about.
Marcus Tullius Cicero (196BC ‚Äí 16BC), De Officiis, Book 1, Moral Goodness

Related articles Architecture -Center ERP Systems in the Manufacturing Domain IT Risk Management Why Guessing is not Estimating and Estimating is not Guessing Making Conjectures Without Testable Outcomes Deadlines Always Matter
Categories: Project Management

Cone of Uncertainty - Part Cinq (Updated)

Sat, 02/04/2017 - 16:41

The notion¬†of the Cone of Uncertainty has been around for awhile. Barry Boehm's work in ‚ÄúSoftware Engineering Economics‚ÄĚ. Prentice-Hall, 1981. ¬†The poster below is from Steve McConnell's site and makes several things clear.

  • The Cone is a project management framework describing the uncertainty aspects of estimates (cost and schedule) and other project attributes (cost, schedule, and technical performance parameters). Estimates of cost, schedule, technical¬†performance on the left side of the cone have a lower probability of being precise and accurate than estimates on the right side of the cone. This is due to many reasons. One is levels of uncertainty¬†early in the project. Aleatory and Epistemic uncertainties, which create the risk to the success of the project. Other uncertainties that create risk include:
    • Unrealistic performance expectation with missing Measures of Effectiveness and Measures of Performance
    • Inadequate assessment of risks and unmitigated exposure to these risks with proper handling plans.
    • Unanticipated technical issues with alternative plans and solutions to maintain effectiveness
  • Since all project work contains uncertainty, reducing this uncertainty¬†- which reduces risk - is the role of the project team and their management. Either the team itself, the Project or Program Manager, or on larger programs the Risk Management owner.¬†

Here's a simple definition of the Cone of Uncertainty: 

The Cone of Uncertainty describes the evolution of the measure of uncertainty during a project. For project success, uncertainty not only must decrease over time, but must also diminishe its impact on the project's outcome. This is done by active risk management, through probabalistic decision-making. At the beginning of a project, there is usually little known about the product or work results. Estimates are needed but are subject to large level of uncertainty. As more research and development is done, more information is learned about the project, and the uncertainty then decreases, reaching 0% when all risk has been mitigated or transferred. This usually happens by the end of the project.

So the question is? - How much variance reduction needs to take place in the project attributes (risk, effectiveness, performance, cost, schedule - shown below) at what points in time, to increase the probability of project success? This is the basis of Closed Loop Project Control  Estimates of the needed reduction of uncertanty, estimates of the possisble reduction of uncertainty, and estimates of the effectiveness of these reduction efforts are the basis of the Close Loop Project Control System.

This is the paradigm of the Cone of Uncertainty - it's a planned development compliance engineering tool, not an after the fact data collection tool

The Cone is NOT the result of the project's past performance. The Cone IS the Planned boundaries (upper and lower limits) of the needed reduction in uncertainty (or other performance metrics) as the project proceeds. When actual measures of cost, schedule, and technical performance are outside the planned cone of uncertainty, corrective actions must be taken to move those uncertanties inside the cone, if the project is going to meet it's cost, schedule, and technical performance goals. 

If your project's uncertanties are outside the Planned boundaries at the time when they should be inside those boundaries, then you are reducing the proabbility of project success

The Measures that are modeled in the Cone of Uncertainty are the Quantitative basis of a control process that establishes the goal for the performance measures. Capturing the actual performance, comparing it to the planned performance, and compliance with the upper and lower control limits provides guidance for making adjustments to maintain the variables perform inside their acceptable limits.

The Benefits of the Use of the Cone of Uncertainty 

The planned value, the upper and lower control limits, the measures of actual values form a Close Loop Control System - a measurement based feedback process to improve the effectiveness and efficiency of the project management processes by [1]

  • Analyzing trends that help focus on problem areas at the earliest point in time - when the¬†variable under control starts misbehaving, intervention can be taken. No need to wait till the end to find out¬†you're not going to make it.
  • Providing early insight into error-prone products that can then be corrected earlier and thereby at lower cost - when the trends are headed to the UCL and LCL, intervention can take place.
  • Avoiding or minimizing cost overruns and schedule slips by detecting them early - by observing trends to breaches of the UCL and LCL.
    enough in the project to implement corrective actions
  • Performing better technical planning, and making adjustments to resources based on discrepancies between planned and actual progress.

Screen Shot 2017-01-12 at 3.48.34 PM

A critical success factor for all project work is Risk Management. And risk management includes the management of all kinds of risks. Risks from all kinds of sources of uncertainty, including technical risk, cost risk, schedule, management risk. Each of these uncertainties and the risks they produce can take on a range of values described by probability and statistical distribution functions. Knowing what ranges are possible and knowing what ranges are acceptable is a critical project success factor.

We need to know the Upper Control Limits (UCL) and Lower Control Limit (LCL) of the ranges of all the variables that will impact the success of our project. We need to know these ranges as a function of time With this paradigm we have logically connected project management processes with Control System processesIf the variances, created by uncertainty going outside the UCL and LCL. Here's a work in progress paper "Is there an underlying Theory of Project Management," that addresses some of the issues with control of project activities.

Here are some examples of Planned variances and managing of the actual variances to make sure the project stays on plan.

A product weight as a function of the programs increasing maturity. In this case, the projected base weight is planned and the planned weights of each of the major subsystems are laid out as a function of time. Tolerance bands for the project base weight provide management with actionable information about the progression of the program. If the vehicle gets overweight, money and time are needed to correct the undesirable variance. This is a closed loop control system for managing the program with a Technical Performance Measure (TPM). There can be cost and schedule performance measures as well.

Screen Shot 2017-01-13 at 4.23.56 PM

Below is another example of a Weight reduction attribute that has error bands. In this example (an actual vehicle like the example above) the weight must be reduced as the program proceeds left to right. We have a target weight at Test Readiness Review of 23KG. A 25KG vehicle was sold in the proposal, and we need a target weight that has a safety margin, so 23KG is our target.

As the program proceeds, there are UCL and LCL bands that follow the planned weight.  The Orange dots are the actual weights from a variety of sources - a Design Model (3D Catia CAD system), a detailed design model, a bench scale model that can be measured, a non-flying prototype, and then the 1st Flight Article). As the program progresses each of the weight measurements for each of the models through to a final article is compared to the planned weight. We need to keep these values inside the error bands of NEEDED weight reduction if we are to stay on plan.

This is the critical concept in successful project management

We must have a Plan for the critical attributes - Mission Effectiveness, Technical Performance, Key Performance Parameters - for the items. If these are not compliant, the project is bcome subject to one of the Root Causes of program performance shortfall. We must have a burndown or burnup plan for producing the end item deliverables for the program that match those parameters over the course of the program. Of course, we have a wide range of possible outcomes for each item in the beginning. And as the program proceeds the variances measures on those items move toward compliance of the target number in this case Weight.

Screen Shot 2017-01-13 at 4.21.56 PM

Here's another example of the Cone of Uncertainty, in this case, the uncertainty is the temperature of an oven being designed by an engineering team. The UCL and LCL are defined BEFORE the project starts. These are used to inform the designer of the progress of the project as it proceeds. Staying inside the control limits is the Planned progress path to the final goal - in this case, temperature.

The Cone of Uncertanty, is the signaling boundaries of the Closed Loop Control system used to manage the project to success

Screen Shot 2017-01-13 at 4.38.04 PM

It turns out the cone can also be a flat range with Upper and Lower Control Limits of the variable that is being developed - a design to variable - in this example a Measure of Performance. In this case, a Measure of Performance that needs to stay within the Upper and Lower limits as the project progresses through its gates. If this variable is out of bounds the project will have to pay in some way to get it back to Green.

A Measure of Performance characterizes physical or functional attributes relating to the system operation, measured or estimated under specific conditions. Measures of Performance are (1) Attributes that assure the system has the capability and capacity to perform and (2) Assessment of the system to assure it meets design requirements to satisfy the Measures of Effectiveness, (3) Corrective actions to return the actual performance to the planned performance when that actual performance goes outside the Upper and Lower control limits. Again this is simple statistical process control, using feedback to take corrective actions to control future outcomes - feedforward.  In the probabilistic and statistical program management paradigm, feedforward control using past performance, with future models (Monte Carlo model of future behaviors) to determine what corrective actions are needed to Keep The Program Green.

Screen Shot 2017-01-15 at 7.37.49 PM

Another cone style is the cone of confidence in a delivery date. This Actual case it's a Low Earth Orbit Vehicle Launch date. In this case, as the program moves from left to right, we need to assure that the Launch Date moves from a low confidence Date to a date that has a chance of being correct. The BLUE bars are the probabilistic ranges of the current estimate date. As the program moves forward those ranges must be reduced if we're going to show up as needed. The Planned date and a date with a margin are the build to dates. As the program moves the confidence of the date must increase and move toward the need date.

  • The probabilistic completion times change as the program matures.
  • The efforts that produce these improvements must be defined and managed.
  • The¬†error bands of the assessment points must include the¬†risk mitigation activities as well.
  • The¬†planned activities show how the¬†error band narrows over time:
    • This is the basis of a¬†risk tolerant plan.
    • The probabilistic interval become more reliable as the risk mitigation and the maturity assessment add confidence to the planned¬†launch date.

Just a reminder again - the Cone of Uncertainty is a DESIRED path, NOT the result of an unmanaged project outcome.

Risk Management, as shown below, is how Adults Manage Projects

Screen Shot 2017-01-13 at 7.09.21 AM

Wrap Up On the Misunderstanding of the Purpose and Value of the Cone of Uncertainty

When you hear... 

I have data that shows that uncertainty (or any other needed attribute) doesn't reduce and therefore the COU is a FAKE ... OR ... I see data on my projects where the variance is getting worse as we move forward, instead of narrowing as the Planned COU tells us it should be to meet our goals ...

...then that project is out of control,  starting with a missing steering target that means it's Open Loop Control and will be late, over budget, and likely not perform to the needed effectiveness and performance parameters. And when you see these out of control situations, go find the Root Cause and generate the Corrective Act. 

This data is an observation of a project not being managed as Tim Lister suggests - Risk Management is How Adults Manage Projects. 

And if these observations are taking place without corrective actions of the Root Causes of the performance shortfall, the management is behaving badly. Their just observers of the train wreck that is going to happen real soon.

The Engineering Reason for the Cone of Uncertainty Model and the Value it Provides the Designing Makers

The Cone of Uncertainty is NOT an output from the project's behaviour, by then that's too late.
The Cone of Uncertanty is a Steering Target Input to the Management Framework for increasing the probability of the project's success.
This is the Programmatic Management of the project in support of the Technical Management of the project. The processes is an engineering discipline. Systems Engineering, Risk Engineering, Safety and Mission Assurance Engineering, are typical roles where we work.
To suggest otherwise is to invert the paradigm and removes any value from the post-facto observations of the project's performance. At that point it's Too Late, the Horse has left and there's no getting him back.
Defining the planned and needed variance levels at planned points in the project is the basis of the closed loop control system needed increase the probability of success.
When variances outside the planned variance appear, the Root Cause of those must be found and corrective action take.

Here's an example from a Galorath presentation, using the framework of the Cone of Uncertainty, and the actual project cones of how to put this all together. Repeating again, the Cone of Uncertainty is the framework for the 

planned reduction of the uncertainty in critical performance measures of the project.

If your project is not reducing the uncertainty as planned for these critical performance measures - cost, schedule, and technical performance - then it's headed for trouble and you may not even know it.

Screen Shot 2017-01-22 at 12.23.32 PM

Resources

[1] Systems Engineering Measurement Primer, INCOSE

[2] System Analysis, Design, and Development Concepts, Principles, and Practices, Charles Wasson, John Wiley & Sons

[3] SMC Systems Engineering Primer & Handbook: Concept, Processes, and Techniques, Space & Missle Systems Center, U.S. Air Force

[4] Defense Acquisition Guide, Chapter 4, Systems Engineering, 15 May 2013.

[5] Program Managers Tool Kit, 16th Edition, Defense Acquisition University.

[6] "Open Loop / Close Loop Project Controls"

[7] "Reducing Estimation Uncertainty with Continuous Assessment: Tracking the 'Cone of Uncertainty',"¬†Pongtip Aroonvatanaporn, Chatchai Sinthop, Barry Boehm.¬†ASE‚Äô10, September 20‚Äď24, 2010, Antwerp, Belgium.¬†

[8]¬†Boehm, B. ‚ÄúSoftware Engineering Economics‚ÄĚ. Prentice-Hall, 1981.

[9] Boehm, B., Abts, C., Brown, A. W., Chulani, S., Clark, B. K., Horowitz, E., Madachy, R., Reifer, D. J., and Steece, B. Software Cost Estimation with COCOMO II, Prentice-Hall,
2000.
[10] Boehm, B., Egyed, A., Port, D., Shah, A., Kwan, J., and Madachy, R. "Using the WinWin Spiral Model: A Case Study," IEEE Computer, Volume 31, Number 7, July 1998, pp.  33-44 

[11] Cohn, M. Agile Estimating and Planning, Prentice-Hall, 2005

[12] DeMarco, T. Controlling Software Projects: Management, Measurement, and Estimation, Yourdon Press, 1982.

[13] Fleming, Q. W. and Koppelman, J. M. Earned Value Project Management, 2nd edition, Project Management Institute, 2000

[14] Galorath, D. and Evans, M. Software Sizing, Estimation, and Risk Management, Auer-bach, 2006

[15]Jorgensen, M. and Boehm, B. ‚ÄúSoftware Development Effort Estimation: Formal Models or Expert Judgment?‚ÄĚ IEEE Software, March-April 2009, pp. 14-19

[16] Jorgensen, M. and Shepperd, M. ‚ÄúA Systematic Review of Software Development Cost Estimation Studies,‚ÄĚ IEEE Trans. Software Eng., vol. 33, no. 1, 2007, pp. 33-53

[17] Krebs, W., Kroll, P., and Richard, E. Un-assessments ‚Äďreflections by the team, for the team. Agile 2008 Conference

[18] McConnell, S. Software Project Survival Guide, Microsoft Press, 1998

[19] Nguyen, V., Deeds-Rubin, S., Tan, T., and Boehm, B. "A SLOC Counting Standard," COCOMO II Forum 2007

[20] Putnam L. and Fitzsimmons, A. ‚ÄúEstimating Software Costs, Parts 1,2 and 3,‚ÄĚ Datamation, September through December 1979

[21] Stutzke, R. D. Estimating Software-Intensive Systems, Pearson Education, Inc, 2005. 

Related articles

Complex, Complexity, Complicated Economics of Software Development Herding Cats: Economics of Software Development Estimating Probabilistic Outcomes? Of Course We Can! I Think You'll Find It's a Bit More Complicated Than That Risk Management is How Adults Manage Projects

 

Categories: Project Management

Quote of the Day

Sun, 01/29/2017 - 18:43

Jerry, just remember, it's not a lie if you believe it - George Costanza

 

Categories: Project Management

Quote of the Day

Sat, 01/28/2017 - 18:12

The measure of who we are is how we react to something that doesn't go our way -  Gregg Popovich, Head coach of the San Antonio Spurs

Categories: Project Management

Quote of the Day

Fri, 01/27/2017 - 17:40

Never let the tool control the hand that uses it -  Lt Gen Hans H. Driessnack (ret), in Advanced Project Management, Best Practices on Implementation, page 40, Harold Kerzner, 2004

Categories: Project Management

Quote of the Day

Thu, 01/26/2017 - 04:49

Screen Shot 2017-01-25 at 8.43.09 PM

In a time of universal deceit - telling the truth is a revolutionary act.
War is peace. Freedom is slavery. Ignorance is strength.

Political language... is designed to make lies sound truthful and murder respectable, and to give an appearance of solidity to pure wind. 
If liberty means anything at all, it means the right to tell people what they do not want to hear.

Categories: Project Management

Quote of the Day

Wed, 01/25/2017 - 03:06

On a long enough timeline the survival rate of everyone drops to zero

Categories: Project Management

Quote of the Day

Sun, 01/22/2017 - 06:31

It is the first responsibility of every citizen to question authority¬†‚Äē Benjamin Franklin

Think of this when you hear someone say protesting is a waste of time, and tell you to just be quiet and accept the situation you are in.

Categories: Project Management

Quote of the Day

Mon, 01/16/2017 - 16:57

MLK

We must learn to live together as brothers or perish together as fools

Categories: Project Management

Quote of the Day

Thu, 01/12/2017 - 05:22

Overheard on Twitter

The Start of a Project Is the Worst Time to Estimate Its Duration or Cost

This is only the case if those you've hired know nothing about what capabilities are needed to produce value the project, what Features are needed to produce that Value, when those Value-producing Features are needed to meet the time cost of value payback process, what risks there are for meeting those value producing outcomes, and how the work effort to produce that value are to be measured (physical percent complete) to increase the probability of success for your project. As the project progresses this understanding will, of course, improve with feedback, working product, and learning. 

If those you've hired don't have some sense of these needs, to some level of confidence, you've hired the wrong people.

Screen Shot 2017-01-11 at 8.06.53 PM

From Estimatiing and Reporting Agile Projects with the SRDR

Related articles Managing in Presence of Uncertainty How We Make Decisions is as Important as What We Decide. Planning is the basis of decision making in the presence of uncertainty
Categories: Project Management

Estimating the Risk

Mon, 01/09/2017 - 19:19

Risk is everywhere on projects. This risk comes from two types of uncertainty. Aleatory uncertainty, which is the naturally occurring yields variances in the underlying processes. This uncertainty is handled with cost, schedule, and technical performance margins. Epistemic uncertainty comes from probabilistic processes that can be addressed with handling responses.

The idea of risk and its management and handling is a critical success factor for all software development.

One of the most rigorous theorems of economics [1] proves that the existing means of production yields greater economic performance only through greater uncertainty that is, through greater risk. While it is futile to try and eliminate risk, and questionable to try and minimize it, it is essential that risk be taken be the right risks...
We must be about to choosing rationally among risk-taking courses of action, rather than plunge into uncertainty on the basis of hunch, hearsay, or incomplete experience, no matter how meticulously quantified.
- Peter Drucker (1975) Management (From The Principles of Software Engineering, Chapter 6, Tom Glib, 1988).

Managing in the presence of risk - and the uncertainty that creates the risk - requires we make risk-informed decisions. These decisions are informed by the probabilistic and statistical outcomes of those decisions in the future. In order to make risk-informed decisions, we must estimate the outcomes and the impacts of those outcomes on future activities (cost, schedule, and technical performance of products and services). Without these estimates, there is no risk management. And as Tim Lister reminds us 

Without these estimates, there can be no risk management. And as Tim Lister reminds us 

Risk Management is how Adults Manage Projects. Be an adult, make estimates of the future outcomes of your risk informed decisions. 

[1] Control or Economic Law Paperback, Eugen von Boehm-Bawerk, 2010.

[2] "How Much Risk is Too Much Risk," Tim Lister, Boston SPIN

[3] "Risk Management is How Adults Manage Projects," Susanne Madsen 

[4] "Risk Management and Agile Software Development," Glen B Alleman

Related articles Essential Reading List for Managing Other People's Money Risk Management is How Adults Manage Projects Herding Cats: Economics of Software Development Herding Cats: Managing Uncertainty, Risk, Threat, and Opportunity What is a Team? Logically Fallacious Friday Herding Cats: Risk Management Capabilities Based Planning Herding Cats: Pictures About Managing in the Presence of Uncertainty Intellectual Honesty of Managing in the Presence of Uncertainty
Categories: Project Management

Quote of the Day

Mon, 01/09/2017 - 00:36

It is very tempting to rely on what you are experiencing - Marlene Cimons

Categories: Project Management

Risk Management

Sun, 01/08/2017 - 20:27

Risk Management is How Adults Manage Projects - Tim Lister

Risk ManagementRisk Management requires making estimates of many things. The uncertainties that create the risk - reducible (Epistemic) and irreducible (Aleatory), the impacts from the risks, the efficacy of the corrective actions, the residual reducible uncertainty, and any changes in the irreducible uncertainties.

Just like risk management, estimating is how adults manage projects. No risk management no adult managemnt. No estimates, no adult management. Since as that. 

Categories: Project Management