Skip to content

Software Development Blogs: Programming, Software Testing, Agile Project Management

Methods & Tools

Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!

Herding Cats - Glen Alleman
Syndicate content
Performance-Based Project Management¬ģ Principles, Practices, and Processes to Increase Probability of Success –Ē–ĺ–≤–Ķ—Ä—Ź–Ļ, –Ĺ–ĺ –Ņ—Ä–ĺ–≤–Ķ—Ä—Ź–Ļ
Updated: 4 hours 24 min ago

Populist versus Technical View of Problems

Thu, 11/20/2014 - 04:17

On a twitter discussions and email exchanges there is a notion of¬†populist books versus¬†technical¬†books ¬†used to address issues and problems encountered in our project management domains. My recent book¬†Performance-Based Project Management¬ģ is a populist book. There are principles, practices, and processes in the book that can be put to use on real projects, but very few equations and numbers. It's mostly narrative about increasing the probability of project success. But the to calculate that probability based on other numbers, processes, and systems is not there. That's the realm of¬†Technical books and journal papers.

The content of the book was developed with the help of editors at American Management Association, the publisher. The Acquisition Editor contacted me about writing a book for the customers of AMA. He explained up front AMA is in the money making business of selling books. And that although I may have many good ideas, even ideas that people might want to read about, it's an AMA book and I'll be getting lots of help developing those ideas into a book that will make money for AMA.

The distinction between a populist book and a technical book are the differences between a book that addresses a broad audience with a general approach to the topic and a deep dive book focused on a narrow audience.

But one other disticntion is for most of the technical approaches, some form of calculation takes place to support the materials found in the populist material. One simple example is estimating. There are estimating articles and some books that lay out the principles of estimates. We have those in our domain in the form of guidelines and a few texts. But to calculate the Estimate To Complete in a statistically sound manner, technical knowledge and the underlying mathematics of non-linear, non-stationary, stochastic processes (Monte Carlo Simulation of the projects work structure) is needed. 

Two examples of populist versus technical

Two from my past two from my current work. 

Screen Shot 2014-11-19 at 8.02.28 PM

These two books are about the same topic. General relativity and its description of the shape of our universe.  One is a best selling popularization of the topic, found in many home libraries of those interested in this fascinating topic. The one on the left is on my shelf from a graduate school course on General Relativity along with Misner, Thorne, and Wheeler's Gravity.

Dense is an understatement for the math and the results of the book on the left. So if you want to calculate something about a rapidly spinning Black Hole, you're going to need that book. The book on the right will talk about those Black Holes in non-mathematical terms, but no numbers come out from that description.

Screen Shot 2014-11-19 at 8.03.36 PMThe book on the left is about probabilistic processes in everyday life that we misunderstand or are biased to misunderstand. The many cognitive biases we use to convince ourselves we are making the right decisions on projects are illustrated through nice charts and graphs.

We use the book on the left in our work with non-stationary stochastic process of complex project cost and schedule modeling. Making these decisions is critical to quantifying how technical and economics risk may affect a system's cost. This book is a treatment of how probability methods are applied to model, measure, and manage risk, schedule, and cost engineering for advanced systems. Garvey's shows how to construct models, do the calculations, and make decisions with these calculations.

Here's The Point - Finally

If you come across a suggestion that decisions can be made in the absence of knowing anything about the future numbers or about actually doing the math, put that suggestion in the class of populist descriptions of a complex topic.

If you can't calculate something, then you can't make a decision based on the evidence represented by numbers. If you can't decide based on the math, then the only way left is to decide on intuition, hunchs, opinion, or some other seriously flawed non-analytical basis.

Just a reminder from Mr. Deming stated in yesterday's post

Deming

If it's not your money, there's likley an expectation that those providing the money are intestered in the calculations needed to make those decisions. 

Action accordingly

Related articles Estimating Guidance When the Solution to Bad Management is a Bad Solution Measures of Program Performance Should I Be Estimating My Work? Decision Making Without Estimates? Trust but Verify Anecdotal Evidence is not Actually Evidence Why Trust is Hard Baloney Claims: Pseudo - science and the Art of Software Methods
Categories: Project Management

Quote of the Day

Wed, 11/19/2014 - 19:39

All ideas require credible evidence to be tested, suspect ideas require that even more so - Deep Inelastic Scattering, thesis adviser, University of California, 1978

Carl Sagan 1When the response to questions about the applicability of an idea is push back with accusations that those asking the questions in an attempt to determine the applicability and truth of the statement are somehow afraid of that truth, suggests there is little evidence as a test of those conjectures.

When there are proposals that ignore the principles of business, microeconomics, control systems theory, and are based on well know bad management practices, with well know and easy to apply corrective actions - there is no there, there. 

Deming 2

So without a testable process, in a testable domain, with evidence based assessment of appliability, outcomes, and benefits, any suggestion is opinion at best and blather at worse.

Related articles Trust but Verify Assessing Value Produced By Investments Should I Be Estimating My Work?
Categories: Project Management

Measures of Program Performance

Mon, 11/17/2014 - 14:29

In a sufficiently complex project we need measures of progress to plan beyond burning down our list of same sized stories, which by the way require non-trivial work to make same sized and keep same sized. And of course if this same size-ness does not have a sufficiently small variance all that effort is a waste.

But let's assume we're not working on a sufficiently small project where same sized work efforts can be developed, we need measures of progress related to the Effectiveness of the deliverables and the Performance of those deliverable in producing that effectiveness for the customer.

Here's a recent webinar on this topic.

Measurement News Webinar from Glen Alleman And of course we need to define in what domain this approach can be applied, what domain it is too much, and what domain it is actually not enough. Paradigm of agile project management from Glen Alleman Then the actual conversation about any approach to Increasing the Probability of Success for our work efforts can start. Along with identifying the underlying Root Causes of any impediments to that goal that exist today and the corrective actions needed to remove them.  Without knowing the root cause and corrective actions any suggested solution has little value as it is speculative at best and nonsense at worse.
Categories: Project Management

When the Solution to Bad Management is a Bad Solution

Mon, 11/17/2014 - 01:40

SMartin_SB_PICS5Much has been written about the Estimating Problem, the optimism bias, the planning fallacy, and other related issues with estimating in the presence of Dilbert-isk style management. The notion that the solution to the estimating problem is not to estimate, but to start work, measure the performance of the work, and use that to forecast completion dates and efforts is essentially falling into the trap Steve Martin did in LA Story. 

Using yesterday's weather becasue he was too lazy to make tomorrow's forecast

By the way each of those issues has a direct and applicable solution. So next time you hear someone use them as the basis of a new idea, ask if they have tried the known to work solution to the planning fallacy, estimating bias, optimism bias, and the myriad of other project issues with knowing solutions?

All measuring performance to date does is measure yesterday's weather. This yesterday's weather paradigm has been well studied. If in fact your project is based on Climate then yesterday's weather is likely a good indicator of tomorrow's weather.

The problem of course with the yesterday's weather approach, is the same problem Steve Martin had in LA Story when he used a previously recorded weather forecast for the next day. 

Today's weather turned out not to be like yesterday's weather.

Those posting that stories settle down to a rhythm assume - and we know what assume means - that the variances in the work efforts are settling down as well. That would mean the word assume comes true Ass out of U and Me. That's a hugely naive approach, without actual confirmation that the variances are small enough to not impact the past performance. When you have statistical processes looking like this, from small sampled projects in the absence of actual reference class - in this case self-reference class - you're also being hugely naive about the possible behaviours of stochastic processes.

14_10_08_ItemsPerSprint

Then when you slice the work to same sized efforts - this is actually process used in the domains we work: DOD, DOE, ERP - you're actually estimating future performance base on a reference class and calling it Not Estimating.

So when you hear examples and Bad Management, over commitment of work, assigning a project manager to a project that is 100's of time larger than that PM has ever experienced and expecting success, getting a credible estimate and cutting it in half, or any other Dilbert style management process - and you start with dropping the core process needed to increase the probability of success.

This approach is itself contrary to good project management principles, which are quite simple:

Principles and Practices of Performance-Based Project Management¬ģ from Glen Alleman ¬†

 If we start with a solution to a problem of Bad Management, before assuring that the Principles and Practices of Good Management are in place, we'll be paving the cow path as we say in our enterprise, space, defense domain. This means that the solution will have not actually fixed the problem. It will have not treated the root cause of the problem, just addressed the symptoms.

There is no substitute for Good Management.

Inigo1And when you hear there is a smell of bad management and there is no enumeration of the root causes and the corrective actions to those root causes, remember Ingio Montoya's retort to Vizzini's statement

You keep using that word. I do not think it means what you think it means.

That word is dysfunction, smell, root cause - all of which are missing the actual innumerated root causes, assessment of the possible corrective actions, and resulting removal of the symptoms. 

I speak about this approach from my hands on experience working the Performance Assessment and Root Cause Analysis on programs that are in the headlines.

Related articles Should I Be Estimating My Work? Estimating Guidance Assessing Value Produced By Investments Basis of #NoEstimates are 27 Year Old Reports
Categories: Project Management

Veterans Day

Tue, 11/11/2014 - 17:04

Veterans-day-2009

Categories: Project Management

Estimating Guidance

Sun, 11/09/2014 - 23:45

With the plethora of opinions on estimating - some informed, many uninformed - here's my list of books and papers that inform our software estimating activities for Software Intensive Systems. These books range for hard core engineering to populist texts 

  • Estimating Software-Intensive Systems: Projects, Products, and Processes, Richard D. Stutzke, Addison Wesley. For nontrival projects, this is the starting point. Stutzke provides in depth assessment of the processes and techniques for estimating complex System of Systems based on software.
  • Software Estimation: Demystifying the Black Art, Steve McConnell, Microsoft Press. McConnell, provides an in depth look at the problems of estimating software projects and solutions for each problem. The notion that

Screen Shot 2014-11-09 at 2.59.03 PM

is not actually true after you have read the book. So please read the book and see how McConnell provides step-by-step actions for producing credible estimates.

  • Software Cost Estimation with COCOMO II, Barry Boehm, et al, Prentice Hall - this is the basis of most parametric estimating tools today. CCOMO II can be¬†tuned for a wide variety of domains. Download the tool, try it out, see what it can do for you.
  • Software Engineering Economics, Barry Boehm, Prentice Hall - writing software for money, especially others peoples money is a Microeconomics problem. These class of problems consider descions about the impact of future outcomes as¬†opportunity cost decisions. Since these decision have future impacts, estimates are needed for both the cost and impact. Writing software for money requires making estimates.
  • The Discipline of Software Engineering, Watts Humphrey, Addison Wesley - this was one of the orginal texts on how to build software products and the processes needed to assure success. Agile came along later, but many of the principles found in Humphrey's book are still applicable. The reason is agile is mostly about coding and the development of incremental results. Above that level discovering requirements derived from capabilities is still a core process.
  • Forecasting and Simulating Software Development Projects, Troy Magennis - this book is the agile version of several of the books above. Monte Carlo Simulation is used and development of the needed capabilities follows the approach found in many enterprise domains.
  • Software Sizing and Estimating, Charles Symons, John Wiley - this is a handbook for estimating. Function Points are not a broadly used as they once were, but again the principles are still applicable.
  • Economics of Iterative Software Development: Steering Toward Better Business Results, Walker Royce, Kurt Bittner, and Mike Perrow, Addison Wesley - software development is Microeconomics. This is an immutable principle. This book explains how to apply this principle in an iterative domain.
  • Facts and¬†Fallacies of Software Engineering, Robert Glass, Addision Wesley - a must read to counter the fallacy that decisons can be made in the absence of estimating the outcomes of those decisions.
  • The Incremental Commitment Spiral Model: Principles and Practices for Success Systems and Software, Barry Boehm, Jo Ann Lane, Supannika Koolmanojwong, and Richard Turner, Addison Wesley - the current Software Intensive Systems world is moving toward this approach for government systems.

Estimating software development starts with understanding what the software system is supposed to be doing and how we're able to measure that. This process is based on defining the needed  capabilities, the measures of Effectiveness, Measures of Performance, Key Performance Parameters, and Technical Performance Measures all needed for the ultimate success of the project. Along with a Plan showing the increasing maturity of the delivered capabilities. If we don't these in some form, it's going to be a disappoiintment for those payinig for our efforts when they get to the end and the outcomes are what they were expecting.

Capabilities are not Requirements. Requirements implement Capabilities. Capabilities are pretty much fixed while the Requirements evolve. Capabilities Based Planning is the basis of project management in many Software Intensive Systems.

Capabilities based planning (v2) from Glen Alleman With Capabilities in hand, the development of the system moves to the Systems Engineering and Requirements process:

With the project's capabilities defined to a level needed to start the project - failing to do this results in a Death March at worse, and spending the customer's money to discover what should have been discovered before starting. With the capabilities, the project needs to be managed in a way that will increase the probability of success.

So when you hear of some new approach to project management, ask if there is any connection to a domain and a context in that domain. Because there any many ideas about how to improve the probability of project success. But without a domain and context it'll be hard to assess if they are applicable to your specific situation. Here's one way to think about this domain dependency. From solo projects to national assets, methods, processes, tools are different as is the value at risk. 

Paradigm of agile project management from Glen Alleman Related articles How to Estimate Software Development
Categories: Project Management

Vectors and Scalars

Sat, 11/08/2014 - 11:55

In the mathematics of physics, there are two essential types of values in all calculations - Scalars and Vectors.

Scalars are isolated values, with no outside context. Indeed They remain the same regardless of any context. A common example would be mass. An object has a mass of 1 Kilogram no matter where it is, or how much physical space it occupies. The context of the object cannot change the scalar value of its mass.

  • The number of stories produced in the last iteration.
  • The number of database rows selected on average for a transaction.
  • The number of defects found in thise release to production.

Vectors are contexted values, and can change depending on that context. An object has weight, dependent on both the mass value and gravity context of the object. An object with high mass, may still have no weight in the corresponding context of gravity.

  • The change in the productivity of¬†value as a function of time and cost as the project moves through its maturation process.
  • The estimated productivity needed to complete the project on or before a need date, at or below the planned budget to achieve the needed Return on Investment on the need date, so the break even date can be announced to those paying for our work.

But most specifically, vector values allow the calculations of change over time. Numbers (scalars) without context (vectors) are not metrics.

It is meaningless to say that cost to operate the IT Service Desk has doubled within the last ten years, without also showing how the number of employees has tripled in the same time.

It is meaningless to say a self-selected 120 projects exceeded their estimated cost and duration without an assessment of the credibility of that original estimate and the determination of the Root Cause of that overage.

It is meaningless to say the end date and cost can be forecast with saying something about the underlying uncertainties in effort size, risk, inter-dependencies, changing requirements, defect rates, labor absorption rates, integration issues, performance issues, and complexities of emerging behaviours once the system starts to come together and applied to fulfill the needed capabilities.

When we hear about small sample sized forecasts of same sized work activities, or selecting the next priority item to work on without considering the inter-dependencies of past work items or future work items - we're speaking about Scalar numbers, not Vector numbers.

Vectors state magnitude and direction. Open Loop control only states magnitude. Use it at your own risk.

Categories: Project Management

Product Development Kaizen

Fri, 11/07/2014 - 19:41

Much of the noise around agile is based on platitudes and words in the absence of a domain and a context in that domain. Here's a possible anchoring processes

Product development kaizen (pdk) from Glen Alleman And an example paper based on these principles. Screen Shot 2014-11-07 at 11.34.16 AM
So when you hear about the next big thing think of Carl's suggestion. Especially if that next big thing violates the principles of business and economics. Untitled
Categories: Project Management

Assessing Value Produced By Investments

Tue, 11/04/2014 - 14:28

Speaking at the Integrated Program Management Conference in Bethesda MD this week. The keynote speaker Monday was Katrina McFarland, Assistant Secretary of Defense (Acquisition)(ASD(A)), the principal adviser to the Secretary of Defense and Under Secretary of Defense for Acquisition. 

During her talk she spoke of the role of Earned Value Management. Here's a mashup of her remarks...

EV is a thoughtful tool as the basis of a conversation for determining the value (BCWP) produced by the investment (BCWS). This conversation is an assessment of the efficacy of our budget. 

We can determine the effecacy of our budget through:

  • Measures of Effectiveness of the deliverables in accomplishing the mission or fulfilling the technical and operational needs of the business.
  • Measures of Performance of these deliverables to perform the needed functions to produce the needed effectiveness
  • Technical Performance Measures of these functions to perform at the needed technical level.

These measures answer the question of what is the efficacy of our budget in delivering the outcomes of our project.

The value of the project outcomes must be traceable to a strategy for the business or mission. Once this strategy has been identified, the Measures of Effectiveness, Performance, and Technical Performance Measures can be assigned to the elements of project. These are shown in the figure below

Screen Shot 2014-11-04 at 6.23.39 AM

This approach is scalable up and down the complexity of projects based on five immutable principles of project success.

5 Immutable Principles

Without credible answers to each of these questions, the project is on the path to failure on day one.

Categories: Project Management

Quote of the Day

Mon, 11/03/2014 - 05:27

Never attribute to malice that which is adequately explained by stupidity - Hanlon's Razor

 

Categories: Project Management

Quote of the Day

Thu, 10/30/2014 - 22:50

The most unsuccessful three years in the education of cost estimators appears to be fifth-grade artihmetic - Norman R. Augustine

Opening line in the Welcome section of Software Estimation: Demystifying the Black Art, Steve McConnell.

Augustine is former Chairman and CEO of Martin Marietta. His seminal book Augustine's Laws, describes the complexities and conundrums of today's business management and offers solutions. Anyone interested in learning how successful management of complex technology based firms is done, should read that book. As well, read McConnell's book and see if you can find where 

Screen Shot 2014-10-30 at 3.47.41 PM

Because I sure can't find that proof or any mention that estimates don't work, other than for those who failed to pay attention in the 5th grade arithmetic class.

Categories: Project Management

Why Trust is Hard

Thu, 10/30/2014 - 14:06

Screen Shot 2014-10-29 at 10.09.40 PMHugh McCleod's art for Zappo's provides the foundation for trust in that environment

If I'm the head of HR, I'm responsible for filling the desks at my company with amazing employees. I can hold people to all the right standards. But ultimately I can't control what they do. This is why hiring for culture works. What Zappos does is radical because it trusts. It says "Go do the best job you can do for the customer, without policy". And leaves employees to come up with human solutions. Something it turns out they're quite good at, if given the chance.

Now let's take another domain, one I'm very famailar with - fault tolerant process control systems. Software and support hardware applied to emergency shutdown of exothermic chemical reactors - those that make the unleaded gasoline for our cars, nuclear reactors and conventional fired power generation, gas turbine controls, and other must work properly machines. And a similar domain of DO-178c flight control systems, which must equally work without fail and provide all the needed capabilities on day one.

At Zappos, the HR Diector describes a work environment where employess are free to do the best job they can for the customer. In the domains above, employees also work to do the best job for the customer they can, but flight safety, live safety, equipment safety are also part of that best job. In other domains we work, doing the best job for the customer means processing with extremely low error rates, transactions for 100's of millions of dollars of value in the enterprise IT paradigm. Medical insurance provider services, HHS enrollment, enterprise IT in a variety of domains.

Zappo's can recover from an error, other domains can't. Nonrecoverable errors mean serious loss of revenue, or even loss of live. In the other domains, failure is similar consequences. I come from those domains, they inform my view of the software development world - where software fail safe and fault tolerance is the basis of business success.

So when we hear about the freedom to fail early and fail often in the absence of a domain or context, care is needed. Without a domain and context, it is difficult to assess the credibility of any concept, let alone one that is untested outside of personal ancedote. It comes down to Trust alone or Trust But Verify. I could also guarantee that Zappos has some of the verify process. It is doubtful employees are left to do anything they wish for their customer. The simple reason there is a business governance process at any firm, no matter the size. Behaviour, even full trust behavior fits inside that governance process.
Categories: Project Management

Quote of the Day

Wed, 10/29/2014 - 16:56

Vision without Execution is Hallucination ‚Äē Jeffrey E. Garten, The Mind Of The CEO

All the rhetoric around any idea needs actionable outcomes that can be tested in the market place, beyond the personal anecdotes of self-selected conversations.

 

Categories: Project Management

Quote of the Day

Wed, 10/29/2014 - 15:56

The Sky's the limit when you don't know what you don't know.

Categories: Project Management

Should I Be Estimating My Work?

Mon, 10/27/2014 - 15:47

The question asked by #NoEstimates is in the form of a statement. 

No Estimates

On the surface this statement sounds interesting until the second sentence. The MicroEconomics of writing software for money is based on estimating future outcomes thaty result from current day decisions. But let's pretend for a moment that Micro Econ is beyond consideration, this is never true, but let's pretend.

The next approach is to construct a small decision tree that can invert the question. Forget the exploring, since all business effort is a zero sum game, in that someone has to pay for everything we do. Exploring, coding, testing, installing training, even operating.

Screen Shot 2014-10-26 at 9.29.05 AM

 So let's start down the flow chart.

Is It Your Money?

In the crass world of capitalism, money talks, BS walks. While this may be abhorrent to some, it's the way the world works, and unless you've got your own bank, you're going to likely have to use other peoples money to produce software -  either for yourself or for someone else. Self-funded start up? No problem, but even the best known names in software today went on to raise more money to move the firm forward. Then self-funded became venture funded, private equity funded, and then publicly funded.

If you're writing software for money, and it's not your money, those providing the money have - or should have if they're savvy investors - a vested interest in knowing how much will this cost. As well when will it be done and most importantly, what will be delivered during the work effort and at the end. 

This requires estimating

Is There A Governance Policy Where You Work?

Governance of software development, either internal projects, external projects, or product development is a subset of corporate governance. 

If you work at a place that has no corporate governance, then estimating is probably a waste.

If however, you work somewhere that does have a corporate governance process - no matter how simple - and this is likely the case when there is a non-trival amount of money at risk, then someone, somewhere in the building has an interest in knowing how much things will cost before you spend the money to do them or buy them. 

This requres estimating

What's the Value at Risk for Your Project?

If the value at risk for a project is low - that is if you spend all the money and consume all the time and produce nothing and the management of your firm writes that off as a loss without much concern, then estimating probably doesn't add much value.

But if those providing you the money have an expectation that something of value will be returned and that something is needed for the business, then writing off the time and cost is not likely to be seen as favorable to you the provider. 

We trusted you because you said "trust me" and you didn't show up on or before the planned time, at or below the planned budget, with the needed capabilities - and you didn't want to estimate those up front and keep us informed about your new and updated Estimate To Complete and Estimate At Complete so we could take corrective actions to help you out - then we're going to suggest you look for work somewhere else.

On low value projects estimating the probability of success, the probability of the cost of that success, the probability of completion date of that success is not likely of much value.

But using the Value At Risk paradigm - risk of loss of a specific asset (in this case the value produced by the project) is defined as a threshold loss value, such that the probability that the loss on the value from the project over the given time horizon exceeds some value.

As an aside the notion of slicing does not reduce the Value at Risk. Slicing is a estimating normalization process where the exposure to risk is reduced to same sized chucks. But the natural and event based variability of the work is still present in the chunks, and the probability of impacting the outcome of the project due to changes in demand, productivity, defects, rework, unfavorable and even un anticipated interactions between the produced chuncks needs to be accounted for in some estimating process. aAs well the sunk cost of breaking down the work into same sized chunks needs to be accounted.

In our space and defense world, there is the 44 Day Rule, where chunks of work are broken down into 44 days (2 financial months) with tangible deliverables. The agile community would consider this to long, but they don't work on National Asset billion dollar software intensive programs, so ignore that for the moment.

So yes, slicing is a good project management process. But using the definition of No Estimates in the opening, slicing is Estimating and done in every credible project management process, usually through the Work Breakdown Structure guide.

The Five Immutable Principles of Project Success

To increase the probability of project success, five immutable principles must be in place and have credible answers to their question. Each requires some form of an estimate, since the outcomes from these principles is in the future. No amount of slicing and dicing is going to produce a non-statistical or non-probabilistic outcome. All slicing does - as mentioned before - is reduce the variance of the work demand, not the work productivity, the variance in that work productivity process, the rework due to defects, or any unidentified dependencies between those work products that will create uncertainty and therefore create risk to showing up on time, on budget, and on specification.

5 Immutable Principles

The Devil Made Me Do It

Those of us seeking an evidence based discussion about the issues around estimating - and there are an endless supply of real issues with real solutions - have pushed back on using Dilbert cartons. But I just couldn't resist today carton.

When we need to make a decision between options - Microeconomics and opportunity costs - about some outcome in the future, we need an estimate of the cost and benefit of that choice. To suggest that decisions can be made without estimates has little merit in the real world of spending other peoples money.

Screen Shot 2014-10-27 at 8.09.49 AM

Related articles No Estimates Needs to Come In Contact With Those Providing the Money
Categories: Project Management

Anecdotal Evidence is not Actually Evidence

Sun, 10/26/2014 - 16:25

Untitled When we hear I know a CEO that uses this method and she's happy with the outcomes, has several core fallacies wrapped into one.

The first is the self-selection problem of statistics. This is the Standish problem. Send out a survey, tally the results from those that were returned. Don't publish how many surveys went out and how many came back.

The next is the Anecdotal sample. I know a guy that... in support of the suggestion that by knowing someone that supports you're conjecture, your conjecture is some how supported.

These are both members of the cherry picking process. The result is lots of exchanges of questions to the original conjecture that have not basis in evidence for the conjecture.

When you encounter such a conjecture, apply the Sagan's BS detection kit

  • Seek independent confirmation of alleged facts.
  • Encourage an open debate about the issue and the available evidence.
  • In our domain and most other there are no authorities. At most, there are experts.
  • Come up with a variety of competing hypotheses explaining a given outcome. Considering many different explanations will lower the risk of confirmation bias.
  • Quantify whenever possible, allowing for easier comparisons between hypotheses' relative explanatory power.
  • Every step in an argument must be logically sound; a single weak link can doom the entire chain.
  • When the evidence is inconclusive, use¬†Occam's Razor¬†to discriminate between hypotheses.
  • Pay attention to¬†falsifiability. Science does not concern itself with unfalsifiable propositions.

When there is push back from hard questions, you'll know those making the claims have no evidence and are essentially BS'ing their constituents.

Categories: Project Management

Trust but Verify

Sat, 10/25/2014 - 18:28

There is this notion in some circles that trust trumps all business management processes.

Screen Shot 2014-10-24 at 4.35.37 PMThe Russian proverb is

"–Ē–ĺ–≤–Ķ—Ä—Ź–Ļ, –Ĺ–ĺ –Ņ—Ä–ĺ–≤–Ķ—Ä—Ź–Ļ, –õ—É—á—ą–Ķ –Ņ–Ķ—Ä–Ķ–Ī–ī–Ķ—ā—Ć, —á–Ķ–ľ –Ĺ–Ķ–ī–ĺ–Ī–ī–Ķ—ā—Ć"

Who's butchered translation is Trust but Verify, don't rely on chance.

President Regan used that proverb reflected back to the Russian in the SALT I treaty. So what does it mean trust that people can think for themselves and decide if it applies to them ... that not making estimates of the cost, performance and schedule for the project are needed?

The first question is - what's the value at risk? Trust alone is likely possible in low value at risk. In that case the impact of not showing up on or before the needed time, at or below the needed cost, and with ot without all the needed capabilities for the mission or business case fulfillment has much less impact and therefore is acceptable.

Trust

Trust but Verify

6 week DB update with 3 developers

18 month ERP integration with 87 developers whose performance is reported to the BoD on a quarterly basis

Water filter install in kitchen using the local handyman

Water filter install in kitchen with wife testing to see if it does what it said in the brochure

Install the finch feeder on the pole attached to the back deck in front of the kitchen window over looking the golf course.

Design and build the 1,200 square foot deck attached to the second floor on the back of the house using the architects plans and schedule the county for the inspection certificate so it can be used last summer.

Arrange for a ride in a glider at the county airport sometime Saturday afternoon

Plan departure from DIA and check for departure delay of SWA flight DEN to DCA.

In the first instances (left column) trust us, we'll be done in the 6 week window probably means that team doesn't need to do much estimating other than the agree among themselves that the Promise made to the manager has a good chance of coming true.

The second (right column) $178M ERP integration project in a publicly traded firm, filing their 10K and subject to FASB 86, and having promised the shareholders, insured, and the provider network that the new system will remove all the grief of the several dozen legacy apps will be made all better - on or before  the Go Live date announced at the Board Meeting and in the Press has a good chance of coming true. 

To assess that chance, more that Trust is needed. Evidence of the probability of completing on or before the go live date and at or below the target budget is needed. That probability is developed with an estimating process and updated on a periodic basis - in this case every month, with a mid-month assessment of the month end's reportable data. 

So next time you hear...

Screen Shot 2014-10-24 at 5.27.28 PM

...think of the Value at Risk, the fiduciary responsibility to those funding your work, to ask and produce an answer to the question of how much, when, and what will be delivered. And even possibly the compliance responsibility - SOX, CMS, FAR/DFARS, ITIL - for knowing to some degree of confidence the Estimate To Complete and the Estimate at Complete for your project. Writing 6 week warehouse apps, probably not much value. Spending 100's of millions of stock holders money and betting the company likely knowing something like those numbers is needed.

Trust Alone is Open Loop Trust but Verify is Closed Loop

Control systems from Glen Alleman   Without knowing the Value At Risk it's difficult if not impossible to have a conversation about applying any method of managing the spend of other peoples money. Here's a clip from another book that needs to be on the shelf of anyone accoutable for spending money in the presence of a governance process. Practical Spread Sheet Risk Modeling. Don't do risk management of other peoples money? Then you don't need this book or similar ones, and likley don't need to estimate the impact of decisions made using other peoples money. Just keep going, your customer will tell you when to stop. Screen Shot 2014-10-25 at 11.21.45 AM  
Categories: Project Management

Happy Permutation Day

Fri, 10/24/2014 - 15:05

10/24/2014

1024 - 2014

Thanks to Mr. Honner, a mathematics teacher in at Brooklyn Technical High School. If you like mathematics and appreciate the contribution a good teacher can make to mathematically understanding which is woefully lacking in our project management domain, sign up to get his blog posts.

Categories: Project Management

Basis of #NoEstimates are 27 Year Old Reports

Thu, 10/23/2014 - 17:52

The #NoEstimates movement appears to be based on a 27 year old report† that provides examples of FORTRAN and PASCAL programs as the basis on which estimates is done. 

Screen Shot 2014-10-21 at 10.51.54 PM

 

A lot has happend since 1987. For a short crtiique on the Software Crisis report - which is referenced in the #NoEstimates argument, see "There is No Software Engineering Crisis."

  • Parametric modeling tools - the structure of software projects are constrained by the external structures of their components. These can be parameterized for estimating purposes.
  • COCOMO¬†- is an example of a parametic estimating tool. There are several others.
  • Reference Class Forecasting models, have been developed as the result of overruns and disasters in many areas, including SWDev. But now we know and we know better not to succumb¬†to all the biases discovered in the past.
  • Monte Carlo Simulation, using Reference Class Forecasting, there as simple and cheap tools,¬†http://www.riskamp.com/ for example. For $125.00 all the handwaving around forecasting cost, schedule, and other project variables can be modeled with ease.¬†
  • Object Oriented Programming¬†- old news but no more debugging of FORTRAN "Unnamed COMMON" overwriting floating point numbers!!
  • Component Based Software, where we can buy¬†parts and assemble them.
  • SOA and CORBA (TIBCO) where ETL and Enterprise Bus are the part of the Enterprise Architecture. Stop writing application code and start writing scripts to integrate. BTW the example of having¬†developers write database apps for what is essentially a warehousing app, has missed the COTS, component based solutions bus.
  • FPA, while a bit¬†long in the tooth, but idea is still valid.
  • Databases of Reference Classes
  • The Web and Web components, same as above.
  • CORBA, same as above.
  • All the web based languages
  • All the runtime interpretative languages with built in debuggers, rather compiled code with¬†stop dead runtime debugging.¬†
  • ERP and COTS products and components, with out of the box functions that remove the need to write any code. Configure the system, sure. Write some scripting code ABAP e.g., but no¬†coding¬†in the developer sense.
  • Software as a Service, where you can¬†buy the solution. That was unheard of in 1986 and 1987.
  • DevOps, another unheard idea back then.
  • Open Source and Reuse

1000's of research and practicum books and papers on how to estimate software projects have be published. Maybe it's time to catch up with the 21st Century approach of estimating the time, cost, and capabilities needed to deliver value for those paying for our work. These approaches answer the mail in the 1987 report, along with the much referenced NATO Software Crisis report published in 1986.

While estimates have always been needed to make decisions in the paradigm of Microeconomics of software development, the techniques, tools, and data have improved dramatically in the last 27 years. Let's acknowldge that and start taking advantage of the efforts to improve our lot in life of being good stewards of other peoples money. And when we hear #Noestimates can be used to forecast completion times and costs at the end, test that idea with activities in the Baloney Claims check list.

‚ÄĒ‚ÄĒ‚ÄĒ‚ÄĒ‚ÄĒ‚ÄĒ‚ÄĒ‚ÄĒ‚ÄĒ‚ÄĒ‚ÄĒ‚ÄĒ

† #NoEstimates is an approach to software development that arose from the observation that large amounts of time were spent over the years in estimating and improving those estimates, but we see no value from that investment. Indeed, according to scholars Conte, Dunmore and Shens [1] a good estimate is one that is within 25% of the actual cost, 75% of the time. in http://www.mystes.fi/agile2014/

As a small aside, that's not what the statement actually says in the context of statistical estimating. It says there is a 75% confidence that there will be on overage of 25% which needs to be covered with management reserve for 25% to protect the budget. Since all project work is probabilistic, uncertainty is both naturally occurring and event based. Event based uncertainty can be reduced by spending money. This is a core concept of Agile development. Do small things to discover what will and won't work. Naturally occurring uncertainty, can only be handled with margin. In this statement - remember it's 27 years old - there is a likelihood that a 25% management reserve will be needed 25% of the time there is a project estimate produced. If you know that ahead of time, it's won't be a disappointment when it occurs 25% of the time.

This is standard best management practice in mature organizations. In some domains, it's mandatory to have Management Reserve built from Monte Carlo Simulations using Reference Classes of past performance.

Related articles How to Estimate Software Development Software Requirements Are Vague How Not To Make Decisions Using Bad Estimates #NoEstimates? #NoProjects? #NoManagers? #NoJustNo
Categories: Project Management