Skip to content

Software Development Blogs: Programming, Software Testing, Agile Project Management

Methods & Tools

Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!

Herding Cats - Glen Alleman
Syndicate content
Performance-Based Project Management¬ģ Principles, Practices, and Processes to Increase Probability of Success
Updated: 11 hours 40 min ago

The Flaw of Empirical Data Used to Make Decisions About the Future

Sun, 04/26/2015 - 19:20

It's popular in the agile world and even more popular in the No Estimates paradigm to use the term empirical data as a substitute for estimating future outcomes. And my favorite meme that further confuses the conversation.

Probabilistic forecasting will outperform estimation every time

This of course is "It is not only not right, it is not even wrong."† Probabilistic forecasting IS estimating. Estimating is about the past, present, and future. Forecasting is estimating about the future. I'll save the embarrassment by not saying the name of the #NoEstimates person posting this. 

First a definition. Empirical  is originating in or based on observation or experience. But we all should know that that data needs to properly represent two sides of the problem, the past and the future.

Let's look at some flawed logic in this empirical data paradigm:

  • The past -¬†we took 18 samples from the start of the project till now and calculated the Average number of value and we'll use that as a¬†representative¬†number for the future.
  • The future - is the past a proper representation - statistically - of the future?¬†
    • It's taken 45 minutes from the driveway to the airport garage the last 5 times I¬†left¬†on¬†Monday afternoon to the remote site.
    • What's the probability it will take 45 minutes today?

One more technical detail.

  • The¬†flow¬†or¬†Kanban¬†style processes depend on a critical concept - Each random variable that is always present in our project must be Identical and Independently distributed.
  • This means each random variable has the same¬†probability distribution¬†as the others.
  • This CAN be the case in some situations, but when we are developing software in an emergent environment - not production line - it is unlikely.¬†¬†

So Now Some Issues Of Using Just Empirical Data

The future is emergent in most development work. If it's a production line, and software development is not production, then past performance is a good indicator of future performance. So let's ask some questions before using this past empirical data:

  • Is the data in the past properly assessed for variance, stability - stationarity, independence?
  • It is now of these, what are the statistical parameters. Especially¬†independence. The notion of ¬†INVEST in agile cannot be assumed without a test.¬†
  • Is the future going to be stable, stationary, independent and represented by the past?
  • What's the uncertainty in the future events?
  • What was the uncertainty in the past that was not recognized and influenced the statistics but was not represented?
  • What are the irreducible uncertainties in the future - the naturally occurring variances that will need margin?
  • What are the reducible uncertainties in the future that must be¬†brought down or have¬†management reserve?

Don't have the answers to these and working a non-trivial project? Our empirical data is not worth much because it doesn't actually represent the future. Might as well guess and stop using the term empirical as a substitute for you know know much of anything about the future.

With those answers we can build a credible model of the future, with interdependencies between the work, probability distribution functions for the statistical behaviors of the work elements and start asking the Killer question:

What's the probability of completing on or before the need date for the work we are producing?

This answer only tells us the probability, not the exact date. So here's the most important point.

  • When we have a model, we can test if there is an acceptable probability of success.
  • That's all we can do, model, test, assess, model some more.

All decisions about future outcomes in the presence of uncertainty need estimates that are placed in the model and assessed for their applicability.

This is called Closed Loop Statistical Process Control. And that's how non-trivial projects are managed. Low value at risk, no one cares if you estimate or not. 

† Which by the way is the situation with most of  #NoEstimates conjectures, starting with the willful ignorance of the MicroEconomics of decision making as an opportunity cost process. What will is cost us if we decided by multiple choices in the presence of an uncertain future? That questions can't be answered without making an estimate of that opportunity cost.

 

Categories: Project Management

Want To Learn How To Estimate? Part Troisième

Sat, 04/25/2015 - 22:46

If you work in a domain, as I do, the need to answer the question when will we providing that needed capability to produce the planned value for the planned amount of money, then estimating is going to be part of answering those questions.

If you work where those paying for the work have little or no interest in asking these questions or knowing these answers, or have confidence you'll not overrun the cost, schedule, and deliver the needed capabilities as planned, then maybe estimates are needed.

Be interesting to hear from those actually paying for those outcomes to see what they need to make decisions in the presence of uncertainty.

Here's some more guidance for getting started with estimating software efforts.

And some tools to help out

So you see a trend here? There is nearly unlimited resources on how to estimate software development projects, how to manage in the presence of uncertainty, how to elicit requirements, how the plan and schedule software projects.

So if you hear we're bad at estimates, that's likely the actual experience for the person making that statement, because the person saying that hasn't yet learned how to estimate. Or when we hear estimates are a waste, it's likely it's not their money, so to them estimates take away from some other activity they see as more important. Why do they care of the project overruns it's budget, is late, or doesn't produce the needed value? Or my favorite estimates are the smell of dysfunction, when there is no domain, root cause, or corrective actions suggested, because that's actually hard work, and it's much easier just to point out bad management than provide suggestions of good management. 

Estimating is hard. Anything of value is hard. All the easy problems have been solved. 

But if we are to ever get a handle on the root causes of software project failure modes, we do need to seek out the root causes. A that means much more than just asking the 5 Whys. That's one of many steps in RCA and far from the complete solution to removing the symptoms of our problems. So start there, but never leave it there. 

Here's some approach we use

Unanticipated cost growth is a symptom of failing to properly estimate in the first place, update those estimates as the project progresses, and deal with the underlying risks that drive that cost growth. Same for lateness and less than acceptable delivered capabilities. Once the estimate has been established in some credible form, adjusted as the project progresses, you of course have to execute to the plan, or change the plan. It's a closed loop system

  • Target
  • Execute¬†
  • Assess performance
  • Determine error signal
  • Take corrective actions
    • Change target
    • Change execution processes
  • Repeat until complete

The answer to the root causes that create the symptoms we so quickly label as smells of dysfunction is NOT to stop doing something, but the learn what the actual cause is. Stopping before this knowledge is acquired leaves the symptom still in place. And that doesn't help anyone.

Related articles Who Builds a House without Drawings? Decision Analysis and Software Project Management Herding Cats: Five Estimating Pathologies and Their Corrective Actions Capability Maturity Levels and Implications on Software Estimating Economics of Software Development Qui Bono Want To Learn How To Estimate? Calculating Value from Software Projects - Estimating is a Risk Reduction Process Root Cause Analysis
Categories: Project Management

Complex, Complexity, Complicated

Fri, 04/24/2015 - 23:41

In the agile community it is popular to use the terms complex, complexity, complicated many times interchangeably and and many times wrongly. These terms are many times overloaded with an agenda used to push a process or even a method.

First some definitions

  • Complex - ¬†consisting of many different and connected part. Not easy to analyze or understand. Complicated or intricate. When a system or problem is considered complex, analytical approaches, like dividing it into parts to make the problem tractable is not sufficient, because it is the interactions of the parts that make the system complex and without these interconnections, the system no longer functions.
  • Complex System -¬†is a functional whole, consisting of interdependent and variable parts. Unlike ¬†conventional systems, the parts need not have fixed relationships, fixed behaviors or fixed quantities, and their individual functions may be undefined in traditional terms.
  • Complicated - containing a number of hidden parts, which must be revealed separately because they do not interact. Mutual interaction of the components creates nonlinear behaviors of the system. In principle all systems are complex. The number of parts or components is irrelevant n the definition of complexity. There can be complexity - nonlinear behaviour - in small systems of large systems.¬†
  • Complexity - there is no standard definition of complexity is a view of systems that suggests simple causes result in complex effects. Complexity as a term¬†is generally used to characterize a system with many parts whose interactions with each other occur in multiple ways. Complexity can occur in a variety of forms
    • Complex behaviour
    • Complex mechanisms
    • Complex situations
    • Complex systems
    • Complex data
  • Complexity Theory -¬†states that critically interacting components self-organize to form potentially evolving structures exhibiting a hierarchy of emergent system properties.¬†This theory takes the view that systems are best regarded as wholes, and studied as such, rejecting the traditional emphasis on simplification and reduction as inadequate techniques on which to base this sort of scientific work.

One more item we need is the types of Complexity

  • Type 1 - fixed systems, where the structure doesn't change as a function of time.
  • Type 2 - systems where time causes changes. This can be repetitive cycles or change with time.
  • Type 3 - moves beyond repetitive systems into organic where change is extensive and non-cyclic in nature.
  • Type 4 - are self organizing where we can¬†combine internal constraints of closed systems, like machines, with the creative evolution of open systems, like people.

And Now To The Point

When we hear complex, complexity, complex systems, complex adaptive system, pause to ask what kind of complex are you talking about. What Type of complex system. In what system are you applying the term complex. Have you classified that system in a way that actually matches a real system.

It is common use the terms complex, complicated, and complexity are interchanged. And software development is classified or mis-classified as one or the both or all three. It is also common to toss around these terms with not actual understanding of their meaning or application.

We need to move beyond buzz words. Words like Systems Thinking. Building software is part of a system. There are interacting parts that when assembled, produce an outcome. Hopefully a desired outcome. In the case of software the interacting parts are more than just the parts. Software has emergent properties. A Type 4 system, built from Type 1, 2, and 3 systems. With changes in time and uncertainty, modeling these systems requires stochastic processes. These processes depend on estimating behaviors as a starting point. 

The understanding that  software development is an uncertain process (stochastic) is well know, starting in the 1980's [1] with COCOMO. Later models, like Cone of Uncertainty made  is clear that these uncertainties, themselves, evolve with time. The current predictive models based on stochastic processes include Monte Carlo Simulation of networks of activities, Real Options, and Bayesian Networks. Each is directly applicable to modeling software development projects.

[1] Software Engineering Economics, Barry Boehm, Prentice-Hall, 1981.

Related articles Decision Analysis and Software Project Management Making Decisions in the Presence of Uncertainty Some More Background on Probability, Needed for Estimating Approximating for Improved Understanding The Microeconomics of a Project Driven Organization How to Avoid the "Yesterday's Weather" Estimating Problem Hope is not a Strategy
Categories: Project Management

Build a Risk Adjusted Project Plan in 6 Steps

Thu, 04/23/2015 - 17:35

When we hear about project planning and scheduling, we think about tasks being planned, organized in the proper order, durations assigned to the work, resources committed to perform the work. 

This is all well and good, but without a risk adjusted view of the planned work, it's going to be disappointing at best. There are some key root causes of most project failure. Let's start here.

Screen Shot 2015-04-23 at 10.01.38 AM

Each of these has been shown, through research on failed programs, to contribute to cost and schedule impacts. Unrealistic expectations of the project's deliverables, Technical issues, Naive Cost and Schedule estimating, and less than acceptable risk mitigation planning.

Project Mangement in Six Steps

Screen Shot 2015-04-23 at 10.33.57 AM

Here's how to address the cost and schedule estimating

Develop a schedule. What ever your feelings are for Gantt's, sticky note, or any handwaving processes you've learned to use, you need a sequence of the work, the dependencies, the planned durations. Without something like that , you have no idea what work is needed to complete the project. Here's a straightforward Master Schedule for some flight avionics on a small vehicle. All software, has to complete as planned, otherwise the users can't do their job as planned. And since they're the ones paying for our work, they have an expectation of us showing up, near oru budget, with the needed capabilities. Not the Minimum, the NEEDED

Screen Shot 2015-04-23 at 9.55.52 AM

Using a Monte Carlo tool (RiskyProject), here is run for the probabilities of cost, duration and completion dates. All project work is probabilistic, any notion that a deterministic plan can be successful is going to result in disappointment.

We usually call our planning sessions done when we can get the risk adjusted Integrated Master Schedule to show the completion date of on or before the need date to an 80% confidence level.

Screen Shot 2015-04-23 at 9.02.07 AM

With a resource loaded schedule - or some external time phased cost model - we can now show the probability of completing on or before the need date, and at of below the planned budget. The chart below informs everyone what the chances are for success for the cost and schedule aspects of the project. Technically it has to work. The customer gets to say that. The Fit for Use and Fit for Purpose measures if we're in an IT Governance paradigm. The Measures of Effectiveness and Measures of Performance if we're in other paradigms. Those measures can be modeled as well, but I'm just focusing on cost and schedule here.

Screen Shot 2015-04-23 at 9.22.27 AM

With this information we can produce the needed margin and management reserve to protect the delivery date and budget. Showing up late and over budget for a product that works is usually not good business. 

Do You Need all This?

What's the Value At Risk?

  • A $10,000 warehouse database app update - certainly not. Just do it.
  • Website for gaming, probably not, just start and let the market tell you where to go. Try to forecast what features and how much they'll cost periodically so those paying get a sense of value returned for their investment.
  • Product development, with a planned launch date, planned sell price, planned cost (socan make money) and planned features. Probably need so visibility into what's going to be happening in the future.
  • Enterprise IT, say an ERP system, worth 10's of millions? Better have a plan.¬†

Don't know the value at risk? Don't have a clear vision of what done looks like in  units of measure meaningful to the decision makers? You've got bigger problems. This approach won't help?

Related articles Calculating Value from Software Projects - Estimating is a Risk Reduction Process Incremental Delivery of Features May Not Be Desirable Decision Analysis and Software Project Management How to Avoid the "Yesterday's Weather" Estimating Problem Hope is not a Strategy Critical Success Factors of IT Forecasting
Categories: Project Management

The Reason We Plan, Schedule, Measure, and Correct

Wed, 04/22/2015 - 15:56

YogiThe notion that planning is of little value, seems to be lost on those not accountable for showing up on or before the need date, at or below the needed cost, and with the planned capabilities needed to fulfill the business case or successfully accomplish the mission.

Yogi says it best in our project management domain. And it bears repeating when someone says let's get started and we'll let the requirements emerge. Or my favorite, let's get started and we'll use our perform numbers to forecast future performance, we don't need no stink'in estimates.

Yogi says ...If you don't know where you are going, you'll end up someplace else.

This is actually a quote from Alice in Wonderland 

"Would you tell me, please, which way I ought to go from here?"
"That depends a good deal on where you want to get to," said the Cat.
"I don't much care where--" said Alice.
"Then it doesn't matter which way you go," said the Cat.
"--so long as I get SOMEWHERE," Alice added as an explanation.
"Oh, you're sure to do that," said the Cat, "if you only walk long enough."
(Alice's Adventures in Wonderland, Chapter 6)

This is often misquoted as If you don't know where you're going, any road will get you there. Which is in fact  technically not possible and not from Alice.

So What To Do?

We need a plan to deliver the value that is being exchanged for the cost of that value. We can't determine the result value - benefit - until first we know the cost to produce that value, then we have to know when that value will be arriving. 

  • Arriving late and over budget diminishes the value for a higher cost, since arriving late means we've had to pay more in labor - people continue to work on producing the value. That extra cost diminishes the value.
  • Arriving with less than the needed capabilities diminishes the value for the same cost.

Both these conditions are basic Managerial Finance 101 concepts base on Return on Investment. 

ROI = (Value - Cost) / Cost

The first thing some will say is but value can't be monetized. Ask the CFO of your firm to see what she thinking about monetizing the outcomes of your work on the balance sheet. Better yet, don't embarrass yourself, read Essentials of Managerial Finance, Brigham and Weston. Mine is 11th Edition, looks like its up to the 14th Edition.

As well, once it is established that both  cost and value are random variables about measurements in the future, you'll need to estimate them to some degree of confidence if you're going to make decisions. These decisions are actual opportunity cost decisions about future outcomes. This is the basis of Microeconomics of software development.

2000px-PDCA_Cycle.svgSo when you hear we can make decisions about outcomes in the future in the presence of uncertainty - the basis of project work - without estimating - don't believe a word of it. Instead read Weston and you too will have the foundational skills to know better. 

Because the close loop management processes we need on project and product development require we make decisions in the presence of uncertainty. There is simply no way to do that without estimating all the variance when we Plan, Do, Check, Act. Each is a random variable, with random outcomes. each require some access of what will happen if I do this. A that notion of let's just try it reminds me of my favorite picture of open loop, no estimates, no measurement, no corrective action management.

Sailing Over The Edge

 

Categories: Project Management

How to Avoid the "Yesterday's Weather" Estimating Problem

Tue, 04/21/2015 - 15:49

One suggestion from the #NoEstimates community is the use of empirical data of past performance. This is many time called yesterdays weather. First let's make sure we're not using just the averages from yesterdays weather. And even adding the variance to that small sample of past performance can lead to very naive outcomes. 

We need to do some actual statistics on that time series. A simple R set of commands will produce the chart below from the time series of past performance data.

Forecast

 But that doesn't really help without some more work.

  • Is the future Really like the past - are the work products and the actual work performed in the past replicated ¬†in the future? If so, this sound like a simple project, just turn out features that all look alike.
  • Is there any interdependencies that grow ¬†in complexity as the project moves forward? This is the integration and test problem. Then the system of systems integration and test problem. Again simple project don't usually have this problem. More complex projects do.
  • What about those pesky¬†emerging¬†requirements. This is a favorite idea of agile (and correctly so), but simple past performance is not going to forecast the needed performance in the presence of emerging requirements
  • Then there all the externalities of all project work, where are those captured in the sample of past performance?
  • All big projects have little projects inside them is a common phrase. Except that collection of little projects need to be integrated, tuned, tested, verified, and validated so that when all the parts are ¬†assembled they actually do what the customer wants.

Getting Out of the Yesterday's Weather Dilemma

Let's use the chart below to speak about some sources of estimating NOT based on simple small samples of yesterdays weather. This is a Master Plan for a non-trivial project to integrate half dozen or so legacy enterprise systems with a new health insurance ERO system for an integrated payer/provider solution: 

Capabilities Flow

  • Reference Class Forecasting for each class of work product.
    • As the project moves left to right in time the¬†classes of product and the related work likely change.¬†
    • Reference classes for each of this movements through increasing maturity, and increasing complexity from integration interactions needs to be used to estimate not only the current work but the next round of work
    • In the chart above work on the left is planned with some level of confidence, because it's work in hand. Work in the right is in the future, so an courser estimate is all that is needed for the moment.
    • This is a¬†planning package¬†notion used in space and defense. Only plan in detail what you understand in detail.
  • Interdependencies Modeling in MCS
    • On any non-trivial project there are interdependencies
    • The notion of INVEST¬†needs to be tested¬†
      • Independent - not usually the case on enterprise projects
      • Negotiable - usually not, since he ERP system provides the core capability to do business. Would be illogical to have half the procurement system.¬†We can issue purchase orders and receive goods. But we can't pay for them until we get the Accounts Payable system. We need both at the same time. Not negotiable.
      • Valuable - Yep, why we doing this if it's not valuable to the business. This is a strawman used by low business maturity projects.
      • Estimate - to a good approximation is what the advice tells us. The term¬†good needs a unit of measure
      • Small - is a domain dependent measure. Small to an enterprise IT projects may be huge to a sole contributor game developer.
      • Testable - Yep, and verifiable, and validatable, and secure, and robust, and fault tolerant, and meets all performance requirements.
  • Margin - protects dates, cost, and technical performance from irreducible uncertainty. By irreductible it means nothing can be done about the uncertainties. It's not the lack of knowledged that is found in reducible uncertainty. Epistemic uncertainty. Irreducible uncertainty is Aleatory. It's the natural randomness in the underlying processes that creates the uncertainty. When we are estimating in the presence of aleatory uncertainty, we must account for this aleatory uncertainty. This is why using the¬†average of a time series for making a decision about possible future outcomes will always lead to disappointment.¬†
    • First we should always use the Most Likely value of the time series, not Average of the time series.
    • The Most Likely - the Mode - is that number that occurs most often of all the possible values that have occurred in the past. This should make complete sense when we consider¬†what value will appear next? Why the value that has appeared Most¬†Often¬†in the past.
    • The Average of two numbers 1 and 99 is 50. The average of two numbers 49 and 51 is 50. Be careful with averages in the absence of knowing the variance.
  • Risk retirement - Epistemic uncertainty creates risks that can be retired. This means spending money and time. So when we're looking at past performance in an attempt to estimate future performance (Yesterdays Weather), we must determine what kind of uncertainties there are in the future and what kind of uncertainties we encountered in the past.
    • Were the and are they reducible or irreducible?
    • Did the performance in the past contain irreducible uncertainties, baked into the numbers that we did not recognize?¬†

This bring up a critical issue with all estimates. Did the numbers produced from the past performance meet the expected values or were they just the numbers we observed? This notion of taking the observed numbers and using them for forecasting the future is an Open Loop control system. What SHOULD the numbers have been to meet our goals? What SHOULD the goal have been? Did know that, then there is no baseline to compare the past performance against to see if it will be able to meet the future goal. 

I'll say this again - THIS IS OPEN LOOP control, NOT CLOSED LOOP. No about of dancing around will get over this, it's a simple control systems principle found here. Open and Close Loop Project Controls

  • Measures of physical percent complete to forecast future performance with cost, schedule, and technical performance measures - once we have the notion of Closed Loop Control and have constructed a¬†steering target, can capture actual against plan, we need to define measures that are meaningful to the decisions makers. Agile does a good jib of forcing¬†working product to appear often. The assessment of Physical Percent Complete though needs to define what that¬†working software¬†is supposed to do in support of the business plan.
  • Measures of Effectiveness - one very good measure is of¬†Effectiveness. Does the software provide and effective solution to the problem. This begs the question or questions. What is the problem and what does an effective solution looks like were it to show up.¬†
    • MOE's are operational measures of success that are closely related to the achievements of the mission or operational objectives evaluated in the operational environment, under a specific set of conditions.
  • Key performance parameters - the companion of Measures of Effectiveness are Measures of Performance.
    • MOP's characterize physical or functional attributes relating to the system operation, measured or estimated under specific conditions.
  • Along with these two measures are Technical Performance Measures
    • TPM's are attributes that determine how well a system or system element is satisfying or expected to satisfy a technical requirement or goal.
  • And finally there are Key Performance Parameters
    • KPPs represent the capabilities and characteristics so significant¬† that failure to meet them can be cause for reevaluation, reassessing, or termination of the program.

The connections between these measures are shown below.

Screen Shot 2015-03-28 at 4.37.51 PM

With these measures, tools for making estimates of the future - forecasts - using statistical tools, we can use yesterdays weather, tomorrow models and related reference classes, desired MOE's, MOP's, KPP's, and TPM's and construct a credible estimate of what needs to happen and then measure what is happening and close the loop with an error signal and take corrective action to stay on track toward our goal.

This all sounds simple in principle, but in practice of course it's not. It's hard work, but when you assess the value at risk to be outside the tolerance range where thj customer is unwilling to risk their investment, we need tools and processes wot actually control the project.

Related articles Hope is not a Strategy Incremental Delivery of Features May Not Be Desirable
Categories: Project Management

Doing the Math

Tue, 04/21/2015 - 15:09

In the business of building software intensive systems; estimating, performance forecasting and  management, closed loop control in the presence of uncertainty for all variables is the foundation needed for  increasing the probability of success.

This means math is involved in planning, estimating, measuring,  analysis, and corrective actions to Keep the Program Green.

When we have past performance data, here's one approach...

And the details of the math in the Conference paper

  Related articles Hope is not a Strategy How to Avoid the "Yesterday's Weather" Estimating Problem Critical Success Factors of IT Forecasting
Categories: Project Management

Approximating for Improved Understanding

Mon, 04/20/2015 - 22:05

Screen Shot 2015-04-09 at 9.35.45 AMThe world of projects, project management, and the products or services produced by those projects is uncertain. It's never certain. Seeking certainty is not only naive, it's simply not possible.

Making decisions in the presence of this uncertainty is part of our job as project managers, engineers, developers on behave of those paying for our work.

It's also the job of the business, whose money is being spent on the projects to produce tangible value in exchange for that money.

From the introduction of the book to the left...

Science and engineering, our modern ways of understanding and altering the world, are said to be about accuracy and precision. Yet we best master the complexity of our world by cultivating insight rather than precision. We need insight because our minds are but a small part of the world. An insight unifies fragments of knowledge into a compact picture that fits in our minds. But precision can overflow our mental registers, washing away the understanding brought by insight. This book shows you how to build insight and understanding first, so that you do not drown in complexity.

So what does this mean for our project world?

  • The future is uncertain. It is always uncertain. It can't be anything but uncertain. Assuming certainty, is a waste of time. Managing in the presence of uncertanity is unavoidable. To do this we must estimate. This is unavoidable. To suggest otherwise willfully ignores the basis of all management practices.
  • This uncertainty creates risk to our project. Cost, schedule, and risks to the delivered capabilities of the project or product development. To manage in with closed loop process, estimates are needed. This is unavoidable as well.
  • Uncertainty is either¬†reducible or¬†irreducible
    • Reducible uncertainty can be¬†reduced with new information. We can¬†buy down this uncertainty.
    • Irreducible uncertainty - the natural variations in what we do - can only be handled with¬†margin.

In both these conditions we need to get organized in order to address the  underlying uncertainties. We need to put structure in place in some manner. Decomposing the work is a common way in the project domain. From a Work Breakdown Structure to simple sticky notes on the wall, breaking problems down into smaller parts is a known successful way to address a problem. 

With this decomposition, now comes the hard part. Making decisions in the presence of this uncertainty.

Probabilistic Reasoning

Reasoning about things that are uncertain is done with probability and statistics. Probability is a degree of belief. 

I believe we have a 80% probability of completing on or before the due date for the migration of SQL Server 2008 to SQL Server 2012.

Why do we have this belief?  Is it based on our knowledge from past experience. Is this knowledge sufficient to establish that 80% confidence?

  • Do we have some external model of the work effort needed to perform the task?
  • Is there a parametric model of similar work that can be applied to this work?
  • Could we decompose the work to smaller¬†chunks that could then be used to model the larger set of tasks?
  • Could I conduct some experiments to improve my knowledge?
  • Could I build a model from intuition that could be used to test the limits of my confidence?

The answers to each of these informs our belief. 

Chaos, Complexity, Complex, Structured?

A well known agile thought leader made a statement today

I support total chaos in every domain

This is unlikely going to result in sound business decisions in the presence of uncertainty. Although there may be domains where chaos might produce usable results, when some degree of confidence that the money being spent will produce the needed capabilities, on of before the need date, at of below the budget needed to be profitable, and with the collection of all the needed capability to accomplish the mission or meet the business case, we're going to need to know how to manage our work to achieve those outcomes.

So let's assume - with a high degree of confidence - that we need to manage in the presence of uncertainty, but we have little interest in encouraging chaos, here's one approach.

So In The End

Since all the world's a set of statistical processes, producing probabilistic outcomes, which in turn create risk to any expected results when not addressed properly - the notion that decisions can be made in the presence of this condition can only be explained by the willful ignorance of the basic facts of the physic of project work. 

  Related articles The Difference Between Accuracy and Precision Making Decisions in the Presence of Uncertainty Managing in Presence of Uncertainty Herding Cats: Risk Management is How Adults Manage Projects Herding Cats: Decision Analysis and Software Project Management Five Estimating Pathologies and Their Corrective Actions
Categories: Project Management

Estimates

Mon, 04/20/2015 - 22:00

Estimating SW Intensive SystemsEstimation and measurement of project attributes are critical success factors for designing, building, modifying, and operating products and services. †

Good estimates are the key to project success. Estimates provide information to the decision makers to assess adherence to performance specifications and plans, make decisions, revise designs and plans, and improve future estimates and processes.

We use estimates and measurements to evaluate the feasibility and affordability of products being built, choose between alternatives designs, assess risk, and support business decisions. Engineers compare estimates if technical baselines of observed performance to decide of the product meets its functional and performance requirements. These are used by management to control processes and detect compliance problems. Process manager use capability baselines to improve production processes.

Developers, engineers, and planners estimate resources needed to develop, maintain, enhance and deploy products, Project planners use estimates for staffing, facilities. Planners and managers use estimates for resources to determine project cost and schedule and prepare budgets and plans. 

Managers compare estimates - cost and schedule baselines - and actual values to determine deviations from plan and understand the root causes of those deviations needed to take corrective actions. Estimates of product, project, and process characteristics provide baselines to assess progress during the project. 

Bad estimates affect all participants in the project or product development process. Incomplete and inaccurate estimates mean inadequate time and money  available for increasing the probability of project success.

The Nature of Estimation

The verb estimate means to produce a statement of the approximate value of some quantity that describes or characterizes an object. The noun estimate refers to the value produced by the verb. The object can be an artifact - software, hardware, documents - or an activity - planning, development, testing, or process.

We make estimates because we cannot directly measure the value of that quantity because:

  • The object is inaccessible
  • The object does not exist yet
  • The Measurement would be too expensive

Reasons to Estimate and Measure Size, Cost and Schedule

  • Evaluate feasibility of requirements.
  • Analyze alternative designs and implementations.
  • Determ required capacity and speed of produced results.
  • Evaluate performance - accuracy, speed, reliability, availability and other¬†...ilities.
  • Identify and assess technical risks.
  • Provide technical baselines for tracking and guiding.

Reasons to Estimate Effort, Cost, and Schedule

  • Determine project feasibility in terms of cost and schedule.
  • Identify and assess risks.
  • Negotiate achievable commitments.
  • Prepare realistic plans and budgets.
  • Evaluate business value - cost versus benefit.
  • Provide cost and schedule baselines for tracking and guiding.

Reasons to Estimate Capability and Performance

  • Predict resource consumption and efficiency.
  • Establish norms for expected performance.
  • Identify opportunities for improvement.

There are many sources of data for making estimates, some reliable some not. Human subject matter expert based estimates have been shown to be the least reliable, accurate and precise due to the biases involved in the human processes of developing the estimate. estimates based on past performance, while useful, must be adjusted for the statistical behaviors of the past and the uncertainty of the future. 

If the estimate is misused in any way, this is not the fault of the estimate - both noun and verb - byt simply bad management. Fix that first, then apply proper estimating processes.

If your project or product development effort does none of these activities or has no need for information on which to make a decision, then estimating is likely a waste of time.

But before deciding estimate are the smell of dysfunction, with NO root cause identified for corrective action check with those paying your salary first, to see what they have to say about your desire to spend their money in presence of uncertainty with the absence of an estimate to see what they say.

† This post is extracted from Estimating Software Intensive Systems: Project, Products and Processes, Dr. Richard Stutzke, Addison Wesley. This book is a mandatory read for anyone working in a software domain on any project that is mission critical. This means if you need to show up on or before the need date, at or below your planned cost, with the needed capabilities - Key Performance Parameters , without which the project will get cancel - then you're going to need to estimate all the parameters of your project, If your project doesn't need to show up on time, stay on budget, or can provide less than the needed capabilities, no need to estimate. Just spend your customer's money, she'll tell you when to stop.

Related articles Capability Maturity Levels and Implications on Software Estimating Incremental Delivery of Features May Not Be Desirable Capabilities Based Planning First Then Requirements
Categories: Project Management

Root Cause Analysis

Mon, 04/20/2015 - 16:07

Root-causeRoot Cause Analysis is a means to answer to why we keep seeing the same problems over and over again. When we treat the symptoms, the root cause remains.

In Lean there is a supporting process of 5S's. 5S's is a workplace organization method that uses a list of five words seiri, seiton, seiso, seiketsu, and shitsuke. This list describes how to organize a work places for efficiency and effectiveness by identifying and storing items used, maintaining the areas and items, and sustaining the new order. The decision making process usually comes from a dialogue about standardization, which build understanding around the employees of how they should do their work.

At one client we are installing Microsoft Team Foundation Server, for development, Release Management and Test Management. The current processes relies on the heroics of many on the team every Thursday night to get the release out the door.

We started the improvement of the development, test, and release process with Root Cause Analysis. In this domain Cyber and Security are paramount, so when there is a cyber or a data security  issue, RCA is the core process to address the issue.

The results of the RCA have show that the work place is chaotic at times, code poorly managed, testing struggles on deadline, and the configuration of the release base inconsistent. It was clear we were missing tools, but the human factors were also the source of the problem - the symptom of latent defects and a break fix paradigm.

There are many ways to ask and answer the 5 Whys and apply the 5 S's, but until that is done and the actual causes determined, and the work place cleaned up, the symptoms will continue to manifest in undesirable ways. 

If we're going to start down the path of 5 Whys and NOT actually determine the Root Cause and develop a corrective action plan, then that is in itself a waste. 

Related articles Five Estimating Pathologies and Their Corrective Actions Economics of Software Development
Categories: Project Management

Economics of Software Development

Sun, 04/19/2015 - 16:21

Economics is called the Dismal Science. Economics is the branch of knowledge concerned with the production, consumption, and transfer of wealth. Economics is generally about behaviors of humans and markets, given the scarcity of means, arises to achieve certain ends.

How does economics apply to software development? We're not a market, we don't create wealth, at least directly, we create products and services that may create wealth. Microeconomics is a branch of economics that studies the behavior of individuals and their decision making on the allocation of limited resources. It's the scarcity of resources that is the basis of Microeconomics. Software development certainly operates in the presence of scarce resources. MicroEconomics is closer to what we need to make decisions in the presence of uncertainty. The general economics processes ae of litle interest, so starting with Big Picture Econ books is not much use.

Software economics is a subset of Engineering Economics. A key aspect of all Microeconomics applied to engineering problems is the application of Statistical Decisions Theory - making decisions in the presence of uncertainty. Uncertainty comes in two types:

  • Aleatory uncertainty - the naturally occurring variances in the underlying processes.
  • Epistemic uncertainty - the lack of information about a probabilistic event in the future.

Aleatory uncertainty can be addressed by adding margin to our work. Time and Money. Epistemic uncertainty and the missing information has economic value to our decision making processes. That is there is economic value in decision based problems in the presence of uncertainty.

This missing information can be bought down with simple solutions. Prototypes for example. Short deliverables to test an idea or confirm an approach. Both are the basis of Agile and have been discussed in depth in Software Engineering Economics, Barry Boehm, Prentice Hall. 1981.

Engineering economics is the application of economic techniques to the evaluation of design and engineering alternatives. Engineering economics assesses the appropriateness of a given project, estimates of its value, and justification of the project (or product) from an engineering standpoint.

This involves the time value of money and cash-flow concepts, -  compound and continuous interest. It continues with economic practices and techniques used to evaluate and optimize decisions on selection of strategies for project success. 

When I hear I read that book and it's about counting lines of code, the reader has failed to comprehend the difference between principles and practices. The section of Statistical Decision theory are about the Expected Value of Perfect Information and how to make decisions with Imperfect information.

Statistical Decision Theory is about making choice, identifying the values, uncertainties and other issues relevant in a given decision, its rationality, and the resulting optimal decision. In Statistical Decision Theory, the underlying statistical processes and the resulting Probabilistic outcomes require us to Estimate in the presence of uncertainty.

Writing software for money, other people's money, requires us to estimate how much money, when we'll be done spending that money and what will result from that spend.

This is the foundation of the Microeconomics of Software Development

If there is no scarcity of resources - time, cost, technical performance - then estimating is not necessary. Just start the work, spend the money and you'll be done when you're done. If however 

Related articles Five Estimating Pathologies and Their Corrective Actions Critical Success Factors of IT Forecasting
Categories: Project Management

For the Upcoming Graduates

Fri, 04/17/2015 - 05:36

No matter your life experience, your view of the world, whether you've had military experience or not, this is - or should be - an inspirational commencement speech. 

 

Categories: Project Management

Debunking

Wed, 04/15/2015 - 04:21

This Blog has been focused on improving program and project management process for many years. Over that time I've run into several bunk ideas around projects, development, methods, and process of managing other peoples money. When that happens, the result is a post or two about the nonsense idea and the corrections to those ideas, not just form my experience but from the governance frameworks that guide our work.

A post on Tony DaSilva's blog, was about the Debunkers Club that struck a cord. I've edit that blog's content to fit mine domain, with full attribution.

This Blog is dedicated to the proposition that all information is not created equal. Much of it is endowed by its creators with certain undeniable wrongs. Misinformation is dangerous!!

There's a lot of crap floating around the any business or technical field. Much of it gets passed around by well-meaning folks, but it is harmful regardless of the purity of the conveyer.

People who attempt to debunk myths, mistakes, and misinformation are often tireless in their efforts. They are also too often helpless against the avalanche of misinformation.

The Debunker Club is an experiment in professional responsibility. Anyone who's interested may join as long as they agree to the following:

  1. I would like to see less misinformation in the project management field. This includes, planning, estimating, risk, execution, performance management, development methods.
  2. I will invest some of my time in learning and seeking the truth, from sources like peer-reviewed scientific research or translations of that research.
  3. I will politely, but actively, provide feedback to those who transmit misinformation.
  4. I will be open to counter feedback, listening to understand opposing viewpoints based on facts, example, and evidence outside personal opinion. I will provide counter-evidence and argument when warranted.
Related articles Debunker Club Works to Dispel the Corrupted Cone of Learning Five Estimating Pathologies and Their Corrective Actions Critical Success Factors of IT Forecasting Calculating Value from Software Projects - Estimating is a Risk Reduction Process
Categories: Project Management

Qui Bono

Tue, 04/14/2015 - 00:57

When we hear a suggestion about a process that inverts the normal process based on a governance framework - say Microeconomics of Software Development, we need to ask who benefits? How would that suggestion be tangibly beneficial to the recipient that is now inverted?

Estimates for example are for the business, why would the business no longer what an estimate of cost, schedule, or technical performance of the provided capabilities?

In the world of spending money to produce value, the one that benefits, should be, must be, the one paying for that value and therefore have a compelling interest in the information needed to make decisions about how the money is spent.

When that relationship between paying and benefit is inverted, then the path to Qui Bono is inverted as well.

In the end follow the money must be the basis of assessing the applicability of any suggestion. If it is suggested that decision making can be done in the absence of estimating the impacts of those decisions, who benefits. If it's not those paying for the value, then Qui Bono is not longer applicable. 

Categories: Project Management

The Flaw of Averages and Not Estimating

Mon, 04/13/2015 - 16:04

There is a popular notion in the #NoEstimates paradigm that Empirical data is the basis of forecasting the future performance of a development project. In principle this is true, but the concept is not complete in the way it is used. Let's start with the data source used for this conjecture.

There are 12 sample in the example used by #NoEstimates. In this case stickies per week. From this time series an average is calculated for the future. This is the empirical data is used to estimate in the No Estimates paradigm. The Average is 18.1667 or just 18 stickies per week.

Data

 

But we all have read or should have read Sam Savage's The Flaw of Averages. This is a very nice populist book. By populist I mean an easily accessible text with little or not mathematics in the book. Although Savage's work is highly mathematically based with his tool set.

There is a simple set of tools that can be applied for Time Series analysis, using past performance to forecast future performance of the system that created the previous time series. The tool is R and is free for all platforms. 

Here's the R code for performing a statistically sound forecast to estimate the possible ranges values the past empirical stickies can take on in the future.

Put the time series in an Excel file and save it as TEXT named BOOK1

> SPTS=ts(Book1) - apply the Time Series function in R to convert this data to a time series
> SPFIT=arima(SPTS) - apply the simple ARIMA function to the time series
> SPFCST=forecast(SPFIT) - build a forecast from the ARIMA outcome
> plot(SPFCST) - plot the results

Here's that plot. This is the 80% and 90% confidence bands for the possible outcomes in the future from the past performance - empirical data from the past. 

The 80% range is 27 to 10 and the 90% range is 30 to 5.

Rplot

So the killer question.

Would you bet your future on a probability of success with a +65 to -72% range of cost, schedule, or technical performance of the outcomes?

I hope not. This is a flawed example I know. Too small a sample, no adjustment of the ARIMA factors, just a quick raw assessment of the data used in some quarters as a replacement for actually estimating future performance. But this assessment shows how to  empirical data COULD  support making decisions about future outcomes in the presence of uncertainty using past time series once the naive assumptions of sample size and wide variances are corrected..

The End

If you hear you can make decisions without estimating that's pretty much a violation of all established principles of Microeconomics and statistical forecasting. When answer comes back we sued empirical data, that your time series empirical data, download R, install all the needed packages, put the data in a file, apply the functions above and see if you really want to commit to spending other peoples money with a confidence range of +65 to -72%  of performing like you did in the past? I sure hope not!!

Related articles Flaw of Averages Estimating Probabilistic Outcomes? Of Course We Can! Critical Success Factors of IT Forecasting Herding Cats: Empirical Data Used to Estimate Future Performance Some More Background on Probability, Needed for Estimating Forecast, Automatic Routines vs. Experience Five Estimating Pathologies and Their Corrective Actions
Categories: Project Management

Who Builds a House without Drawings?

Mon, 04/13/2015 - 05:46

This month's issue of Communications of the ACM, has a Viewpoint article titled "Who Builds a House without Drawing Blueprints?" where two ideas are presented:

  • It is a good idea to think about what we are about to do before we do ¬†it.
  • If we're going to write a good program, we need to think above to code level.

The example from the last bullet is there are many coding methods - test driven development, agile programming, and others ...

If the only sorting algorithm we know is a bubble sort no coding method will produce code that sorts in O(n log n) time.

Not only do we need to have somes sense of what capabilities the software needs to deliver in exchange for the cost of the software, but also do those capabilities meet the needs? What are the Measures of Effectiveness and Measures of Performance the software must fulfill? In what order must these be fulfilled? What supporting documentation is needed for the resulting product or service in order to maintain it over it's life cycle.

If we do not start with a specification, every line of code we write is a patch.†

This notion brings up several other gaps in our quest to build software that fulfills the needs of those paying. There are several conjectures floating around that willfully ignore the basic principles of providing solutions acceptable to the business. Since the business operates on the principles of Microeconomics of decision making, let's look at developing software from the point of view of those paying for our work. It is conjectured that ...

  • Good code is it's own documentation.
  • We can develop code just by sitting down and¬†doing it. Our¬†mob¬†of coders can come up with the best solution as they go.
  • We don't need to estimate the final cost and schedule, we'll just use some short term highly variable empirical data to show us the average progress and project that.
  • All elements of the software can be¬†sliced to a standard size and we'll use Kanban to forecast future outcomes.
  • We're bad at estimating and our managers misuse those numbers, so the solution is to Not Estimate and that will fix the root cause of those symptoms of Bad Management.

There are answers to each of these in the literature for the immutable principles of project management, but I came across a dialog that illustrates to na√Įvety ¬†around spending other people's money to develop software without knowing how much, what, and when.

Here's a conversation - following Galileo Galilei's Dialogue Concerning the Two Chief World Systems - between Salviati who argues for the principles of celestial mechanics and Simplicio who is a dedicated follower that those principles have not value for him as his sees them an example of dysfunction. 

I'll co-op the actual social media conversation and use those words by Salviati and Simplicio as the actors. The two people on the social media are both fully qualified to be Salviati. Galileo used Simplicio as a double entendre to make his point, so neither is Simplicio here:

  • Simplicio - my first born is a novice software developer but is really bad at math and especially statistics and those pesky estimating requests asked by the managers he works for. he's thinking he needs to find a job where they let him develop code, where there is #NoMath needed to make those annoying estimates.
  • Salviati - Maybe you tell him you're not suggesting he not learn math, but simply reduce his dependence on math in his work, since it is hard and he's not very good at it.
  • Simplicio - Yes, lots of developers struggle with answering estimate questions based on ¬†statistics and other know and tested approaches. I'm suggesting he find some alternative to having to make estimates, since he's so bad at them.
  • Salviati - I'll agree for the moment, since he doesn't appear to be capable of learning the needed math. Perhaps he should seek other ways to answering the questions asked of him by those paying his salary. Ways in which he can apply #NoMath to answering those questions needed by the business people to make decisions.
  • Simplicio - Maybe he can just pick the most important thing to work on first, do that, then go back and start the next most important thing, and do that until he is done. Then maybe those paying him will stop asking¬†when will you be done and how much will it cost when that day arrives, and oh yes, all that code you developed it will meet the needed¬†capabilities¬†I'm paying¬†you¬†to¬†develop right?
  • Salviati - again this might be a possible solution to your son's dilemma. After all we're not all good at using statistics and other approaches to estimate those numbers needed to make business decisions. Since we really like to just start coding, maybe the idea of #NoMath is a good one and he can just be an excellent coder.¬†Those paying for his work really only want it to work on the needed day for the expected cost and provide the needed capabilities - all within the confidence levels needed to fulfill their business case needs so they can stay in business. ¬†
  • Simplicio - He heard of this idea on the internet. Collect old data and use those for projecting the new data. That'd of course not be be the same as analyzing the future risks, changing sizes of work and all the other probabilistic outcomes. Yea, that's work, add up all the past estimates, find the average and use that.
  • Salviati - that might be useful for him, but make sure you caution him, that those numbers from the past may not represent the numbers in the future if he doesn't assess what capabilities are needed in the future and what the structure of the solution is for those capabilities. And while he's at it, make sure the uncertainties in the future are the same as the uncertainties in the past, otherwise that past numbers are pretty much worthless for making decisions about the future.
  • Simplicio - Sure, but at his work, his managers abuse those numbers and take them as¬†point values and ignore the¬†probabilistic¬†ranges he places on them. His supervisor - the one with the pointy hair - simply doesn't recognize that all project work is probabilistic and wants his developers to just do it.
  • Salviati - Maybe your son can ask his supervisors boss - the one that provides the money for his work,¬†Five Whys¬†as to why he even needs an estimate. Maybe that person will be happy to have you son spend his money with no need to know how much it will cost in the end, or when he'll be done, or really what will be done when the money and time runs out.
  • Simplicio - yes that's the solution. All those books, classes, and papers he should have read, all those tools he could have used, really don't matter any more. He can go back and tell the person paying for the work that he can produce the result without using any math whatsoever. Just take whatever he is producing, one slice at a time, and eventually he'll get what he needs to fulfill his business case, hopefully before time and money runs out.

† Viewpoint: Who Builds a House without Drawing Blueprints?, Leslie Lamport, CACM, Vol.58 No.4, pp. 38-41.

Categories: Project Management

Top Project Management Blogs of 2015

Fri, 04/10/2015 - 15:20

PS-Blog-Best-Blogs

The List in alphabetical order includes this blog. Thanks

 

Categories: Project Management

The Microeconomics of a Project Driven Organization

Thu, 04/09/2015 - 16:30

The notion that we can ignore - many times willfully - the microeconomics of decision making is common in some development domains. Any project driven paradigm has many elements, each interacting with each in random ways, in nonlinear ways, in ways we may not be able to even understand when the maturity of the organization is not yet developed to a level needed to manage in the presence of uncertainty.

ProjectDrivenOrganization

So When We Say Project What Do We Mean?

The term project has an official meaning in many domains. Work that has a finite duration is a good start. But then what is finite? Work that makes a change to an external condition. But what does change mean, and what is external. In most definitions, operations and maintenance are not usually budgeted as projects. There are accounting rules the describe projects as well. Once we land on an operational definition of the project, here's a notional picture of the range of projects.

6a00d8341ca4d953ef01bb07dbef16970dWhen we hear a suggestion about any process for project management, we need to first establish a domain and a context in that domain to test the idea. 

My favorite questionable conjecture is that we can make decisions about the spending of other peoples money without estimating the outcomes for that decisions. Making decisions about an uncertain future is the basis of Microeconomics.

One framework for making decisions in the presence of uncertainty is Organizational Governance. Without establishing a governance framework, ranging from one like that below, to No governance, just DO IT, it's difficult to have a meaningful conversation about the applicability of any project management process.

So when we hear a new and possibly counter intuitive suggestion, start by asking In What Governance Model Do You Think This Idea Might Be Applicable?

Wp1074_ppp_architecture

 

Related articles Decision Analysis and Software Project Management Incremental Delivery of Features May Not Be Desirable
Categories: Project Management

The Microeconomics of Decision Making in the Presence of Uncertainty - Re-Deux

Thu, 04/09/2015 - 14:59

Microeconomics is a branch of economics that studies the behavior of individuals and small impacting organizations in making decisions on the allocation of limited resources.

All engineering is constrained optimization. How do we take the resources we've been given and deliver the best outcomes. That's microeconomics is. Unlike models of mechanical engineering or classical physics, the models of microeconomics are never precise. They are probabilistic, driven by the underlying statistical processes of the two primary actors - suppliers and consumers. 

Let's look at both in light of the allocation of limited resources paradigm.

  • Supplier = development resources - these are limited in both time and capacity for work. And as likely talent and production of latent defects, which cost time and money to remove.
  • Consumer = those paying for the development resources have limited time and money. Limited money is obvious, they have a budget. Limited time, since the¬†time value of money of part of the Return in Capital equation used by the business. Committing capital (not real capital, software development is usually carried on the books as an expense), needs a time when that capital investment will start to return¬†value.¬†

In both case time, money, capacity for productive value are limited (scarce) and compete with each other and compete with the needs of both the supplier and the consumer. In addition, since the elasticity of labor costs is limited by the market, we can't simply buy cheaper to make up for time and capacity. It's done of course but always to the determent of quality and actual productivity.

So cost is inelastic, time is inelastic, capacity for work is inelastic and other attributes of the developed product constrained. The market need is like constrained as well. Business needs are rarely elastic - oh we really didn't need to pay people in the time keeping system, let's just collect the time sheets, we'll run payroll when that feature gets implemented.

Enough Knowing, Let's Have Some Doing

With the principles of Microeconomics applied to software development, there is one KILLER issue, that if willfully ignored ends the conversation for any business person trying to operate in the presence of limited resources - time, money, capacity for work.

The decisions being made about these limited resources are being made in the presence of uncertainty. This uncertainty - as mentioned - is based on random processes. Random process produce imprecise data. Data drawn from random variables. Random variables with variances, instability (stochastic processes), non-linear stochastic processes. 

Quick Diversion Into Random Variables

There are many mathematical definitions of random variables, but for this post let's use a simple one.

  • A variable is an attribute of a system or project that can take on multiple values. If the value of this variable is¬†fixed for example when someone asks what is¬†the number of people on the project can be known by counting then and writing that down. When someone asked you could count and say say 16.
  • When the values of the variable are¬†random then the variable can take on a range of values just like the non-random variable, but we don't know exactly what those values will be when we want to use that variable to ask a question. If the variable is a¬†random variable and¬†someone asks what will be the cost of this project when it is done, you'll have to provide a range of values and the confidence for each of the numbers in the range.¬†

A simple example - silly but illustrative - would be HR wants to buy special shoes for the development team, with the company logo on them. If we could not for some reason (doesn't matter why) measure the shoe size of all the males on our project, we could estimate how many shows of what size woudl be needed from the statistical distribution of males shoe sizes for a large population of make coders.

Mod8-image_shoe_male1

This would get use close to how many shoes of what size we need to order. This is a notional example, so please don't place an order for actual shoes. But the underlying probability distribution of the values the random variable can take on can tell us about the people working on the project.

Since all the variables on any project are random variables, we can't know the exact value of them at any one time. But we can know about their possible ranges and the probabilities of any specific value when asked to produce that value for making a decision. 

The viability of the population values and its analysis should not be seen not as a way of making precise predictions about the project outcomes, but as a way of ensuring that all relevant outcomes produced by these variables have been considered, that they have been evaluated appropriately, and that we have a reasonable sense what will happen for the multitude of values produced by a specific variable. It provides a way of structuring our thinking about the problem. 

Making Decisions In The Presence of Random Variables

To make a decision - a choice among several choices - means making an opportunity cost  decision based in random data. And if there is only one choice, then the choice is either take the choice or don't.

This means the factors that go into that decision are themselves random variables. Labor, productivity, defects, capacity, quality, usability, functionality, produced business capability, time. Each is a random variables, interacting in nonlinear ways with the other random variables.

To make a choice in the presence of this paradigm we must make estimates of not only the behaviour of the variables, but also the behaviors of the outcomes.

In other words

To develop software in the presence of limited resources driven by uncertain processes for each resource (time, money, capacity, technical outcomes), we must ESTIMATE the behaviors of these variables that inform our decision.

It's that simple and it's that complex. Anyone conjecturing decisions can be made in the absence of estimates of the future outcomes of that decision is willfully ignoring the Microeconomics of business decision making in the software development domain.

For those interested in further exploring of the core principle of Software Development business beyond this willful ignorance, here's a starting point.

These are the tip of the big pile of books, papers, journal articles on estimating software systems. 

A Final Thought on Empirical Data

Making choices in the presence of uncertainty can be informed by several means:

  • We have data from the past
  • We have a model of the system that can simulated
  • We have¬†reference classes from which we can extract similar information

This is empirical data. But there are several critically important questions that must be answered if we are not going to be disappointed with our empirical data outcomes

  • Is the past representative of the future?
  • Is the sample of data from the past sufficient to make sound forecasts of the future. The number of sample needed greatly influences the confidence intervals on the estimates of the future.

Calculating the number of samples needed for a specific level of confidence requires some statistics. But here's a place to start. Suffice it to say, those conjecturing estimates based on past performance (number of story point in the past) will need to produce the confidence calculation before any non-trivial decisions should be made on their data. Without those calculations the use of past performance be very sporty when spending other peoples money.

Thanks to Richard Askew for suggesting the addition of the random variable background

Categories: Project Management

Capability Maturity Levels and Implications on Software Estimating

Mon, 04/06/2015 - 14:52

 An estimate is the most knowledgeable statement you can make at a particular point in time regarding, cost/effort, schedule, staffing, risk, the ...ilities of the product or service.[1]

CMMI  for Estimates

 

Immature versus Mature Software Organizations [3]

Setting sensible goals for improving the software development processes requires  understanding the difference between immature and mature organizations. In an immature organization, processes are generally improvised by practitioners and their management during the course of the project. Even if a process has been specified, it is not rigorously followed or enforced.

Immature organizations are reactionary with managers focused on solving immediate crises. Schedules and budgets are routinely exceeded because they are not based on realistic estimates. When hard deadlines are imposed, product functionality and quality are often compromised to meet the schedule.

In immature organizations, there is no objective basis for judging product quality or for solving product or process problems. The result is product quality is difficult to predict. Activities intended to enhance quality, such as reviews and testing, are often curtailed or eliminated when projects fall behind schedule.

In mature organizations possesses guide the organization-wide ability to manage development and maintenance processes. The process is accurately communicated to existing staff and new employees, and work activities are carried out according to the planned process. The processes mandated are usable and consistent with the way the work actually gets done. These defined processes are updated when necessary, and improvements are developed through controlled pilot-tests and/or cost benefit analyses. Roles and responsibilities within the defined process are clear throughout the project and across the organization.

Let's look at the landscape of maturity on estimating the work for those providing the funding for the work.

1. Initial

Projects are small, short, and while important to the customer, not likely critical to the success of the business in terms of cost and schedule. 

  • Informal or no estimating
  • When there are estimates, they are manual, without any processes, and likely considered¬†guesses

The result of this level of maturity is poor forecasting of the cost and schedule of the planned work. And surprise for those paying for the work.

2. Managed

Projects may be small, short, and possibly important. Some for of estimating, either from past experience or from decomposition of the planned work is used to make linear projects of future cost, schedule, and technical performance.

This past performance is usually not adjusted for the variances of the past, just and average. As well the linear average usually doesn't consider changes in the demand for work, technical differences in the works, and other uncertainties in the future for that work.

This is the Flaw of Averages approach to estimating. As well the effort needed to decompose the work into same sized chunks is the basis of all good estimating processes. In the Space and Defense business the 44 day rule is used to bound the duration of work. This answers the question how long are you willing to wait before you find out you're late? For us, the answer is no more than one accounting period. In practice, project status - physical percent complete is done every Thursday afternoon.

3. Defined

There is an estimating process, using recorded past performance and the statistical adjustments of that past performance. Reference Classes are  used to model future performance from the past. Parametric estimates can be used with those reference classes or other estimating processes. Function Points is common in enterprise IT projects where interfaces to legacy systems, database topology, user interfaces, transactions are the basis of the business processes. 

The notion that we've never done this before so how can we estimate, begs the question do you have the right development team? This is a past performance issues. Why hire a team that has no understanding of the problem and then ask then to estimate the cost of the solution? You wouldn't hire a plumber to install a water system if she hadn't done this before in some way.

4. Quantitatively Managed

Measures, Metrics, Architecture assessments - Design Structure Matrix is one we use - are used to construct a model of the future. External Databases referenced to compare internal estimates with external experiences.

5. Optimized 

There is an estimating organization that supports development, starting with the proposal and continuing through project close out. As well there is a risk management organization helping inform the estimates about possible undesirable outcomes in the future.

Resources

[1] Improving Estimate Maturity for More Successful Projects, SEER/Tracer Presentation, March 2010.

[2] Software Engineering Information Repository, Search Results on Estimates

[3] The Capability Maturity Model for Software

Categories: Project Management