Skip to content

Software Development Blogs: Programming, Software Testing, Agile Project Management

Methods & Tools

Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!

Project Management

Thinking About #NoEstimates?

I have a new article up on agileconnection.com called The Case for #NoEstimates.

The idea is to produce value instead of spending time estimating. We have a vigorous “debate” going on in the comments. I have client work today, so I will be slow to answer comments. I will answer as soon as I have time to compose thoughtful replies!

This column is the follow-on to How Do Your Estimates Provide Value?

If you would like to learn to estimate better or recover from “incorrect” estimates (an oxymoron if I ever heard one), see Predicting the Unpredictable. (All estimates are guesses. If they are ever correct, it’s because we got lucky.)

Categories: Project Management

Want To Learn How To Estimate? Part Deux

Herding Cats - Glen Alleman - Thu, 04/23/2015 - 19:51

If you work in a domain, as I do, that needs to answer when will we providing that needed capability to produce the planned value for the planned amount of money, then estimating is going to be part of answering those questions. If you work where those paying for the work have no interest in knowing answers, or have confidence you'll not overrun the cost, schedule, and deliver the needed capabilities as planned, then maybe estimates are needed. Be interesting to hear from those actually paying for those outcomes to see what they need to make decisions in the presence of uncertainty.

Here's some guidance for getting started with estimating software efforts.

And some tools to help out

So you see a trend here? There is nearly unlimited resources on how to estimate software development projects, how to manage in the presence of uncertainty, how to elicit requirements, how the plan and schedule software projects.

So if you hear we're bad at estimates, that's likely the actual experience for the person making that statement, because the person saying that hasn't yet learned how to estimate. Or when we hear estimates are a waste, it's likely it's not their money, so to them estimates take away from some other activity they see as more important. Why do they care of the project overruns it's budget, is late, or doesn't produce the needed value? Or my favorite estimates are the smell of dysfunction, when there is no domain, root cause, or corrective actions suggested, because that's actually hard work, and it's much easier just to point out bad management than provide suggestions of good management. 

Estimating is hard. Anything of value is hard. All the easy problems have been solved. 

But if we are to ever get a handle on the root causes of software project failure modes, we do need to seek out the root causes. A that means much more than just asking the 5 Whys. That's one of many steps in RCA and far from the complete solution to removing the symptoms of our problems. So start there, but never leave it there. 

Here's some approach we use

Unanticipated cost growth is a symptom of failing to properly estimate in the first place, update those estimates as the project progresses, and deal with the underlying risks that drive that cost growth. Same for lateness and less than acceptable delivered capabilities. Once the estimate has been established in some credible form, adjusted as the project progresses, you of course have to execute to the plan, or change the plan. It's a closed loop system

  • Target
  • Execute¬†
  • Assess performance
  • Determine error signal
  • Take corrective actions
    • Change target
    • Change execution processes
  • Repeat until complete

The answer to the root causes that create the symptoms we so quickly label as smells of dysfunction is NOT to stop doing something, but the learn what the actual cause is. Stopping before this knowledge is acquired leaves the symptom still in place. And that doesn't help anyone.

Related articles Who Builds a House without Drawings? Decision Analysis and Software Project Management Herding Cats: Five Estimating Pathologies and Their Corrective Actions Capability Maturity Levels and Implications on Software Estimating Economics of Software Development Qui Bono Want To Learn How To Estimate? Calculating Value from Software Projects - Estimating is a Risk Reduction Process Root Cause Analysis
Categories: Project Management

Build a Risk Adjusted Project Plan in 6 Steps

Herding Cats - Glen Alleman - Thu, 04/23/2015 - 17:35

When we hear about project planning and scheduling, we think about tasks being planned, organized in the proper order, durations assigned to the work, resources committed to perform the work. 

This is all well and good, but without a risk adjusted view of the planned work, it's going to be disappointing at best. There are some key root causes of most project failure. Let's start here.

Screen Shot 2015-04-23 at 10.01.38 AM

Each of these has been shown, through research on failed programs, to contribute to cost and schedule impacts. Unrealistic expectations of the project's deliverables, Technical issues, Naive Cost and Schedule estimating, and less than acceptable risk mitigation planning.

Project Mangement in Six Steps

Screen Shot 2015-04-23 at 10.33.57 AM

Here's how to address the cost and schedule estimating

Develop a schedule. What ever your feelings are for Gantt's, sticky note, or any handwaving processes you've learned to use, you need a sequence of the work, the dependencies, the planned durations. Without something like that , you have no idea what work is needed to complete the project. Here's a straightforward Master Schedule for some flight avionics on a small vehicle. All software, has to complete as planned, otherwise the users can't do their job as planned. And since they're the ones paying for our work, they have an expectation of us showing up, near oru budget, with the needed capabilities. Not the Minimum, the NEEDED

Screen Shot 2015-04-23 at 9.55.52 AM

Using a Monte Carlo tool (RiskyProject), here is run for the probabilities of cost, duration and completion dates. All project work is probabilistic, any notion that a deterministic plan can be successful is going to result in disappointment.

We usually call our planning sessions done when we can get the risk adjusted Integrated Master Schedule to show the completion date of on or before the need date to an 80% confidence level.

Screen Shot 2015-04-23 at 9.02.07 AM

With a resource loaded schedule - or some external time phased cost model - we can now show the probability of completing on or before the need date, and at of below the planned budget. The chart below informs everyone what the chances are for success for the cost and schedule aspects of the project. Technically it has to work. The customer gets to say that. The Fit for Use and Fit for Purpose measures if we're in an IT Governance paradigm. The Measures of Effectiveness and Measures of Performance if we're in other paradigms. Those measures can be modeled as well, but I'm just focusing on cost and schedule here.

Screen Shot 2015-04-23 at 9.22.27 AM

With this information we can produce the needed margin and management reserve to protect the delivery date and budget. Showing up late and over budget for a product that works is usually not good business. 

Do You Need all This?

What's the Value At Risk?

  • A $10,000 warehouse database app update - certainly not. Just do it.
  • Website for gaming, probably not, just start and let the market tell you where to go. Try to forecast what features and how much they'll cost periodically so those paying get a sense of value returned for their investment.
  • Product development, with a planned launch date, planned sell price, planned cost (socan make money) and planned features. Probably need so visibility into what's going to be happening in the future.
  • Enterprise IT, say an ERP system, worth 10's of millions? Better have a plan.¬†

Don't know the value at risk? Don't have a clear vision of what done looks like in  units of measure meaningful to the decision makers? You've got bigger problems. This approach won't help?

Related articles Calculating Value from Software Projects - Estimating is a Risk Reduction Process Incremental Delivery of Features May Not Be Desirable Decision Analysis and Software Project Management How to Avoid the "Yesterday's Weather" Estimating Problem Hope is not a Strategy Critical Success Factors of IT Forecasting
Categories: Project Management

The Reason We Plan, Schedule, Measure, and Correct

Herding Cats - Glen Alleman - Wed, 04/22/2015 - 15:56

YogiThe notion that planning is of little value, seems to be lost on those not accountable for showing up on or before the need date, at or below the needed cost, and with the planned capabilities needed to fulfill the business case or successfully accomplish the mission.

Yogi says it best in our project management domain. And it bears repeating when someone says let's get started and we'll let the requirements emerge. Or my favorite, let's get started and we'll use our perform numbers to forecast future performance, we don't need no stink'in estimates.

Yogi says ...If you don't know where you are going, you'll end up someplace else.

This is actually a quote from Alice in Wonderland 

"Would you tell me, please, which way I ought to go from here?"
"That depends a good deal on where you want to get to," said the Cat.
"I don't much care where--" said Alice.
"Then it doesn't matter which way you go," said the Cat.
"--so long as I get SOMEWHERE," Alice added as an explanation.
"Oh, you're sure to do that," said the Cat, "if you only walk long enough."
(Alice's Adventures in Wonderland, Chapter 6)

This is often misquoted as If you don't know where you're going, any road will get you there. Which is in fact  technically not possible and not from Alice.

So What To Do?

We need a plan to deliver the value that is being exchanged for the cost of that value. We can't determine the result value - benefit - until first we know the cost to produce that value, then we have to know when that value will be arriving. 

  • Arriving late and over budget diminishes the value for a higher cost, since arriving late means we've had to pay more in labor - people continue to work on producing the value. That extra cost diminishes the value.
  • Arriving with less than the needed capabilities diminishes the value for the same cost.

Both these conditions are basic Managerial Finance 101 concepts base on Return on Investment. 

ROI = (Value - Cost) / Cost

The first thing some will say is but value can't be monetized. Ask the CFO of your firm to see what she thinking about monetizing the outcomes of your work on the balance sheet. Better yet, don't embarrass yourself, read Essentials of Managerial Finance, Brigham and Weston. Mine is 11th Edition, looks like its up to the 14th Edition.

As well, once it is established that both  cost and value are random variables about measurements in the future, you'll need to estimate them to some degree of confidence if you're going to make decisions. These decisions are actual opportunity cost decisions about future outcomes. This is the basis of Microeconomics of software development.

2000px-PDCA_Cycle.svgSo when you hear we can make decisions about outcomes in the future in the presence of uncertainty - the basis of project work - without estimating - don't believe a word of it. Instead read Weston and you too will have the foundational skills to know better. 

Because the close loop management processes we need on project and product development require we make decisions in the presence of uncertainty. There is simply no way to do that without estimating all the variance when we Plan, Do, Check, Act. Each is a random variable, with random outcomes. each require some access of what will happen if I do this. A that notion of let's just try it reminds me of my favorite picture of open loop, no estimates, no measurement, no corrective action management.

Sailing Over The Edge

 

Categories: Project Management

Software Development Linkopedia April 2015

From the Editor of Methods & Tools - Wed, 04/22/2015 - 15:20
Here is our monthly selection of knowledge on programming, software testing and project management. This month you will find some interesting information and opinions about honnest ignorance, canonical data model, global teams, recruiting developers, self-organization, requirements management, SOLID programming, developer training, retrospectives and Test-Driven Development (TDD). Blog: The Power of Not Knowing Blog: Why You Should Avoid a Canonical Data Model Blog: Managing global teams ‚Äď Lessons learned Blog: The Sprint Burndown is dead, long live Confidence Smileys Blog: How to write a job post for development positions Blog: Why Self-Organizing is So Hard Article: Order Your ...

How to Avoid the "Yesterday's Weather" Estimating Problem

Herding Cats - Glen Alleman - Tue, 04/21/2015 - 15:49

One suggestion from the #NoEstimates community is the use of empirical data of past performance. This is many time called yesterdays weather. First let's make sure we're not using just the averages from yesterdays weather. And even adding the variance to that small sample of past performance can lead to very naive outcomes. 

We need to do some actual statistics on that time series. A simple R set of commands will produce the chart below from the time series of past performance data.

Forecast

 But that doesn't really help without some more work.

  • Is the future Really like the past - are the work products and the actual work performed in the past replicated ¬†in the future? If so, this sound like a simple project, just turn out features that all look alike.
  • Is there any interdependencies that grow ¬†in complexity as the project moves forward? This is the integration and test problem. Then the system of systems integration and test problem. Again simple project don't usually have this problem. More complex projects do.
  • What about those pesky¬†emerging¬†requirements. This is a favorite idea of agile (and correctly so), but simple past performance is not going to forecast the needed performance in the presence of emerging requirements
  • Then there all the externalities of all project work, where are those captured in the sample of past performance?
  • All big projects have little projects inside them is a common phrase. Except that collection of little projects need to be integrated, tuned, tested, verified, and validated so that when all the parts are ¬†assembled they actually do what the customer wants.

Getting Out of the Yesterday's Weather Dilemma

Let's use the chart below to speak about some sources of estimating NOT based on simple small samples of yesterdays weather. This is a Master Plan for a non-trivial project to integrate half dozen or so legacy enterprise systems with a new health insurance ERO system for an integrated payer/provider solution: 

Capabilities Flow

  • Reference Class Forecasting for each class of work product.
    • As the project moves left to right in time the¬†classes of product and the related work likely change.¬†
    • Reference classes for each of this movements through increasing maturity, and increasing complexity from integration interactions needs to be used to estimate not only the current work but the next round of work
    • In the chart above work on the left is planned with some level of confidence, because it's work in hand. Work in the right is in the future, so an courser estimate is all that is needed for the moment.
    • This is a¬†planning package¬†notion used in space and defense. Only plan in detail what you understand in detail.
  • Interdependencies Modeling in MCS
    • On any non-trivial project there are interdependencies
    • The notion of INVEST¬†needs to be tested¬†
      • Independent - not usually the case on enterprise projects
      • Negotiable - usually not, since he ERP system provides the core capability to do business. Would be illogical to have half the procurement system.¬†We can issue purchase orders and receive goods. But we can't pay for them until we get the Accounts Payable system. We need both at the same time. Not negotiable.
      • Valuable - Yep, why we doing this if it's not valuable to the business. This is a strawman used by low business maturity projects.
      • Estimate - to a good approximation is what the advice tells us. The term¬†good needs a unit of measure
      • Small - is a domain dependent measure. Small to an enterprise IT projects may be huge to a sole contributor game developer.
      • Testable - Yep, and verifiable, and validatable, and secure, and robust, and fault tolerant, and meets all performance requirements.
  • Margin - protects dates, cost, and technical performance from irreducible uncertainty. By irreductible it means nothing can be done about the uncertainties. It's not the lack of knowledged that is found in reducible uncertainty. Epistemic uncertainty. Irreducible uncertainty is Aleatory. It's the natural randomness in the underlying processes that creates the uncertainty. When we are estimating in the presence of aleatory uncertainty, we must account for this aleatory uncertainty. This is why using the¬†average of a time series for making a decision about possible future outcomes will always lead to disappointment.¬†
    • First we should always use the Most Likely value of the time series, not Average of the time series.
    • The Most Likely - the Mode - is that number that occurs most often of all the possible values that have occurred in the past. This should make complete sense when we consider¬†what value will appear next? Why the value that has appeared Most¬†Often¬†in the past.
    • The Average of two numbers 1 and 99 is 50. The average of two numbers 49 and 51 is 50. Be careful with averages in the absence of knowing the variance.
  • Risk retirement - Epistemic uncertainty creates risks that can be retired. This means spending money and time. So when we're looking at past performance in an attempt to estimate future performance (Yesterdays Weather), we must determine what kind of uncertainties there are in the future and what kind of uncertainties we encountered in the past.
    • Were the and are they reducible or irreducible?
    • Did the performance in the past contain irreducible uncertainties, baked into the numbers that we did not recognize?¬†

This bring up a critical issue with all estimates. Did the numbers produced from the past performance meet the expected values or were they just the numbers we observed? This notion of taking the observed numbers and using them for forecasting the future is an Open Loop control system. What SHOULD the numbers have been to meet our goals? What SHOULD the goal have been? Did know that, then there is no baseline to compare the past performance against to see if it will be able to meet the future goal. 

I'll say this again - THIS IS OPEN LOOP control, NOT CLOSED LOOP. No about of dancing around will get over this, it's a simple control systems principle found here. Open and Close Loop Project Controls

  • Measures of physical percent complete to forecast future performance with cost, schedule, and technical performance measures - once we have the notion of Closed Loop Control and have constructed a¬†steering target, can capture actual against plan, we need to define measures that are meaningful to the decisions makers. Agile does a good jib of forcing¬†working product to appear often. The assessment of Physical Percent Complete though needs to define what that¬†working software¬†is supposed to do in support of the business plan.
  • Measures of Effectiveness - one very good measure is of¬†Effectiveness. Does the software provide and effective solution to the problem. This begs the question or questions. What is the problem and what does an effective solution looks like were it to show up.¬†
    • MOE's are operational measures of success that are closely related to the achievements of the mission or operational objectives evaluated in the operational environment, under a specific set of conditions.
  • Key performance parameters - the companion of Measures of Effectiveness are Measures of Performance.
    • MOP's characterize physical or functional attributes relating to the system operation, measured or estimated under specific conditions.
  • Along with these two measures are Technical Performance Measures
    • TPM's are attributes that determine how well a system or system element is satisfying or expected to satisfy a technical requirement or goal.
  • And finally there are Key Performance Parameters
    • KPPs represent the capabilities and characteristics so significant¬† that failure to meet them can be cause for reevaluation, reassessing, or termination of the program.

The connections between these measures are shown below.

Screen Shot 2015-03-28 at 4.37.51 PM

With these measures, tools for making estimates of the future - forecasts - using statistical tools, we can use yesterdays weather, tomorrow models and related reference classes, desired MOE's, MOP's, KPP's, and TPM's and construct a credible estimate of what needs to happen and then measure what is happening and close the loop with an error signal and take corrective action to stay on track toward our goal.

This all sounds simple in principle, but in practice of course it's not. It's hard work, but when you assess the value at risk to be outside the tolerance range where thj customer is unwilling to risk their investment, we need tools and processes wot actually control the project.

Related articles Hope is not a Strategy Incremental Delivery of Features May Not Be Desirable
Categories: Project Management

Doing the Math

Herding Cats - Glen Alleman - Tue, 04/21/2015 - 15:09

In the business of building software intensive systems; estimating, performance forecasting and  management, closed loop control in the presence of uncertainty for all variables is the foundation needed for  increasing the probability of success.

This means math is involved in planning, estimating, measuring,  analysis, and corrective actions to Keep the Program Green.

When we have past performance data, here's one approach...

And the details of the math in the Conference paper

  Related articles Hope is not a Strategy How to Avoid the "Yesterday's Weather" Estimating Problem Critical Success Factors of IT Forecasting
Categories: Project Management

Thinking About Estimation

I have an article up on agileconnection.com. It’s called How Do Your Estimates Provide Value?

I’ve said before that¬†We Need Planning; Do We Need Estimation?¬†Sometimes we need estimates. Sometimes we don’t. That’s why I wrote Predicting the Unpredictable: Pragmatic Approaches for Estimating Cost or Schedule.

I’m not judging your estimates. I want you to consider how you use estimates.

BTW, if you have an article you would like to write for agileconnection.com, email it to me. I would love to provide you a place for your agile writing.

Categories: Project Management

ScrumMaster ‚Äď Full Time or Not?

Mike Cohn's Blog - Tue, 04/21/2015 - 15:00

The following was originally published in Mike Cohn's monthly newsletter. If you like what you're reading, sign up to have this content delivered to your inbox weeks before it's posted on the blog, here.

I’ve been in some debates recently about whether the ScrumMaster should be full time. Many of the debates have been frustrating because they devolved into whether a team was better off with a full-time ScrumMaster or not.

I’ll be very clear on the issue: Of course, absolutely, positively, no doubt about it a team is better off with a full-time ScrumMaster.

But, a team is also better off with a full-time, 100 percent dedicated barista. Yes, that’s right: Your team would be more productive, quality would be higher, and you’d have more satisfied customers, if you had a full-time barista on your team.

What would a full-time barista do? Most of the time, the barista would probably just sit there waiting for someone to need coffee. But whenever someone was thirsty or under-caffeinated, the barista could spring into action.

The barista could probably track metrics to predict what time of day team members were most likely to want drinks, and have their drinks prepared for them in advance.

Is all this economically justified? I doubt it. But I am 100 percent sure a team would be more productive if they didn’t have to pour their own coffee. Is a team more productive when it has a fulltime ScrumMaster? Absolutely. Is it always economically justified? No.

What I found baffling while debating this issue was that teams who could not justify a full-time ScrumMaster were not really being left a viable Scrum option. Those taking the “100 percent or nothing” approach were saying that if you don’t have a dedicated ScrumMaster, don’t do Scrum. That’s wrong.

A dedicated ScrumMaster is great, but it is not economically justifiable in all cases. When it’s not, that should not rule out the use of Scrum.

And a note: I am not saying that one of the duties of the ScrumMaster is to fetch coffee for the team. It’s just an exaggeration of a role that would make any team more productive.

Approximating for Improved Understanding

Herding Cats - Glen Alleman - Mon, 04/20/2015 - 22:05

Screen Shot 2015-04-09 at 9.35.45 AMThe world of projects, project management, and the products or services produced by those projects is uncertain. It's never certain. Seeking certainty is not only naive, it's simply not possible.

Making decisions in the presence of this uncertainty is part of our job as project managers, engineers, developers on behave of those paying for our work.

It's also the job of the business, whose money is being spent on the projects to produce tangible value in exchange for that money.

From the introduction of the book to the left...

Science and engineering, our modern ways of understanding and altering the world, are said to be about accuracy and precision. Yet we best master the complexity of our world by cultivating insight rather than precision. We need insight because our minds are but a small part of the world. An insight unifies fragments of knowledge into a compact picture that fits in our minds. But precision can overflow our mental registers, washing away the understanding brought by insight. This book shows you how to build insight and understanding first, so that you do not drown in complexity.

So what does this mean for our project world?

  • The future is uncertain. It is always uncertain. It can't be anything but uncertain. Assuming certainty, is a waste of time. Managing in the presence of uncertanity is unavoidable. To do this we must estimate. This is unavoidable. To suggest otherwise willfully ignores the basis of all management practices.
  • This uncertainty creates risk to our project. Cost, schedule, and risks to the delivered capabilities of the project or product development. To manage in with closed loop process, estimates are needed. This is unavoidable as well.
  • Uncertainty is either¬†reducible or¬†irreducible
    • Reducible uncertainty can be¬†reduced with new information. We can¬†buy down this uncertainty.
    • Irreducible uncertainty - the natural variations in what we do - can only be handled with¬†margin.

In both these conditions we need to get organized in order to address the  underlying uncertainties. We need to put structure in place in some manner. Decomposing the work is a common way in the project domain. From a Work Breakdown Structure to simple sticky notes on the wall, breaking problems down into smaller parts is a known successful way to address a problem. 

With this decomposition, now comes the hard part. Making decisions in the presence of this uncertainty.

Probabilistic Reasoning

Reasoning about things that are uncertain is done with probability and statistics. Probability is a degree of belief. 

I believe we have a 80% probability of completing on or before the due date for the migration of SQL Server 2008 to SQL Server 2012.

Why do we have this belief?  Is it based on our knowledge from past experience. Is this knowledge sufficient to establish that 80% confidence?

  • Do we have some external model of the work effort needed to perform the task?
  • Is there a parametric model of similar work that can be applied to this work?
  • Could we decompose the work to smaller¬†chunks that could then be used to model the larger set of tasks?
  • Could I conduct some experiments to improve my knowledge?
  • Could I build a model from intuition that could be used to test the limits of my confidence?

The answers to each of these informs our belief. 

Chaos, Complexity, Complex, Structured?

A well known agile thought leader made a statement today

I support total chaos in every domain

This is unlikely going to result in sound business decisions in the presence of uncertainty. Although there may be domains where chaos might produce usable results, when some degree of confidence that the money being spent will produce the needed capabilities, on of before the need date, at of below the budget needed to be profitable, and with the collection of all the needed capability to accomplish the mission or meet the business case, we're going to need to know how to manage our work to achieve those outcomes.

So let's assume - with a high degree of confidence - that we need to manage in the presence of uncertainty, but we have little interest in encouraging chaos, here's one approach.

So In The End

Since all the world's a set of statistical processes, producing probabilistic outcomes, which in turn create risk to any expected results when not addressed properly - the notion that decisions can be made in the presence of this condition can only be explained by the willful ignorance of the basic facts of the physic of project work. 

  Related articles The Difference Between Accuracy and Precision Making Decisions in the Presence of Uncertainty Managing in Presence of Uncertainty Herding Cats: Risk Management is How Adults Manage Projects Herding Cats: Decision Analysis and Software Project Management Five Estimating Pathologies and Their Corrective Actions
Categories: Project Management

Estimates

Herding Cats - Glen Alleman - Mon, 04/20/2015 - 22:00

Estimating SW Intensive SystemsEstimation and measurement of project attributes are critical success factors for designing, building, modifying, and operating products and services. †

Good estimates are the key to project success. Estimates provide information to the decision makers to assess adherence to performance specifications and plans, make decisions, revise designs and plans, and improve future estimates and processes.

We use estimates and measurements to evaluate the feasibility and affordability of products being built, choose between alternatives designs, assess risk, and support business decisions. Engineers compare estimates if technical baselines of observed performance to decide of the product meets its functional and performance requirements. These are used by management to control processes and detect compliance problems. Process manager use capability baselines to improve production processes.

Developers, engineers, and planners estimate resources needed to develop, maintain, enhance and deploy products, Project planners use estimates for staffing, facilities. Planners and managers use estimates for resources to determine project cost and schedule and prepare budgets and plans. 

Managers compare estimates - cost and schedule baselines - and actual values to determine deviations from plan and understand the root causes of those deviations needed to take corrective actions. Estimates of product, project, and process characteristics provide baselines to assess progress during the project. 

Bad estimates affect all participants in the project or product development process. Incomplete and inaccurate estimates mean inadequate time and money  available for increasing the probability of project success.

The Nature of Estimation

The verb estimate means to produce a statement of the approximate value of some quantity that describes or characterizes an object. The noun estimate refers to the value produced by the verb. The object can be an artifact - software, hardware, documents - or an activity - planning, development, testing, or process.

We make estimates because we cannot directly measure the value of that quantity because:

  • The object is inaccessible
  • The object does not exist yet
  • The Measurement would be too expensive

Reasons to Estimate and Measure Size, Cost and Schedule

  • Evaluate feasibility of requirements.
  • Analyze alternative designs and implementations.
  • Determ required capacity and speed of produced results.
  • Evaluate performance - accuracy, speed, reliability, availability and other¬†...ilities.
  • Identify and assess technical risks.
  • Provide technical baselines for tracking and guiding.

Reasons to Estimate Effort, Cost, and Schedule

  • Determine project feasibility in terms of cost and schedule.
  • Identify and assess risks.
  • Negotiate achievable commitments.
  • Prepare realistic plans and budgets.
  • Evaluate business value - cost versus benefit.
  • Provide cost and schedule baselines for tracking and guiding.

Reasons to Estimate Capability and Performance

  • Predict resource consumption and efficiency.
  • Establish norms for expected performance.
  • Identify opportunities for improvement.

There are many sources of data for making estimates, some reliable some not. Human subject matter expert based estimates have been shown to be the least reliable, accurate and precise due to the biases involved in the human processes of developing the estimate. estimates based on past performance, while useful, must be adjusted for the statistical behaviors of the past and the uncertainty of the future. 

If the estimate is misused in any way, this is not the fault of the estimate - both noun and verb - byt simply bad management. Fix that first, then apply proper estimating processes.

If your project or product development effort does none of these activities or has no need for information on which to make a decision, then estimating is likely a waste of time.

But before deciding estimate are the smell of dysfunction, with NO root cause identified for corrective action check with those paying your salary first, to see what they have to say about your desire to spend their money in presence of uncertainty with the absence of an estimate to see what they say.

† This post is extracted from Estimating Software Intensive Systems: Project, Products and Processes, Dr. Richard Stutzke, Addison Wesley. This book is a mandatory read for anyone working in a software domain on any project that is mission critical. This means if you need to show up on or before the need date, at or below your planned cost, with the needed capabilities - Key Performance Parameters , without which the project will get cancel - then you're going to need to estimate all the parameters of your project, If your project doesn't need to show up on time, stay on budget, or can provide less than the needed capabilities, no need to estimate. Just spend your customer's money, she'll tell you when to stop.

Related articles Capability Maturity Levels and Implications on Software Estimating Incremental Delivery of Features May Not Be Desirable Capabilities Based Planning First Then Requirements
Categories: Project Management

Root Cause Analysis

Herding Cats - Glen Alleman - Mon, 04/20/2015 - 16:07

Root-causeRoot Cause Analysis is a means to answer to why we keep seeing the same problems over and over again. When we treat the symptoms, the root cause remains.

In Lean there is a supporting process of 5S's. 5S's is a workplace organization method that uses a list of five words seiri, seiton, seiso, seiketsu, and shitsuke. This list describes how to organize a work places for efficiency and effectiveness by identifying and storing items used, maintaining the areas and items, and sustaining the new order. The decision making process usually comes from a dialogue about standardization, which build understanding around the employees of how they should do their work.

At one client we are installing Microsoft Team Foundation Server, for development, Release Management and Test Management. The current processes relies on the heroics of many on the team every Thursday night to get the release out the door.

We started the improvement of the development, test, and release process with Root Cause Analysis. In this domain Cyber and Security are paramount, so when there is a cyber or a data security  issue, RCA is the core process to address the issue.

The results of the RCA have show that the work place is chaotic at times, code poorly managed, testing struggles on deadline, and the configuration of the release base inconsistent. It was clear we were missing tools, but the human factors were also the source of the problem - the symptom of latent defects and a break fix paradigm.

There are many ways to ask and answer the 5 Whys and apply the 5 S's, but until that is done and the actual causes determined, and the work place cleaned up, the symptoms will continue to manifest in undesirable ways. 

If we're going to start down the path of 5 Whys and NOT actually determine the Root Cause and develop a corrective action plan, then that is in itself a waste. 

Related articles Five Estimating Pathologies and Their Corrective Actions Economics of Software Development
Categories: Project Management

Economics of Software Development

Herding Cats - Glen Alleman - Sun, 04/19/2015 - 16:21

Economics is called the Dismal Science. Economics is the branch of knowledge concerned with the production, consumption, and transfer of wealth. Economics is generally about behaviors of humans and markets, given the scarcity of means, arises to achieve certain ends.

How does economics apply to software development? We're not a market, we don't create wealth, at least directly, we create products and services that may create wealth. Microeconomics is a branch of economics that studies the behavior of individuals and their decision making on the allocation of limited resources. It's the scarcity of resources that is the basis of Microeconomics. Software development certainly operates in the presence of scarce resources. MicroEconomics is closer to what we need to make decisions in the presence of uncertainty. The general economics processes ae of litle interest, so starting with Big Picture Econ books is not much use.

Software economics is a subset of Engineering Economics. A key aspect of all Microeconomics applied to engineering problems is the application of Statistical Decisions Theory - making decisions in the presence of uncertainty. Uncertainty comes in two types:

  • Aleatory uncertainty - the naturally occurring variances in the underlying processes.
  • Epistemic uncertainty - the lack of information about a probabilistic event in the future.

Aleatory uncertainty can be addressed by adding margin to our work. Time and Money. Epistemic uncertainty and the missing information has economic value to our decision making processes. That is there is economic value in decision based problems in the presence of uncertainty.

This missing information can be bought down with simple solutions. Prototypes for example. Short deliverables to test an idea or confirm an approach. Both are the basis of Agile and have been discussed in depth in Software Engineering Economics, Barry Boehm, Prentice Hall. 1981.

Engineering economics is the application of economic techniques to the evaluation of design and engineering alternatives. Engineering economics assesses the appropriateness of a given project, estimates of its value, and justification of the project (or product) from an engineering standpoint.

This involves the time value of money and cash-flow concepts, -  compound and continuous interest. It continues with economic practices and techniques used to evaluate and optimize decisions on selection of strategies for project success. 

When I hear I read that book and it's about counting lines of code, the reader has failed to comprehend the difference between principles and practices. The section of Statistical Decision theory are about the Expected Value of Perfect Information and how to make decisions with Imperfect information.

Statistical Decision Theory is about making choice, identifying the values, uncertainties and other issues relevant in a given decision, its rationality, and the resulting optimal decision. In Statistical Decision Theory, the underlying statistical processes and the resulting Probabilistic outcomes require us to Estimate in the presence of uncertainty.

Writing software for money, other people's money, requires us to estimate how much money, when we'll be done spending that money and what will result from that spend.

This is the foundation of the Microeconomics of Software Development

If there is no scarcity of resources - time, cost, technical performance - then estimating is not necessary. Just start the work, spend the money and you'll be done when you're done. If however 

Related articles Five Estimating Pathologies and Their Corrective Actions Critical Success Factors of IT Forecasting
Categories: Project Management

No.

NOOP.NL - Jurgen Appelo - Fri, 04/17/2015 - 23:01
Say No.

Last week, I had a nice Skype call with a reader who was seeking my advice on becoming an author and speaker, and I gave him some pointers. I normally don’t schedule calls with random people asking for a favor, but this time I made an exception. I had a good reason.

The post No. appeared first on NOOP.NL.

Categories: Project Management

For the Upcoming Graduates

Herding Cats - Glen Alleman - Fri, 04/17/2015 - 05:36

No matter your life experience, your view of the world, whether you've had military experience or not, this is - or should be - an inspirational commencement speech. 

 

Categories: Project Management

Debunking

Herding Cats - Glen Alleman - Wed, 04/15/2015 - 04:21

This Blog has been focused on improving program and project management process for many years. Over that time I've run into several bunk ideas around projects, development, methods, and process of managing other peoples money. When that happens, the result is a post or two about the nonsense idea and the corrections to those ideas, not just form my experience but from the governance frameworks that guide our work.

A post on Tony DaSilva's blog, was about the Debunkers Club that struck a cord. I've edit that blog's content to fit mine domain, with full attribution.

This Blog is dedicated to the proposition that all information is not created equal. Much of it is endowed by its creators with certain undeniable wrongs. Misinformation is dangerous!!

There's a lot of crap floating around the any business or technical field. Much of it gets passed around by well-meaning folks, but it is harmful regardless of the purity of the conveyer.

People who attempt to debunk myths, mistakes, and misinformation are often tireless in their efforts. They are also too often helpless against the avalanche of misinformation.

The Debunker Club is an experiment in professional responsibility. Anyone who's interested may join as long as they agree to the following:

  1. I would like to see less misinformation in the project management field. This includes, planning, estimating, risk, execution, performance management, development methods.
  2. I will invest some of my time in learning and seeking the truth, from sources like peer-reviewed scientific research or translations of that research.
  3. I will politely, but actively, provide feedback to those who transmit misinformation.
  4. I will be open to counter feedback, listening to understand opposing viewpoints based on facts, example, and evidence outside personal opinion. I will provide counter-evidence and argument when warranted.
Related articles Debunker Club Works to Dispel the Corrupted Cone of Learning Five Estimating Pathologies and Their Corrective Actions Critical Success Factors of IT Forecasting Calculating Value from Software Projects - Estimating is a Risk Reduction Process
Categories: Project Management

Give the Power to the Programmers

From the Editor of Methods & Tools - Tue, 04/14/2015 - 14:22
Delegation of power to programmers is a smart idea. It is provably and measurably smarter than leaving the power with managers to design the developer’s work environment, and with IT architects to design the technology that we should program. Devolving the power to create a better working-environment, and to design the technology for our stakeholders, is better because developers are closer to the action, are more informed in practical detail and they can rapidly and frequently, test and measure, that their ideas really work. Video producer: http://33degree.org/

Qui Bono

Herding Cats - Glen Alleman - Tue, 04/14/2015 - 00:57

When we hear a suggestion about a process that inverts the normal process based on a governance framework - say Microeconomics of Software Development, we need to ask who benefits? How would that suggestion be tangibly beneficial to the recipient that is now inverted?

Estimates for example are for the business, why would the business no longer what an estimate of cost, schedule, or technical performance of the provided capabilities?

In the world of spending money to produce value, the one that benefits, should be, must be, the one paying for that value and therefore have a compelling interest in the information needed to make decisions about how the money is spent.

When that relationship between paying and benefit is inverted, then the path to Qui Bono is inverted as well.

In the end follow the money must be the basis of assessing the applicability of any suggestion. If it is suggested that decision making can be done in the absence of estimating the impacts of those decisions, who benefits. If it's not those paying for the value, then Qui Bono is not longer applicable. 

Categories: Project Management

Best Creativity Books

NOOP.NL - Jurgen Appelo - Mon, 04/13/2015 - 20:34
best-creativity-books

After my lists of mindfulness books and happiness books, here you can find the 20 Best Creativity Books in the World.

This list is created from the books on GoodReads tagged with “creativity”, sorted using an algorithm that favors number of reviews, average rating, and recent availability.

The post Best Creativity Books appeared first on NOOP.NL.

Categories: Project Management

The Flaw of Averages and Not Estimating

Herding Cats - Glen Alleman - Mon, 04/13/2015 - 16:04

There is a popular notion in the #NoEstimates paradigm that Empirical data is the basis of forecasting the future performance of a development project. In principle this is true, but the concept is not complete in the way it is used. Let's start with the data source used for this conjecture.

There are 12 sample in the example used by #NoEstimates. In this case stickies per week. From this time series an average is calculated for the future. This is the empirical data is used to estimate in the No Estimates paradigm. The Average is 18.1667 or just 18 stickies per week.

Data

 

But we all have read or should have read Sam Savage's The Flaw of Averages. This is a very nice populist book. By populist I mean an easily accessible text with little or not mathematics in the book. Although Savage's work is highly mathematically based with his tool set.

There is a simple set of tools that can be applied for Time Series analysis, using past performance to forecast future performance of the system that created the previous time series. The tool is R and is free for all platforms. 

Here's the R code for performing a statistically sound forecast to estimate the possible ranges values the past empirical stickies can take on in the future.

Put the time series in an Excel file and save it as TEXT named BOOK1

> SPTS=ts(Book1) - apply the Time Series function in R to convert this data to a time series
> SPFIT=arima(SPTS) - apply the simple ARIMA function to the time series
> SPFCST=forecast(SPFIT) - build a forecast from the ARIMA outcome
> plot(SPFCST) - plot the results

Here's that plot. This is the 80% and 90% confidence bands for the possible outcomes in the future from the past performance - empirical data from the past. 

The 80% range is 27 to 10 and the 90% range is 30 to 5.

Rplot

So the killer question.

Would you bet your future on a probability of success with a +65 to -72% range of cost, schedule, or technical performance of the outcomes?

I hope not. This is a flawed example I know. Too small a sample, no adjustment of the ARIMA factors, just a quick raw assessment of the data used in some quarters as a replacement for actually estimating future performance. But this assessment shows how to  empirical data COULD  support making decisions about future outcomes in the presence of uncertainty using past time series once the naive assumptions of sample size and wide variances are corrected..

The End

If you hear you can make decisions without estimating that's pretty much a violation of all established principles of Microeconomics and statistical forecasting. When answer comes back we sued empirical data, that your time series empirical data, download R, install all the needed packages, put the data in a file, apply the functions above and see if you really want to commit to spending other peoples money with a confidence range of +65 to -72%  of performing like you did in the past? I sure hope not!!

Related articles Flaw of Averages Estimating Probabilistic Outcomes? Of Course We Can! Critical Success Factors of IT Forecasting Herding Cats: Empirical Data Used to Estimate Future Performance Some More Background on Probability, Needed for Estimating Forecast, Automatic Routines vs. Experience Five Estimating Pathologies and Their Corrective Actions
Categories: Project Management