Skip to content

Software Development Blogs: Programming, Software Testing, Agile Project Management

Methods & Tools

Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!

Project Management

Managing Projects By The Numbers

Herding Cats - Glen Alleman - Sun, 03/15/2015 - 19:42

Screen Shot 2015-03-14 at 12.40.20 PMWhen we hear we don't need deadlines; we don't need estimates; we can slice work small enough to have each and every activity be the same size, with no variance; we can use small samples with ¬Ī30% variance to forecast future outcomes in the absence of any uncertainties; we're Bad at Estimates; and the plethora of other reasons for not doing the work needed to be proper stewards of our customer's money - we're really saying we weren't paying attention in the High School Statistics Class.

All project activities are probabilistic, driven by the underlying uncertainties of the individual random processes. When coupled together - only trivial projects have independent work activities - cost, schedule, and technical activities drive each other in non-linear, non-deterministic ways. Managing in the presence of this uncertainty means risk reduction work or margin. Both are needed.

When it's not our money, we are obligated to be stewards of the money from those paying for the production of the value. This is a core principle of all business operations.

The notion that those on the cost side of the balance sheet have an equal voice of those on the revenue side when it comes to managing the firm's money to produce products or services seems to be lost on those not accountable for managing the money. 

Categories: Project Management

Pi Day

Herding Cats - Glen Alleman - Sat, 03/14/2015 - 14:30

March 14, 2015

3/14/15

at

9:26:53 AM

ŌÄ=3.141592653

Categories: Project Management

Estimating Probabilistic Outcomes? Of Course We Can!

Herding Cats - Glen Alleman - Fri, 03/13/2015 - 16:30

At a client site in Sacramento, CA off and on since last December. Driving back to hotel tonight, there was a radio show on the current Uniform California Earthquake Rupture Forecast (Version 3).

UCERF3_postcard

A former Colorado neighbor is an earthquake researcher and we've had discussions about probabilistic estimation of complex systems and complexity sciences

The notion that we can't predict - to some level of confidence - outcomes in the future is of course simply not correct. Earthquake prediction is not technically possible in the populist sense. It's a complex probabilistic process. 

Making forecasts - estimates of future outcomes - for software development projects is much less complex. The processes used to make these estimates range from past performance time series to multi-dimensional parametric models. Several tools are available for these parametric model. Steve McConnell provided  an original one I use a decade or so ago. Steve provides some background on making estimates where he speaks of the 10 deadly sins

  • Confusing targets with estimates - the bug-a-boo of all #NoEstimates advocates. It's simple - DON'T DO THIS.
  • Saying yes when you mean no - no quantitative data and guessing mans bad estimates - DON'T DO THIS
  • Committing Too Early - use cone of uncertainty
  • Assume underestimating has no impact on project - DON'T DO THIS
  • Estimating in the Impossible Zone - an optimistic estimate has a non-zero probability of coming true
  • Overestimating from use of new tools - DON'T DO THIS¬†¬†¬†¬†
  • Using only one estimating technique - DON'T DO THIS
  • Not using estimating software - DON'T DO THIS
  • Not including risk factors - the the primary sin of the simple small samples of stories or story points used to linearly forecast future performance. DON'T DO THIS.
  • Providing¬†off the cuff estimates - this is called guessing. DON'T DO THIS.

When you need to estimate - as you do in any non-trivial project - make sure you're not committing any of the 10 sins Steve mentions.

So Why The Earthquakes and SW Estimates?

One process is very complex and emerging science. One is a well developed mathematical process. 

There is so much misinformation about estimating software development, it's hard to know where to start. From outright wrong math, to misuse of mathematical concepts, to failure to acknowledge that estimates aren't for developers, they're for those paying the developers.

Categories: Project Management

Will You Offer This for Free?

NOOP.NL - Jurgen Appelo - Fri, 03/13/2015 - 12:54
free

I have done many things for free.

All chapters of my new #Workout book have been available for free since I started writing them, because this helped me get early and fast feedback. I have done countless of free talks for local communities, because it allowed me to learn about their needs and stories. And I have offered plenty of participants free seats in my workshops when they offered me some other value in return.

What I don’t do is give away stuff for free just because people ask.

The post Will You Offer This for Free? appeared first on NOOP.NL.

Categories: Project Management

Empirical Data Used to Estimate Future Performance Re-Deux

Herding Cats - Glen Alleman - Fri, 03/13/2015 - 05:05

Came across this puzzling tweet today

... real empirical data & using probability to forecast are worlds apart. I don't buy "estimation uses data" argument.

This is always reminds me of Wolfgang Pauli's remark to a colleague who showed him a paper from an author who wanted Pauli to comment on...

Das ist nicht nur nicht richtig, es ist nicht einmal falsch!
It is not only not right, it is not even wrong

So let's Look At a Simple Forecasting Process

First forecasting is about the future. Estimates are about the past, present, and future. So estimates of future cost, schedule, and technical performance can be called forecasts.

A project's past performance data is a time series. Gather from things that happened in the past, based on intervals of times. These intervals should be evening spaced. They don't have to be, but that makes the analysis more complex. The example below is done with R is a statistical programming language found here www.r-project.org. R is used in a wide variety of domains. I was introduced to R through our son's work in cellular biology when he pointed out I'd get a D in his BioStats class he taught. Stop making linear projects, unadjusted for the variances of the past, and most of all unadjusted for variances created from uncertainty in the future. Come on Dad get with the program of making risk informed decisions. Here's a good reference on how to do this at a much broader scale Risk Informed Decision Making Handbook.

Below is a R plot from historical data of a project cost parameter, forecasting the possible values of this parameter to the future, using ARIMA (Autoregressive Integrated Moving Average). ARIMA is built into R - which can be downloaded for free. R and its statistical analysis capabilities are used in our domain to develop estimates. Using past performance - in the example below of cost index - we can forecast the eange and confidence on the bounds of that range - for cost index values.

Screen Shot 2015-03-11 at 3.37.12 PM

The chart above is from the paper below.

Earned Value Management Meets Big Data from Glen Alleman So when you hear about empirical data being used for forecasting the future. Or you hear the statement in the opening of this post, ask several questions:
  • Do you have any experience forecasting future outcomes from past performance that is mathematically credible?
  • Did you adjust your forecast for past variances?
  • Did you adjust your forecast for future uncertainty?

No? Then it's unlikely your number will have any chance of being correct.

We Have No Empirical Data, Now What?

Here's a continuation of the Tweet stream

Have you ever been asked to estimate something and haven't got any empirical data. This happens all the time. New teams are put together in new domains and asked to estimate, which really means commit. I don't see too many managers gathering real data about their projects and using them to forecast lead times.

Here's the way to solve this non-problem, problem.

This is a short list, just from my office book shelf. The office library has dozens of other books and the files have many dozens of recent papers on estimating software in the absence of empirical data. Google will find you 100's more.

So here's the final outcome. Whenever we hear about the reason we can't estimate, it's simply not true, never was true, never will be true. 

Related articles Software Engineering is a Verb Failure is not an Option Managing in the Presence of Uncertainty
Categories: Project Management

Managing by (mis)quoting Deming

Herding Cats - Glen Alleman - Wed, 03/11/2015 - 16:21

It is very popular to quote a person who made a significant contribution to some field. Deming is popularly used in the Agile domain for his quotes. I use them often, sometimes out of context.

A classic recent out of context quote is...

"Management by numerical goal is an attempt to manage without knowledge of what to do, and in fact is usually management by fear." -Deming

A counter quote is...

As Deming said: "A numerical goal without a method is nonsense." and then he said "Where there is fear you do not get honest figures."

So cherry picking quotes is very common, when a narrow view of a topic needs to be supported.

Here's a short story about measurement and numbers, from - Andrea Gabor, The Capitalist Philosophers (New York: Times Business, 2000), p. 143.

In the dense fog of a dark night in October, 1707, Great Britain lost nearly an entire fleet of ships. This was not from a battle at sea. Admiral Clowdisley Shovell simply miscalculated his position in the Atlantic and his flagship smashed into the rocks of the Scilly Isles, a tail of islands off the southwest coast of England. The rest of the fleet, following blindly behind, went aground as well, piling onto the rocks, one  after another. Four warships and 2,000 lives were lost.

For such a proud nation of seafarers, this tragic loss wa embarrassing. To be fair to Admiral Shovell, it was not¬†actually surprising. The concept of latitude and longitude had¬†been around since the first century B.C. But in 1700 no one had devised¬†an accurate way to measure longitude, meaning that nobody ever knew for¬†sure how far east or west they had traveled. Seamen like Shovell estimated their progress by guessing their¬†average speed or by dropping a log over the side of the boat and timing¬†how long it took to float from bow to stern. Forced to rely on such crude¬†measurements, the admiral can be forgiven his massive misjudgment. What¬†caused the disaster was not the admiral‚Äôs ignorance, but his inability to measure¬†something that he already knew to be critically important‚ÄĒin this case longitude.¬†

A one small reminder for all those quoting anyone out of context, here's my contribution

Over 150 years ago the Irish mathematician and physicist Lord Kelvin¬†reminded us: ‚ÄúWhen you can measure what you are speaking about, and express¬†it in numbers, you know something about it; but when you cannot measure it, when¬†you cannot express it in numbers, your knowledge is of a meager and unsatisfactory¬†kind....‚ÄĚ

So whenever you hear a quote, even my quotes, go test the context and domain of that quote to see of the poster has done his homework, followed the tread of the speaker or writer of the quotes and connected that quote in some meaningful way to a problem at head. If not, it's a nice platitude. 

Categories: Project Management

Software Development Linkopedia March 2015

From the Editor of Methods & Tools - Wed, 03/11/2015 - 15:46
Here is our monthly selection of knowledge on programming, software testing and project management. This month you will find some interesting information and opinions about software development culture, project estimation, checking the health of your project team, the costs and benefits of unit testing, product backlog management, mobile architecture, test coverage and user experience. Blog: Culture Change: Reinventing Organizations Blog: Why you suck at estimating ‚Äď a lesson in psychology Blog: Squad Health Check model ‚Äď visualizing what to improve Blog: Selective Unit Testing ‚Äď Costs and Benefits Article: The Significance of Release Retrospectives Article: Mobile ...

Quote of the Day

Herding Cats - Glen Alleman - Wed, 03/11/2015 - 14:44

When you have mastered numbers, you will in fact no longer be reading numbers, any more than you read words when reading books. You will be reading meanings.
‚ÄďW. E. B. Du Bois, American sociologist, historian, civil rights activist

As project managers, numbers and the information they convey are the glue that holds the process of delivering value to the customer to together.

These numbers are plans, estimates, measures of Effectiveness, Performance, Risk, Value. There units of measure must always be meaningful to the decision makers. This is a fundamental issue when the units are useful only to a specific group. 

Story Points or Stories are not units of measure found on the balanced sheet. The primary unit of measure for the business decision maker is Dollars and Time. All deliverables have a value measured in some form in Dollars and the time when that value will be available to start accruing that value.

When we speak in units of measures not meaningful to the decision makers, we obfuscate conversation, and become frustrated when those we should be informing don't understand what we are trying to convey. 

When we hear you don't understand what I am telling you, it is not the obligation of the listener to understand in the presence of meaningless units. It is the obligation of the speaker. Classroom teacher know this, flight instructors know this, baseball coaches know this. Any mentor, coach, advisor, or consultant knows this. 

Related articles I Think You'll Find It's a Bit More Complicated Than That A Theory of Speculation Software Engineering is a Verb Open Loop Thinking v. Close Loop Thinking
Categories: Project Management

Open Loop Thinking v. Close Loop Thinking

Herding Cats - Glen Alleman - Wed, 03/11/2015 - 06:12

GTDInspired by the Book Getting Things Done (GTD), my training and experience in control systems, Systems Engineering, and process improvement, there are two classes of thinkers when it comes to decision making.

The Open Loop thinker assert they are efficient in that they only do what they need to in order to get what they want. The classic is why estimate our work, let's just start coding, estimates are waste of time. The Open Loop process really says a waste of MY time, forget about those paying me, or those accountable for balancing the budget in exchange for the value produced by my work.

The Closed loop thinker consider themselves responsible and they do more than they need to for the greater good.  In the development case, the greater good is th good of the organization providing their salary. That salary comes of course, from customer, who for goods or services at a competitive proThe open looper sees the immediate result and upon achieving the desired result, wanders off to find something else of interest.  The closed looper sees not only the result, but the effect on others, the need to repeat the result again in the future and everything along the way to that result that could be improved. 

The GTD paradigm introduces the notion of Closing the Loops. The idea of closing the loop means you're not DONE until you've returned your environment to a stable state. This means the following conditions must be present:

  • We know what DONE looks like in units of measure meaningful to the decision makers, not just those providing the solution.
  • We have a steering target that guides us along the path to DONE.
  • We have measures of actual progress to plan, that produce¬†error signals we can use to take corrective actions to stay in the path toward DONE.
  • We have corrective action plans to¬†correct our path to assure, to some level of confidence, we are going to¬†Land Soft on or before the day the needed capabilities are NEEDED, for the cost needed to meet the business objectives.

This of course is the definition of Close Loop control. Open Loop control has no steering target, no corrective action process - the Closed part of Closed Loop control - and no mechanism to forecast the estimate to complete. To close the loop, requires not only measuring past performance, but establishing the Target performance needed to shown up on time, on budget, with the needed capabilities.

This target performance defines what the performance needs to be at each point in time to meet the planned delivery, at the planned cost, with the planned capabilities. Along the way, actual measures of performance close the loop by generating an error signal used to take corrective actions to stay on plan.

Control systems from Glen Alleman Related articles Criteria for a "Good" Estimate A Theory of Speculation Control Theory | Bachelors Engineering Herding Cats: Building a Credible Performance Measurement Baseline Risk Management is How Adults Manage Projects
Categories: Project Management

Managing in the Presence of Uncertainty

Herding Cats - Glen Alleman - Wed, 03/11/2015 - 04:46

All project work is driven by underlying uncertainty from people, processes, technology, and tools. Let me restate for clarity Certainty is an Illusion, if you're looking for certainty, you're not going to find it in the project domain. You're not going to find it anywhere in the real world.

So the big question is not, how can we manage in the presence of uncertainty? There is only one way. OK, there are two ways. 

The first way is to ignore the uncertainties and pretend that what happened in the past is going to happen in the future. And what might happen in the future can be controlled when it happens, or we can just ignore it and keep moving open. We can also pretend that small samples of what happened in the past are sufficient to forecast the Happy Path to what is going to happen in the future future. Both of these are a naive from a management perspective, and they are mathematically wrong.

Now a second way. Since all project work is driven by underlying uncertainty, the result is a probabilistic outcome of anything we do. Like the picture below. Each activity is governed by its probability distribution. When these activities are interconnected, or even when they are serial, the result is a new probability  distribution  for the STOP date and cost. As an aside, these distributions are not multiplied. There is no multiply operator between probability distributions. The numbers that are produced by the distributions, the possible values that can result, are represented by Integral or Differential Equations. Same for simple linear series of work so popular in some agile communities trying to apply Little's Law. Which by the way requires each random variable to be independent and identically distributed (I.I.D.). Which in any non-trivial project is simply not the case.

Screen Shot 2015-03-06 at 7.59.02 AM

 

The actual schedule for the notion example above, with some defined probability distributions and the applicable of a Monte Carlo Simulator produces this probabilistic outcome for a deliverable. The tool that produces this outcome is RiskyProject from Intaver. There are several products like this, but this is my favorite. It starts with the Risk Register for both reducible and irreducible risks, all the probability setting, including residual probability after mitigation, connects these risks to the work in the Integrated Master Schedule, and produces charts like the one below, showing the probability distribution function and the cummulative distirbution function for a specific deliverable. For example this shows a 79% chance of completing on or before March 13, 2018 for the item in the schedule.

Untitled

So since we shouldn't or actually can't ignore the underlying probabilistic behaviour of our projects elements. what can we do? Well we can manage in the presence of these uncertainties.

The basis of all decision making in the presence of uncertainty is estimating. If you have no uncertainty, making decisions is easy, in fact trivial. If you have a project in which decisions can be made without the help of estimates, it is likely that project is itself easy and possibly even trivial. Those types of projects have little or no uncertainty.

Assuming your project is not trivial, here's one way to manage in the presence of uncertainty.

Here's some more background from a recent conference on managing cost and schedule in the presence of uncertainty.   Related articles The Cost Estimating Problem Herding Cats: Decision Making Without Estimates? Risk Management is How Adults Manage Projects Quote of the Day
Categories: Project Management

How Full to Fill a Sprint

Mike Cohn's Blog - Tue, 03/10/2015 - 15:00

An important consideration in commitment-driven planning is how full to fill the sprint. To answer that, we need to understand that a sprint includes three types of time.

The first type of time is corporate overhead. Corporate overhead includes everything required to be a good citizen in today’s corporate world. That includes time spent in all-company meetings, reading emails, going to HR sensitivity training classes, and even attending the various meetings for a team’s agile process.

The second type of time is plannable time. This is the time that team members can plan to use on the real work of the sprint. A team member does not, however, want to fill the sprint too full. Team members need to leave time room for some unplanned time, which is the third type of time.

Unplanned time goes toward three things:

  1. Emergencies. This is something like the server going down and your team being asked to fix it.
  2. Tasks that get bigger than expected. For example, in sprint planning you think a task will take 8 hours. Later, after spending 6 hours on it, you realize you still have 6 to go.
  3. Tasks that are not identified in sprint planning. No matter how hard you try during sprint planning, you aren’t going to think of everything.

Graphically, I think of the sprint as a rectangle, or anything with area as you can see in the figure here.

 

 

 

 

 

 

 

The first thing we put into our sprint is corporate overhead. Next, the team loads the sprint with plannable time—but not so full that the slightest problem causes the sprint to overflow. They leave room at the top of the rectangle for unplanned time—emergencies, tasks that get bigger, and tasks they don’t think of.

Determining how full a team should fill a sprint is really a question of how much plannable time they have. Consider two teams. The first is part of a large organization with lots of corporate overhead.

This team is always responsible for second-tier tech support on the existing application while trying to move forward on a next-generation product. This team may not have much plannable time at all.

The second team is two programmers in a garage working on a startup’s first product. Corporate overhead for them is who is going to order lunch for the day. They might have much more plannable time than the first team.

Similarly, consider an amazingly agile team that sucks at estimating their work. They are horrible at it. They think something will take two hours and it takes 10. And they rush through planning, so they fail to identify many tasks they could have identified if they’d tried harder.

Because they’re so bad at estimating, this team might need to leave a lot of unplanned time in their sprint. Their plannable time would be less than that of an equivalent team that could estimate better.

This means a team’s amount of plannable time cannot really be used as a measure of how hard that team works or of how good they are.

If you are an experienced agile team, you’ll find that your amount of plannable time is also the vertical intercept on your sprint burndown chart. This is sometimes called a team’s capacity. This is distinct from velocity, which should always be measured in the units used on a team’s product backlog items, usually story points.

Criteria for a "Good" Estimate

Herding Cats - Glen Alleman - Tue, 03/10/2015 - 14:51

There are a large number of statements about estimating that are flawed from the start. From Dilbert's manager's use of estimates, to naive use of statistics, to suggesting the future can not be modeled, project's are too unstable, estimates are of no value, to my favorite, making estimates for our work takes away from doing our work. None of these notions and they are certainly notional (existing only in theory or as a suggestion or idea), has connected the dots between spending money to produce value and the need of those providing the money to know how much, when, and what.

Let's begin with what does a good estimating process need to look like

  • A good estimating process produces good estimates for all the quantities we need without exceeding the resources allocated for the estimate. Estimating is a process just like designing code is a process, our developing a marketing campaign for the product we just release, is a processes.¬†This means when you hear¬†we can't spend more time on the estimate than the estimate is worth¬†remember the speaker is restating the obvious.
  • The primary requirement for the estimate is to provide a value for some quantity with a¬†known and¬†appropriate¬†level of accuracy. All estimates by their nature have errors.
  • The accuracy of an estimate depends on two things
    • The inherent accuracy of the estimating technique or model¬†
    • The errors in the input parameters.
  • To assess the amount of risk associated with a bad estimate. If we underestimate the effort needed to¬†patch the SQL Server over the weekend, what is the cost of not being able to start the production server on Monday morning?

First A Bit About Ordinal and Cardinal Values in General

  • Cardinal numbers describe¬†how many- there were 8 rabbits in the garden this morning.
  • Ordinal numbers describe¬†position¬†or¬†relation¬†- there were a lot more rabbits in the garden this morning than there were yesterday.

When we use Cardinal numbers, we can determine the relative relation of other numbers in our estimate. The effort to develop Function A is larger than Function B. But it doesn't tell us how long it will take to do either. Only after we know how to do one, we can tell the relative effort of the other.

If we know something about one of the Functions - that is we had a calibration of the estimates of that Function - we now have a¬†Cardinal¬†number. We estimate that Function A will take 16 days ¬Ī3 days and Function B is only ¬Ĺ that effort, so we estimate Function B will take 8 days ¬Ī1¬Ĺ days all things being the same. This last part is the source of most estimating errors. We make assumptions that turn out not to be true.

In our domain here are four root causes of program performance issues. Unrealistic cost and schedule estimates are in that list. But all four have at their root, estimating errors. Either risk, direct estimates, or expectations that were improperly estimated to be possible.

Screen Shot 2015-03-09 at 8.17.01 AM

Measures are the Raw Material of Good Estimating

When we use uncalibrated numbers in our estimate, we can make decisions based on relative measures. I like Chocolate ice cream more than I like Vanilla. That's a decision based on an Ordinal measure. I estimate I can only play 9 holes of golf today, because it will take me 2 hours and I have to leave for a meeting in 3 hours with a 30 minute drive. These numbers are Cardinal. 

So Why Do We Estimate? Let Me Count the Ways

Estimates are needed to make decisions in the presence of uncertainty. This is a fundamental principle of making choices in the presence of uncertainty. If it's Chocolate or Vanilla we don't have much uncertainty - unless we never tasted them before. 

No matter how many times there it is conjectured decisions can be made without estimates. No matter how much this is repeated. No matter how desperately those making that conjecture want it to be true, it is simply not true when there is uncertainty about the future.

Since all project work is based on uncertainty  (even production line work has uncertainty), the notion of making a decision without estimating the impact of that decision is mathematically impossible.

No Estimates

So why do we estimate? There are three simple answers:

  • To control the project. Without a target to steer toward and an estimate of the time, cost, and probability of success of the efforts toward that target, we don't know if the target can be achieved. Once we're underway, new information is available to take corrective actions on the variances between planned progress and actual progress. This is the basis of Closed Loop control. Without a steering target, the estimated needed progress to reach the target as planned,¬†measures of the actual progress to plan, the difference between planned and actual, and the corrective actions to¬†get back on plan - we're going to be late, over budget and our gadget likely to not work.
  • Have the business understand the potential cost, duration, and probability of success. It's not our money. We may hear statements about the¬†waste of estimates, but it's not our money.¬†
  • Estimate future performance. If we're working on a project that has a duration beyond our ability to see all the work in detail, we need an estimate of how long, how much, and what can be done to produce the needed value for those paying for our work.¬†

Estimates are measurement about the future

Any measurement is the resulting of applying a procedure. An operational definition puts communicable meaning into a concept - W. Edwards Deming

Estimates (measurements future outcomes) must address assumptions of scope, units of measure (Cardinal values), and conditions of the measures (estimate). 

So if we want to understand something about the future in order to inform our decision making, we need to estimate - in some way - what are the possible outcomes we will see in the future and what happens if we pick one of those outcomes compared to another. This is the foundation of the Microeconomics of writing software for money. To attempt to make decisions in the absence of an estimate of the impact of that decision, ignores - likely with intent - the foundation of all business decision making  

Categories: Project Management

Best Mindfulness Books

NOOP.NL - Jurgen Appelo - Mon, 03/09/2015 - 13:20
best mindfulness books

I have developed an interest in mindfulness and meditation, from a scientific perspective. For this reason, I did a bit of research on GoodReads to discover which are the best mindfulness books. Unsurprisingly, I found a lot religious works among these books. Those were not what I was looking for. In my not-so-humble opinion, religion is spiritualy for the ignorant mind, while enlightenment is spirituality for the curious mind.

The post Best Mindfulness Books appeared first on NOOP.NL.

Categories: Project Management

Fibonacci Numbers, Agile, and the Actual Mathematics

Herding Cats - Glen Alleman - Mon, 03/09/2015 - 05:24

In Agile Fibonacci numbers are the latest rage. But it's just that the latest rage. The reason for using the Fibonacci sequence is to reflect the inherent uncertainty in estimating larger items. What this means is the larger the size of the card (in the planning poker paradigm)

There of course is no evidence the uncertainty of a Fibonacci large number is any more uncertain than the uncertainty of a Geometric series large number, of 1, 2, 4, 8, 16, 32, ... compared to Fibonacci's series 1, 1, 2, 3, 5, 8, 13, 21, ...

The Geometric series is below.

359px-GeometricSquares.svg

 

But there is a very interesting relationship between Fibonacci and Geometric series. The connection is …ł (Phi).¬†Phi is an irrational number in connection with a classical problem of the division of a line segment to¬†mean and¬†extreme¬†ratio

Phi 

Phi is the common ratio between geometric series and the Fibonacci series. 

Both the Fibonacci series and the Geometric series can be used to assess the differences between items like difficulty, cost, durations - or any Ordinal comparisons. The Fibonacci series is just is clever approach with no actual differentiation from the Geometric.

No for the Point

If agile is using Fibonacci numbers or Geometric numbers as RELATIVE comparisons of work, complexity, or anything, this is seriously flawed, since these numbers are not calibrated - they are relative. Relative is not Bad per se, but they are also not Good. Because the business is very unlikely to carry Fibonacci numbers of Story Points in the general ledger. 

What needs to happen is to estimate work in Cardinal units. Measures that have been calibrated in a credible manner. Sampling 20 or so past sprints without consideration of their variance, changes in risk, future changes is all the variables that will impact the project is naive at best and bad math at worse.

So while it may sound cool to be tossing around Fibonacci numbers in those planning session, you might as well be making up any progress separation series and use it because to outcome is the same. This notion is simply a marketing ploy that somehow entered the vocabulary of agilest. Both Fibonacci and series and Geometric series provide a simple way to separate adjacent Ordinal measures so they don't appear related.

 

Categories: Project Management

I Think You'll Find It's a Bit More Complicated Than That

Herding Cats - Glen Alleman - Mon, 03/09/2015 - 00:57

When we encounter simple answers to complex problems, we need to not only be skeptical, we need to think twice about the credibility of the person posing the solution. A recent example is:

The cost of software is not directly proportional to the value it produces. Knowing cost is potentially useless information.

The first sentence is likely the case. Value of any one feature or capability is not necessary related to it's cost. Since cost in software development is nearly 100% correlated with the cost of the labor needed to produce the feature.

But certainly the cost of developing all the capabilities and the cost of individual capabilities when their interactions are considered must be related to their value or the principles of Microeconomics of Software Development would not longer be in place.

Microeconomics is a branch of economics that studies the behavior of individuals and small impacting organizations in making decisions on the allocation of limited resources. Those limited resources include (but are not limited to) Time and Money.

So without knowing the cost or time it takes to produce an outcome, the simple decision making processes of spending other peoples money based on the Return on that Investment gets a divide by zero error

ROI = (Value - Cost) / Cost

Since all elements of a project are driven by statistical processes, the outcomes are always probabilistic. The delivered capabilities are what the customer bought. Cost and Schedule are needed to produce those capabilities. The success of the project in providing the needed capabilities depends on knowing the Key Performance Parameters, the Measures of Effectiveness, the Measures of Performance, and the Technical Performance Measures of those capabilities and the technical and operational requirements that implement them.

The cost and schedule to fulfill all these probabilistic outcomes is itself probabilistic. It is literally impossible to determine these outcomes in a deterministic manner when each is a statistical process without estimating. The Cost and Schedule elements are also probabilistic, further requiring estimates.

Slide1

The notion that you can determine the Value of something without knowing its Cost is actually nonsense. Anyone suggesting that is the case has little understanding of business, microeconomics of software development or how the world or business treats expenditures of other peoples money.

Here's some background to help in that understanding:

And for any suggestion that cost is not important in determining value please read

51lMOhz7RdLBetween this last book and the books above and all the papers, articles, and training provided about how to manage other people's money when producing value from software systems, you'll hopefully come to realize those notions that we don't need to know the cost, can't know the cost, and poor at making estimates, and should simply start coding and see what comes out are not only seriously misinformed, but misinformed with intentional ignorance.

If your project is not using other peoples money, if your project has low value at risk, if your project is of low importance to those paying, then maybe, just maybe they don't really care how much you spend, when you'll be done, or what will result. But that doesn't sound very fulfilling where I live.

Screen Shot 2015-03-08 at 5.56.03 PMAnd a final BTW, using a small sample of past performance, without adjusting for variances, adjusting for irreducible (aleatory) uncertainty, and adjusting for Reducible (epistemic) uncertainty - all three create risks - is simple driving in the dark with the lights off. A simple High School statistics class will show how to Not Make That Type of forecast. Once you enter to domain of spending other people's money - at least a non-trivial amount of money - much better statistical forecasting will be needed is you have any hope of showing up before the need date, below the planned budget, and with the required capabilities to earn back the cost in exchange for delivered value. To learn how to use probability and statistics for estimating software intensive systems, buy and read the book to the left

Related articles Start with Problem, Than Suggest The Solution Estimating is Risk Management Software Engineering is a Verb
Categories: Project Management

Quote of the Day

Herding Cats - Glen Alleman - Sat, 03/07/2015 - 20:25

Unless you tackle a problem that's already been solved, which is boring, or one whose solution is clear from the beginning, mostly you're stuck 
- Yitang Zhang, from an interview about his solution to the "Bounded Gaps Between Primes" problem

Categories: Project Management

Why We Need Governance

Herding Cats - Glen Alleman - Sat, 03/07/2015 - 17:20

Governance (in the software development domain) is based on specifying the decision rights and accountabilities needed to elicit desired behaviours in the development and sustainment of products and service.

Business Governance

Is the set of decisions that define expectations, grants power, and verifies performance

Information Technology Governance

Firms with superior software development governance have 25% higher profits than firms with poor governance, given the same strategic objectives. 
These top performers have custom designed product development governance for their strategies. 
Just as corporate governance aims to ensure quality decisions about all corporate assets, software development governance links decisions with company objectives and monitors performance and accountability. 

Program Governance

Program Governance is the framework that ensures project’s are conceived and executed in accordance with best project management practices within a wider framework of an organizational governance processes.
Effective program governance ensures projects deliver their expected value.
An appropriate governance framework ensures all expenditures are appropriate for the risks being managed.
Program governance approach is not about micromanagement, it is about setting terms of reference an operating framework, defining boundaries, and ensure planning and execution are carried out in a way which assures all projects deliver the planned benefits.

The Framework for Program Governance

  • Connect project performance measures with business performance measures through policies, practices, procedures, processes, and tools.
  • Measure and manage the spend for the value produced from project work, including software development, infrastructure, customer support, testing, quality assurance, validation and other support functions.
  • Assure accountability of organizations and individuals in their participation in product development and sustainment processes through performance reporting and variance analysis against planned performance of cost, schedule, and technical outcomes.
  • Increase the maturity of product development, release, and sustainment processes to transform the organization to increase the effectiveness of all work activities.

Putting Program Governance to Work

  • Align the processes of software development, testing, quality assurance, release management, and operations with the business needs.
  • Provide predictable, consistent processes that meet customer expectations.
  • Enable efficient and effective delivery of products and services.
  • Enable measurable, improvable processes that can be tuned for accurate delivery and overall effectiveness of product or service offerings.
Related articles I Think You'll Find It's a Bit More Complicated Than That What is a Team?
Categories: Project Management

What is a Team?

Herding Cats - Glen Alleman - Thu, 03/05/2015 - 21:46

There is a recurring discussion on many domains, especially software development, and especially agile development about what is a team. From my experience in the military, then leading project managers, then lead program management offices, that getting the definition of a team right up front is a critical success factor.

My favorite definition comes from Jon Katzenbach:

A team is a small number of people with complementary skills who are committed to a common purpose, performance goals, and approach for which they are mutually accountable. (Katzenbach and Smith, 1993)

Each portion is critical to the success of the team.

  • A small number - too many people get confused, overlapping roles, and can't focus on a single outcome.
  • Complementary skills - duplicate skills, roles, responsibility are a waste
  • Committed to a common purpose - why are we here, what are we building, what does done look like in tangible units of measure meaningful to the decision makers?
  • Mutually accountable - this is the¬†killer concept.¬†If we don't hold each other mutually accountable, we can't have a shared outcome, and our efforts are for naught.

So when you hear about Mob Programming or even Pair Programming, ask if those ideas meet the Katzenbach test of a team.

One of the best examples of TEAM is in this. And I suspect that most working in small teams or even large commercial organizaitons don't think of teams in this manner. Our Space and Defense domain usualy does.

Categories: Project Management

The Use, Misuse, and Abuse of Complexity and Complex

Herding Cats - Glen Alleman - Wed, 03/04/2015 - 16:54

Our world is complex and becoming more complex all the time. We are connected and in turn driven by a complex web of interacting technology and processes. These interacting technologies and processes are implemented by information and communication technologies that are themselves complex. It is difficult to apply such a broad topic as complexity to the equally broad topic of developing software systems or even broader topic of engineered systems.

Measuring complexity in engineered systems is a highly varying concept.1

The result of these complex systems many times creates complexity. But care is needed is tossing around words like complex and complexity. If these systems are in fact Engineered, rather than simply left to emerge on their own, we can apply some principles to control the unwanted complexity of these complex systems.

First some definitions. These are not the touchy feely definitions found in places like Cynefin where the units of measure of complex, complexity, and chaos, are no where to be found. Cynefin was developed in the context of management and organizational strategy by David Snowden. We're interested in the system complexity of things, the people who build them, and the environments where they are deployed. But this also means measuring complex and the complexity in units meaningful to the decision makers. These units must somehow be connected to the cost, schedule, and probability of success for those paying for the work.

Complexity has turned out to be very difficult to define. The dozens of definitions that have been offered all fall short in one respect or another, classifying something as complex which we intuitively would see as simple, or denying an obviously complex phenomenon the label of complexity. Moreover, these definitions are either only applicable to a very restricted domain, such as computer algorithms or genomes, or so vague as to be almost meaningless. (From Principia Cybernetica) 

Some more background about complex systems and their complexity4

  • A¬†System¬†is a set of interacting components - whether human-made, naturally-occurring, or a combination of both.
  • By "interact", it means the exchange of physical force, energy, mass flow, or information, such that one component can change the state of another component. Or for software systems, the exchange of information, state knowledge, or impact an outcome of other component.
  • The technologies or natures of these systems may be mechanical, chemical, electronic, biological, informational (software), or combinations of these or others.
  • The behavior of a system can include "emergent" aspects that are not a characteristic of any individual component, but arise from their combination.
  • Emergent properties can be valuable (e.g., delivery of new services) or undesired (e.g., dangerous or unstable).
  • The behavior of a system is often not easily predicted from the behavior of its individual components, and may also be more complex.
  • The complexity of human-engineered systems is growing, in response to demands for increased sophistication, in response to market or government competition, and enabled¬†by technologies.
  • It has become relatively easy to construct systems¬†which cannot be so¬†readily understood.¬†

What is complexity then?

The words chaos and complexity have been used to mean disorder and complications for centuries. Only in the last thirty years have they been used to refer to mathematical and scientific bodies of knowledge.6

Often something is called complex when we can't fully understand its structure or behavior. It is uncertain, unpredictable, complicated, or just difficult to understand.  Complexity is often described as the inability of a human mind to grasp the whole of a complex problem and predict the outcome as Subjective Complexity.5

The complex and complexity I'm speaking about are for Engineered Systems, products and services used by organizations, but engineered for their use.  Their use can be considered complex, even create complexity and many times emergent. But the Cynefin approach to complex is ethereal, without the principled basis found in engineering and more importantly Systems Engineering.3 This appears to be why agilest toss around the terms found in Cynefin, since engineering of the software is not a core principle of agile development, rather emergent design and architecture is the basis of the Agile Manifesto.

Here's an example of the engineering side of complex systems, from Dr. Sheard's presentation "Twelve Roles and Three Types of Systems Engineering," NASA Goddard Space Flight Center, Systems Engineering Seminar, Complexity and Systems Engineering, April 5, 2011,

Screen Shot 2015-03-01 at 9.02.58 AM

Before applying these definitions to problems for developing software, there is more work to do.

In complex systems there are entities that participate in the system2

  • The technical system being designed and built.
  • The socio-technical systems that are building the systems - the project team or production team.
  • The technological Environment into which the system will be inserted when the system is complete and deployed.¬†The socio-political system related to the technological environment. This is generally the interaction of the system stakeholders with the resulting system.
  • The subjective human experience when thinking about, designing, or using the system, called Cognition.

Cynefin does not make these distinctions, instead separates the system into Complex, Complicated, Chaotic, and Obvious, without distinguishing to which engineered portion of the system these are applicable.

So when we hear about Complex Adaptive Systems in the absence of a domain and the mathematics of such a system, care is needed. It is likely no actionable information will be available in units of measure meaningful to the decision makers to help them make decisions. Just a set words.

References

1 "Complexity Types: from Science to Systems Engineering," Sarah Sheard, ; Mostashari, Ali, Proceedings of the 21th Annual International Symposium of the International Council of Systems Engineering

2 "Systems Engineering in Complexity Context," Sarah Sheard, Proceedings of the 23nd Annual International Symposium of the International Council of Systems Engineering

3 Systems Engineering Principles and Practices, 2nd Edition, Alexander Kossiakoff William N. Sweet Samuel J. Seymour Steven M. Biemer, John Wiley & Sons.

4 The Challenge of Complex Systems, INCOSE Crossroads of America Chapter.

5¬†‚ÄúOn Systems architects and systems architecting: some thoughts on explaining the art and science of system architecting‚ÄĚ H. G.¬†Sillitto, Proceedings of INCOSE IS 2009, Singapore, 20-23 July.¬†

6 Practical Applications of Complexity Theory for Systems Engineers, Sarah Sheard, Proceedings of the 15th Annual International Symposium of the International Council on Systems Engineering

Related articles Risk Management is How Adults Manage Projects Start with Problem, Than Suggest The Solution I Think You'll Find It's a Bit More Complicated Than That
Categories: Project Management

Failure is not an Option

Herding Cats - Glen Alleman - Wed, 03/04/2015 - 04:30

LastTitan_vandenberg_fThere is a popular noton in the agile world, and some business guru's that failure is encouraged as part of the learning process. What is not stated is when and where this failure can take place.

The picture to the left is the flight of the last Titan IV launch vehicle. I was outside the SCIF, but got to see everything up the 2nd stage separation.

The Martin Company‚Äôs launch vehicle built a five-decade legacy goes back to the earliest rockets designed and built in the United States. The Intercontinental Ballistic Missile (ICBM) program; Project Gemini, NASA‚Äôs 2nd human spaceflight program; Mars Viking landers; Voyager deep space probes; communications and reconnaissance satellites‚ÄĒall of these programs and more relied on the Titan for a safe and dependable launch.¬†

The final version flew when the program was retired after delivering National Reconnaissance Office payload to orbit on October 19, 2005. A total 368 Titans were flown, with capabilities ranging from Earth reconnaissance and military and civil communications to human and robotic exploration.

In this domain, failure is not an option. Many would correctly say, failures were found before use. And that is correct, Design, Development, Test, and Evaluation (DDT&E) is the basis of must work when commanded to do so when commanded to do so.

In domains without the needed capability that must perform on demand - fail fast and fail often may be applicable. 

Choose domain before suggesting a process idea is applicable

Related articles Software Engineering is a Verb Self-Organization Quote of the Day
Categories: Project Management