Skip to content

Software Development Blogs: Programming, Software Testing, Agile Project Management

Methods & Tools

Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!

Project Management

What Happened to our Basic Math Skills?

Herding Cats - Glen Alleman - Wed, 07/01/2015 - 15:29

Screen Shot 2015-06-30 at 10.03.06 PMMaking decisions in the presence of uncertainty of a future outcomes resulting from that decision is an important topic in the project management, product development, and engineering domains. The first question in this domain is...

If the future is not identical to the past, how can we make a decision in the presence of this future uncertainty?

The answer is we need some means of taking what we know about the past and the present and turning it into information about the future. This information can be measurements of actual activities - cost, duration of work, risks, dependencies, performance and effectiveness measures, models and simulation of past and future activities, reference classes, parametric models.

If the future is identical to the past and the present, then all this data can show us a simple straight line projection from the past to the future.

But there are some questions:

  • Is the future like the past? Have we just assumed this? Or have we actually developed an understanding of the future from looking into¬†what could possible change in the future from the past?
  • If there is no change, can that future be sustained long enough for our actions to have a beneficial impact?
  • If we discover the future may not be like the past, what is the statistical behavior of this future, how can we discover this behavior, and how will these changes impact our decision making processes?

The answers to these and many other questions can be found in the mathematics of probability and statistics. Here's some popular misconceptions of mathematical concepts

Modeling is the Key to Decision Making

"All models are wrong, some are useful," George Box and Norman R. Draper (1987). Empirical Model-Building and Response Surfaces, p. 424, Wiley. ISBN 0471810339. 

  • This book is about process control systems and the statistical process models used to design and operate the control systems in chemical plants. (This is a domain I have worked in and developed software for).
  • This quote has been wildly misquoted, not only out of context, but also completely out of the domain it is applicable to.
  • All models are wrong says, every model is wrong because it is a simplification of reality. This is the definition of a model.
  • Some models, in the "hard" sciences, are only a little wrong. They ignore things like friction or the gravitational effect of tiny bodies. Other models are a lot wrong - they ignore bigger things. In the social sciences, big things are ignored.
  • Statistical models are descriptions of ¬†systems using mathematical language. In many cases we can add a certain layer of abstraction to enable an inferential procedure.
  • It is¬†almost impossible¬†for a single model to describe perfectly a real world phenomenon given our ¬†own subjective view of the world, since our sensory system is not perfect.
  • But - and this is the critical misinterpretation of Box's quote - successful statistical inference does happen any a certain degree of consistency we exploit.
  • So our¬†almost always wrong models¬†do prove¬†useful.

We can't possibly estimate activities in the future if we don't already know what they are

We actually do this all the time. But more importantly there are simple step-by-step methods for making credible estimates about unknown - BUT KNOWABLE - outcomes.
This know of unknown but knowable is critical. If we really can't know - it is unknowable - then the work is not a project. It is pure research. So move on, unless you're a PhD researcher.

Here's a little dialog showing how to estimating most anything in the software development world. 
With your knowledge and experience in the domain and a reasonable understanding of what the customer wants (no units of measure for reasonable by the way, sorry), let's ask some questions.

I have no pre-defined expectation of the duration. That is I have no anchor to start. If I did and didn't have a credible estimate I'd be a Dilbert manager - and I'm not.

  • Me¬†- now that you know a little bit about my needed feature, can you develop this in less than 6 months?
  • You¬†- of course I can, I'm not a complete moron.
  • Me¬†- good, I knew I was right to hire you. How about developing this feature in a week?
  • You¬†- are you out of your mind? I'd have to be a complete moron to sign up for that.
  • Me¬†- good, still confirms I hired the right person for the job. How about getting it done in 4 months?
  • You¬†- well that's still seems like too long, but I guess it'll be more than enough time if we run into problems or it turns out you don't really know what you want and change your mind.
  • Me¬†- thanks for the confidence in my ability. How about 6 weeks for this puppy?
  • You¬†- aw come on, now you're making me cranky. I don't know anyone except someone who has done this already, that can do it in 6 weeks. That's a real stretch for me. A real risk of failure and I don't want that. You hired me to be successful, and now you're setting me up for failure.¬†
  • Me¬†- good, just checking. How about 2¬Ĺ months - about 10 weeks?
  • You¬†- yea that still sounds pretty easy, with some margin. I'll go for that.
  • Me¬†- Nice, I like the way you think. How about 7 weeks?¬†
  • You¬†- boy you're a pushy one aren't you. That's a stretch, but I've got some sense of what you want. It's possible, but I can't really commit to being done in that time, it'll be risky but I'll try.
  • Me¬†- good, let's go with 8¬Ĺ weeks for now, and we'll update the estimate after a few weeks of you actually producing output I can look at.

Microeconomics of Decision Making

 Making decisions about the future in the presence of uncertainty can be addressed by microeconomics principles. Microeconomics is a branch of economics that studies the behavior of individuals and small impacting organizations in making decisions on the allocation of limited resources. Projects have limited resources, business has limited resources. All human endeavors have limited resources - time, money, talent, capacity for work, skills, and other unknowns. 

The microeconomics of decision making involves several variables

  • Opportunity cost - the value of what we give up by taking that action. If we decide between A and B and choose B, what is the cost of A that we're giving up.
  • Marginal cost analysis -¬†impact of small changes in the ‚Äúhow-much‚ÄĚ decision.
  • Sunk cost -¬†costs that have already been incurred and cannot be recovered.
  • Present Value -¬†The value today of a future cost or benefit.

Formally, defining this choice problem is simple: there is a state space S, whose elements are called states of nature and represent all the possible realizations of uncertainty; there is an outcome space X , whose elements represent the possible results of any conceivable decision; and there is a preference relation ‚™ł over the mappings from S to X.¬†‚Ć

This of course provides little in a way to make a decision on a project. But the point here is making decisions in the presence of uncertainty is a well developed discipline. Conjecturing it can't be done simply ignores this discipline.

The Valuation of Project Deliverables

It's been conjectured that focusing on value is the basis of good software development efforts. When suggested that this value is independent of cost this is misinformed. Valuation and the resulting Value used to compare choices, is the process of determining the economic value of an asset, be it a created product, a service, or a process. Value is defined as the net worth, or the difference between the benefits produced by the asset and the costs to develop or acquire the asset, all adjusted appropriately for probabilistic risk, at some point in time.

This valuation has several difficulties:

  • Costs and benefits might occur at different points in time and need to be adjusted, or discounted, to account for time value of money. The fundamental principle that money is worth more today than in the future under ordinary economic conditions.¬†
  • Not all determinants of value are known at the time of the valuation since there is uncertainty inherent in all project and business environments.¬†
  • Intangible benefits like learning, growth or emergent opportunities, and embedded flexibility are the primary sources of value in the presence of uncertainty.¬†

The valuation of the outcomes of software projects depends on the analysis of these underlying costs and benefits. A prerequisite for cost-benefit analysis is the identification of the relevant value and cost drivers to produce that value. Both cost and value are probabilistic, driven by  uncertainty - both reducible and irreducible uncertainty

Modeling Uncertainty

In addition to measurable benefits and costs of the software project, the valuation process must consider uncertainty. Uncertainty arises from different sources. Natural uncertainty (aleatory) which is  irreducible. This uncertainty relates to variations in the environment variables. Dealing with irreducible uncertainty requires margin for cost, schedule, and the performance of the outcomes. For both value and cost.

Event based uncertainty (epistemic) which is reducible. That is we can buy down this uncertainty with out actions. We can pay money to find things out. We can pay money to improve the value delivered from the cost we invest to produce that value.

Parameter uncertainty relates to the estimation of parameters (e.g., the reliability of the average number of defects). Model uncertainty relates to the validity of specific models used (e.g., the suitability of a certain distribution to model the defects). There is a straightforward taxonomy of uncertainty for software engineering that includes additional sources such as scope error and assumption error. The standard approach of handling uncertainty is by defining probability distributions for the underlying quantities, allowing the application of a standard calculus. Other approaches based on fuzzy measures or Bayesian networks consider different types of prior knowledge. ‡

The Final Point Once Again

The conjecture we can make informed decisions about choices in an uncertain future can be done in the absence of making estimates of the impacts of these choices has no basis in the mathematics of decision making.

This conjecture is simply not true. Any attempt to show this can be done has yet to materialize in any testable manner. This is where the basic math skills come into play. There is no math that supports this conjecture. Therefore there is no way to test this conjecture. It's personal opinion uninformed by any mathematics.

Proceed with caution when you hear this.

† Decision Theory Under Uncertainty, Johanna Etner, Meglena Jeleva, Jean-Marc Tallon,  Centre d’Economie de la Sorbonne 2009.64

‡ Estimates, Uncertainty and Risk. IEEE Software, 69-74 (May 1997), Kitchenham and Linkman and "Belief Functions in Business Decisions. In: Studies in Fuzziness and Soft Computing, Vol. 88, Srivastava and Mock

Related articles Information Technology Estimating Quality Everything I Learned About PM Came From a Elementary School Teacher Carl Sagan's BS Detector Eyes Wide Shut - A View of No Estimates
Categories: Project Management

Debian Size Claims - New Lecture Posted

10x Software Development - Steve McConnell - Tue, 06/30/2015 - 19:17

In this week's lecture (https://cxlearn.com) I demonstrate how to use some of the size information we've discussed in other lectures by diving into the Wikipedia claims about the sizes of various versions of Debian.  The point of this week's lecture is to show how to apply critical thinking to size information presented by an authoritative source (Wikipedia), and how to arrive at a confident conclusion that that information is not credible. Practicing software professionals should be able to look at size claims like the Debian size claims and, based on general knowledge, immediately think, "That seems far from credible." Yet, few professionals actually do that. My hope is that working through public examples like this in the lecture series will help software professionals improve their instincts and judgment, which can then be applied to projects in their own organizations. 

Lectures posted so far include:  

0.0 Understanding Software Projects - Intro
     0.1 Introduction - My Background
     0.2 Reading the News
     0.3 Definitions and Notations 

1.0 The Software Lifecycle Model - Intro
     1.1 Variations in Iteration 
     1.2 Lifecycle Model - Defect Removal
     1.3 Lifecycle Model Applied to Common Methodologies
     1.4 Lifecycle Model - Selecting an Iteration Approach  

2.0 Software Size
     2.05 Size - Comments on Lines of Code
     2.1 Size - Staff Sizes 
     2.2 Size - Schedule Basics 
     2.3 Size - Debian Size Claims (New)

Check out the lectures at http://cxlearn.com!

Understanding Software Projects - Steve McConnell

 

Succeeding with Geographically Distributed Scrum Teams - New White Paper

10x Software Development - Steve McConnell - Tue, 06/30/2015 - 19:02

We have a new white paper, "Succeeding with Geographically Distributed Scrum Teams." To quote the white paper itself: 

When organizations adopt Agile throughout the enterprise, they typically apply it to both large and small projects. The gap is that most Agile methodologies, such as Scrum and XP, are team-level workflow approaches. These approaches can be highly effective at the team level, but they do not address large project architecture, project management, requirements, and project planning needs. Our clients find that succeeding with Scrum on a large, geographically distributed team requires adopting additional practices to ensure the necessary coordination, communication, integration, and architectural work. This white paper discusses common considerations for success with geographically distributed Scrum.

Check it out!

Prioritize and Optimize Over a Slightly Longer Horizon

Mike Cohn's Blog - Tue, 06/30/2015 - 15:00

A lot of agile literature stresses that product owners must prioritize the delivery of value. I’m not going to argue with that. But I am going to argue that product owners need to optimize over a slightly longer horizon than a single sprint.

A product owner’s goal is to maximize the amount of value delivered over the life a product. If the product owner shows up at each sprint planning meeting focused on maximizing the value delivered in only that one sprint, the product owner will never choose to invest in the system’s future.

The product owner will instead always choose to deliver the feature that delivers the most value in the immediate future regardless of the long-term benefit. Such a product owner is like the pleasure-seeking student who parties every night during the semester but fails, and has to repeat the course during the summer.

A product owner with a short time horizon may have the team work on features A1, B1, C1 and D1 in four different parts of the application this sprint because those are the highest valued features.

A product owner with a more appropriate, longer view of the product will instead have the team work on A1, A2, A3 and A4 in the first sprint because there are synergies to working on all the features in area A at the same time.

Agile is still about focusing on value. And it’s still about making sure we deliver value in the short term. We just don’t want to become shortsighted about it.

Making Decisions In The Presence of Uncertainty

Herding Cats - Glen Alleman - Tue, 06/30/2015 - 03:57

Screen Shot 2015-06-29 at 5.30.17 PMDecision making is hard. Decision making is easy when we know what to do. When we don't know what to do there are conflicting choices that must be balanced in the presence of uncertainty for each of those choices. The bigger issue is that important choices are usually ones where we know the least about the outcomes and the cost and schedule to achieve those outcomes. 

Decision science evolved to cope with decision making in the presence of uncertainty. This approach goes back to Bernoulli in the early 1700s, but remained an academic subject into the 20th century, because there was no satisfactory way to deal with the complexity of real life. Just after World War II, the fields of systems analysis and operations research began to develop. With the help of computers, it became possible to analyze problems of great complexity in the presence of uncertainty.

In 1938, Chester Barnard, authored of¬†The Functions of the Executive,¬†and coined the term ‚Äúdecision making‚ÄĚ from the lexicon of public administration into the business world. This term replaced narrower descriptions such as ‚Äúresource allocation‚ÄĚ and ‚Äúpolicy making.‚ÄĚ

Decision analysis functions at four different levels

  • Philosophy - uncertainty is a consequence of our incomplete knowledge of the world. In some cases, uncertainty can be partially or completely resolved before decisions are made and resources committed. In many important cases, complete information is not available or is too expensive (in time, money, or other resources) to obtain.
  • Decision framework - decision analysis provides concepts and language to help the decision-maker. The decision maker is aware of the adequacy or inadequacy of the decision basis:
  • Decision-making process - provides a step-by-step procedure that has proved practical in tackling even the most complex problems in an efficient and orderly way.
  • Decision making¬†methodology - provides a number of specific tools that are sometimes indispensable in analyzing a decision problem. These tools include procedures for eliciting and constructing influence diagrams, probability trees, and decision trees; procedures for encoding probability functions and utility curves; and a methodology for evaluating these trees and obtaining information useful to further refine the analysis.

Each level focuses on different aspects of the problem of making decisions. And it is decision making that we're after.  The purpose of the analysis is not to obtain a set of numbers describing decision alternatives. It is to provide the decision-maker the insight needed to choose between alternatives. These insights typically have three elements:

  • What is important to making the decision?
  • Why is it important?
  • How important is it?

Now To The Problem at Hand

It has been conjectured ...

No Estimates

The key here and the critical unanswered question is how can a decision about an outcome in the future, in the presence of that uncertain future, be made in the absence of estimating the attributes going into that decision?

That is, if we have less than acceptable knowledge about a future outcome, how can we make a decision about the choices involved in that outcome?

Dealing with Uncertainty

All project work operates in the presence of uncertainty. The underlying statistical processes create probabilistic outcomes for future activities. These activities may be probabilistic events, or the naturally occurring variances of the processes that make up the project. 

Clarity of discussion through the language of probability is one of the basis of decision analysis. The reality of uncertainty must be confronted and described, and the mathematics of probability is the natural language to describe uncertainty.

When we don't have the clarity of language, when redefining mathematical terms, misusing mathematical terms, enters the conversation, agreeing on the ways - and there are many ways - of making decisions in the presence of an uncertain future - becomes bogged down in approaches that can't be tested in any credible manner. What remains is personal opinion, small sample anecdotes, and attempts to solve complex problems with simple and simple minded approaches. 

For every complex problem there is an answer that is clear, simple, and wrong. H. L. Mencken

Related articles Eyes Wide Shut - A View of No Estimates Carl Sagan's BS Detector Systems Thinking, System Engineering, and Systems Management
Categories: Project Management

Software Development Conferences Forecast June 2015

From the Editor of Methods & Tools - Mon, 06/29/2015 - 14:17
Here is a list of software development related conferences and events on Agile project management ( Scrum, Lean, Kanban), software testing and software quality, software architecture, programming (Java, .NET, JavaScript, Ruby, Python, PHP) and databases (NoSQL, MySQL, etc.) that will take place in the coming weeks and that have media partnerships with the Methods & […]

Information Technology Estimating Quality

Herding Cats - Glen Alleman - Mon, 06/29/2015 - 01:07

The estimator's charter is not to state what the developers should do, but rather to provide a reasonable project of what they will do. - Tom DeMarco

Here's a few resource materials for estimating cost, schedule, and technical outcomes on software intensive systems. In meeting about managing risk in the presence of uncertainty below, it became clear we need to integrate estimating with risk, technical performance measures, measures of effectiveness, measures of performance, cost, and schedule.

Some Recent Resources

There are 1,117 other papers and articles in the Software Cost Estimating folder on my server. These are just a very small sample of how to make estimates.

The notion that we can make decision in the presence of uncertainty in the absence of estimating (#NoEstimates) the outcomes of those decisions can only work if there no opportunity costs at risk in the future. That is there is nothing at risk for making a choice between multiple outcomes.

Related articles Humpty Dumpty and #NoEstimates Climbing Mountains Requires Good Estimates Carl Sagan's BS Detector
Categories: Project Management

What Creates Trust in Your Organization?

I published my most recent newsletter, Creating Trustworthy Estimates, this past week. I also noted on Twitter that one person said his estimates created trust in his organization. (He was responding to a #noestimate post that I had retweeted.)

Sometimes, estimates do create trust. They provide a comfortable feeling to many people that you have an idea of what size this beast is. That’s why I offer solutions for a gross estimate in Predicting the Unpredictable. I have nothing against gross estimates.

I don’t like gross estimates (or even detailed estimates) as a way to evaluate projects in the project portfolio¬†because estimates are guesses. Estimates are not a great way to understand and discuss the value of a project. They might be one piece of the valuation discussion, but if you use them as the only way to value a project, you are missing the value discussion you need to have. See Why Cost is the Wrong Question for Evaluating Projects in Your Project Portfolio.

I have not found that only estimates create trust. I have found that delivering the product  (or interim product) creates more trust.

Way back, when I was a software developer, I had a difficult machine vision project. Back then, we invented as we went. We had some in-house libraries, but we had to develop new solutions for each customer.

I had an estimate of 8 weeks for that project. I prototyped and tried a gazillion things. Finally, at 6 weeks, I had a working prototype. I showed it to my managers and other interested people. I finished the project and we shipped it.

Many years later, when I was a consultant, I encountered one of those managers. He said to me, “We held our breath for 6 weeks¬†until you showed us a prototype. You had gone dark and we were worried. We had no idea if you would finish.”

By that time, I had managed people like me. I asked them for visual updates on their status each week or two. I had learned from my experiences.

I asked that manager why they held their breath. I always used an engineering notebook. I could have explained my status at any time to anyone who wanted it. He replied, “We so desperately wanted your estimate to be true. We were so afraid it wasn’t. We had no idea what to do. When you showed us a working prototype, that’s when we started to believe you could finish the project.”

They trusted my initial estimate. It’s a good thing they didn’t ask for updated estimates each week. I remember that project as a series of highs and lows.

That’s the problem with invention/innovation. You can keep track of your progress. You can determine ways to make progress. And, with the highs, your meet or beat your estimate. With the lows, you extend your estimate. I remember that at the beginning of week 5 I was sure I was not going to meet my date. Then, I discovered a way to make the project work. I remember my surprise that it was something “that easy.” It wasn’t easy. I had tracked my experiments in my notebook. There wasn’t much more I could do.

Since then, I asked¬†my managers, “When do you want to know my project is in trouble? As soon as it I think I’m not going to meet my date; after I do some experiments; or the last possible moment?” I create trust when I ask that question¬†because it shows I’m taking their concerns seriously.

After that project, here is what I did to create trust:

  1. Created a first draft estimate.
  2. Tracked my work so I could show visible progress and what didn’t work.
  3. Delivered often. That is why I like inch-pebbles. Yes, after that project, I often had one- or two-day deliverables.
  4. If I thought I wasn’t going to make it, use the questions above to decide when to say, “I’m in trouble.”
  5. Delivered a working product.

PredictingUnpredictable-smallEstimates can be useful. They can show you the risks. And, I’m sure that only having estimates is insufficient for building trust. If you want to learn more about estimation, see Predicting the Unpredictable: Pragmatic Approaches to Estimating Cost or Schedule.

Categories: Project Management

Everything I Learned About PM Came From a Elementary School Teacher

Herding Cats - Glen Alleman - Thu, 06/25/2015 - 18:08

Our daughter is an elementary teacher in Austin Texas. A nice school, Number 2 school in Texas.

While visiting this week, we were talking about a new book a group of us are working on. While showing her the TOC, she said Dad we do all that stuff (minus the finance side) every day, week, month, semester, and year. It's not that hard. That's what we've been trained to do. OK, but talent, dedication, skill, and a gift for teaching helps. 

Here's how an elementary school teacher sees her job as the Project Manager of 20 young clients.

  • Plan before starting anything, it‚Äôs going to go wrong, so know that up front and be able to recognize the train wreck is coming and get out of the way.
    • The plan is a strategy for the successful completion of the project.
    • Without the plan, you don't know how to assess progress in terms meaningful to the¬†decision¬†makers. Measures of cost and¬†schedule¬†are¬†measures¬†of effectiveness.¬†Measurers¬†of¬†stories produces,¬†features¬†delivered¬†aren;t measures or capabilities produced.
    • A Capabilities Based Plan is that measure. What capabilities doesn't the customer need to¬†accomplishment¬†the¬†business¬†case or fulfill a mission.
    • In education Blooms Taxonomy with TLOs and ELO's define the¬†capabilities¬†the student will possess¬†at the end of the course.
  • Have a notion of what done looks like, so when you get there, you can stop and move on.
    • Done is defined as possessing a¬†capability¬†to accomplish something.
    • Write this down in units of Effectiveness and Performance¬†
  • Have your Plan B always ready to go and then start thinking of Plan C when Plan B is under way. No Plan A ever lasts too long in the presence of chaos.
    • Risk management is how adults manage projects - Tim Lister
    • Adult supervision is the role of the teacher. Many times adult¬†supervision¬†of the role of the project¬†manager.
  • Make sure you‚Äôve got all the right resources lined up and ready to spring into action when things go wrong. Classroom aides, class leaders, parents, staff all ready to go when the plan goes in the ditch.
    • Resource planning is a critical success factor for all projects.
  • Know what can go wrong before you start, steer away from trouble and trouble will stay away.
    • Risk planning is planning. Planning is strategy.¬†
    • Apply good risk management to all activities on the project. Perform some formal sequence of risk¬†management. Pick one. My favorite is the SEI Continuous Risk Management process
  • Separate the trouble makers from the main stream. You know them on day one.¬†
    • Any good project manager can see trouble coming.
    • Isolate the troubled parts. Assign them to¬†separate¬†teams. Have them fix the problem so the rest¬†of the project isn't impacted by them
  • Show up early, prepare for the work, clean up afterward, so you can start ‚Äúclean‚ÄĚ again the next day. No less that 100% complete at the end of each period of performance. If not you‚Äôll pay dearly for ¬†it later.
    • Being prepared is the major attribute of project success.
    • This means planning.
    • Let's¬†things¬†emerge is nice of small non-trivial projects with low value at risk.¬†
  • Always ask ‚Äúis this your best work?‚ÄĚ and ‚Äúdid you put your name on it?‚ÄĚ Otherwise you're creating re-work.
    • Set the highest¬†quality¬†standards possible
  • No crying when it doesn‚Äôt work. Redo it and get back on schedule, recess time is schedule margin - you get to stay in and finish your planned work.
    • No¬†whining, every one put your "big boy " pants on a do the work needed to get the job done.
  • Take a break, go outside and play, think what you‚Äôre going to do next hour. Come back and do it.
    • Have retrospectives.
    • Look back for opportunities for improvement
    • Do Root Cause Analysis to find out the "real" why things didn't work
    • Have fun while still working hard

Is This Your Best Work

Related articles Who's Budget is it Anyway? Systems Thinking, System Engineering, and Systems Management Myth's Abound
Categories: Project Management

Predicting the Unpredictable is Available

PredictingUnpredictable-smallI’m happy to announce that Predicting the Unpredictable: Pragmatic Approaches to Estimating Cost or Schedule is done and available. It’s available in electronic and print formats. If you need a little help explaining your estimates or how to use estimation (even #noestimate), read this book.

 

Categories: Project Management

Success and Failure (Get Your Free Celebration Grid Poster!)

NOOP.NL - Jurgen Appelo - Tue, 06/23/2015 - 22:28
Success and Failure v1.00 - Poster

Success and Failure – The Celebration Grid

The post Success and Failure (Get Your Free Celebration Grid Poster!) appeared first on NOOP.NL.

Categories: Project Management

You Don’t Need a Complicated Story Hierarchy

Mike Cohn's Blog - Tue, 06/23/2015 - 15:00

Consultants and tool vendors seem to have a penchant for making things complicated. It seems the more complicated we make things, the more our clients need us. And that sells tools and services, I suppose.

On the other hand, I find unnecessary complexity extremely frustrating. It’s like the novel I read this week by a first-time author. It was good, but it had too many minor characters who complicated the plot and made the book hard to follow.

The same thing happens when people introduce complicated hierarchies or taxonomies for user stories like this:

You don’t need this. When teams are forced to use complicated taxonomies for their stories, they spend time worrying about whether a particular story is an epic, a saga or merely a headline. That discussion is like the minor character who walks into the novel and needlessly complicates the plot.

But, Mike -- I can hear you asking -- you’ve written about epics and themes before.

Yes, but those are labels. A story is a story so my recommended story taxonomy is this:

A story is a story is a story.

Some stories are big and they can be labeled as epics. I’ve used the analogy of movies before. All movies are movies but some movies are romantic comedies—that’s a label, just like epic is.

Similarly, theme refers to a group of related stories, but not does have to work within a hierarchy. Again using movies, I could have a group of spy movies that would include the James Bond movies and the Austin Powers movies. But a group of comedy movies would include Austin Powers but not James Bond.

So, again, theme and epic are labels not an implied hierarchy. Don’t make things more complicated than they need to be. I haven’t come across any reasons to have fancy story hierarchies or taxonomies.

Selecting an Iteration Approach - New Lecture Posted

10x Software Development - Steve McConnell - Tue, 06/23/2015 - 11:17

In this week's lecture (https://cxlearn.com) I explain how the lifecycle model can be used to show the incredibly large number of variations in approaches to software projects, especially including numerous variations in kinds of iteration. I identify approaches that work if you need predictability, if you need flexibility, if you need to attack uncertainty in requirements (i.e., unknown requirements), and if you need to attack uncertainty in architecture (i.e., technical risk).  

The overarching message is that there are lots of different ways to organize the activities on a software project, and the way you organize the activities significantly affects what a project will accomplish. 

Lectures posted so far include:  

0.0 Understanding Software Projects - Intro
     0.1 Introduction - My Background
     0.2 Reading the News
     0.3 Definitions and Notations 

1.0 The Software Lifecycle Model - Intro
     1.1 Variations in Iteration 
     1.2 Lifecycle Model - Defect Removal
     1.3 Lifecycle Model Applied to Common Methodologies
     1.4 Lifecycle Model - Selecting an Iteration Approach 

2.0 Software Size
     2.05 Size - Comments on Lines of Code
     2.1 Size - Staff Sizes 
     2.2 Size - Schedule Basics 

Check out the lectures at http://cxlearn.com!

Understanding Software Projects - Steve McConnell

 

Variations in Iteration - New Lecture Posted(1)

10x Software Development - Steve McConnell - Tue, 06/23/2015 - 11:17

In this week's lecture (https://cxlearn.com) I explain how the lifecycle model can be used to show the incredibly large number of variations in approaches to software projects, especially including numerous variations in kinds of iteration. I identify approaches that work if you need predictability, if you need flexibility, if you need to attack uncertainty in requirements (i.e., unknown requirements), and if you need to attack uncertainty in architecture (i.e., technical risk).  

The overarching message is that there are lots of different ways to organize the activities on a software project, and the way you organize the activities significantly affects what a project will accomplish. 

Lectures posted so far include:  

0.0 Understanding Software Projects - Intro
     0.1 Introduction - My Background
     0.2 Reading the News
     0.3 Definitions and Notations 

1.0 The Software Lifecycle Model - Intro
     1.1 Variations in Iteration 
     1.2 Lifecycle Model - Defect Removal
     1.3 Lifecycle Model Applied to Common Methodologies
     1.4 Lifecycle Model - Selecting an Iteration Approach 

2.0 Software Size
     2.05 Size - Comments on Lines of Code
     2.1 Size - Staff Sizes 
     2.2 Size - Schedule Basics 

Check out the lectures at http://cxlearn.com!

Understanding Software Projects - Steve McConnell

 

Software Gardening, Entropy, Hybrid Requirements, Lean UX in Methods & Tools Summer 2015 issue

From the Editor of Methods & Tools - Mon, 06/22/2015 - 14:29
Methods & Tools ‚Äď the free e-magazine for software developers, testers and project managers ‚Äď has just published its Summer 2015 issue that discusses¬† Software Gardening, Software Entropy, Hybrid Requirements, Lean UX and agileMantis. * Software Gardening – Yet another crappy analogy or a reality? * Entropy for Measuring Software Maturity * The READ, RATT, […]

Management, Humanity and Expectations

There’s a twitter discussion of what people “should” do in certain situations. One of the participants believes that people “should” want to learn on their own time and work more than 40 hours per week. I believe in learning. I don’t believe in expecting people to work more than 40 hours/week. My experience is that when you ask people to work more than 40 hours, they get stupid. See ¬†Management Myth 15: I Need People to Work Overtime. If you want people to learn, read Management Myth #9: We Have No Time for Training.

One participant also said that people should leave their emotional baggage (my word) at home. Work supposedly isn’t for emotions. Well, I don’t understand how we can have people who work without their emotions. Emotions are how we explain how we feel about things. I want people to advocate for what they feel is useful and good. I want to know when they feel something is bad and damaging. I want that, as a manager. See Management Myth #4: I Don’t Need One-on-Ones.

People are emotional. Let’s assume they are adults and can harness their emotions. If not, we can provide feedback about the situation. But, ignoring their emotions? That never works. It’s incongruent and can make the situation worse.

I have a problem with “shoulds” for other people. I cannot know what is going on in other people’s lives. Nor, do I want to know all the details as a manager. I need to know enough to use my judgement as a manager to help the people and teams proceed.

When managers build trust with people, those people can share what is relevant about the way they can perform at work with their manager, and maybe with their team. If they have a personal situation that requires time off, depending on the team, the person might have to talk to the team before the manager. (I know some agile teams like this.) The team might manage the situation without management help or interference.

If you are in a leadership position, don’t impose your “shoulds” on other people. You cannot know what is happening in other people’s lives. You can ask for

You can ask for the results you want. You want people to learn more? Provide time during the week for everyone to learn together. You want people to work through a personal crisis? Provide support.

Don’t expect automatons at work. Expect humans and you’ll get way more than you could imagine.

Categories: Project Management

Climbing Mountains Requires Good Estimates

Herding Cats - Glen Alleman - Mon, 06/22/2015 - 06:01

LongsswridgerouteThere was an interesting post on the #NoEstimates thread that triggered memories of our hiking and climbing days with our children (now grown and gone) and our neighbor who has summited many of the highest peaks around the world.

The quote was Getting better at estimates is like using time to plan the Everest climb instead of climbing smaller mountains for practice.

A couple background ideas:

  • The picture above is Longs Peak. We can see Longs Peak from our back deck in Niwot Colorado. It's one of 53 14,000 foot mountains in Colorado -¬†Fourteeners. Long is one of 4 along the front range.¬†

In our neighborhood are several semi-pro mountain climbers. People move to Colorado for the outdoor life, skiing, mountain and road biking, hiking, and climbing. 

Now to the Tweet suggesting that getting better at estimating is replaced by doing (climb) smaller projects. Turns out estimates are needed for those smaller mountains, estimates are needed for all hiking and climbing. But first...

  • No one is going to climb Everest - and live to tell about it - without first having summited many other¬†high peaks.
  • Anyone interested in the trials and tribulations of Everest should start with John Krakauer's¬†Into Thin Air: A Personal Account of the Mt. Everest Disaster.
  • Before attempting - and attempting is the operative word here - any signifiant peak several things have to be in place.

Let's start with those Things.

No matter how prepared you are, you need a plan. Practice on lower peaks is necessary but far from sufficient for success. Each summit requires planning in depth. For Long's peak you need a Plan A, Plan B, and possibly a Plan C. On most of all you need strong estimating skills and the accompanying experience to determine when to invoke each Plan. People die on Longs because they foolishly think they can beat the odds and proceed with Plan B.

So the suggest that summiting something big, like any of the Seven Summits, without both deep experience and deep planning is likely going to not be heard of again.

 So the OP is likely speaking for not having summited much of anything, hard to tell, no experience resume attached.

The estimating part is basic, Can we make it to the key hole on Long's Peak before the afternoon storms come in/ On Everest, can we make it to the Hillary Step before 1:00 PM? No? Turn back, you're gonna die if you continue.

Can we make it to the delivery date at the pace we're on now, AND with the emerging situation for the remaining work, AND for the cost we're trying to keep AND with the needed capabilities the customer needs? Remember the use of past performance is fine, If and Only If the future is something like the past, or we know something about how the future is different from the past.

When the future not like the past? We need a Plan B. And that plan has to have estimates of our future capabilities, cost expenditure rate, and our abilities to produce the needed capabilities.

ALL PLANNING IN THE PRESENCE OF UNCERTAINTY REQUIRES - MANDATES ACTUALLY - ESTIMATING. 

Ask any hiker, climber, development manager, business person. Time to stop managing by platitudes and start managing by the principles of good management.

Related articles There is No Such Thing as Free The Fallacy of the Planning Fallacy Systems Thinking, System Engineering, and Systems Management Myth's Abound Eyes Wide Shut - A View of No Estimates
Categories: Project Management

The Art of Systems Architecting

Herding Cats - Glen Alleman - Sun, 06/21/2015 - 23:36

Art of Systems ArchitectingThe Art of Systems Architecting† is a book that changed the way I look at  development of software intensive systems. As a manager of software in the system of systems domain, this book created a clear and concise vision of how to assemble all the pieces of the system into a single cohesive framework.

One of the 12 principles of the Agile Manifesto is The best architectures, requirements, and designs emerge from self-organizing teams. The self-organizing team parts is certainly good. But good architectures don't emerge, unless it's the Ball of Mud architecture. Good architecture is a combination of science, engineering and art. Hence the title of the book.

Systems architecting borrows from other architectures, but the basic attributes are the same: † 

  • The architect is principally the agent of the client, not the builder. The architect must act in the best interests of the client.
  • The architect works jointly with the client and the builder on the problem and the definition of the solution. Systems requirements - in the form of needed capabilities, their Measures of Effectiveness, Measures of Performance (MOP), Key Performance Parameters (KPP), and Technical Performance Measures (TPM), are an input. The client will provide the requirements, but the architect is expected to jointly help the client ¬†determine the requirements.
  • The architect's product is an architectural representation. A set of abstracted designs of the system.
  • The product of the architect is not just a physical representation of the system. Cost estimates are part of any feasible deliverables as well. Knowing the value of some¬†built item requires we also know the cost. The system architecture must cover physical structure, system behavior, cost, performance, delivery schedule, and other elements needed to clarify the clients priorities.
  • The initial architecture is a¬†Vision¬†of the future outcome of the work effort. This description os a set of specific models. These include the needed capabilities, the motives of the outcome, beliefs, and unstated assumptions. These distinctions are critical when creating standards for the architecture. ToGAF and DoDAF are examples of architecture standards.¬†

Why Do We Care About This?

When we hear of some new and possibly different approach to anything, we need to ask - what is the paradigm this idea fits into? If it is truly new, what paradigm does it replace and how does that replacement maintain the needed information from the old paradigm used for success and what parts of the old paradigm are replaced for the better and how can we be assured that it is actually better?

One answer starts with the architecture of the paradigm. In the case of managing projects this is the programmatic architecture. This Principles, Practices, and Processes of the Programmatic Architecture.

Five Immutable Principles of project success can be found in...

5 Immutable Principles

With these principles we can apply Five Practices guided by these Principles

Screen Shot 2015-06-21 at 3.24.50 PM

With the Principles and Practices in place, Processes can be defined for the specific needs of the domain.

Screen Shot 2015-06-21 at 3.26.57 PM

So with the Principles, Practices, and Processes in place, we can now ask

When it is suggested a new approach be taken, where does that approach fit in the Principles, Practices, and Processes that are in place now? If there is no place, how does this new suggestion fulfill the needs of the business that are in place? If there needs aren't fulfilled, does the business acknowledge that those needs are no longer needed?

If not, the chances of this new idea of actually being accepted by the business are slim to none.

Related articles Systems Thinking, System Engineering, and Systems Management Who's Budget is it Anyway? Eyes Wide Shut - A View of No Estimates
Categories: Project Management

Systems Thinking, System Engineering, and Systems Management

Herding Cats - Glen Alleman - Sun, 06/21/2015 - 16:46

There are several paradigms for Systems Thinking. Ranging from Psychobabble to hard core Systems Engineering. A group of colleagues are starting a book with a working title Increasing The Probability of Project Success, several of the chapters are based on Systems Thinking.

But first some background between Systems Theory, Systems Thinking, and Systems Engineering

Systems Theory is the interdisciplinary study of systems in general, with the goal of elucidating principles that can be applied to all types of systems at all nesting levels in all fields of research.

Systems Engineering is an interdisciplinary field of engineering that focuses on how to design and manage complex engineering systems over their life cycles. Systems Management (MSSM, USC, 1980) is an umbrella discipline encompassing systems engineering, managerial finance, contract management, program management, human factors, operations research, in limitary, defense, space, and other complex systems disciplines)

Here's are two books references that inform our thought processes 

Systems ThinkingThis book is the basis of Thinking about systems. It's a manufacturing and Industrial Engineering paradigm. Software Intensive Systems fit in here as well, since interfaces between system components define the complexity aspects of all system of systems.

This book opens with an Einstein quote In the brain, thinking is doing. As engineers - yes software engineering is alive and well in many domains, no matter how much we think wqe have to do. We can plan, prepare, and predict, but action occurs through doing.

so when we hear any suggestion, ask how can this be put to work in some measurable way to assess the effectiveness and performance of the outcomes?

Systems Thinking Building MapsThis is the companion mapping processes book. Systems Thinking is the process of understanding how systems influence one another withn a world  of systems and has been defined as an approach to problem solving by viewing our "problems" as parts of an obverall system, rather than reacting to a specific part or outcome.

There are many kinds of systems. Hard systems, software systems, evolutionary systems. It is popular to mix these, but that creates confusion and removes the ability to connect concepts with actionable outcomes. 

Cynefin is one of those popular approaches that has no units of measure of complex, complicated, chaotic, and obvious. Just soft self referencing words. 

so in our engineering paradigm this approach is not very useful.

Along with these appoaches are some other seminal works

In The End

Everything's  system. Interactions between components is where the action is and where the problems come from. Any non-trivial systems has interactions that must be managed as system interactions. this means modeling these interactions, estimating the impacts of these interactions. defining the behaviors of these interaction before, during, and after their development,

This means recognizing the criteria for a mature and effective method of managing in the presence of uncertainty.

  • Recognition by clients and providers the need to¬†architect the system.
  • Acceptance of a disciple to those function using known methods.
  • Recognition of the separation of value judgements and technical decisions between cleint, architect and builder.
  • Recognition that the architecture is an art as well as a science, in particular, the development and use of nonanalytical techniques.
  • Effective utilization of an educated professional staff engaged in the process of systems level architecting.
Related articles Eyes Wide Shut - A View of No Estimates Who's Budget is it Anyway? The Dysfunctional Approach to Using "5 Whys"
Categories: Project Management

Monte Carlo Simulation of Project Performance

Herding Cats - Glen Alleman - Thu, 06/18/2015 - 15:32

Monte-Carlo-3Project work is random. Most everything in the world  is random. The weather, commuter traffic, productivity of writing and testing code. Few things actually take as long as they are planned. Cost is less random, but there are variances in the cost of labor, the availability of labor. Mechanical devices have variances as well.

The exact fit of a water pump on a Toyota Camry is not the same for each pump. There is a tolerance in the mounting holes, the volume of water pumped. This is a variance in the technical performance.

Managing in the presence of these uncertainties is part of good project management. But there are two distinct paradigms of managing in the presence of these uncertainties.

  1. We have empirical data of the variances. We have samples of the hole positions and sizes of the water pump mounting plate for the last 10,000 pumps that were installed. We have samples of how long it took to write a piece of code and the attributes of the code that are correlated to that duration. We have empirical measures.
  2. We have a theoretical model of the water pump in the form of a 3D CAD model with the materials modeling for expansion, drilling errors of the holes and other static and dynamic variances. We have modeling the duration of work using a Probability Distribution Function and a Three Point estimate of the Most Likely Duration, the Pessimistic and Optimistic duration. These can be derived form past performance, but we don't have enough actual data to produce the PDF and have a low enough Sample Error for our needs.

In the first case we have empirical data. In the second case we don't. There are two approaches to modeling what the system will do in terms of cost and schedule outcomes.

Bootstrapping the Empirical Data

With samples of past performance and the proper statistical assessment of those samples, we can re-sample them to produce a model of future performance. This bootstrap resampling shares the principle of the second method - Monte Carlo Simulation - but with several important differences.

  • The¬†researcher - and we are researching what the possible outcomes might be from our model - does not know nor have any control of the Probability Distribution Function that generated the past sample. You take what you got.¬†
  • As well we don;'t have any understanding of¬†Why those samples appear as they do. They're just there.¬†We get what we get.
  • This last piece is critical because it prevents us from defining what performance must be in place to meet some future goal. We can't tell what performance we need because we have not model of the¬†need performance, just samples from the past.
  • This results from the statistical conditions that there is a PDF for the process that ius unobserved. All we have is a few samples of this process.
  • With these few samples, we're going to resample them to produce a modeled outcome. This resampling locks in any behavior of the future using the samples from the past, which may or may not actually represent the¬†true underlying behavior. This may be all we can do because we don't have any theoretical model of the process.

This bootstrapping method is quick, easy, and produces a quick and easy result. But it has issues that must be acknowledged.

  • There is a fundamental assumption that the past empirical samples represent the future. That is, the samples contained in the¬†bootstrapped list and their resampling are also contained in all the future samples.
  • Said in a more formal way
    • If the sample of data we have from the past is a reasonable representation of the underlying population of all samples from the work process, then the distribution of parameter estimates produced from the bootstrap¬† model on a series of resampled data sets will provide a good approximation of the distribution of that statistics in the population.
    • With this sample data and its parameters (statistical moments) we can make a good approximation of the future.
  • There are some important statistical behaviors though that must be considered, starting with the future samples are identical to the statistical behaviors of the past samples.
    • Nothing is going to change in the future
    • The past and the future are identical statistically
    • In the project domain that is very unlikely
  • With all these condition, for a small project, with few if any interdependencies, a static work process with little valance, boot strapping is a nice quick and dirty approach to forecasting (estimating the future) ¬†based on the past.

Monte Carlo Simulation

This approach is more general and removes many of the restrictions to the statistical confidence of bootstrapping.

Just as a reminder, in principle both the parametric and the non-parametric bootstrap are special cases of Monte Carlo simulations used for a very specific purpose: estimate some characteristics of the sampling distribution. But like all principles, in practice there are larger differences when modeling project behaviors.

In the more general approach  of Monte Carlo Simulation the algorithm repeatedly creating random data in some way, performing some modeling with that random data, and collecting some result.

  • The duration of a set independent tasks
  • The probabilistic completion date of a series of tasks connected in a network (schedule), each with a different Probability Distribution ¬†Function evolving as the project moves into the future.
  • A probabilistic cost ¬†correlated with the probabilistic schedule model. This is called the Joint Confidence Level. Both cost and schedule are random variance with time evolving changes in their respective PDFs.

In practice when we hear Monte Carlo simulation we are talking about a theoretical investigation, e.g. creating random data with no empirical content - or from reference classes -  used to investigate whether an estimator can represent known characteristics of this random data, while the (parametric) bootstrap refers to an empirical estimation and is not necessary a model of the underlying processes, just a small sample of observations independent from the actual processes that generated that data.

The key advantage of MCS is we don't necessarily need  past empirical data. MCS can be used to advantage if we do, but we don't need it for the Monte Carlo Simulation algorithm to work.

This approach could be used to estimate some outcome, like in the bootstrap, but also to theoretically investigate some general characteristic of an statistical estimator (cost, schedule, technical performance) which is difficult to derive from empirical data.

MCS removes the road block heard in many critiques of estimating - we don't have any past data on which to estimate.  No problem, build a model of the work, the dependencies between that work, and assign statistical parameters to the individual or collected PDFs and run the MCS to see what comes out.

This approach has several critical advantages:

  • The first is a restatement - we don't need empirical data, although it will add value to the modeling process.
    • This is the primary purpose of Reference Classes
    • They are the raw material for defining possible future behaviors form the past
  • We can make judgement of what he future will be like, or most importantly what the future MUST be like to meet or goals, run the simulation and determine is our planned work will produce a desired result.

So Here's the Killer Difference

Bootstrapping models make several key assumptions, which may not be true in general. So they must be tested before accepting any of the outcomes.

  • The future is like the past.
  • The statistical parameters are static - they don't evolve with time. That is the future is like the past, an unlikely prospect on any non-trivial project.
  • The sampled data is identical to the population data both in the past and in the future.

Monte Carlo Simulation models provide key value that bootstrapping can't.

  • Different Probability Distribution Function can be assigned to work as it progresses through time
  • The shape of that PDF can be defined from past performance, or defined from the¬†needed performance. This is a CRITICAL capability of MCS

The critical difference between Bootstrapping and Monte Carlo Simulation is that MCS can show what the future performance has to be to stay on schedule (within variance), on cost, and have the technical performance meet the needs of the stakeholder.

When the process of defining the needed behavior of the work is done, a closed loop control system in put in place. This needed performance is the steering target. Measures of actual performance compared to needed performance generate the error signals for taking corrective actions. Just measuring past performance and assuming the future will be the same, is Open Loop control. Any non-trivial project management method needs a closed loop control system

Bootstrapping can only show what the future will be like if it like the past, not what it must be like. In Bootstrapping this future MUST be like the past. In MCS we can tune the PDFs to show what performance has to be to manage to that plan. Bootstrapping is reporting yesterday's weather as tomorrow's weather - just like Steve Martin in LA Story. If tomorrow's weather turns out not to be like yesterday's weather, you gonna get wet.

MCS can forecast tomorrows weather, by assigning PDFs to future activities that are different than past activities, then we can make any needed changes in that future model to alter the weather to meet or needs. This is in fact how weather forecasts are made - with much more sophisticated models of course here at the National Center for Atmospheric Research in Boulder, CO

This forecasting (estimating the future state) of possible outcomes and the alternation of those outcomes through management actions to change dependencies, add or remove resources, provide alternatives to the plan (on ramps and off maps of technology for example), buy down risk, apply management reserve, assess impacts of rescoping the project, etc. etc. etc.  is what project management is all about.

Bootstrapping is necessary but far from sufficient for any non-trivial project to show up on of before the need date (with schedule reserve), at o below the budgeted cost (with cost reserve) and have the produce or service provide the needed capabilities (technical performance reserve).

Here's an example of that probabilistic forecast of project performance from a MCS (Risky Project). This picture shows the probability for cost, finish date, and duration. But it is built on time evolving PDFs assigned to each activity in a network of dependent tasks, which models the work stream needed to complete as planned.

When that future work stream is changed to meet new requirements, unfavorable past performance and the needed corrective actions, or changes in any or all of the underlying random variables, the MCS can show us the expected impact on key parameters of the project so management in intervention can take place - since Project Management is a verb.

Untitled

The connection between the Bootstrap and Monte Carlo simulation of a statistic is simple.

Both are based on repetitive sampling and then direct examination of the results.

But there are significant differences between the methods (hence the difference in names and algorithms). Bootstrapping uses the original, initial sample as the population from which to resample. Monte Carlo Simulation uses a data generation process, with known values of the parameters of the Probability Distribution Function. The common algorithm for MCS is Lurie-Goldberg. Monte Carlo is used to test that the results of the estimators produce desired outcomes on the project. And if not, allow the modeler and her management to change those estimators and then mange to the changed plan.

Bootstrap can be used to estimate the variability of a statistic and the shape of its sampling distribution from past data. Assuming the future is like the past, make forecasts of throughput, completion and other project variables. 

In the end the primary differences (and again the reason for the name differences) is Bootstrapping is based on unknown distributions. Sampling and assessing the shape of the distribution in Bootstrapping adds no value to the outcomes. Monte Carlo is based on known or defined distributions usually from Reference Classes.

Related articles Do The Math Complex, Complexity, Complicated The Fallacy of the Planning Fallacy
Categories: Project Management