Warning: Table './devblogsdb/cache_page' is marked as crashed and last (automatic?) repair failed query: SELECT data, created, headers, expire, serialized FROM cache_page WHERE cid = 'http://www.softdevblogs.com/?q=aggregator/sources/6' in /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/database.mysql.inc on line 135

Warning: Cannot modify header information - headers already sent by (output started at /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/database.mysql.inc:135) in /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/bootstrap.inc on line 729

Warning: Cannot modify header information - headers already sent by (output started at /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/database.mysql.inc:135) in /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/bootstrap.inc on line 730

Warning: Cannot modify header information - headers already sent by (output started at /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/database.mysql.inc:135) in /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/bootstrap.inc on line 731

Warning: Cannot modify header information - headers already sent by (output started at /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/database.mysql.inc:135) in /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/bootstrap.inc on line 732
Software Development Blogs: Programming, Software Testing, Agile, Project Management
Skip to content

Software Development Blogs: Programming, Software Testing, Agile Project Management

Methods & Tools

Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!

Herding Cats - Glen Alleman
warning: Cannot modify header information - headers already sent by (output started at /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/database.mysql.inc:135) in /home/content/O/c/n/Ocnarfparking9/html/softdevblogs/includes/common.inc on line 153.
Syndicate content
Resources about Performance-Based Project ManagementÂŽ using Principles, Practices, and Processes to Increase Probability of Success
Updated: 8 hours 39 min ago

A Reminder of the Pseudo-Science of #NoEstimates

Fri, 12/02/2016 - 02:09

When you hear...

#NoEstimates is a hashtag for the topic of exploring alternatives for making decisions in software development. That is, ways to make decisions with "No Estimates"

Think about the conjecture. How would you assess a decision in the presence of uncertainty without making an estimate of the outcome of that decision. Since there is not deterministic data available about the future, even though you may have deterministic data from the past. This would mean the future is like the past, there is no uncertainty about the future - either reducible (epistemic) or irreducible (aleatory). No conditions that were in place from the past will be changing in the future. Nothing is going to emerge that you have not accounted for. Nothing is going to change in any attribute, process, or people doing the work..

Such a process defies the principles of microeconomics of decision making. Defies the principles of managerial finance in the presence of uncertainty. Defies the principles of closed loop control systems in the presence of stochastic non-stationary systems. 

It simply defies the principles of logic

Pseudo–science and the art of software methods from Glen Alleman
Categories: Project Management

Estimates, Forecasts, Projections

Sun, 11/27/2016 - 15:33

There is the misuse of the terms of statistics and probability in many domains. Politics being one. The #Noestimates advocates are another of the most prolific abusers of these terms. Here are the mathematical definitions. not the Wikipedia definition, not the self-made definitions used to support their conjectures.

Estimates

  • An Estimate is a value inferred for a population of values based on data collected from a sample of data from that population. The estimate can also be produced parametrically or through a simulation (Monte Carlo is common, but Method of Moments is another we use). 
    • Estimates can be about the past, present, or future.
    • We can estimate the number of clams in the Pleistocene era that are in the shale formations near our house.
    • We can estimate the number of people sitting in Folsom Field for lat night's game against Utah. The Buff's won and are now the PAC-12 South Champs.
    • We can estimate the total cost, total duration, and the probability that all the Features will be delivered on the program we are working for the US Government. Or ANY software project for that matter.
  • Estimates have precision and accuracy.
    • These values are estimates as well.
    • The estimated completion cost for this program is $357,000,000 with an accuracy of $200,000 and a precision of $300,000.
    • Another way to speak about the estimated cost is This program will cost $357,000,000 or less with 80% confidence.
  • An estimate is a statistic about a whole population of possible values from a previous reference period or a model that can generate possible values given the conditions of the model. 

An estimate is the calculated approximation of a result

Forecasts

  • Forecasts speculate future values for a population of possible values with a certain level of confidence, based on the current and past values as an expectation (prediction) of what will happen:
    • This is the basis of weather forecasting.
    • If you listen carefully to the weather forecast it says there is a 30% chance of snow next week over the forecast area.
    • We live at the mouth of a canyon at 5,095' and of there is a 30% chance of snow in Boulder (8 miles south), there is a much lower chance in our neighborhood.
    • Business forecasts, weather forecasts, traffic forecasts are typical. These forecasts usually come from models of the process being forecast. NOAA and NCAR are in our town, so lots of forecasting going on. Weather as well as climate.
    • Not so typical to Forecast the cost of a project or forecast the delivery date. Those are estimated values.
    • For example a financial statement presents, to the best of the responsible party's knowledge and belief, an entity's expected financial position, results of operations, and cash flows. [2]
  • In a forecast, the assumptions represent expectations of actual future events.
    • Sales forecasts
    • Revenue growth
    • Weather forecasts
    • Forecasts of cattle prices in the spring

A forecast is a prediction of some out come in the future. Forecasts are based on estimating the processes that produce the forecast. The underlying statistcal models (circulation, thermal models) of weather forecasting are estimates of the compressable fluid flow of gases and moisture in the atmsophere (way over simplified).

Projections/Prediction

  • Projections indicate what future values  may exist for a population of values if the assumed patterns of change were to occur. Projections are not a prediction that the population will change in that manner.
    • Projected revenue for GE aircraft engine sales in 2017 was an article in this week's Aviation Week & Space Technology. 
    • A projection simply indicates a future value for the population if the set of underlying assumptions occurs.

A prediction says something about the future.

Project cost, schedule, and technical performance Estimates

All projects contain uncertainty. Uncertainty comes in two forms - aleatory (irreducible) and epistemic (reducible). If we're going to make decisions in the presence of these uncertainties, we need to estimate what their values are, what the range of values are, what the stability of the underlying processes that generate these values are, how these values interact with all the elements of the project and what the impact of these ranges of value will do to the probability of success of our project.

Project decision in the presence of unceratinty cannot be made without estimates. Anyomne claiming othewise does not understand statsiics and probability of orucomes on projects

As well anyone claims estimates are a waste, not needed, misused by management, of any other dysfunction, is doing them wrong. So as we said at Rocky Flats - Don't Do Stupid Things On Purpose. Which means when you do hear those phrases, you'll know they are Doing Stupid Things on Purpose.

And when yo hear, we don't need estimates we need budget, remember:

In a world of limited funds, as a project manager, Product Owner, or even sole contributor, you’re constantly deciding how to get the most return for your investment. The more accurate your estimate of project cost is, the better able you will be to manage your project’s budget.

Another example of not understanding the probability and statistics of projects and the businesses that fund them is, there are two estimates needed for all projects that operate in the presence of uncertainty:

  • Estimate at Completion (EAC)
    • EAC is the expected cost of the project when it is complete.
    • This can be calculated bottom-up from the past performance and future projections of performance for the projects work - which in the future will be an estimate.
  • Estimate to Complete
    • ETC is the expected cost to complete the project.
  • The ETC used to calculate the EAC
    • EAC = Actual Costs to Date (AC) + Estimated Cost to Complete (ETC).
    • EAC = Actual performance to date / Some Index of Performance.

This last formula is universal and can be used no matter the software development method.

  • Agile has such a formula - it's called the Burn Down Chart. We're burning down story points at some rate. If we continue at this rate, we will be done by this estimated date
  • Same for traditional projects. We're spending at a current rate - the run rate - if we keep spending at this rate, the project will cost that much
  • Earned Value Management provides the same EAC and can also provide an Estimated Completion Date (ECD)
  • Earned Schedule provide a better ECD

 Wrapup

Nothing in progression can rest on its orginal plan - Thomas Monson [5]

All project work is driven by uncertainty. Uncertainty is modeled by random variables. These variables can represent aleatory uncertainty or epistemic uncertainty. This uncertainty creates risk and as stated here often

Risk Management is How Adults Mange Projects - Time Lister

So if you hear the conjecture that decisioins can be made in the presence of uncertainty without estimates, you'll now know that is a complete load a crap, run away.

If this topic interests you here's a Bibliography of materials for estimating and lots of other topics in agile software development that is updated all the time. Please read and use these when you hear unsubstantiated claims around estimating in the presnece of uncertainty. Making estimates is our business and this resoruce has served us well over the decades.

Other Resources

  1. Australian Bureau of Statistics
  2. Financial Forecasts and Projections, AT §301.06, AICPA, 2015
  3. Earned Value Management in EIA-748-C
  4. Earned Schedule uses the same values as Earned Value, to produce an estimated complete date, www.earnedschedule.com I started using ES at Rocky Flats to explain to the steel workers that their productivity as measured in Budgeted Cost for Work Complete (BCWP or EV) means they are late. ES told them how late and provides the date of the projected completion of the planned work
  5. Project Management Analytics: A Data-Driven Approach to Making Rational and Effective Project Decisions, Harjit Singh, Pearson FT Press; 1st Edition, November 12, 2015.

 

Related articles Why Guessing is not Estimating and Estimating is not Guessing Critical Success Factors of IT Forecasting Eyes Wide Shut - A View of No Estimates IT Risk Management Architecture -Center ERP Systems in the Manufacturing Domain
Categories: Project Management

Quote of the Day

Sat, 11/26/2016 - 22:35

When you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knpowledge is of a meager and unsatisfactory kind. -- Lord Kelvin Popular Lectures and Addresses, 1889.

Without numbers, without some means of assessing the situation, the outcomes, the impacts, any conjecture is of little use. 

Categories: Project Management

Risk Management is How Adults Manage Projects

Tue, 11/15/2016 - 17:33

Risk Management is How Adults Manage Projects - Tim Lister

Here's how we manage risk on our software intensive system of systems using Agile. 

Risk management requires making estimating of the reducible and irreducible uncertanties that create risk. If you're not estimating these uncertanties and their impacts, you're not managing as an adult.

Related articles IT Risk Management Risk Management is How Adults Manage Projects Estimating is Risk Management Mr. Franklin's Advice Late Start = Late Finish
Categories: Project Management

The Myth of the Scrum Master and Actual Life Experiences

Tue, 11/15/2016 - 00:11

Just listened to Tony Richards "The Capital Role of Scrum" in Scrum Master Toolbox Podcast (yes I know, this is Vasco's podcast and it does have value when he sticks to Scrum topics). Vasco describes the scrum Master as the Scrum Mom. 

This brings to mind a concept we (wife and me) came to a bit late in our child raising experience.  We encountered Parenting with Love and Logic a bit late - middle school. In this paradigm, there are 3 types of parents.

  • The drill sergeant parent
  • The helicopter parent
  • The consultant parent

In the Scrum paradigm

  • Drill Sergent - enforces compliance with rules using the policies and procedures of the firm's software development lifecycle (SDLC). Remember Jack Welsch's quote bureaucracy protects the organization from the incompetent.  When everyone is competent less bureaucracy is needed. 
  • Helicopter - rescue the team when they get in trouble. This transfers the accountability for getting things done to the Scrum Master, rather than the Scrum Team.

It's the Consultant that serves as the best parenting style. For an Agile (Scrum) Team, the parenting actions in Love and Logic have direct applicability. I'm not suggesting the Scrum master or Scrum Coach is the parent of the team. Rather the paradigm of parenting is applicable. Vasco may not have realized that, but parenting is very close to managing the action of others for a beneficial outcome of both those performing the work and those paying for the work.

  • Provide messages of personal worth and strength - a team is defined as a small group of qualified individuals who hold each other accountable for a shared outcome. The SM needs to message that idea at all times. Determine if the team is behaving as a team, and when they are not consulting with them to determine why not, what can be changed to get back to the team processes. Here's one of the best talks about what a Team does when it is working properly.
  • Very seldom mention responsibilities - if the team is acting like a team, then they have a shared accountability for the outcomes. This is self-defining the responsibilities. The have agreement on this accountability for a shared outcome means having a process to reveal the outcome. Product Roadmap, Release Plan, backlogs, Big Visible Charts again, are ways to broadcast the results of the shared outcome Alistair Cockburn calls these Information Radiators.
  • Demonstrates how to take care of self and be responsible - the SM behaves as a consultant advisor.
  • Shares personal feelings about own performance and responsibilities - communication is all ways (meaning not top down, not bottom up, not dominated by the vocal few).
  • Provides and helps team explore alternatives and then allows the team to make their own decision - making those decisions is the basis of Scrum. Along with the accountability of the team for those decisions.
  • Provides “time frames” in which the team may complete responsibilities - all software development is time-based. Those paying for the work hopefully understand the time value of money. Time is the only thing a team can't get more of. Self-managing in the presence of uncertainty means the team must manage the time aspects of their work.
  • Models doing a good job, finishing, cleaning up, feeling good about it - the SM walks the walk of being a consultative guide. 
  • Often asks, “Who owns the problem?” helps the team explore solutions to the problem - guides the team to the solution through Socratic interaction. This means the SM needs to have some sense of what the solutions might be. Having little understanding of the product domain, means not being able to ask the right questions. Without that skill and experience, the Team can easily get in trouble.
  • Uses lots of actions, but very few words - big visible charts, directed question, artifacts of the teams work, self-created outcomes that demonstrate success as a team for that shared outcome speak much louder than words. 
  • Allows the team to experience life’s natural consequences and allows them to serve as their own teacher - the notion of fail fast and fail often is misunderstood in the business world by the teams. It is many times taken as we don't need to know what done looks like and we can have it emerge as we go. In Love and Logic, the paradigm means failures as a young child have much lower consequences than failures as a young adult. Learn to see what failures will occur and avoid them. Falling out of a chair at age 3 is much less critical than falling off the side of a mountain at age 16 with no protection  - helmet or belaying ropes. Make mistake early on, the cost of mistakes later can be life threatening.

Summary

Scrum teams must act as teams in the Jon Katzenbach notion. The Scrum Master must act as the consultant parent for that team. The term Scrum Coach has two aspects. The parenting coach (consultant) and the coach found on sports teams. Agilist many times forget this. The sports coach is not a player. The sports coach may have played and knows the game. But the agile coach, like the sports coach, has insight to how to improve the performance of the team, that the team members themselves do not have. This is evidenced by the Super Bowl win of the Broncos and the World Series win of the Chicago Cubs. 

Categories: Project Management

Little Book of Bad Excuses

Mon, 11/14/2016 - 16:13

Long ago there were a set of small books from the Software Program Managers Network and Norm Brown's work on Industrial Strength Management Strategies, which was absorbed by an organization which is no longer around. I have all the Little Books and they contain gems of wisdom that need restating in the presence of the current approaches to software development and hype around processes conjectured to fix problems.

The Software Program Managers Network produced books. I have 7 of them.

  • Little Books of Configuration Management
  • Project Breathalyzer
  • Little Book of Bad Excuses
  • Little of Testing Volume I and II
  • Condensed Guide to Software Acquisition Practices
  • Little Yellow Book of Software Management Questions

These books are built on the Nine Best Practices for developing software-intensive systems and short briefing that goes with the paper

Let's start with formal risk management. There was a twitter post yesterday asking about the connection between Agile development and Risk Management. Agile is a participant in risk management but it is not risk management in and of itself.

From Bad Excuses book, here's a list for Risk Management and the Project Breathalyzer where these  items live in a larger context

  1. What are the top ten risks as determined by the customer, technical, and program management?
  2. How are these risks identified?
  3. How are these risks resolved?
  4. How much money and time has bee set aside for risk mitigation?
  5. What risks would be classified as showstoppers and were these derived?
  6. How many risks are in the Risk Register? How recently has the Risk Register been updated?
  7. How many risks have been added in the last six months?
  8. Can a risk be named that was mitigated in the last six months?
  9. What risks are expected to be mitigated or resolved in the next six months?
  10. Are risks assessed and prioritized in terms of their likelihood of occurrence and the potential impact to the program?
  11. Are as many viewpoints as possible involved in the risk assessment process?
  12. What percentage of the risks impact the final delivery of the system?
  13. To date how many risks have been closed out?
  14. How are identified risks made visible to all project participants?

No matter the development method - agile or traditional - risk management is how adults manage projects (Tim Lister). No risk management no adults at the table.

Categories: Project Management

Little Book of Bad Excuses

Mon, 11/14/2016 - 16:13

Long ago there were a set of small books from the Software Program Managers Network and Norm Brown's work on Industrial Strength Management Strategies, which was absorbed by an organization which is no longer around. I have all the Little Books and they contain gems of wisdom that need restating in the presence of the current approaches to software development and hype around processes conjectured to fix problems.

The Software Program Managers Network produced books. I have 7 of them.

  • Little Books of Configuration Management
  • Project Breathalyzer
  • Little Book of Bad Excuses
  • Little of Testing Volume I and II
  • Condensed Guide to Software Acquisition Practices
  • Little Yellow Book of Software Management Questions

These books are built on the Nine Best Practices for developing software-intensive systems and short briefing that goes with the paper

Let's start with formal risk management. There was a twitter post yesterday asking about the connection between Agile development and Risk Management. Agile is a participant in risk management but it is not risk management in and of itself.

From Bad Excuses book, here's a list for Risk Management and the Project Breathalyzer where these  items live in a larger context

  1. What are the top ten risks as determined by the customer, technical, and program management?
  2. How are these risks identified?
  3. How are these risks resolved?
  4. How much money and time has bee set aside for risk mitigation?
  5. What risks would be classified as showstoppers and were these derived?
  6. How many risks are in the Risk Register? How recently has the Risk Register been updated?
  7. How many risks have been added in the last six months?
  8. Can a risk be named that was mitigated in the last six months?
  9. What risks are expected to be mitigated or resolved in the next six months?
  10. Are risks assessed and prioritized in terms of their likelihood of occurrence and the potential impact to the program?
  11. Are as many viewpoints as possible involved in the risk assessment process?
  12. What percentage of the risks impact the final delivery of the system?
  13. To date how many risks have been closed out?
  14. How are identified risks made visible to all project participants?

No matter the development method - agile or traditional - risk management is how adults manage projects (Tim Lister). No risk management no adults at the table.

Categories: Project Management

How to Lie With Statistics

Sun, 11/13/2016 - 15:49

The book How To Lie With Statistics, Darrell Huff  tells us how the make the numbers look the way we want them to look, without actually changing the numbers.  One common way to is adjust the coordinates scales. Either in this way, by not have the full y-axis scale. This approach makes it look like the voter turnout is dramatically different between 2008 and 2016. Which it is as a percentage

Screen Shot 2016-11-11 at 7.56.04 AM

The Republican vote was lower than the Democratic vote, but the scale makes it look much different

Screen Shot 2016-11-11 at 7.58.02 AM

Of course, there's my favorite where the y-axis has no scale - supposedly normalized - but normalized in heterogeneous units - Cats versus Cumquats. In this case, as well the Ideal has no basis in fact since both the estimate and the actual are random variances subject to the uncertainties of project management - aleatory and epistemic uncertainties. The missing error bands hide what is the Root Cause of the non-ideal actuals. Each dot (diamond and triangle) needs a confidence band from the original estimating process. Was that estimate an 80% confidence estimate, a 60% confidence estimate, a wild ass guess. With this knowledge the single point results are worthless in determining what could the numbers have been, why they are the way they are, and what could we possibly do about making them better.

Without knowing why those projects dod not follow the ideal - meaning the actuals matched the estimate, the chart is just a bunch of date with no information for taking corrective actions to improve project performance.

Screen Shot 2016-11-11 at 8.01.02 AM

So first go buy How To Lie with Statistics, (you can find a downloadable version at Archive.org) then download How to Lie with Statistical Graphs.

Along with

  • Statistics, A Very Short Introduction, David J. Hand
  • Principles of Statistics, M. G. Bulmer
  • Flaws and Fallacies in Statistical Thinking, Stephen K. Campbell
  • Stanard Deviations: Flawed Assumptions, Tortured Data, and Other Ways to Lie with Statistics, Gary Smith

You can start to push back on graphs, charts, assumptions, and conclusions derived from bad statistics

After that go buy Apollo Root Cause Analysis: Effective Solution to Everyday Problems Every Time, Dean Gano and learn that without the CAUSE is the numbers you see in a graph, you have no way to taking any action to make them better. You're just by stander to possible bad statsitics.

Categories: Project Management

The Agile Cannon

Sat, 11/12/2016 - 18:30

The paper Agile Base Patterns in the Agile Canon, Daniel R Greening, 2016 49th Hawaii International Conference on System Sciences is an important contribution to the discussion of agile at scale in organizations beyond 5 developers at the table with their customer.

The Agile Cannon is composed of 5 elements

  1. Measure Economic Progress
    • Plans don't guarantee creative success - creative efforts operate in an economy - as system where people manage limited resources to maximize return and growth
    • Forces on economic progress
      • Economics - actions that be participants without well-defined economic guidance, wander aimlessly. They don't know what they value. They don't know their costs
      • Measurement - lagging measures applied to current decisions can fail perversely
      • When measurement drives rewards, perceived value is gamed. Creativity is improved with rewards of mastery, autonomy, and purpose [1]
    • Measure economic progress with well-chosen, evolving metrics
      • Identify desired outcomes
      • Identify relevant metrics
      • Create a forecasting discipline
      • Embrace objectivity
      • Evolve
  2. Proactively Experiment to Improve
    • Not improving fast enough
    • Forces on proactive improvement
      • Complacency - passive observation 
      • Loss of control creates risk of failure
      • Quest for control (in manufacturing sense) makes innovation harder [2]
      • Non-creative work is easier
      • Uncertainty creates confusion
    • Proactively experiment to improve
      • Run adaptive improvement experiments
      • Before changing anything assess different options and explore possible results
      • Experiments can be evolutionary or revolutionary
      • Establish a hypothesis
      • Innovation causes variability
      • Kaizen emphasis on small improvements
      • Variation accompanies chaos and complex adaptive systems
      • Two solutions to all this
        • Compensate for metric variations by including learning metrics
        • Compensate for cost variation by including risk reduction metrics
    • Teams that apply experiment techniques can become hyper-productive [3]
  3. Limit Work-In-Progress
    • When going too slow, more detailed plans makes it worse
    • Forces on WIP 
      • Inventory - fungible assets helps increase productivity, but increase costs
      • Congestion - as randomly timed requests increase system utilization, delay before request started increases exponentially
      • Cognition - the most limited resource for creative people is time and attention
    • Limit WIP to improve value flow
      • Cognition and backlogs = a clear mind helps prioritize work
        • Focus on most profitable work
        • Establish a Zero backlog approach to planning - Scrum creates Sprint Backlog. Highly effective Scrum teams have seven items in Sprint Backlog
        • Create fractually structured Product Backlog - seven small backlog items, followed by seven bigger ones. Fractually Structured backlog limits the amount of planned effort invested early in a large project, reducing planning, decreasing sunk cost bias, and encourages rapid adoption to new information
      • Collaborative Focus
        • Swarm on top most items
        • Goal is completion - shippable product
        • Communication delays are a form of WIP
      • Value Stream Optimization
        • Visibly track active work by category
        • Limit WIP in each category
        • Organize using VSM [4]
  4. Embrace Collective Responsibility
    • Forces on Collective Responsibility
      • Readily claim responsibility for success, but refrain from claim responsibility for failure
        • Deny the problem
        • Blame others
        • Blame circumstances
        • Feel obligation to keep doing our job
    • Help People embrace collective responsibility
      • Autonomy
      • Understanding
      • Agency
    • Organizational culture largely determines if teams and individuals embrace and sustain collective responsibility
  5. Solve Systemic Problems
    • Forces on systemic problems
      • Operating with many actors, dysfunctions of others limits agility
      • Competing for attention from dependencies creates queue that increases latency 
    • Collaboratively analyze and mitigate systemic dysfunction
      • Root Cause Analysis [5]
      • Static analysis - dependency mapping
      • Dynamic analysis - analyze flow

[1] D. Pink, Drive: The surprising truth about what motivates us (2011). 

[2] R. Ashkenas, “It’s Time to Rethink Continuous Improvement,” HBR blog http://j.mp/hbrci (2012). 

[3] C.R. Jakobsen et al, “Scrum and CMMI – Going from Good to Great: Are you ready-ready to be done-done?” Agile Conference 2009, IEEE. 

[4] G. Alleman, "Product & Process Development Kaizen for Software Development, Project, and Program Management, LPPDE, Denver Colorado, April 2008

[5] D.R. Greening, “Agile Pattern: Social Cause Mapping,” http://senexrex.com/cause-mapping/ (2015). 

Categories: Project Management

Veterans Day

Fri, 11/11/2016 - 15:41

Veterans-day1

For all of us who have served, are serving, and those who gave their lives in service of our country - honor them today

Categories: Project Management

Book of the Month

Thu, 11/10/2016 - 19:04

It's been a busy month for reading. I've been on the road, so I try and focus on reading rather than on working while on the plane. Here are three books underway that are related for the programs we work

Practical Guide to Distributed Scrum

This book contains processes for improving the performance of Scrum teams when they are distributed.

Two of my clients are in this situation. Mainly because the cost of living near the office is prohibitive and travel distances are the worst in Metro DC.

The book shows how to develop User Stories using a distributed team, engaging in effective release planning, managing cultural and language differences, resolving dependencies, and using remote software processes.

Logically Fallacious

It seems many of the idea debates we get into are based on logical fallacies. 

Here's a nice book on how this happens and how to address the issues when it comes up.

Agile!

I've saved the best for last.

This is a MUST READ book for anyone working with agile or thinking about it.

With the Logically Fallacious book in hand, Agile! can be read in parallel.

There is so much crap out there around Agile, this book is mandatory reading. 

From the nonsense of #Noestimates to simply bad advice, Bertrand calls it out. Along with all the good things of agile

 

Related articles Five Estimating Pathologies and Their Corrective Actions Taxonomy of Logical Fallacies Essential Reading List for Managing Other People's Money Architecture -Center ERP Systems in the Manufacturing Domain IT Risk Management Why Guessing is not Estimating and Estimating is not Guessing Mike Cohn's Agile Quotes
Categories: Project Management

Estimating is a Learned Skill

Wed, 11/09/2016 - 05:25

Estimating is a learned skill, used for any purpose from every-day life to management of projects. When I left for the airport this morning to catch my flight to a customer site I estimated, given the conditions, how much time I need to get to my favorite parking spot at DIA. When I landed in Boston, I asked the taxi driver how long it will take to get back to the airport on Wednesday at 3:00PM. He knew that answer. From my location at the office in the North End to the airport, between 7 to 12 minutes to the SWA terminal.

The same process for estimating is applied to multi-billion dollar projects we work. And the same process is applied to the Scrum development processes on those projects. 

Here's some materials that provide the tools and processes needed to learn how to estimate. Google will find these when there is no URL provided.

So when you hear we can't estimate you'll know better. And when you hear estimates are a waste you'll realize that person must work in a de minimis project, where those paying have no need to know how much it will cost, when the project will be done, and what Capabilities they'll get for that time and money before the time and money runs outs.

The primary purpose of software estimation is not to predict a project’s outcome; it is to determine whether a project’s targets are realistic enough to allow the project to be controlled to meet them ‒ Steve McConnell

  • “Believing is Seeing: Confirmation Bias Studies in Software Engineering, “Magne Jørgensen and Efi Papatheocharous, 41st Euromicro Conference on Software Engineering and Advanced Applications (SEAA)
  • “Numerical anchors and their strong effects on software development effort estimates,” Erik Løhrea, and Magne Jørgensen, Simula Research Laboratory, Oslo.
  • “Review on Traditional and Agile Cost Estimation Success Factor in Software Development Project,” Zulkefli Mansor, Saadiah Yahya, Noor Habibah Hj Arshad, International Journal on New Computer Architectures and Their Applications (IJNCAA) 1(3): 942–952.
  • “Release Planning & Buffered MoSCoW Rules,” Dr. Eduardo Miranda Institute for Software Research ASSE 2013 ‐ 14th Argentine Symposium on Software Engineering / 42 JAIIO (Argentine Conference on Informatics) September 16th, 2013, Cordoba, Argentina
  • “Fixed price without fixed specification,” Magne Jørgensen, Simula Research Laboratory, 15 March 2016.
  • “The Use of Precision of Software Development Effort Estimates to Communicate Uncertainty,” Magne Jørgensen, Software Quality Days. The Future of Systems-and Software Development. Springer International Publishing, 2016.
  • “The Influence of Selection Bias on Effort Overruns in Software Development Projects,” Magne Jørgensen, Simula Research Laboratory & Institute of Informatics, University of Oslo.
  • “Software effort estimation terminology: The tower of Babel,” Stein Grimstad, Magne Jørgensen, Kjetil Moløkken-Østvold, Information and Software Technology 48 (2006) 302–310.
  • “Planning and Executing Time-Bound Projects,” Eduardo Miranda, IEEE Computer, March 2002, pp. 73 ‒ 78.
  • “When 90% Confidence Intervals are 50% Certain: On the Credibility of Credible Intervals,” Karl Halvor Teigen and Magne Jørgensen, Applied Cognitive Psychology Applied Cognitive Psychology, 19: 455–475 (2005).
  • “Software quality measurement,” Magne Jørgensen, Advances in Engineering Software 30 (1999) 907–912.
  • “Group Processes in Software Effort Estimation,” Kjetil Moløkken-østvold and Magne Jørgensen, Empirical Software Engineering, 9, 315–334, 2004.
  • “Story Point Estimating,” Richard Carlson, ALEA, Agile and Lean Associates, 2013.
  • “Project Estimation in the Norwegian Software Industry – A Summary,” Kjetil Moløkken, Magne Jørgensen, Sinan S. Tanilkan, Hans Gallis, Anette C. Lien, and Siw E. Hove, Simula Research Laboratory.
  • “Software Estimation using a Combination of Techniques,” Klaus Nielsen, PM Virtual Library, 2013
  • “An Effort Prediction Interval Approach Based on the Empirical Distribution of Previous Estimation Accuracy,” Magne Jørgensen and D. I. K. Sjøberg, Simula Research Laboratory, Norway.
  • “Better Sure than Safe? Overconfidence in Judgment Based Software Development Effort Prediction Intervals,” Magne Jørgensen, Karl Halvor Teigen, and Kjetil Moløkken-Østvold, Journal of Systems and Software, February 2004
  • “The Impact of Irrelevant and Misleading Information on Software Development Effort Estimates: A Randomized Controlled Field Experiment,” Magne Jørgensen and Stein Grimstad, IEEE Transactions of Software Engineering, Volume 37, Issue 5, September ‒ October 2011.
  • “The Heisenberg Uncertainty Principle and Its Application to Software,” P. A. Laplante, ACM SIGSOFT Software Engineering Notes, Vol. 15 No. 5 Oct 1990, Page 21.
  • “Experience With the Accuracy of Software Maintenance Task Effort Prediction Models, Magne Jørgensen, IEEE Transactions On Software Engineering, Vol. 21, No. 8, August 1995.
  • “Conducting Realistic Experiments in Software Engineering,” Dag I. K. Sjøberg, Bente Anda, Erik Arisholm, Tore DybĂĽ, Magne Jørgensen, Amela Karahasanovic, Espen F. Koren and Marek VokĂĄc, International Symposium on Empirical Software Engineering, 2002.
  • “Forecasting of software development work effort: Evidence on expert judgement and formal models,” Magne Jørgensen, International Journal of Forecasting 23(3) pp. 449-462, July 2004.
  • “Software effort estimation terminology: The tower of Babel,” Stein Grimstad, Magne Jørgensen, Kjetil Moløkken-Østvold, Information and Software Technology, 48 (2006) 302–310.
  • “A Systematic Review of Software Development Cost Estimation Studies,” Magne Jørgensen and Martin Shepperd, IEEE Transactions On Software Engineering, Vol. 33, No. 1, January 2007.
  • “Towards a Fuzzy based Framework for Effort Estimation in Agile Software Development,” Atef Tayh Raslan, Nagy Ramadan Darwish, and Hesham Ahmed Hefny, (IJCSIS) International Journal of Computer Science and Information Security, Vol. 13, No. 1, 2015,
  • “Evaluation of Model Evaluation Criterion for Software Development Effort Estimation,” S. K. Pillai, and M. K. Jeyakumar, International Journal of Electrical, Computer, Energetic, Electronic and Communication Engineering, Vol: 9, No: 1, 2015.
  • “Modern Tools to Support DoD Software Intensive System of Systems Cost Estimation: A DACS State-of-the-Art Report, August 2007
  • “Software Effort Estimation with Ridge Regression and Evolutionary Attribute Selection,” Efi Papatheocharous , Harris Papadopoulos and Andreas S. Andreou, 3rd Artificial Intelligence Techniques in Software Engineering Workshop, 7 October, 2010, Larnaca, Cyprus
  • “The Business of Software Estimation Is Not Evil: Reconciling agile approaches and project estimates,” Phillip G. Armour, Communications of the ACM, January 2014, Vol. 57, No. 1.
  • “Analysis of Empirical Software Effort Estimation Models,” Saleem Basha and Dhavachelvan P, (IJCSIS) International Journal of Computer Science and Information Security, Vol. 7, No. 3, 2010,
  • “Empirical Estimation of Hybrid Model: A Controlled Case Study,” Sadaf Un Nisa and M. Rizwan Jameel Qureshi, I.J., Information Technology and Computer Science, 2012, 8, 43–50.
  • “Identification of inaccurate effort estimates in agile software development,” Florian Raith, Ingo Richter, Robert Lindermeier, and Gudrun Klinker, 2013 20th Asia-Pacific Software Engineering Conference (APSEC)
  • “Efficient Indicators to Evaluate the Status of Software Development Effort Estimation inside the Organizations,” Elham Khatibi and Roliana Ibrahim, International Journal of Managing Information Technology (IJMIT) Vol. 4, No. 3, August 2012
  • “Modern Project Management: A New Forecasting Model to Ensure Project Success,” Iman Attarzadeh and Ow Siew Hock, International Conference on Future Computer and Communication, 2009.
  • “Using public domain metrics to estimate software development effort,” Ross Jeffery, Melanie Ruhe, and Isabella Wieczorek, Proceedings. Seventh International Software Metrics Symposium, 2001.
  • “What We Do and Don’t Know about Software Development Effort Estimation,” Magne Jørgensen, IEEE Software, March / April 2014.
  • “A review of studies on expert estimation of software development effort,” Magne Jørgensen, The Journal of Systems and Software 70 (2004) 37–60.
  • “How to Avoid Impact from Irrelevant and Misleading Information when Estimating Software Development Effort,” Magne Jørgensen & Stein Grimstad Simula Research Laboratory.
  • “Avoiding Irrelevant and Misleading Information When Estimating Development Effort,” Bente Anda , Hege Dreiem , Dag I. K. Sjøberg1, and Magne Jørgensen, IEEE Software, Volume 25, Issues 3, May-June, 2008.
  • “Prediction of project outcome: The Application of Statistical Methods to Earned Value Management and Earned Schedule Performance Indexes,” Walt Lipke, Ofer Zwikael, Kym Henderson, and Frank Anbari, International Journal of Project Management, 27, pp. 400-407, 2009
  • “The ROI of Agile VS. Traditional Methods? An Analysis of XP, TDD, Pair Programming, and Scrum (Using Real Options),” Dr. David Rico, http://davidfrico.com/rico08b.pdf
  • “Exploring the ‘Planning Fallacy’: Why People Underestimate Their Task Completion Times.” Roger Buehler, Dale Griffin, and Michael Ross, Journal of Personality and Social Psychology, Vol 67(3), Sep 1994, 366-38.
  • “Estimates, Uncertainty, and Risk,” Barbara Kitchenham and Stephen Linkman, University of Keele, IEEE Software, May / June, 1997,
  • “Software Project Scheduling under Uncertainties,” Intaver Institute Inc.
  • “A Comparison of Software Project Overruns—Flexible versus Sequential Development Models,” Kjetil Moløkken-Østvold and Magne Jørgensen, IEEE Transactions on Software Engineering, Volume 31, Issue 9, September 2005.
  • “Cost Estimating Issues for MAIS Programs Using an Agile Approach for SW Development,” Richard Mabe, 22 September 2015, DoD Agile Meeting: Enhancing Adoption of Agile Software Development in DoD, September 2015, PARCA OSD.
  • “An Empirical Investigation on Effort Estimation in Agile Global Software Development,” Ricardo Britto, Emilia Mendes, and Jurgen Borstler, 2015 IEEE 10th International Conference on Global Software Engineering
  • “Planning, Estimating, and Monitoring Progress in Agile Systems Development Environments,” Suzette S. Johnson, STC 2010.
  • “Improving Subjective Estimates Using Paired Comparisons,” Eduardo Miranda, IEEE Software, January/February, 2001.
  • “A Comparison of Software Project Overruns—Flexible versus Sequential Development Models,” Kjetil Moløkken-Østvold and Magne Jørgensen, IEEE Transactions on Software Engineering, Volume 31, Issue 9, September 2005.
  • “Using Performance Indices to Evaluate the Estimate at Completion,” David Christensen, Journal of Cost Analysis and Management, Spring 17–24.
  • “Reliability Improvement of Major Defense Acquisition Program Cost Estimates—Mapping DoDAF to COSMO,” Ricardo Valerdi, Matthew Dabkowski, and Indrajeet Dixit, Systems Engineering, Volume 18, Issue 4, 2015
  • “Fallacies and biases when adding effort estimates.” Magne Jørgensen, https://www.simula.no/file/simulasimula2762pdf/download
  • “The Use of Precision of Software Development Effort Estimates to Communicate Uncertainty,” Magne Jørgensen, Software Quality Days. The Future of Systems-and Software Development. Springer.
  • “Communication of software cost estimates,” Magne Jørgensen, https://simula.no/file/simulasimula2498pdf/download
  • “Relative Estimation of Software Development Effort: It Matters With What and How You Compare,” Magne Jørgensen, IEEE Software(2013): 74-79.
  • “Reasons for Software Effort Estimation Error: Impact of Respondent Role, Information Collection Approach, and Data Analysis Method Magne,” Jørgensen and Kjetil Moløkken-Østvold, IEEE Transactions On Software Engineering, Vol. 30, No. 12, December 2004.
  • “Use Case Points: An estimation approach,” Gautam Banerjee, https://gl/QcPmYd
  • “Software Cost Estimation Methods: A Review, “ Vahid Khatibi and Dayang N. A. Jawawi, Journal of Emerging Trends in Computing and Information Sciences, Volume 2 No. 1.
  • “Software cost estimation,” Chapter 26, Software Engineering, 9th Edition, Ian Sommerville, 2010.
  • “Estimating Development Time and Effort of Software Projects by using a Neuro-Fuzzy Approach,” Venus Marza and Mohammad Teshnehlab, in Advanced Technologies, INTECH Open, October 2009.
  • “Function Points, Use Case Points, Story Points: Observations From a Case Study,” Joe Schofield, Alan W. Armentrout, and Regina M. Trujillo, CrossTalk: The Journal of Defense Software Engineering, May–June 2013.
  • “Advanced Topics in Agile Estimating,” Mike Cohn, Mountain Goat Software
  • “Schedule Assessment Guide: Best Practices for Schedule Assessment,” GAO-16-89G.
  • Agile Estimating and Planning, Mike Cohn, Prentice Hall, 2006
  • ” A Bayesian Software Estimating Model Using a Generalized g-Prior Approach,” Sunita Chulani and Bert Steece, Technical Report, USC-CSE-98515
  • “A Model for Software Development Effort and Cost Estimation,” Krishnakumar Pillai and V.S. Sukumaran Nair, IEEE Transactions on Software Engineering, Vol. 23, No. 8, August 1997.
  • “An Alternative to the Rayleigh Curve Model for Software Development Effort,” F. N. Parr, IEEE Transactions On Software Engineering, Vol. SE–6, NO. 3, May1980.
  • Fifty Quick Ideas to Improve Your User Stories, Gojko Adzix and David Evans, http://leanpub.com/50quickideas
  • “The Use of Agile Surveillance Points: An Alternative to Milestone Reviews,” Richard “Dick” Carlson, http://a2zalea.com/wp–content/uploads/2014/02/Agile–Surveillance–Points_20140113.pdf
  • “A Planning Poker Tool for Supporting Collaborative Estimation in Distributed Agile Development,” Fabio Calefato and Filippo Lanubile, ICSEA 2011, The Sixth International Conference on Software Engineering Advances.
  • “An Effort Estimation Model for Agile Software Development,” Shahid Ziauddin, Kamal Tipu, Shahrukh Zia, Advances in Computer Science and its Applications (ACSA) 314 Vol. 2, No. 1, 2012.
  • “Successful Solutions Through Agile Project Management,” ESI International White Paper, 2010.
  • “Towards a Fuzzy based Framework for Effort Estimation in Agile Software Development,” Atef Tayh Raslan, Nagy Ramadan Darwish, and Hesham Ahmed Hefny, (IJCSIS) International Journal of Computer Science and Information Security, Vol. 13, No. 1, 2015.
  • “Improving Subjective Estimations Using Paired Comparisons,” Eduardo Miranda, IEEE Software Magazine, Vol. 18, No. 1, January 2001.
  • “Sprint Planning Optimization in Agile Data Warehouse Design,” Matteo Golfarelli, Stefano Rizzi, and Elisa Turricchia, LNCS 7448, pp. 30–41, 2012.
  • “Effort Estimation in Global Software Development: A Systematic Literature Review,” Ricardo Britto, Vitor Freitas, Emilia Mendes, and Muhammad Usman, IEEE 9th International Conference on Global Software Engineering, 2014.
  • “An evaluation of the paired comparisons method for software sizing,” Eduardo Miranda, Proceedings of the 2000 International Conference on Software Engineering.
  • “Protecting Software Development Projects Against Underestimation,” Eduardo Miranda and Alain Abran, Project Management Journal, Volume 39, Issue 3, Pages 75-85, September, 2008.
  • “Sizing User Stories Using Paired Comparisons,” Eduardo Miranda, Pierre Bourque, and Alain Abran, Information and Software Technology, Volume 51, Issue 9, September 2009, Pages 1327–1337.
  • “Effort Estimation in Agile Software Development using Story Points,” Evita Coelho and Anirban Basu, International Journal of Applied Information Systems (IJAIS), Volume 3, Number 7, August 2012.
  • “A Model for Estimating Agile Project Schedule Acceleration,” Dan Ingold, Barry Boehm, Supannika Koolmanojwong, and Jo Ann Lane, Center for Systems and Software Engineering, University of Southern California, Los Angeles, 2013.
  • “Cost Estimation in Agile Development Projects,” Siobhan Keaveney and Kieran Conboy, International Conference on Information Systems Development (ISD2011) Prato, Italy.
  • “Analysis of Empirical Software Effort Estimation Models,” Saleem Basha and Dhavachelvan P, (IJCSIS) International Journal of Computer Science and Information Security, Vol. 7, No. 3, 2010.
  • IT Project Estimation: A Practical Guide to Costing Software, Paul Coombs, Cambridge University Press, 2003
  • “Replanning, Reprogramming, and Single Point Adjustments,”
    July 2013, NAVY CEVM (Center for Earned Value Management).
  • “Software Cost Estimating for Iterative / Incremental Development Programs – Agile Cost Estimating,” NASA CAS, July 2014.
  • “Distinguishing Two Dimensions of Uncertainty,” Craig Fox and GĂźllden ÜlkĂźmen, Perspectives on Thinking, Judging, and Decision Making, Brun, W., Keren, G., Kirkeboen, G., & Montgomery, H. (Eds.). (2011). Oslo, Norway: Universitetsforlaget.
  • “Could Social Factors Influence the Effort Software Estimation?” Valentina Lenarduzzi, 7th International Workshop on Social Software Engineering (SSE), At Bergamo (Italy), September 2015.
  • “Object-Oriented Software Cost Estimation Methodologies Compared,” D. Gregory Foley & Brenda K. Wetzel, Society of Cost Estimating and Analysis – International Society of Parametric Analysts, 22 December 2011, pp 41-63.
  • “Fix Your Estimating Bad Habits,” Ted M. Young, http://slideshare.net/tedyoung/fix-you-some-bad-estimation-habits
  • “How to Estimate and Agile Project,” Saunders Learning Center, http://www.slideshare.net/FloydSaunders/how-to-estimate-an-agile-project
  • Software Cost Estimation Metrics Manual for Defense Systems, Bradford Clark and Richard Madachy (editors), 2015.
  • “Metrics for Agile Projects: Finding the Right Tools for the Job,” ESI International, https://www.projectsmart.co.uk/white–papers/metrics–for–agile–projects.pdf
  • “The Sprint Planning Meeting, Richard “Dick” Carlson, http://www.a2zalea.com/wp–content/uploads/2014/02/SprintPlanningMeeting_20140118.pdf
  • “Software Development Estimation Biases: The Role of Interdependence,” Magne Jørgensen and Stein Grimstad, IEEE Transactions on Software Engineering, Vol. 38, No. 3, May/June 2012.
  • “Managing Projects of Chaotic and Unpredictable Behavior,” Richard “Dick” Carlson, http://www.a2zalea.com/wp–content/uploads/2014/02/Managing–Projects–of–Chaotic–and–Unpredictable–Behavior_20140219.pdf
  • ‘Practical Guidelines for Expert-Judgement-Based Software Effort Estimation,” Magna Jørgensen, IEEE Software, May/June 2005.
  • “How do you estimate on an Agile project?,” eBook, ThoughtWorks, https://gl/ES5M3c
  • “Using the COSMIC Method to Estimate Agile User Stories,” Jean-Marc Desharnais, Luigi Buglione, Bugra KocatĂźrk, Proceedings of the 12th International Conference on Product Focused Software Development and Process Improvement
  • “On the problem of the software cost function,” J. J. Dolado, Information and Software Technology 43 (2001) 61–72.
  • Software Project Effort Estimation: Foundations and Best Practice Guidelines for Success, 2014th Edition, Adam Trendowicz and Ross Jeffery, Springer.
  • “Unit effects in project estimation: It matters whether you estimate in work-hours or workdays,” Magne Jørgensen Journal of Systems and Software(2015).
  • “Estimating Software Development Effort based on Use Cases – Experiences from Industry,” Bente Anda , Hege Dreiem , Dag I. K. Sjøberg1, and Magne Jørgensen, Proceedings of the 4th International Conference on The Unified Modeling Language, Modeling Languages, Concepts, and Tools, Pages 487-502
  • “A Neuro-Fuzzy Model with SEER-SEM for Software Effort Estimation,” Wei Lin Du, Danny Ho, Luiz Fernando Capretz, 25th International Forum on COCOMO and Systems/Software Cost Modeling, Los Angeles, CA, 2010.
  • “A Program Manager's Guide For Software Cost Estimating,” Andrew L. Dobbs, Naval Postgraduate School, December 2002.
  • “An Engineering Context For Software Engineering,” Richard D. Riehle, September 2008, Naval Postgraduate School.
  • “Application of Real Options theory to software engineering for strategic decision making in software related capital investments,” Albert O. Olagbemiro, Monterey, California. Naval Postgraduate School, 2008.
  • “Next Generation Software Estimating Framework: 25 Years and Thousands of Projects Later,” Michael A. Ross, Journal of Cost Analysis and Parametrics, Volume 1, 2008 - Issue 2.
  • “A Probabilistic Method for Predicting Software Code Growth,” Michael A. Ross, Journal of Cost Analysis and Parametrics, Volume 4, 2011 - Issue 2
  • “Application of selected software cost estimating models to a tactical communications switching system: tentative analysis of model applicability to an ongoing development program,” William B. Collins, Naval Postgraduate School
  • “An examination of project management and control requirements and alternatives at FNOC,” Charlotte Ruth Gross, Naval Postgraduate School.
  • “Software cost estimation through Bayesian inference of software size, In Kyoung Park, Naval Postgraduate School.
  • “Using the agile development methodology and applying best practice project management processes,” Gary R. King, Naval Postgraduate School.
  • “Calibrating Function Points Using Neuro-Fuzzy Technique,” Vivian Xia Danny Ho Luiz F. Capretz, 21st International Forum on Systems, Software and COCOMO Cost Modeling, Washington, 2006.
  • Practical Software Project Estimation: A Toolkit for Estimating Software Development Effort & Duration, Peter Hill, International Software Benchmarking Standards Group.
  • Software Estimation Best Practices, Tools & Techniques: A Complete Guide for Software Project Estimators, Murali K. Chemuturi, J. Ross Publishing, August 2009.
  • Software Project Cost and Schedule Estimating: Best Practices, William H. Roetzheim, Prentice Hall.
  • Estimating the Scope of Software Projects Using Statistics, Louis Newstrom, Louis Newstrom Publisher, December 4, 2015.
  • “Organizational Structure Impacts Flight Software Cost Risk,” Jairus M. Hihn , Karen Lum, and Erik Monson, Journal of Cost Analysis and Parametrics, Volume 2, 2009 - Issue 1.
  • “Estimate of the appropriate Sprint length in agile development by conducting simulation,” Ryushi Shiohama, Hironori Washizaki, Shin Kuboaki, Kazunori Sakamoto, and Yoshiaki Fukazawa, 2012 Agile Conference, 13-17 August 2012, Dallas Texas
  • “Advancement of decision making in Agile Projects by applying Logistic Regression on Estimates,” Lakshminarayana Kompella, 2013 IEEE 8th International Conference on Global Software Engineering Workshops.
  • “Estimating in Actual Time,” Moses M. Hohman, IEEE Proceedings of the Agile Development Conference (ADC’05), Denver, Colorado 24-29 July, 2005
  • “Cost Estimation In Agile Development Projects,” Siobhan Keaveney and Kieran Conboy, ECIS 2006 Proceedings
  • “Coping with the Cone of Uncertainty: An Empirical Study of the SAIV Process Model,” Da Yang, Barry Boehm , Ye Yang, Qing Wang, and Mingshu Li, ICSP 2007, LNCS 4470, pp. 37–48, 2007
  • “Combining Estimates with Planning Poker – An Empirical Study,” Kjetil Moløkken-Østvold and Nils Christian Haugen, Proceedings of the 2007 Australian Software Engineering Conference (ASWEC'07).
  • “A Case Study Research on Software Cost Estimation Using Experts’ Estimates, Wideband Delphi, and Planning Poker Technique,” Taghi Javdani Gandomani , Koh Tieng Wei, and Abdulelah Khaled Binhamid, International Journal of Software Engineering and Its Applications, 8, No. 11 (2014), pp. 173-182.
  • “Algorithmic Based and Non-Algorithmic Based Approaches to Estimate the Software Effort,” WanJiang Han , TianBo Lu , XiaoYan Zhang , LiXin Jiang and Weijian Li, International Journal of Multimedia and Ubiquitous Engineering, 10, No. 4 (2015), pp. 141-154.
  • “Reducing Estimation Uncertainty with Continuous Assessment: Tracking the 'Cone of Uncertainty’” Pongtip Aroonvatanaporn, Chatchai Sinthop and Barry Boehm, Center for Systems and Software Engineering University of Southern California Los Angeles, CA 90089, ASE’10, September 20–24, 2010, Antwerp, Belgium, 2010.
  • “Integrated Approach of Software Project Size Estimation,” Brajesh Kumar Singh, Akash Punhani, and A. K. Misra, International Journal of Software Engineering and Its Applications 10, No. 2 (2016), pp. 45-64.
  • “Investigating the Effect of Using Methodology on Development Effort in Software Projects,” Vahid B. Khatibi, Dayang N. A. Jawawi, and Elham Khatibi, International Journal of Software Engineering and Its Applications 6, No. 2, April, 2012.
  • “Data-Driven Decision Making as a Tool to Improve Software Development Productivity,” Mary Erin Brown, Walden University, 2013
  • “Applying Agile Practices to Space-Based Software Systems,” Arlene Minkiewicz, Software Technology Conference, Long Beach, CA 31 March – 3 April, 2014
  • “Estimating the Effort Overhead in Global Software Development,” Ansgar Lamersdorf, Jurgen Munch, Alicia Fernandez-del Viso Torre, Carlos Rebate Sanchez, and Dieter Rombach, 2010 5th IEEE International Conference on Global Software Engineering
  • “A Proposed Framework for Software Effort Estimation Using the Combinational Approach of Fuzzy Logic and Neural Networks,” Pawandeep Kaur and Rupinder Singh, International Journal of Hybrid Information Technology 8, No. 10 (2015), pp. 73-80.
  • “Software Estimating Rules of Thumb,” Capers Jones, http://compaid.com/caiinternet/ezine/capers-rules.pdf
  • “Why Are Estimates Always Wrong: Estimation Bias and Strategic Misestimation,” Daniel D. Galorath, http://iceaaonline.com/ready/wp-content/uploads/2015/06/RI03-Paper-Galorath-Estimates-Always-Wrong.pdf
  • “Using planning poker for combining expert estimates in software projects,” K. Moløkken-Østvold, N. C. Haugen, and H. C. Benestad, Journal of Systems and Software, vol. 81, issue 12 (2008) pp. 2106–2117.
  • “Effort Distribution to Estimate Cost in Small to Medium Software Development Project with Use Case Points,” Putu Linda Primandari and Sholiq, The Third Information Systems International Conference, 2015
  • “Estimation of IT-Projects Highlights of a Workshop,” Manfred Bundschuh, Metrics News, Vol. 4, Nr. 2, December 1999, pp. 29 – 37, https://itmpi.org/Portals/10/PDF/bundschuh-est.pdf
  • “Curbing Optimism Bias and Strategic Misrepresentation in Planning: Reference Class Forecasting in Practice,” Bent Flyvbjerg, European Planning Studies 16, No. 1, January 2008.
  • “The Use of Precision of Software Development Effort Estimates to Communicate Uncertainty,” Magne Jørgensen, Software Quality Days. The Future of Systems-and Software Development. Springer International Publishing, 2016.
  • “Numerical anchors and their strong effects on software development effort estimates,” Erik Løhrea, and Magne Jørgensen, Journal of Systems and Software(2015).
  • “A Neuro-Fuzzy Model for Function Point Calibration,” Wei Xia, Danny Ho, and Luiz Fernando Capretz, WSEAS, Transactions On Information Science & Applications, Issue 1, Volume 5, January 2008.
  • “Unit effects in project estimation: It matters whether you estimate in work-hours or workdays,” Magne Jørgensen, Journal of Systems and Software(2015), https://simula.no/file/time-unit-effect-woauthorinfpdf/download
  • “Fallacies and biases when adding effort estimates,” Magne Jørgensen, Proceedings at Euromicro/SEEA. : IEEE, 2014.
  • “How Does Project Size Affect Cost Estimation Error? Statistical Artifacts and Methodological Challenges,” International Journal of Project Management 30 (2012): 751-862, https://simula.no/file/simulasimula742pdf/download
  • “Does the Use of Fibonacci Numbers in Planning Poker Affect Effort Estimates?” Ritesh Tamrakar and Magne Jørgensen, 16th International Conference on Evaluation & Assessment in Software Engineering, 2012.
  • “Using inferred probabilities to measure the accuracy of imprecise forecasts,” Paul Lehner, Avra Michelson, Leonard Adelman, and Anna Goodman, Judgment and Decision Making, Vol. 7, No. 6, November 2012, pp. 728–740.
  • “Software Development Effort Estimation: Why it fails and how to improve it,” Magne Jørgensen, Simula Research Laboratory & University of Oslo, https://simula.no/file/simulasimula1688pdf/download
  • Contrasting Ideal and Realistic Conditions As a Means to Improve Judgment-Based Software Development Effort Estimation,” Magne Jørgensen, Information and Software Technology53 (2011): 1382-1390.
  • Software Effort Estimation as Collaborative Planning Activity Kristin Børte, https://simula.no/file/simulasimula1226pdf/download
  • “Human judgment in planning and estimation of software projects,” https://simula.no/file/simulasimula886pdf/download
  • “Guideline for Sizing Agile Projects with COSMIC,” Sylvie Trudel and Luigi Buglione, http://cosmic-sizing.org/publications/guideline-for-sizing-agile-projects-with-cosmic
  • “The COSMIC Functional Size Measurement Method, Version 3.0.1, Guideline for the use of COSMIC FSM to manage Agile projects, VERSION 1.0,” September 2011, http://cosmic-sizing.org/cosmic-method-v3-0-1-agile-projects-guideline-v1-0/
  • “Using the COSMIC Method to Evaluate the Quality of the Documentation of Agile User Stories,” Jean-Marc Desharnais, Buğra KocatĂźrk, and Alain Abran, Proceedings of the 12th International Conference on Product Focused Software Development and Process Improvement, Pages 68-73
  • “An Empirical Study of Using Planning Poker for User Story Estimation,” Nils C. Haugen, Proceedings of AGILE 2006 Conference (AGILE’06).
  • “A Framework for the Analysis of Software Cost Estimation Accuracy,” Stein Grimstad and Magne Jørgensen, ISESE'06, September 21–22, 2006.
  • “An Empirical Investigation on Effort Estimation in Agile Global Software Development,” Ricardo Britto, Emilia Mendes, and Jurgen Borstler, 2015 IEEE 10th International Conference on Global Software Engineering
  • “Software Effort Estimation: Unstructured Group Discussion as a Method to Reduce Individual Biases,” Kjetil Moløkken and Magne Jørgensen, Incremental and Component-Based Software Development October 2003, University of Oslo.
  • “A Case Study on Agile Estimating and Planning using Scrum,” V. Mahnic, Electronics And Electrical Engineering, No. 5(111).
  • “Review on Traditional and Agile Cost Estimation Success Factor in Software Development Project,” Zulkefli Mansor, Saadiah Yahya, Noor Habibah Hj Arshad, International Journal on New Computer Architectures and Their Applications (IJNCAA) 1(3): 942-952.
  • “A Collective Study of PCA and Neural Network based on COCOMO for Software Cost Estimation,” Rina M. Waghmode, L.V. Patil, and S.D Joshi, International Journal of Computer Applications (0975 – 8887) Volume 74– No. 16, July 2013.
  • “iUCP: Estimating Interactive-Software Project Size with Enhanced Use-Case Points,” Nuno Jardim Nunes, Larry Constantine, and Rick Kazman, IEEE Software, Issue No. 04 - July/August (2011 vol. 28)
  • “Estimating Software Project Effort Using Analogies,” Martin Shepperd and Chris Schofield, IEEE TRANSACTIONS ON SOFTWARE ENGINEERING, VOL. 23, NO. 12, NOVEMBER 1997.
  • “Software Engineering Economics,” Barry W. Boehm, Software Information Systems Division, TRW Defense Systems Group, Redondo Beach, CA 90278
  • “Adapting, Correcting, and Perfecting Software Estimates: A Maintenance Metaphor,” Tarek K. Abdel-Hamid, IEEE Computer, March 1993.
  • “Estimating software projects,” R. Agarwal, Manish Kumar t, Yogesh, S. Mallick, RM. Bharadwaj, D. Anantwar, ACM Software Engineering Notes Vol 26 No. 4 July 2001, Pg. 6.
  • “Cost estimation in agile development projects,” Siobhan Keaveney and Kieran Conboy, Proceedings of the Fourteenth European Conference on Information Systems, ECIS 2006, GĂśteborg, Sweden, 2006.
  • “Managing Uncertainty in Agile Release Planning,” K. McDaid, D. Greer, F. Keenan, P. Prior, P. Taylor, G. Coleman, Proceedings of the Eighteenth International Conference on Software Engineering & Knowledge Engineering (SEKE'2006).
  • “Research Challenges of Agile Estimation,” Rashmi Popli, Dr. Naresh Chauhan, International Journal of IT & Knowledge Management, Volume 7 • Number 1 • December 2013 pp. 108-111 (ISSN 0973-4414),
  • “Allowing for Task Uncertainties and Dependencies in Agile Release Planning,” Kevin Logue, Kevin McDaid, and Des Greer,
  • “Agile Software Development in Large Organizations, ” Mikael Lindvall, Dirk Muthig, Aldo Dagnino Christina Wallin, Michael Stupperich, David Kiefer, John May, adn Tuomo KähkĂśnen, IEEE Computer, December 2004.
  • "Adoption of Team Estimation in a Specialist Organizational Environment,“ Tor Erlend FĂŚgri, Lecture Notes in Business Information Processing ¡ June 2010
  • “The Relationship between Customer Collaboration and Software Project Overruns,” Kjetil Moløkken-Østvold and Kristian Marius Furulund, IEEE Agile Conference, 13-17 August, 2007.
  • “Improving Estimations in Agile Projects: Issues and Avenues,” Luigi Buglione, Alain Abran, Proceedings Software Measurement European Forum (SMEF)
  • “Allowing for Task Uncertainties and Dependencies in Agile Release Planning,” Kevin Logue, Kevin McDaid, and Des Geer, Proceedings Software Measurement European Forum (SMEF)
  • “Fundamental uncertainties in projects and the scope of project management,” Roger Atkinson , Lynn Crawford, and Stephen Ward, International Journal of Project Management 24 (2006) 687–698.
  • “Improving estimation accuracy by using Case Based Reasoning and a combined estimation approach,” Srinivasa Gopal and Meenakshi D’Souza, Proceedings of ISEC '12, Feb. 22-25, 2012.
  • “Effort Estimation in Agile Software Development: A Systematic Literature Review,” Muhammad Usman , Emilia Mendes , Francila Weidt , and Ricardo Britto, Proceedings of the 10th International Conference on Predictive Models in Software Engineering, 2014, Pages 82-91
  • “Incremental effort prediction models in Agile Development using Radial Basis Functions ,” Raimund Moser, Witold Pedrycz , and Giancarlo Succi, SEKE 2007
  • “Applying Combined Efforts of Resource Capability of Project Teams for Planning and Managing Contingency Reserves for Software and Information Engineering Projects,” Peter H. Chang, GSTF Journal on Computing (JoC) Vol. 2 No. 3, October 2012.
  • Evidence-Based Software Engineering and Systematic Reviews, Barbara Ann Kitchenham, David Budgen and Pearl Brereton, November 5, 2015
  • “The Signal and the Noise in Cost Estimating,” Christian B. Smart, Ph.D., 2016 International Training Symposium, Bristol England, 2016.
  • Estimating Software Intensive Systems: Project, Products, and Processes, Richard Stutzke, Addison Wesley.
  • Estimating Software Costs: Bringing Realism to Estimating, 2nd Edition, Capers Jones, McGraw Hill
  • Software Estimation: Demystifying the Black Art, Steve McConnell, Microsoft Press.
  • Software Metrics: A Rigorous and Practical Approach, 3rd Edition, Norman Fenton and Janes Bieman, CRC Press.
  • Probability Methods for Cost Uncertainty Analysis: A Systems Engineering Perspective, Paul R. Garvey, CRC Press.
  • Forecasting and Simulating Software Development Projects: Effective Modeling of Kanban & Scrum Projects using Monte-Carlo Simulation, Troy Magennis, CreateSpace Independent Publishing Platform, October 25, 2011.
  • “Effort estimation for Agile Software Development Projects,” Andreas Schmietendorf, Martin Kunz, Reiner Dumke, Proceedings 5th Software Measurement European Forum, Milan 2008.
  • “The QUELCE Method: Using Change Drivers To Estimate Program Costs,” Sarah Sheard, April 2016, Software Engineering Institute.
  • “Software Cost and Schedule Estimating: A Process Improvement Initiative,” Robert Park, Wolfhart Goethert, J. Todd Webb, Special Report CMU/SEI-94-SR-3 May 1994.
  • “Organizational Considerations for the Estimating Process,” Bob Ferguson, Software Engineering Institute, November, 2004.
  • “A Parametric Analysis of Project Management Performance to Enhance Software Development Process,” N. R. Shashikumar, T. R. Gopalakrishnan Nair, Suma V, IEEE International Conference on Advanced Research in Engineering and Technology (ICARET - 2013)
  • “Checklists and Criteria for Evaluating the Cost and Schedule Estimating Capabilities of Software Organizations,” Robert E. Park, CMU/SEI-95-SR-005
  • “A Manager's Checklist for Validating Software Cost and Schedule Estimates,” Robert E. Park, Special Report CMU/SEI-95-SR-004 January 1995.
  • ACE Accurate Confident Estimating TSP Symposium November 4, 2014 Pittsburgh, PA, Team Software Process (TSP) Symposium, 4 Nov 2014, SEI Carnegie Mellon University
  • How to Lie with Statistics, Darrell Huff, W. W. Norton, 1954
  • “A Simulation and Evaluation of Earned Value Metrics to Forecast the Project Duration,” Mario Vanhoucke and Stephan Vandevoorde, The Journal of the Operational Research Society, 58, No. 10 (Oct., 2007), pp. 1361-1374
  • “Avoid Software Project Horror Stories: Check the Reality Value of the Estimate First!”, Harold van Herringen, ICEAA 2014
  • COSMIC: Guideline for Sizing Business Software, Version 3, http://www.etsmtl.ca/Unites-de-recherche/GELOG/accueil
  • “Factors Affecting Duration And Effort Estimation Errors In Software Development Projects,” Ofer Morgenshtern, Tzvi Raz, and Dov Dvir, Working Paper No 8/2005, Henry Crown Institute of Business Research, Israel. http://recanati-bs.tau.ac.il/Eng/?CategoryID=444&ArticleID=747
  • “An Empirical Validation of Software Cost Estimation Models,” Chris F. Kemerer, Research Contributions, Management of Computing, Communications of the ACM , May 1987, Volume 30, Number 5.
  • “A Decision Support System To Choose Optimal Release Cycle Length In Incremental Software Development Environments,” Avnish Chandra Suman, Saraswati Mishra, and Abhinav Anand, International Journal of Software Engineering & Applications (IJSEA), 7, No.5, September 2016.
  • “Protecting Software Development Projects Against Underestimation,” Eduardo Miranda, Alain Abran, École de technologie supĂŠrieure - UniversitĂŠ du QuĂŠbec, http://mse.isri.cmu.edu/software-engineering/documents/faculty-publications/miranda/mirandaprotectingprojectsagainstunderestimations.pdf
  • “Improving Subjective Estimates Using Paired Comparisons,” Eduardo Miranda, IEEE Software, January/February 2001.
  • “Improving Estimations in Agile Projects: Issues and Avenues,” Luigi Buglione, Alain Abran, Software Measurement European Forum – SMEF2007, Rome (Italy), May 8-11, 2007.
  • “Estimation of Software Development Effort from Requirements Based Complexity,” Ashish Sharma , Dharmender Singh Kushwaha, 2nd International Conference on Computer, Communication, Control and Information Technology (C3IT 2012), February 25 - 26, 2012
  • “Estimating the Test Volume and Effort for Testing and Verification & Validation,” Alain Abran, Juan Garbajosa, , Laila Cheikhi1, First Annual ESA Workshop on Spacecraft Data Systems and Software - SDSS 2005, ESTEC, Noordwijk, European Space Agency, Netherlands, 17-20 October 2005.
  • “A General Empirical Solution to the Macro Software Sizing and Estimating Problem,” Lawrence H. Putnam, IEEE Transactions On Software Engineering, VOL. SE-4, NO. 4, JULY 1978.
  • “A Comparison of Software Cost, Duration, and Quality for Waterfall vs. Iterative and Incremental Development: A Systematic Review,” Susan M. Mitchell and Carolyn B. Seaman, Third International Symposium on Empirical Software Engineering and Measurement, 2009.
  • Software Sizing and Estimating: MK II FPA, Charles R. Symons, John Wiley and Sons, 1991
  • “A Review of Surveys on Software Effort Estimation,” Kjetil Molkken and Magne Jorgensen, Proceeding of International Symposium on Empirical Software Engineering, ISESE '03.
  • “Accurate Estimates Without Local Data?” Tim Menzies, Steve Williams, Oussama Elrawas, Daniel Baker, Barry Boehm, Jairus Hihn, Karen Lum, and Ray Madachy, Software Process Improvement And Practice, (2009).
  • “An Assessment and Comparison of Common Software Cost Estimation Modeling Techniques,” Lionel C. Briand, Khaled El Emam, Dagmar Surmann, Isabella Wieczorek, and Katrina D. Maxwell, Proceedings of the 21st international conference on Software engineering, Pg. 313-322
  • “The Probable Lowest-Cost Alternative According to Borda,” Neal D. Hulkower, Journal of Cost Analysis and Parametrics, 3:2, 29-36
  • “An Efficient Approach for Agile Web Based Project Estimation: AgileMOW,” Ratnesh Litoriya and Abhay Kothari, Journal of Software Engineering and Applications, 2013, 6, 297-303.
  • “Corad Agile Method for Agile Software Cost Estimation,” Govind Singh Rajput and Ratnesh Litoriya, http://dx.doi.org/10.4236/oalib.1100579
  • “A Baseline Model for Software Effort Estimation,” Peter A. Whigham, Caitlin A. Owen, and Stephen G. MacDonell, ACM Transaction on Software Engineering Methodology, 24, 3, Article 20 (May 2015).
  • Agile Product Management: Agile Estimating & Planning Your Sprint with Scrum and Release Planning 21 Steps, Paul Vii, Create Space, 2016
  • “Core Estimating Concepts,” William Roetzheim, CrossTalk: The Journal of Defense Software Engineering—January/February 2013.
  • “A Practical Approach to Size Estimation of Embedded Software Components,” Kenneth Lind and Rogardt Heldal, IEEE Transactions On Software Engineering, Vol. 38, No. 5, September/October 2012.
  • “A Probabilistic Model for Predicting Software Development Effort,” Parag C. Pendharkar, Girish H. Subramanian, and James A. Rodger, IEEE Transactions On Software Engineering, Vol. 31, No. 7, July 2005.
  • “A Pattern Language for Estimating,” Dmitry Nikelshpur, PLoP '11 Proceedings of the 18th Conference on Pattern Languages of Programs, Article No. 17.
  • “Do Estimators Learn? On the Effect of a Positively Skewed Distribution of Effort Data on Software Portfolio Productivity,” Hennie Huijgens and Frank Vogelezang, 7th International Workshop on Emerging Trends in Software Metrics, 2016.
  • “The Inaccurate Conception: Some thoughts on the accuracy of estimates,” Phillip G. Armour, Communications Of The ACM, March 2008/Vol. 51, No. 3
  • “Understanding Software Project Estimates,” Katherine Baxter, Cross Talk The Journal of Defense Software Engineering, March/April 2009,
  • “Validation Methods for Calibrating Software Effort Models,” Tim Menzies, Dan Port, Zhihao Chen, and Jairus Hihn, May 15–21, 2005, http://menzies.us/pdf/04coconut.pdf
  • “Requirements Instability in Cost Estimation,” Abiha Batool, Sabika Batool, and Mohammad Ayub Latif, https://www.academia.edu/4493828/Requirements_Instability_in_Cost_Estimation
  • “Negative Results for Software Effort Estimation Tim Menzies, Ye Yang, George Mathew, Barry Boehm, Jairus Hihn, EMSE 2016.
  • “Creating Requirements-Based Estimates Before Requirements Are Complete,” Carol A. Dekkers, Cross Talk The Journal of Defense Software Engineering, April 2005.
  • “Rational Cost Estimation of Dedicated Software Systems,” Beata Czarnacka-Chrobot, Journal of Software Engineering and Applications, 2012, 5, 262-269.
  • “Summarization of Software Cost Estimation,” Xiaotie Qina and Miao Fang, Advances in Control Engineering and Information Science, 2011
  • “Software Project Development Cost Estimation,” Barbara Kitchenham, The Journal of Systems and Software 3, 267-278 (1985).
  • “Cost Estimation in Agile Software Development Projects,” Michael Lang, Kieran Conboy and SiobhĂĄn Keaveney, International Conference on Information Systems Development (ISD2011) Prato, Italy.
  • “Project Estimating and Scheduling,” Terry Boult, University of Colorado, Colorado Springs, CS 330 Software Engineering.
  • “Practical Guidelines for Expert-Judgment-Based Software Effort Estimation,” Magne Jørgensen, IEEE Software, May/June 2005.
  • “Predicting Software Projects Cost Estimation Based on Mining Historical Data,” Hassan Najadat, Izzat Alsmadi, and Yazan Shboul, ISRN Software Engineering, Volume 2012, Article ID 823437.
  • “Models for Improving Software System Size Estimates during Development,” William W. Agresti, William M. Evanco, William M. Thomas, Journal of Software Engineering & Applications, 2010, 3: 1-10.
  • “Requirements Engineering for Agile Methods,” Alberto Sillitti and Giancarlo Succi, in Engineering and Managing Software Requirements, pp. 306-326, Springer, 2005.
  • “A Method for Improving Developers’ Software Size Estimates,” Lawrence H. Putnam, Douglas T. Putnam, and Donald M. Beckett, Cross Talk The Journal of Defense Software Engineering, April 2005.
  • “PERT, CPM, and Agile Project Management,” Robert C. Martin 5 October 2003, http://www.codejournal.com/public/cj/previews/PERTCPMAGILE.pdf
  • “Reliability and accuracy of the estimation process Wideband Delphi vs. Wisdom of Crowds,” Marek Grzegorz Stochel, 35th IEEE Annual Computer Software and Applications Conference, 2011
  • “Predicting development effort from user stories,” P. Abrahamsson, I. Fronza, R. Moser, J. Vlasenko, and W. Pedrycz, International Symposium on Empirical Software Engineering and Measurement, 2011
  • “Effort prediction in iterative software development processes - incremental versus global prediction models,” Pekka Abrahamsson, Raimund Moser, Witold Pedrycz, Alberto Sillitti, Giancarlo Succi, First International Symposium on Empirical Software Engineering and Measurement, 2007.
  • “Planning Poker or How to avoid analysis paralysis while release planning,” James Grenning, https://wingman-sw.com/papers/PlanningPoker-v1.1.pdf
  • “Agile Estimation using CAEA: A Comparative Study of Agile Projects,” Shilpa Bhalerao , Maya Ingle, 2009 International Conference on Computer Engineering and Applications IPCSIT, Vol.2 (2011).
  • “A Bayesian approach to improve estimate at completion in earned value management,” Franco Caron, Fabrizio Ruggeri, and Alessandro Merli, Project Management Institute Journal, Vol. 44, No. 1, pp. 3-16. 2013.
  • “An Empirical Approach for Estimation of the Software Development Effort,” Amit Kumar Jakhar and Kumar Rajnish, International Journal of Multimedia and Ubiquitous Engineering, 10, No. 2 (2015), pp. 97-110.
  • “Forecasting of Software Development Work Effort: Evidence on Expert Judgment and Formal Models,” Magne Jørgensen, International Journal of Forecasting, 2007
  • Software Project Effort Estimation Foundations and Best Practice Guidelines for Success, Adam Trendowicz and Ross Jeffery, Springer 2014.
  • “Agile Release Planning: Dealing with Uncertainty in Development Time and Business Value,” Kevin Logue and Kevin McDaid, 15th Annual IEEE International Conference and Workshop on the Engineering of Computer Based Systems, March 31 – April 4, 2008.
  • “Why Are Estimates Always Wrong: Estimation Bias and Strategic Misestimation,” Daniel D. Galorath, AIAA SPACE 2015 Conference and Exposition Pasadena, California, 2015.
  • “Estimation of Project Size Using User Stories,” Murad Ali, Zubair A Shaikh , Eaman Ali, International Conference on Recent Advances in Computer Systems (RACS 2015).
  • “A Survey of Agile Software Estimation Methods,” Hala Hamad Osman and Mohamed Elhafiz Musa, International Journal of Computer Science and Telecommunications, Volume 7, Issue 3, March 2016.
  • “Why Can’t People Estimate: Estimation Bias and Mitigation,” Dan Galorath, IEEE Software Technology Conference, October 12-15, 2015 
    Hilton Hotel, Long Beach California
  • “Cost-Effective Supervised Learning Models for Software Effort Estimation in Agile Environments,” Kayhan Moharreri, Alhad Vinayak Sapre, Jayashree Ramanathan, and Rajiv Ramnath, 40th Annual IEEE Computer Software and Applications Conference (COMPSAC), 2016
  • “A Review of Surveys on Software Effort Estimation,” Kjetil Moløkken and Magne Jørgensen, http://uio.no/isu/INCO/Papers/Review_final8.pdf
  • “Project Duration Forecasting Using Earned Value Method and Time Series,” Khandare Manish A., Vyas Gayatri S., International Journal of Engineering and Innovative Technology (IJEIT) Volume 1, Issue 4, April 2012.
  • “Integrating Risk Assessment and Actual Performance for Probabilistic Project Cost Forecasting: A Second Moment Bayesian Model,” Byung-Cheol Kim, IEEE Transactions On Engineering Management, Vol. 62, No. 2, May 2015.
  • “A study of project selection and feature weighting for analogy based software cost estimation,” Y.F. Li , M. Xie, and T.N. Goh, The Journal of Systems and Software 82 (2009) 241–252.
  • “Complementing Measurements and Real Options Concepts to Support InterSprint Decision-Making in Agile Projects,” Zornitza Racheva , Maya Daneva, Luigi Buglione, 34th Euromicro Conference Software Engineering and Advanced Applications
  • “Software Cost Estimation and Sizing Methods: Issues and Guidelines,” Shari Lawrence Pfleeger, Felicia Wu, and Rosalind Lewis, RAND Corporation, Project Air Force, 2005.
  • “Anchoring and Adjustment in Software Estimation,” Jorge Aranda and Steve Easterbrook, ESEC-FSE’05, September 5–9, 2005, Lisbon, Portugal.
  • “Cycle Time Analytics: Making Decision using lead Time and Cycle Time to avoid needing estimates for every story,” Troy Magennis, LKCE 2013 ‒ Modern Management Methods,
  • “Probabilistic Forecasting Decision Making: When Do You Want it?” Larry Maccherone, http://www.hanssamios.com/dokuwiki/_media/larry-maccherone-probabilistic-decision-making.pdf
  • “Software Project Planning for Robustness and Completion Time in the Presence of Uncertainty using Multi Objective Search Based Software Engineering,” Stefan Gueorguiev, Mark Harman, and Giuliano Antoniol, GECCO’09, July 8–12, 2009, MontrĂŠal QuĂŠbec, Canada.
  • “Empirical Validation of Neural Network Models for Agile Software Effort Estimation based on Story Points,” Aditi Panda, Shashank Mouli Satapathy, and Santanu Kumar Rath, 3rd International Conference on Recent Trends in Computing 2015 (ICRTC-2015).
  • “When 90% Confidence Intervals are 50% Certain: On the Credibility of Credible Intervals,” Karl Halvor Teigen and Magne Jørgensen, Applied Cognitive Psychology, 19: 455–475 (2005)
  • “Scaling Agile Estimation Methods with a Parametric Cost Model,” Carl Friedrich Kreß, Oliver Hummel, Mahmudul Huq, ICSEA 2014 : The Ninth International Conference on Software Engineering Advances, 2014
  • “Expert Estimation and Historical Data: An Empirical Study,” Gabriela Robiolo, Silvana Santos, and Bibiana Rossi, ICSEA 2013 : The Eighth International Conference on Software Engineering Advances
  • “Agile Monitoring Using The Line Of Balance,” Eduardo Miranda, Institute for Software Research – Carnegie-Mellon University and Pierre Bourque, École de technologie supĂŠrieure – UniversitĂŠ du QuĂŠbec
  • “Managerial Decision Making Under Risk and Uncertainty,” Ari Riabacke, IAENG International Journal of Computer Science, 32:4, IJCS_32_4_12
  • “Simple Method Proposal for Cost Estimation from Work Breakdown Structure,” SĂŠrgio Sequeira and Eurico Lopes, Conference on ENTERprise Information Systems / International Conference on Project Management/ Conference on Health and Social Care Information Systems and Technologies, CENTERIS / ProjMAN / HCist 2015 October 7-9, 2015.
  • “From Nobel Prize to Project Management: Getting Risks Right,” Bent Flyvbjerg, Aalborg University, Denmark, Project Management Journal, vol. 37, no. 3, August 2006, pp. 5-15.
  • “The Uncertainty Principle in Software Engineering,” Hadar Ziv and Debra Richardson, ICSE 97, 9th International Conference on Software Engineering, Boston MA, 17 ‒ 23 May, 1997
  • “Analyzing Software Effort Estimation using k means Clustered Regression Approach,” Geeta Nagpal, Moin Uddin, and Arvinder Kaur, ACM SIGSOFT Software Engineering Notes, January 2013 Volume 38 Number 1.
  • “Assuring Software Cost Estimates: Is it an Oxymoron?,” Jairus Hihnl and Grant Tregre, Goddard Space Flight Center, 2013 46th Hawaii International Conference on System Sciences.
  • “How Does NASA Estimate Software Cost? Summary Findings and Recommendations,” Jairus Hihn, Lisa VanderAar, Manuel Maldonado, Pedro Martinez, Grant Tregre, NASA Cost Symposium, OCE Software Working Group, August 27-29, 2013.
  • “Calibrating Software Cost Models Using Bayesian Analysis,” Sunita Chulani, Barry Boehm, Bert Steece, USC-CSE 1998
  • “Software Project and Quality Modelling Using Bayesian Networks,” Norman Fenton, Peter Hearty, Martin Neil, and Łukasz Radliński, Elsevier, November 31, 2013.
  • “Software Project Level Estimation Model Framework based on Bayesian Belief Networks,” Hao Wang, Fei Peng, and Chao Zhang, 2006 Sixth International Conference on Quality Software (QSIC'06), 27-28 October, 2006.
  • “Using Bayesian Belief Networks to Model Software Project Management Antipatterns,” Dimitrios Settas, Stamatia Bibi, Panagiotis Sfetsos, Ioannis Stamelos, and Vassilis Gerogiannis, Software Engineering Research, Management and Applications, ACIS International Conference on (2006), Seattle, Washington, Aug. 9, 2006 to Aug. 11, 2006.
  • “A Survey Of Bayesian Net Models For Software Development Effort Prediction,” Lukasz Radlinski, International Journal Of Software Engineering And Computing, Vol. 2, No. 2, July-December 2010
  • “Ten Unmyths of Project Estimation Reconsidering some commonly accepted project management practices,” Phillip Armour, Communications of the ACM, November 2002, Vol. 45, No. 11
  • “Using Earned Value Data to Forecast the Duration and Cost of DoD Space Programs,” Capt. Shedrick Bridgeforth Air Force Cost Analysis Agency (AFCAA).
  • “Enterprise Agility—What Is It and What Fuels It?,” Rick Dove, in Utility Agility - What Is It and What Fuels It? - Part 2, 10/24/2009.
  • “Scrum Metrics for Hyperproductive Teams: How They Fly like Fighter Aircraft,” Scott Downey and Jeff Sutherland, HICSS '13 Proceedings of the 2013 46th Hawaii International Conference on System Sciences, Pages 4870-4878, IEEE Computer Society
  • Software Project Estimation: The Fundamentals for Providing High Quality Information to Decision Makers, Alain Abran, Wiley-IEEE, 6 April 2016
  • Practical Software Project Estimation 3rd Edition, Peter Hill, McGraw-Hill Education, 2010
  • “Engaging Software Estimation Education using LEGOs: A Case Study,” Linda Laird and Ye Yang, IEEE/ACM 38th IEEE International Conference on Software Engineering Companion, 2016
  • “Software Estimation – A Fuzzy Approach,” Nonika Bajaj, Alok Tyagi and Rakesh Agarwal, ACM SIGSOFT Software Engineering Notes, Page 1 May 2006 Volume 31 Number 3.
  • “Limits to Software Estimation,” J. P. Lewis, ACM SIGSOFT Software Engineering Notes Vol. 26 No. 4, July 2001, Page 54.
  • “Recent Advances in Software Estimation Techniques,” Richard E. Fairley, ICSE '92: Proceedings of the 14th International Conference on Software Engineering, May 1992.
  • “Software Estimation Using the SLIM Tool ,”Nikki Panlilio-Yap, Proceedings of the 1992 conference of the Centre for Advanced Studies on Collaborative research - Volume 1, CASCON '92
  • “An Approach for Software Cost Estimation,” Violeta Bozhikova and Mariana Stoeva, International Conference on Computer Systems and Technologies - CompSysTech’10.
  • “Estimating Software -Intensive Projects in the Absence of Historical Data,” Aldo Dagnino, ICSE '13: Proceedings of the 2013 International Conference on Software Engineering, May 2013.
  • “A Framework for Software Project Estimation Based on COSMIC, DSM and Rework Characterization,” Sharareh Afsharian, Marco Giacomobono, and Paola Inverardi, BIPI’08, May 13, 2008.
  • “Estimating software projects,” R. Agarwal, Manish Kumart, Yogesh, S. Mallick, RM. Bharadwaj, and D. Anantwar, Software Engineering Notes vol 26 no 4 July 2001 Page 60.
  • “Software Intensive Systems Cost and Schedule Estimation Technical Report SERC-2013-TR-032-2,” Stevens Institute of Technology, Systems Engineering Research Center, 31 June 2013.
  • “Improved Size And Effort Estimation Models For Software Maintenance,” Vu Nguyen, University of Southern California, December 2010.
  • “Probabilistic Estimation of Software Project Duration,” Andy M. Connor, New Zealand Journal of Applied Computing & Information Technology, 11(1), 11-22, 2007.
  • “Application of Sizing Estimation Techniques for Business Critical Software Project Management,” Parvez Mahmood Khan and M.M. Sufyan Beg, International Journal of Soft Computing And Software Engineering (JSCSE), 3, No. 6, 2013
  • “Double Whammy – How ICT Projects are Fooled by Randomness and Screwed by Political Intent,” Alexander Budzier and Bent Flyvbjerg, SaĂŻd Business School working papers, August 2011.
  • “Software Cost Estimation Framework for Service-Oriented Architecture Systems using Divide-and-Conquer Approach,” Zheng Li and Jacky Keung, Proceedings of the 5th International Symposium on Service-Oriented System Engineering (SOSE 2010), pp. 47-54, Nanjing, China, June 4-5, 2010.
  • “Investigating Effort Prediction Of Software Projects On The ISBSG Dataset,” Sanaa Elyassami and Ali Idri, International Journal of Artificial Intelligence & Applications (IJAIA), Vol. 3, No. 2, March 2012.
  • “Resource Estimation in Software Engineering,” Lionel C. Briand, Encyclopedia of Software Engineering, John Wiley and Sons, 2001.
  • Software Measurement and Estimation: A Practical Approach, Linda M. Laird and M. Carol Brennan, Wiley-IEEE Computer Society Press, June 2006
  • CECS 543/643 Advanced Software Engineering, Dar-Biau Liu, California State University, Long Beach, Spring 2012
  • “Software Cost Estimation Review,” Alphonce Omondo Ongere, Helsinki Metropolia University of Applied Sciences, 30 May 2013.
  • “Software estimation process: a comparison of the estimation practice between Norway and Spain,” Paul Salaberria, 1 December 2014, Universitetet I Bergenm.
  • “Comparison of available Methods to Estimate Effort, Performance and Cost with the Proposed Method,” M. Pauline, Dr. P. Aruna, Dr. B. Shadaksharappa, International Journal of Engineering Inventions, Volume 2, Issue 9 (May 2013), pp. 55-68
  • “Applying Fuzzy ID3 Decision Tree for Software Effort Estimation,” Ali Idri and Sanaa Elyassam, IJCSI International Journal of Computer Science Issues, Vol. 8, Issue 4, No 1, July 2011.
  • “Software Effort Estimation with Ridge Regression and Evolutionary Attribute Selection,” Efi Papatheocharous, Harris Papadopoulos and Andreas S. Andreou, 3d Artificial Intelligence Techniques in Software Engineering Workshop, 7 October, 2010.
  • “A Comparison of Different Project Duration Forecasting Methods Using Earned Value Metrics,” Stephan Vandevoorde and Mario Vanhoucke, International Journal of Project Management, 24 (2006), 289-302
  • “Quantifying IT Forecast Quality,” J. L. Eveleens and C. Verhoef, Science of Computer Programming, Volume 74, Issues 11–12, November 2009, Pages 934-988.
  • “Predictive Modeling: Principles and Practices,” Rick Hefner, Dean Caccavo, Philip Paul, and Rasheed Baqai, NDIA Systems Engineering Conference, pp. 20-23 October 2008.
  • “Modeling, Simulation & Data Mining: Answering Tough Cost, Date& Staff Forecasts Questions,” Troy Magennis and Larry Maccherone, Agile 2012, 13-17 August, 2013, Dallas Texas.
  • How to Measure Anything, Douglas Hubbard, John Wiley and Sons, 2014.
  • All About Agile: Agile Management Made Easy!, Kelly Waters, CreateSpace Independent Publishing Platform, March 19, 2012.
  • “Enhance Accuracy In Software Cost And Schedule Estimation By Using 'Uncertainty Analysis And Assessment’ In The System Modeling Process,” Kardile Vilas Vasantrao, International Journal of Research & Innovation in Computer Engineering, Vol 1, Issue 1, (6-18), August 2011.
  • “Estimating Perspectives, Richard D. Stutzke, 20th International COCOMO and Software Cost Modeling Forum, Los Angeles 25-28 October 2005
  • “Introduction to Systems Cost Uncertainty Analysis: An Engineering Systems Perspective,” Paul R. Garvey, National Institute of Aerospace (NIA) Systems Analysis & Concepts Directorate NASA Langley Research Center, 2 May 2006.
  • “Cost and Schedule Uncertainty Analysis of Growth in Support of JCL,” Darren Elliot and Charles Hunt, 2014 NASA Cost Symposium, 13 August 2014.
  • “Measurement of Software Size: Contributions of COSMIC to Estimation Improvements,” Alain Abran, Charles Symons, Christof Ebert, Frank Vogelezang, and Hassan Soubra, ICEAA International Training Symposium, Bristol England, 2016.
  • “What Does a Mature Cost Engineering Organization Look Like?” Dale Shermon, International Cost Estimating and Analysis Association (ICEAA) 2016 18th to 20th October 2016.
  • “A Hybrid Model for Estimating Software Project Effort from Use Case Points,” Mohammad Azzeh and Ali Bou Nassif, Applied Soft Computing journal, Elsevier
  • “A Deep Learning Model for Estimating Story Points,” Morakot Choetkiertikul, Hoa Khanh Dam, Truyen Tran, Trang Pham, Aditya Ghose, and Tim Menzies,
  • “A Hybrid Intelligent Model for Software Cost Estimation,” Wei Lin Du, Luiz Fernando Capretz, Ali Bou Nassif, Danny Ho, Journal of Computer Science, 9(11):1506-1513, 2013
  • “An Empirical Analysis of Task Allocation in Scrum-based Agile Programming,” Jun Lin, Han Yu, Zhiqi Shen
  • “Agile Planning & Metrics That Matter,” Sally Elatta, Agile Transformation for Government.
  • Introduction to Uncertainty Quantification, J. Sullivan, Springer, 2016
  • Generalized Estimating Equations, 2nd Edition, James W. Hardin, Chapman and Hall, 2012.
  • Software Estimation Best Practices, Tools & Techniques: A Complete Guide for Software Project Estimators, Murali K. Chemuturi
  • “#NoEstimates, But #YesMeasurements: Why Shouldn’t agile teams waste their time and effort in estimating,” Pekka Forselius, ISBSG IT Confidence Conference, 2016
  • “Agile Benchmarks: What Can You Concluded?” Don Reifer, ISBSG IT Confidence Conference, 2016
  • “Improve Estimation Maturity using Functional Size Measurement and Industry Data,” Drs. Harold van Heeringen, ISBSG IT Confidence Conference, 2016
  • “Why Can’t People Estimate: Estimation Bias and Mitigation,” Dan Galorath, ISBSG IT Confidence Conference, 2015
  • “Why Can’t People Estimate: Estimation Bias and Strategic Mis-Estimation,” Daniel D. Galorath, ISBSG IT Confidence Conference, 2014
  • “Estimation ‒ Next Level,” Ton Dekkers, ISBSG IT Confidence Conference, 2013.
  • “Are We Really That Bad? A Look At Software Estimation Accuracy,” Peter R. Hill, ISBSG IT Confidence Conference, 2013.
  • Hybrid Approach For Estimating Software Project Development Effort: Designing Software Cost Estimation Model, Ketema Kifle, LAP LAMBERT Academic Publishing, June 2, 2016.
  • A Hedonic Approach to Estimating Software Cost Using Ordinary Least Squares Regression and Nominal Attribute Variables, Marc Ellis, BiblioScholar, 2012.
  • “The Evaluation of Well-known Effort Estimation Models based on Predictive Accuracy Indicators,” Khalid Khan, School of Computing Blekinge Institute of Technology, Sweden
  • Uncertainty Quantification: Theory, Implementation, and Applications, Ralph Smith, SIAM-Society for Industrial and Applied Mathematics, December 2, 2013.
  • “A Knowledge and Analytics-Based Framework and Model for Forecasting Program Schedule Performance, Kevin T. Knudsen and Mark Blackburn, Complex Adaptive Systems, Los Angeles, CA , 2016
  • “Managing Project Uncertainty: From Variation to Chaos,” Arnound de Meyer, Christoph Loch, and Michael Pich, IEEE Engineering Management Review 43(3):91 - 91 ¡ December 2002
  • “Project Uncertainty and Management Style,” C. H. Loch, M. T. Pich, and A. De Meyer, 2000/31/TM/CIMSO 10
  • Effects of Feature Complexity of Software Estimates ‒ An Exploratory Study,” Ana Magazinius and Richard Berntsson Svensson, 40th Euromicro Conference on Software Engineering and Advanced Applications, 2014
  • Project Management Under Risk: Using the Real Options Approach to Evaluate Flexibility in R & D," Arnd Huchzermeier and Christoph Loch, INSEAD
Related articles The Fallacy of the Planning Fallacy Architecture -Center ERP Systems in the Manufacturing Domain IT Risk Management Why Guessing is not Estimating and Estimating is not Guessing Making Conjectures Without Testable Outcomes Agile Software Development in the DOD Essential Reading List for Managing Other People's Money Five Estimating Pathologies and Their Corrective Actions Humpty Dumpty and #NoEstimates
Categories: Project Management

Why Johnny Can't Estimate?

Fri, 11/04/2016 - 16:00

I work in a domain where engineered systems are developed for complex software-intensive system of systems. These systems are engineered with a variety of development methods. Ranging from traditional to agile and combinations in between. These systems are developed for external and internal customers. Some are products in their own right, some are embedded in larger missions.

In all cases, we start our work with ...

Risk Management is How Adults Manage Projects - Tim Lister

Since all risk comes from uncertainty - reducible (Epistemic) and irreducible (Aleatory), estimating is a foundation of all we do. There is no discussion of the conjecture that estimates are a waste, estimates can't be done, estimates are evil, estimates must be stopped immediately. This would be like saying, risk management is a waste, control system engineering is a waste, thermal analysis of the computer system is a waste, assessment of the reliability, repairable, survivability - all the ...illities are a waste.

In our domain of engineered systems there is a broad range of problems, complex issues, approaches to solving problems. This is a familiar example of that range...

 

Paradigm of agile project management from Glen Alleman   To be informed how to estimate in this broad range of domains, problems, and impacts - education, exp[erience, and skill are needed. This usually starts with education. Mine is in hard science and systems engineering. Some colleagues have engineering or finance educations. During this education process we all have encountered texts that have laid out the principles and processes needed to do our job and deliver the needed capabilities for the needed cost on the needed date to accomplish the needed mission. Here's a few that serve us well Estimating SW Costs

This book provides a clear, complete understanding of how to estimate software costs, schedule, and quality using real-world information. 

This includes planning for and execution the project by phase and activity level cost estimates. How to estimate regression, components, integration, and stress tests for the software. How to compensate for inaccuracies in data collection, calculation, and analysis. 

How to test the design principles of operational characteristics of the product suing prototyping. How to handle configuration change, research, quality and documentation costs.

SW Project Estimation

Software projects are often late and over budget leading to major problems for customers. There is a serious issue in estimating realistic software project budgets and schedules. Generic models cannot be a reliable source of estimating for complex software projects.

This book presents a number of examples using data collected over years from various organizations the build software. It presents an overview of the International Software Benchmarking Standards Group, which collects data on software projects. This data collection is based on ISO Standards for measuring the functional size of software

Dr. Abran shows how to build estimation models from the data of an organization using statistically sound process and how to focus on the quality of the estimation models. 

SW Estimating

Often referred to as the black art because of the complexity and uncertainty, software estimation is as difficult or puzzling as people think. 

Generating accurate and precise estimates is straightforward.

This book shows how to estimate schedule and cost and the functionality that can be delivered within a time frame. Gow to avoid common estimating mistakes. The estimation techniques needed for specific projects activities. How to apply estimation techniques to any type project - small to large 

Estimarting SW Intensive Systems

Many software projects fail because their leaders don't know how to estimate, schedule, or measure them accurately. Proven tools and techniques exist for every facet of software estimation. This book bring them together in a real-world guidebook to help software managers, engineers, and customers immediately improve their estimates - and drive continuous improvement over time.

 

Screen Shot 2016-11-03 at 7.55.20 PM

Software engineering has become procedural and controlled. Agile is a highly procedural process along with more traditional development methods. 

The estimating of the development process still needs maturing. This book provides a concise guide for estimating software development efforts. It shows why accurate estimates are needed, what different estimating methods can be used, and how to analyze risks to make appropriate contingency allowances for the uncertainties encountered on all projects, not just software development projects.

Agile Estimating and Planning

This book is a practical guide to estimating and planning agile projects. 

The book speaks to why conventional planning fails and why agile planning works. How to estimate feature size using story points and ideal days and how to use each measure to make decisions. How and when to reestimate. How to prioritize features using financial and technical approaches. How to split large features into smaller features. How to plan iterations and predict the team's initial rate of progress. How to schedule projects that have unusually high uncertainty or schedule-related risk. How to estimate projects that will be worked by multiple teams.

These books are just a small sample of the resources available for estimating. When you hear someone say it's too hard, can't be done, never seen it done, it's a waste and the variety of other reasons - ask have you read any of these books and found then wanting for your needs? 

Related articles Want To Learn How To Estimate? Populist Books Can Be Misleading Local Firm Has Critical Message for Project Managers Software Estimating for Non Trival Projects Software Engineering is a Verb
Categories: Project Management

Without a Root Cause Analysis, No Suggested Fix Can Be Effective

Fri, 10/28/2016 - 16:12

In a recent Blog post titled Precision it is suggested Precision (or the implication thereof) is perhaps the root problem of most, if not all dysfunction related to estimation. 

Yes, projects have uncertainty. Everything has uncertainty. Projects are no exception. Estimates have precision and accuracy, but the reason for these unfavorable outcomes is not stated in the post

But as the second paragraph of the post says The number of people walking around with the natural assumption that someone is to blame whenever an estimate turns out to be wrong is just sad. When you’re pushed to deliver a low estimate to secure a bid, and then yelled at for not being able to build the product in time afterwards — you just can’t win. This is another example of Bad Management and Doing Stupid Things on Purpose.

Using the chart from the post, showing the number of projects that went over their estimated effort, let's look closer at a process to sort out the conjectures made in the post about estimating.

Screen Shot 2016-10-28 at 8.19.02 AMFirst, without finding the root cause of the estimating gaps, the quest for a solution is an open loop problem. Yes, there is data showing large variances of actuals versus estimated values.

Why is this the case? has no answer.

And by the way, this is not the same naive and simple-minded 5 whys used by #Noestimates advocates. The actual approach to asking Why is shown later in this post. This is like observing a number of people who are overweight and then claiming “being overweight” is the problem for these people. You need to find the Root Cause.

At times I work for the Institute for Defense Analyses, who produces Root Cause Analyses for software-intensive  system of systems, here's an example Expeditionary Combat Support System: Root Cause Analysis.

When you find the Root Cause for those projects reported to be “overweight” you may also find the corrective action to the inaccuracies and imprecision of the estimates. OR it may be there were technical issues on the program that caused the overages.

Research has shown there are 4 “major” causes of cost and schedule growth in our domain  Estimating is only One of those causes.

Screen Shot 2016-10-28 at 8.25.28 AM

Without determining which root cause is the source of the unfavorable performance of the projects shown in the first chart, no suggestion for corrections can be made. 

Start with Root Cause Analysis and only then suggest the reason for the problem. Here’s the process used in our domain http://www.apollorootcause.com/about/apollo-root-cause-analysis-method  Buy the Reality Charting tool, buy Apollo Root Cause Analysis: Effective Solutions to Everyday Problems Every Time download Seven Steps to Effective Problem-Solving And Strategies for Personal Success Then this approach may be useful in your domain as well.

Then when you hear estimates are the smell of dysfunction you'll know that can not be correct without knowing the Root Cause of that dysfunction. And more importantly you'll know those making any suggestion for a fix to any problem when there is no Root Cause of that problem are just treating the symptoms and the problem will recur over and over - just like the recurring problems with project cost and schedule overruns.

 

Related articles Humpty Dumpty and #NoEstimates Information Technology Estimating Quality Herding Cats: Estimating Resources Herding Cats: Project Management, Performance Measures, and Statistical Decision Making Are Estimates Really The Smell of Dysfunction? Why Guessing is not Estimating and Estimating is not Guessing IT Risk Management The Dysfunctional Approach to Using "5 Whys" Five Estimating Pathologies and Their Corrective Actions
Categories: Project Management

Acting Like an Adult in the Presence of Uncertainty

Thu, 10/27/2016 - 18:41

Risk Management is how Adults Manage Projects - Tim Lister

There are two kinds of uncertainty on all projects, no matter the domain, including software development projects

Screen Shot 2016-10-27 at 11.21.29 AM

Both these drivers of risk will impact the probability of success of projects

Screen Shot 2016-10-27 at 11.23.38 AM

Decision-Making in the Presence of Uncertainty

Decision-making is the process of choosing among alternatives. The nature and context of decisions vary and have a variety of criteria to be considered in making a choice. The alternatives can involve a large number of uncertainties about the factors that influence the ultimate outcomes. For high-consequence decisions with potential outcomes that can have a large impact on valued things, the decision-making process should involve consideration of uncertainties about the outcomes. [1]

For customers paying for the development of software, these things include - cost, schedule (time cost of money), market timing, and performance of the product to name a few.

A decision is a resolution on an issue or a problem of interest being considered. The process of making a decision involves three elements: values, alternatives, and facts (Buede, 2009). [1]

  • Values are criteria used to assess the utility of the outcome. Depending on the context, values can be objective or subjective, and quantitative or qualitative. The values capture the needs, objectives, and preferences of the stakeholders, which are the people and organizations that have an interest and can influence or be influenced by a decision.
  • Alternatives may be given or can be the result of a creative process to generate potential solutions to the problem of interest. The facts are everything known about the alternatives and the context or environment in which the alternatives will be deployed or applied.
  • Facts are the relevant data, information, and knowledge used in the process of assessing the alternatives against the values in order to make the decision. The decision makers use available facts, history, experience, and judgment to select an alternative.

In our software development world, Values, Alternatives, and Facts are always part of the decision-making process mo matter if we're spending our customer's money or our own.

If we're going to make risk-informed decisions in the presence of uncertainty, then we're going to have to estimate to a probability of occurrence, the statistical range of values of the drivers of the risk, the probability of consequences or the statistical range of outcomes.

If you've chosen to not estimate, then Tim Lister suggests we're not acting like adults when spending other people's money.

 [1] "Considering Risk and Resilience in Decision Making," Wilfredo Torres-Pomales, Langley Research Center, Hampton, Virginia, NASA/TM-2015-218777

Related articles Risk Management is Project Management for Adults Making Decisions In The Presence of Uncertainty The Notion of Enterprise Software Development Herding Cats: Project Management, Performance Measures, and Statistical Decision Making Quote of the Day Decision Making in the Presence of Uncertainty Herding Cats: Quote of the Day Some More Background on Probability, Needed for Estimating
Categories: Project Management

Agile Project Management Methods for Corporate IT Projects

Wed, 10/26/2016 - 15:29

When Agile development moves beyond a single scrum team and into the Enterprise IT domain, several considerations must be addressed. For Scrum (or most other agile methods) focus on the team is paramount. The behaviours of the team should follow Katzenbach's definition of a team as...

...a small group of people with complementary skills who are committed to a common purpose, performance goals and approach for which they are mutually accountable - Katzenbach, J. R. and Smith, D.K. (1993), The Wisdom of Teams: Creating the High-performance Organisation, Harvard Business School, Boston

This definition for single scrum teams needs to come in contact with Corporate Governance, managerial finance, decision making in the presence of uncertainty when spending the firms money that has impacts beyond the team and their natural desire to control their own destiny. Governance is about decision rights. IT Governance: How Top Performers Manage Decision Rights for Superior Results is a place to start.

In a recent exchange on Twitter it was mentioned in response a my post that some teams consider the corporate money their own that imagine a world where Ownership has faith and trust in the workers to spend the money appropriately, and this seeming lack of trust in the team's ability to consider external governance as part of their behaviour  ...speaks to a lack of trust that the people they have hired are capable of making proper decisions.

Governance is the basis of business management in some form for all organisations beyond that small group of individuals Katzenbach speaks of. The title of this Blog - Herding Cats - speaks to the issue of organisational governance. Herding cats is "an idiomatic saying that refers to an attempt to control or organize a class of entities which are uncontrollable or chaotic." (Warren Bennis, Managing People is Like Herding Cats (1997)). 

When Agile encounters governance here's some thoughts from a book chapter on how to assure the benefits of both paradigms are delivered.

Screen Shot 2016-10-26 at 8.27.52 AM

PM Chapter on Agile IT Project Management Methods from Glen Alleman   Related articles How We Make Decisions is as Important as What We Decide. Estimating Guidance Why We Need Governance
Categories: Project Management

Quote of the Day

Tue, 10/25/2016 - 15:07

"I am always doing that which I can not do, in order that I may learn how to do it." - Pablo Picasso

When I'm asked to help a client with something new, it's usually close to the domain I'm currently working in. But sometimes it's outside my core experiences. When this happens it's an opportunity to expand my core competencies. Program Planning and Controls applied in new domains is one example. 

Picture1This learning process starts with first principles, of course, so at the very bottom of any new knowledge are similar principles. This is the case for most anything we do. Go back to first principles and determine by what principle does this new knowledge have validity? When you find that, you'll have a rock to stand on in the exploration of new knowledge. If you're exploring this new domain with no basis for recognizing good ideas from bad ideas your contribution to the client's needs is likely to be less than they are expecting for their money.

When you hear we're exploring ask if that exploration has any hypothesis that is being tested, any way to recognize that the exploration is actually moving toward some goal, that the exploration has some way to measure progress toward the goal? No, then those exploring aren't actually exploring, they are just wandering around spending their client'ss money in the hope they will find something of interest they can call discovery.

We have a bumper sticker here in Boulder

Those who wander aren't always lost.

This is not true, if you're wandering around looking for the answer, you're lost. If you're a Boulder hippy and Ward, no problem. But those paying us have a reason to spend that money. They need something we can provide. 

Related articles Overarching Paradigm of Project Success Who Builds a House without Drawings? Essential Reading List for Managing Other People's Money
Categories: Project Management

Fair and Balanced in the Absence of Principles?

Mon, 10/17/2016 - 02:40

We hear fair and balanced is a desirable approach to problems. Turns out this is a false balance when the issue under discussion doesn't address an underlying principle. 

One side can be wrong

It is seductive to state we're exploring new ways to do things ... in the absence of the underlying principles that would guide the explorer to a possible new way to doing something. In the absence of any principles, any conjectures should be rejected. Without this approach, any conjecture, any unsubstantiated opinion, can be treated as equal to principles and evidence-based processes. This is not a good way to improve processes.

Categories: Project Management

Why We Need Estimates

Sun, 10/16/2016 - 00:02

The ability to generate reliable cost and schedule estimates is a critical success factor necessary to support business projects.

Without this ability, business value is at risk of experiencing cost overruns, missed deadlines, and performance shortfalls—all recurring problems that projects assessments too often reveal. Furthermore, cost increases often mean that the business firm cannot fund as many projects as intended or deliver them when promised.

Related articles IT Risk Management Architecture -Center ERP Systems in the Manufacturing Domain Why Guessing is not Estimating and Estimating is not Guessing
Categories: Project Management

Project Management, Performance Measures, and Statistical Decision Making (Part Duex)

Wed, 10/12/2016 - 21:08

Screen Shot 2016-10-11 at 2.04.04 PMThere is a current rash of suggestions on how to improve the performance of software projects. Each of these, while well meaning, are missing the means to confirm their credibility.

They are many times end up being personal anecdotes from observations and local practices that may or may not be applicable outside those anecdotes and more importantly may not be statistically sound in principle, let alone practice.

I work in the Software Intensive System of Systems domains in Aerospace, Defense, Enterprise IT (both commercial and government) applying Agile, Earned Value Management, Productive Statistical Estimating (both parametric and Monte Carlo), Risk Management, and Root Cause Analysis with a variety of capabilities. In this domain, we are guided by credible results using principles, processes, and procedures to increase the probability of program success. References below. 

The growth of cost and schedule is not unique to commercial development. One of my work colleagues is the former NASA Cost Director. This is from one of our presentations from a International Cost Estimating and Analysis Association meeting on the same topic. And there are many other examples (see references).

Screen Shot 2016-10-11 at 2.10.01 PM

One case of 12 projects from a large contractor of Software Intensive System (SIS) shows similar variances 

Screen Shot 2016-10-11 at 2.25.59 PM

More research at www.ida.org and www.acq.osd.mil/parca/ has shown there are four core root causes of this unfavorable growth in cost and schedule and shortfalls in technical capabilities. 

Screen Shot 2016-10-11 at 2.30.30 PM

Some might say these are domains unrelated to our domain. I'd suggest the Principles for project success on non-trivial software efforts are universal. Your project may be different in practice, but the principles are the same. These principles are:

Screen Shot 2016-10-11 at 3.21.21 PM

So let's look at an example 

Here's a typical graph showing a core problem in the software development domain. 

There are a collection of projects that started with an estimate at completion and as these projects are executed they didn't turn out as planned - and most importantly they didn't turn out as needed. 

Screen Shot 2016-10-11 at 1.55.42 PM

Figure 1 - Planned Estimates versus Actual Performance from [1]

Some unanswered critical questions at charts like Figure 1 are:

  1. What's the credibility of the initial estimates?
    • Are they risk adjusted?
    • Is there Management Reserve for in scope but unplanned activities?
    • What's the confidence in this estimate?
    • What are the Cost Element Relationships that drive risk
  2. What processes are in place to execute according to plan?
    • Are there measures of physical percent complete to provide the feedback needed to take corrective actions?
    • Is there a risk retirement plan to ensure Risks don't turn into Issues and delay the project?
  3. What irreducible uncertainties and the resulting risks were Not handled?
    • Is there schedule, cost, and technical margin in the plan?
    • Do you know the margin burndown rate and is that measured to assure you're on plan for the burn down of margin rate?
  4. Was the scope controlled, properly funded, staffed properly, defects too high - causing rework, and a myriad of other operational and developmental things that could have caused the actual result to not match the initial estimate. What are these and what corrective actions from the root causes were not in place?
    • Is there a change control process in place of some kind?
    • Does that process prevent impacting the plans without authorization?
  5. What irreducible uncertainties were not considered for cost, schedule, and technical margins?
    • Is there Management Reserve for this?
    • Is that MR managed in accordance with the governance processes for the project or the firm?

Each of these questions and the others needed to determine if the samples like those Figure 1 have any root causes not identified by the author of the chart.

Without determining the cause of why the sampled value is what it is, the chart is missing one half of the needed information to make a decision for the corrective actions, and the unfavorable answers to the questions above. 

A Few References and Resources 

  1. Schedule Estimation and Uncertainty Surrounding the Cone of Uncertainty, Todd Little, IEEE Software, May/June 2006
  2. "Sources of Weapons Systems Growth: Analysis of 35 Major Defense Acquisition Programs"
  3. GAO cost estimating and assessment guide
  4. Parametric Cost Estimating Handbook
  5. NASA Cost Estimating Handbook
  6. USAF Cost Risk Uncertainty Handbook
  7. Department of Energy
  8. OMB A-11 Part 7
  9. Defense Acquisition Universty Cost Estimating
Related articles Three Increasingly Mature Views of Estimate Making in IT Projects Who's Budget is it Anyway? Just Because You Say Words, It Doesn't Make Then True Project Risk Management, PMBOK, DoD PMBOK and Edmund Conrow's Book There is No Such Thing as Free Essential Reading List for Managing Other People's Money The Fallacy of the Planning Fallacy Herding Cats: Probabilistic Cost and Schedule Processes Information Technology Estimating Quality
Categories: Project Management