10 deadly sins of task evaluations in IT

Short description

The art and science of valuation in IT involve both goal-setting and evaluation, but avoiding common mistakes is essential for accurate estimates. The “Deadly Sins” include confusing goals with grades, saying “yes” when really meaning “no,” committing to early estimates in the cone of uncertainty, assuming underestimation doesn’t affect outcomes, scoring in the “Impossible Zone,” overestimating the usefulness of new tools, using only one evaluation method, not using evaluation software, ignoring the impact of risks, and giving impromptu grades. Breaking larger estimates into smaller ones can lead to better results. More deadly sins and empirical rules exist beyond the ten highlighted here.

10 deadly sins of task evaluations in IT

The Art and Science of Valuation in IT:

  • The science of evaluation is well developed and well supported by software tools.

  • The art of estimation is mostly based on rules of thumb and they still need some refinement.

This article is based on materials by Steve McConnell of Construx and some of the materials are still pictures.


Deadly sins

Sin #1 Confusing goals with grades

Distinguish between:

  • Setting goals is a key part of the art of assessment.

  • When you are asked to evaluate, determine if you really should evaluate or if it is better to figure out how to achieve the goal.

  • It is best thought of as an iterative process of aligning goal and assessment.

Sin #2 Saying “yes” when you really mean “no”

Why developers say yes.

It is very difficult to mount a vigorous, plausible, and risky valuation defense that is not based on quantitative methods, supported by little data, and evidenced mostly by managers’ guesswork.

Fred Brooks (1975)

Features of communications in the company:

  • Software developers tend to be introverts and relatively young

  • Marketing and sales professionals tend to be more extroverted and organizationally superior than the developers they negotiate with

  • It’s very easy to slip into a push

Sin #3 Committing to Early Estimates in the Cone of Uncertainty

You can make commitments only based on evaluations at the later stages, when you understand the essence of the work.

The most accurate estimates are late.

Sin #4 Assuming that the underestimation does not affect the results of the project

Planning errors affect the project non-linearly. Over time, errors and defects accumulate, and risks begin to emerge more and more often.

Sin #5 Score in the “Impossible Zone”


  • Suppose you drive up a hill for 1 mile at 30 miles per hour.

  • How fast do you need to drive down the hill so that the average speed over the entire trip is 60 miles per hour?

A variation on the theme of sin

A commonly accepted definition of an estimate is: ‘An estimate is the most optimistic prediction that has a non-zero chance of happening’… Accepting this definition inevitably leads to a method called “how early is the date you can’t prove you won’t finish the estimate”.

Tom DeMarco (1982)

Estimates are probable statements.

What happens when you take a face value and compress it? There is no such thing as a “single point score” that is correct or makes sense. All estimates involve at least implicit probabilities (even if the estimator is unaware of this).

Schedule Compression and the “Impossible Zone”.

A trade-off between cost and schedule.

All researchers find some trade-off between schedule compression and cost. No one believes that there is no compromise. It is assumed that the maximum possible compression of the graph is about 25%.

An old post is mentioned here: https://t.me/junior_pm/17

Don’t create estimates in the “impossible zone.”

Let’s return to the question. What is the solution to the riddle?

Sin #6 Overestimating the usefulness of new tools

The beneficial effect of new tools or methods is There are also problems:

  • A training fee must be paid on first use

  • The maximum efficiency is not achieved at the first use

  • The first use is often associated with errors

  • Early claims of effectiveness are often based on expert use – sometimes by the developers or authors who invented the tool or method!

  • The result is less than expected when it appears

  • New tools and methods increase risks

The best guess is productivity losses from initial use of a new tool or method.

Sin #7 Using only one evaluation method

Use several methods:

  • It is difficult to be confident in estimates produced by only one method, and this contributes to the problem of Brooks’ “energy shielding”.

  • Leading organizations use several evaluation techniques.

  • Create scores in a variety of ways and look for convergence or spread between scores.

Sin #8 Not using evaluation software

Use evaluation software to:

  • The best support for the science of evaluation is tools.

  • Scores created using tools may have greater reliability than scores created manually.

  • Good evaluation tools: https://github.com/FocusedObjective/FocusedObjective.Resources

Sin No. 9 Failure to include in the assessment of the impact of risks

Consideration of risks in the assessment:

  • Software development projects are inherently volatile and risky.

  • Overall risk (RE) is the expected value of the budget overrun on the project.

  • The RE estimate is where the “buffer planning” begins.

Sin #10 Giving impromptu grades

Approach the assessment as a mini-project.

Define a standardized assessment procedure.

Elements of a standardized procedure:

  • A clear description of the estimation uncertainty.

  • Using multiple assessment approaches.

  • Re-evaluation plan at predetermined project milestones.

  • Defining the moment when “estimates” become “commitments”.

Break larger estimates into smaller estimates.

Break systems into modules Break large tasks into small tasks Use a statistical property called the “law of large numbers”—high and low values ​​tend to balance each other out.


  • Bad grades (or goals) are the norm.

  • Good grades are possible!

  • The deadly sins and empirical rules presented here are only the tip of the iceberg.

Immortal sins, but also sins

Sins #20 – #11

Sin #20

Assess it to make a plan before anyone knows what it is.

Sin #19

Believing that the most authentic reviews come from the people with the most powerful vocal connections.

Sin #18

To think that all grades are converted into one another. Transfer money in calendar terms and story points in days.

Sin #17

Build plans for a new project based on the initial plan of a past project. Ignore actual deadlines and past experience.

Sin #16

Assume that the sales department is a better judge of software projects than developers.

Sin #15

Give an estimate, ignoring the working points:

  • attending meetings…

  • switching to other projects…

  • key customer support…

  • holidays…

  • diseases…

  • Emergency…

Sin #14

Present estimates with a high degree of precision (“67.5 hours”) supported only by a low degree of precision (“±2 months”).

Sin #13

Consider that estimation tools (such as Monte Carlo) cannot match the computing power of a manager, pen and napkin system.

Sin #12

Think like this: “The sooner we anticipate the deadlines, the more time we will have to anticipate them.”

Sin #11

Argue that developers can be taught to evaluate better. Taking it as a hindsight decision.

Related posts