And yet, all companies claim that they could improve in this area, most lament that they don’t recycle the experience from past projects at all. But, it requirs a systematic approach. This article will analyse this methodically.
It starts with the use of a table from the PMBOK (Project Management Body of Knowledge from
pmi.org):
This table below has always seemed to me to be a little theoretical, but I have changed my mind. The original figures which were derived from the construction industry have been embedded in the PMBOK (Project Management Body of Knowledge) ever since.
Type of
Estimate

Range of
Precision

When to
Use

Order of Magnitude
conceptual, ‘scientific’ guess


Enough initial information is available for authorizing the project and assessing feasibility

Preliminary Estimate
design, intermediate

 25 % / +25 %

Information about human resources, materials, expenses, and overhead are available and the objective is identified

Definitive Estimate
Finalized, bottomup, baseline

 10 % / +10 %

Detailed work breakdown structure information can be used

The table shows the increasing precision of the estimates as the project progresses from an order of magnitude or conceptual estimate, through a budgetary or design estimate and to a definitive or grassroots engineering estimate.
There is something in this analysis that is crucially significant. At each step, we can be very clear about the information that becomes available:
1) initially, when there is only very early stage introductory information, the estimate provides an ‘order of magnitude estimate’, conceptual approximation or ‘wild guess’,
2) once further information on resources, materials, suppliers, expenses and overheads has been collected, and when the objectives have been defined, a design or ‘preliminary estimate’ can be prepared,
3) a ‘definitive estimate’ that becomes the baseline budget for the project is built upon a bottomup analysis of each work package using the most detailed information available.
The PMBOK suggests that the ‘order of magnitude’ estimate varies within a range of +/ 50% compared with the ‘preliminary estimate’ between +/ 25% whilst the ‘definitive estimate’ is reckoned to be at +/ 10%.
The assumption is that the estimate at each stage is given with about 95 % certainty or two standard deviations from the mean. Thus for a small project worth about 1 million €uros, this provides a 95 % confidence interval of:
 500K and 1,500K €uros for an order of magnitude estimate
 750K and 1,250K €uros for a preliminary estimate
 900K and 1,100K €uros for a definitive estimate
The competitive challenge is to narrow that definitive estimate range of 10 % / +10 %. This is the estimate that is used for the budget baseline, for example in competitive bidding and defines the development costs in the final product.
The only credible way to reduce this range is to improve the quality of the data that is used for the estimate. Inevitably this implies strengthening the estimate by using more complete and better historical data.
Unless an improvement in precision is implemented, in simple mathematical terms the damage could be significant:
With a 10 % / +10 % range of estimate on a contract worth 1,000,000 €uros, one standard deviation gives you 1,050,000 Euros and you would have an 84% chance of achieving this, and a 16% chance of exceeding it. If a competitor had reduced their range to 5 % / +5 %, by using more reliable source data for their estimate, then they would have moved one standard deviation and have reduced their chances of going over 1,050,000 to a mere 2 ½ %
If your definitive range was – 10 % / + 10 %, and the competitor’s range was 5% / +5 %, and if you were to provision up to three standard deviations from the mean, you would need to provision about 11,000, whilst the competitor would be able to provision half as much. Therefore your competitor would be able to outbid you very easily.
In fact, it’s worse than that, because your inability to forecast more precisely means that you are more likely to underbid as well as to overbid. On more than 10% of bids you may bid too low and lose an average of 50,000 of possible revenues, and on a further 10% when you bid too high you may simply fail to win the business.
One million €uros is a small project. Even ten such projects per year is a small company of, say, about 60 employees. If your estimates are imprecise, you are losing about €500,000 on project overruns per year and €1,600,000 in lost business. Therefore, you will bid low as a matter of routine and very few of your projects will be profitable.
It gets worse. If the market is extremely price sensitive (price elastic) you may not win any bids at all, because you competition is able to bid lower and preserve a safety margin. If the market is price insensitive (price inelastic), margins may be higher, but you may still lose because of the variability in your own estimates and the need to cover the risks.
How to improve the precision of estimates:
(1) Understand the price sensitivity of the marketplace. Does a point increase in price have an equivalent effect on demand, or are buyers more sensitive to other factors?
(2) Analyse the relationship between the estimates and actual results (time, resources and costs) and evaluate underlying factors.
(3) Analyse the variability of the estimates and includ the predictability of the risks (risks that occur compared to risks analysed) to assess the impact of risk on the estimates.
(4) Analyse the sources of variability of the estimates with an aim to cover 95% of the risks.
(5) Identify which risks reoccur from project to project and seek to eliminate them; in other words apply a process perspective.
The aim is to reduce your previous 1 sigma limit, so that it now becomes 2 sigma, 2 sigma becomes 3 sigma, and so on. Thus only 1% of projects will now fall outside the –10% / +10% range, where as before it would have been about 5%., and only 5% will fall outside the 5% / +5% range where previously it would have been 32%. This is the significant benefit from applying the experience of previous projects.
This approach could also be used for agile development projects (i.e. projects where the scope is prioritised and the set of target requirments are allowed to evolve during the project.) A single sigma shift would introduce a greater degree of precision in terms of the number of high priority and medium priority functions that could be delivered within a timebox (i.e. before a deadline.)
Is it possible to half the range on a definitive estimate? If you can go from –50% / +50% to 25% / +25% thanks to the creation of a comprehensive Work Breakdown Structure and from 25% / +25% due to a detailed Project Plan, then it’s most certainly possible to halve the range again based on reliable and analysed historical data.