Putnam model explained

The Putnam model is an empirical software effort estimation model.[1] The original paper by Lawrence H. Putnam published in 1978 is seen as pioneering work in the field of software process modelling.[2] As a group, empirical models work by collecting software project data (for example, effort and size)and fitting a curve to the data. Future effort estimates are made by providing size and calculating the associated effort using the equation which fit the original data (usually with some error).

Created by Lawrence Putnam, Sr. the Putnam model describes the time and effort required to finish a software project of specified size.SLIM (Software LIfecycle Management) is the name given by Putnam to the proprietary suite of tools his company QSM, Inc. has developed based on his model. It is one of the earliest of these types of models developed, and is among the most widely used. Closely related software parametric models areConstructive Cost Model (COCOMO), Parametric Review of Information for Costing and Evaluation – Software (PRICE-S), andSoftware Evaluation and Estimation of Resources – Software Estimating Model (SEER-SEM).

The software equation

While managing R&D projects for the Army and later at GE, Putnam noticed software staffing profiles followedthe well-known Rayleigh distribution.[3]

Putnam used his observations about productivity levels to derive the software equation:

B1/3Size
Productivity

=Effort1/3Time4/3

where:

In practical use, when making an estimate for a software task the software equation is solved for effort:

Effort=\left[

Size
ProductivityTime4/3

\right]3B

An estimated software size at project completion and organizational process productivity is used. Plotting effort as a function of time yields the Time-Effort Curve. The points along the curve represent the estimated total effort to complete the project at some time. One of the distinguishing features of the Putnam model is that total effort decreases as the time to complete the project is extended. This is normally represented in other parametric models with a schedule relaxation parameter.

This estimating method is fairly sensitive to uncertainty in both size and process productivity. Putnam advocates obtaining process productivity by calibration:

ProcessProductivity=

Size
\left[
Effort
B
\right]1/3Time4/3

Putnam makes a sharp distinction between 'conventional productivity' : size / effort and process productivity.

One of the key advantages to this model is the simplicity with which it is calibrated. Most software organizations, regardless of maturity level can easily collect size, effort and duration (time) for past projects. Process Productivity, being exponential in nature is typically converted to a linear productivity index an organization can use to track their own changes in productivity and apply in future effort estimates.[6]

See also

References

  1. Book: Putnam , Lawrence H. . Ware Myers . Five core metrics: the intelligence behind successful software management . Dorset House Publishing . September 2003 . 0-932633-55-2 .
  2. Web site: Putnam . Lawrence H. . A General Empirical Solution to the Macro Software Sizing and Estimating Problem . IEEE TRANSACTIONS ON SOFTWARE ENGINEERING, VOL. SE-4, NO. 4, pp 345-361 . 1978 .
  3. Web site: Focus on Lawrence Putnam: A CAI State of the Practice Interview . Computer Aid, Inc. . September 2006 .
  4. Web site: US Government . Putnam Special Skills Factor Table . Data & Analysis Center for Software . August 20, 1997 .
  5. Book: Putnam , Lawrence H. . Ware Myers . Measures for Excellence: Reliable Software on Time, Within Budget . Prentice Hall . October 1991 . 978-0-13-567694-3 . 234 . The special skills factor, B, is a function of system size: .16 for 5-15 KSLOC, .18 for 20 KSLOC, .28 for 30 KSLOC, .34 for 40 KSLOC, .37 for 50 KSLOC and .39 for > 70 KSLOC .
  6. Web site: US Government . Putnam Productivity Parameter Table . Data & Analysis Center for Software . August 20, 1997 .

External links