A procedural codeoriented approach

The previous approach would be useful at the design stage of a project and where a procedural programming language is not the primary vehicle for development. However how could you estimate the effort to develop an individual software module using a procedural language? An approach might be based on the following steps.

1. Envisage the number and type of programs in the final system

This is easiest where the system is of a conventional and well understood nature. Most information systems are built from a small set of system operations, such as insert, amend, update, display, delete, print. The same principle should equally apply to embedded systems, albeit with a different set of primitive functions.

2. Estimate the SLOC of each identified program

The estimator must have a particular implementation language in mind for this step.

One way to judge the number of instructions likely to be in a program is to draw up a program structure diagram and to visualize how many instructions would be needed to implement each identified procedure. The estimator might look at existing programs that have a similar functional description to help in this process.

Where programs for an information system are similar (for instance, they are data validation programs) then the number of data item types processed by each program is likely to be the major influence on size.

3. Estimate the work content, taking into account complexity and technical difficulty

The practice is to multiply the SLOC estimate by a factor for complexity and technical difficulty. This factor will depend largely on the subjective judgement of the estimator. For example, the requirement to meet particular highly constrained performance targets can greatly increase programming effort.

A weighting can be given when there is uncertainty, for example about a new technique used in particular module, but this should not be excessive. Where there is a large amount of uncertainty then specific measures should be taken to reduce this by such means as the use of exploratory prototypes.

4. Calculate the work-days effort

Historical data can be used to provide ratios to convert weighted SLOC to effort. These conversion factors are often based on the productivity of a 'standard programmer' of about 15-18 months of experience. In installations where the rate of turnover is lower and the average programmer experience is higher this might be reflected in the conversion rate employed.

Note that the steps above can be used to derive an estimate of lines of code that can be used as an input to one of the COCOMO models, which are now about to be described.

Function point analysis Mark II is also based on the idea that the number of data item types processed influences program size.

See Chapter 4 for a discussion of prototypes.

Draw up an outline program structure diagram for a program to do the processing Exercise 5.8 described in Exercise 5.7, which sets up Customer records. For each box on your diagram, estimate the number of lines of code needed to implement the routine in a third generation language such as Cobol.

5.12 COCOMO: a parametric model

Boehm's COCOMO (Constructive COst MOdel) is often referred to in the literature on software project management, particularly in connection with software estimating. The term COCOMO really refers to a group of models.

Boehm originally based his models in the late 1970s on a study of 63 projects. Of these only seven were business systems and so they could be used with applications other than information systems. The basic model was built around the equation

Because there is now a newer COCOMO II, the older version is now referred to as COCOMO 81.

Boehm originally used mm (for man-months) when he wrote Software Engineering Economics.

Generally, information systems were regarded as organic while real-time systems were embedded.

effort = cx sizek where effort is measured in pm, or the number of 'person-months' consisting of units of 152 working hours, size is measured in kdsi, thousands of delivered source code instructions, and c and k are constants.

The first step was to derive an estimate of the system size in terms of kdsi. The constants, c and k (see Table 5.10), depended on whether the system could be classified, in Boehm's terms, as 'organic', 'semi-detached' or 'embedded'. These related to the technical nature of the system and the development environment.

• Organic mode - this would typically be the case when relatively small teams developed software in a highly familiar in-house environment and when the system being developed was small and the interface requirements were flexible.

• Embedded mode - this meant the product being developed had to operate within very tight constraints and changes to the system were very costly.

• Semi-detached mode - this combined elements of the organic and the embedded modes or had characteristics that came between the two.

Table 5.10

COCOMO constants

System type




The exponent value k, when it is greater than 1, means that larger projects are seen as requiring disproportionately more effort than smaller ones. This reflected Boehm's finding that larger projects tended to be less productive than smaller ones because they needed more effort for management and co-ordination.

Exercise 5.9 Apply the basic COCOMO model to the lines of code figures in Table 5.1 to generate estimated work-months of effort, assuming an organic mode. Compare the calculated figures with the actuals.

As well as the intermediate model, a further, detailed, COCOMO model attempts to allocate effort to individual project phases.

Boehm in fact found this, by itself, to be a poor predictor of the effort required and so went on to develop the intermediate version of COCOMO, which took into account 15 cost drivers. In the intermediate model, a nominal effort estimate, (pmnom) is derived in a similar way as for the basic model.

The nominal estimate is then adjusted by a development effort multiplier (dem):

Pmest = Pmnorn X dem where dem is calculated by taking into account multipliers based on the effort drivers in Table 5.11.

Table 5.11 C0C0M081 intermediate cost drivers

Driver type


Cost driver

Product attributes


required software reliability


database size


product complexity

Computer attributes


execution time constraints


main storage constraints


virtual machine volatility - degree to which the operating system changes


computer turn around time

Personnel attributes


analyst capability


application experience


programmer capability


virtual machine (i.e. operating system) experience


programming language experience

Project attributes


use of modern programming practices


use of software tools


required development schedule.

These multipliers take into account such influences on productivity as Boehm's suggestion that having a programming team fully conversant with the programming language to be used could reduce the effort required to implement the project by up to 20% compared to a team with a very low or initially nonexistent familiarity with the programming language. In fact, the biggest influence on productivity according to Boehm is the capability of the implementation team.

As an example of the approach, an organization might decide to use the following multipliers for assessing the effect of analyst capability (ACAP).

Very low 1.46 Low 1.19

Nominal 1.00 High 0.80 Very High 0.71

If the analysts involved in a project, taken as a whole, generally possess above average talent and productivity then the estimator might rate the ACAP as high and use a multiplier of 0.8, effectively reducing the nominal estimate by 20%.

The overall dem is calculated by multiplying the multipliers selected for each cost driver in Table 5.11 to create a single combined multiplier.

Exercise 5.10 At IOE, most of the systems that are developed are technically similar, so that the product, computer and project attributes, as listed in Table 5.11 do not change from one project to another and are given a unit multiplier of l .0. Only personnel attributes differ and the following table is used by the organization to take this into account.


Very low




Very high































On the new IOE group maintenance accounts project, the analyst is regarded as being of exceptionally high quality. The programmers are of high quality but have little experience of the particular application area and are going to use a programming language that is new to them. They are however familiar with the operating system environment and thus can be rated as high on VEXP.

What would the dem for this project? If the nominal estimate for this project was four person-months, what would be the final estimate?

The detailed COCOMOII Model Definition Manual has been published by the Centre for Software Engineering, University of Southern California.

A new family of models, COCOMO II, is currently (1999) being refined by Barry Boehm and his co-workers. This approach uses various multipliers and exponents, the values of which have been set initially by experts. However, a database containing the performance details of executed projects is being built up and periodically analysed so that the expert judgements can be progressively replaced by values derived from actual projects. The new models take into account that there is now a wider range of process models in common use for software development projects than in the late 1970s and early 1980s. As we noted earlier, estimates are required at different stages in the system life cycle and COCOMO II has been designed to accommodate this by having models for three different stages.

Application composition Where the external features of the system that the users will experience are designed. Prototyping will typically be employed to do this. With small applications that can be built using high-productivity application-building tools, development can stop at this point.

Early design Where the fundamental software structures are designed. With larger, more demanding systems, where, for example, there will be large volumes of transactions and performance is important, careful attention will need to be paid to the architecture to be adopted.

Post architecture Where the software structures undergo final construction, modification and tuning to create a system that will perform as required.

To estimate the effort for application composition, the counting of object points, which were described earlier, is recommended by the developers of COCOMO II.

At the early design stage, FPs are recommended as the way of gauging a basic system size. An FP count might be converted to a SLOC equivalent by multiplying the FPs by a factor for the programming language that is to be used.

The following model can then be used to calculate an estimate of person- Note that where months. COCOMO 81 used 'man-

months', COCOMO II

pm = A x sizesfx emx x em2 x ... x emn uses 'person-months'.

Where pm is the effort in 'person-months', A is a constant (in 1998 it was 2.45), size is measured in SLOC (which might have been derived from an FP count as explained above), and 5/is exponent scale factor.

The scale factor is derived thus:

sf= 1.01 -f 0.01 x YXexponent driver ratings)

What is the maximum value that the scale factor (sf) can have, given that there are Exercise 5.11 five exponent drivers and the maximum rating for an individual driver is five and the minimum is zero?

The qualities that govern the exponent drivers used to calculate the scale factor are listed below. Note that the less each quality is applicable, the bigger the value given to the exponent driver. The fact that these factors are used to calculate an exponent implies that the lack of these qualities will increase the effort required disproportionately more on larger projects.

• Precedentedness This quality refers to the degree to which there are precedents, similar cases in the past, for the project that is being planned. The greater the novelty of the new system, the more uncertainty there is and the higher the value given to the exponent driver.

• Development flexibility This is the degree to which the requirements can be met in a number of different ways. The less flexibility there is, the higher the value of the exponent driver.

• Architecture/risk resolution This relates to the degree of uncertainty there is about the requirements. If they are not firmly fixed and are liable to change then this would lead to a high value being given to this exponent driver.

• Team cohesion This reflects the degree to which there is a large dispersed team (perhaps in several countries) as opposed to there being a small tightly knit team.

• Process maturity The chapter on software quality explains the process maturity model. The more structured and organized the way the software is produced, the lower uncertainty and the lower the rating will be for this exponent driver.

Exercise 5.12 A new project has 'average' novelty for the software house that is going to execute it and is thus given a 3 rating on this account for precedentedness. Development flexibility is high to the extent that this generates a zero rating, but requirements might change radically and so the risk resolution exponent is rated at 4. The team is very cohesive and this generates a rating of l, but the software house as a whole tends to be very informal in its standards and procedures and the process maturity driver has therefore been given a value of 4.

What would be the scale factor, sf\ that would applicable in this case?

In the COCOMO II model the effort multipliers, em, are similar in nature to the development effort multipliers, dem, used in the original COCOMO. There are seven of these multipliers that are relevant to aarly design and sixteen that can be used at the post architecture stage. Table 5.12 lists the effort multipliers for early design and Table 5.13 for post architecture. As with COCOMO 81, each of these multipliers might, for a particular application, be given a rating of very low, low, nominal, high or very high. Each rating for each effort multiplier has a value associated with it. A value greater than 1 means that development effort is increased, while a value less than 1 causes effort to be decreased. The nominal rating means that the multiplier has no effect on the estimate, that is, it is 1. The intention is that the ratings that these and other values use in COCOMO II will be modified and refined over time as details of actual projects are added to the database.

Table 5.12 COCOMOII Early Design effort multipliers


Effort modifier


Product reliability and complexity


Required reusability


Platform difficulty


Personnel capability


Personnel experience


Facilities available


Schedule pressure

Table 5.13 COCOMOI1 Post Architecture effort multipliers

Modifier type


Effort modifier

Product attributes


Required software reliability


Database size


Documentation match to life-cycle needs


Product complexity


Required reusability

Platform attributes


Execution time constraint


Main storage constraint


Platform volatility

Personnel attributes


Analyst capabilities


Application experience


Programmer capabilities


Platform experience


Programming language experience


Personnel continuity

Project attributes TOOL Use of software tools

SITE Multisite development

Project attributes TOOL Use of software tools

SITE Multisite development

Was this article helpful?

0 0
Project Management Made Easy

Project Management Made Easy

What you need to know about… Project Management Made Easy! Project management consists of more than just a large building project and can encompass small projects as well. No matter what the size of your project, you need to have some sort of project management. How you manage your project has everything to do with its outcome.

Get My Free Ebook

Post a comment