CCO

80,000

40,000

12,000

3,000

65,000

Totals

355,000

202,000

38,000

24,000

277,400

All this code counting stuff may appear confusing when summarized in a couple of pages. However, over the first year of the project, these analyses and definitions were highly scrutinized and well understood. They provided a useful perspective for discussing several of the engineering trade-offs being evaluated. After the first year, the SLOC counts were very stable and well correlated to the schedule estimating analyses performed throughout the project life cycle. On one hand, the CCPDS-R code counting process is a good example of why SLOC is a problematic metric for measuring software size. On the other hand, CCPDS-R is an example of a complex system in which SLOC metrics worked very effectively.

This section on software size is a good example of the issues associated with transitioning to component-based development. While projects can and must deal with hetero- ! geneous measurements of size, there is no industry-accepted approach. Consequently, project managers need to analyze carefully such important metrics definitions.

D.8.2 Subsystem Process Improvements

One of my main themes in this book is that real process improvements should be evident in subsequent project performance. Because it comprised three separate projects, CCPDS-R provides a perfect case study for illustrating this trend. Overall, the Common Subsystem subsidized much of the groundwork for the PDS and STRATCOM subsystems—namely, the process definition, the tools, and the reusable architecture primitives. With each successive subsystem, productivity and quality improved significantly. This is the expectation for a mature software process such as the one devel oped and evolved on CCPDS-R. It is always difficult to compare productivities across projects, but CCPDS-R subsystems had consistent measures of human-generated SLOC and homogeneous processes, teams, and techniques. The consistent metrics approach produced a comparable set of measures. The normalized unit of measure chosen to compare productivities was the cost per SLOC. The absolute costs are irrelevant; the relative costs among subsystems are not. The PDS Subsystem was delivered at 40% of the cost per SLOC of the Common Subsystem, and the STRATCOM Subsystem at 33%. This is one of the real indicators of a level 3 or level 4 process.

Table D-13 summarizes the SCO traffic across all CSCIs at month 58. By this time, the Common Subsystem was well beyond its FQT and had processed quite a few SCOs in a maintenance mode to accommodate engineering change proposals. The PDS and STRATCOM subsystems were well into their test phases. For completeness, the table provides entries for support, test, and operating system/vendor. (Tracking of commercial product change orders was similar to SCO tracking.) Support included code generation tools, configuration management tools, metrics tools, and standalone test drivers; test included software drivers used for requirements verification.

Table D-13 shows that the values of the modularity metric (average scrap per change) and the adaptability metric (average rework per change) were generally much better in the subsequent subsystems (PDS and STRATCOM) than they were in the Common Subsystem. The one exception was the SCG CSCI, a special communications capability needed in the STRATCOM Subsystem that did not have a counterpart in the other subsystems and was uniquely complex.

CCPDS-R demonstrated the true indicator of a mature process, as described in Section E.2. With each subsequent subsystem, performance—as measured by quality, productivity, or time to market—improved. CCPDS-R was subjected to numerous SEI software capability evaluations over its lifetime, and the project's process maturity contributed to a level 3 or higher assessment. These performance improvements were not due solely to a mature process. Stakeholder teamwork and project investments in architecture middleware and process automation were probably equally important to overall j project success.

D.8.3, SCO Resolution Profile

The average change costs evolved over time into a fairly constant value of 16 hours per change. This effort included analysis time, redesign, recode, and retest of the resolution. The profile of changes shown in Figure D-16 provides another interesting perspective.

Table D-13. CCPDS-R subsystem changes by CSCI
Project Management Made Easy

Project Management Made Easy

What you need to know about… Project Management Made Easy! Project management consists of more than just a large building project and can encompass small projects as well. No matter what the size of your project, you need to have some sort of project management. How you manage your project has everything to do with its outcome.

Get My Free Ebook


Post a comment