The CCPDS-R acquisition included two distinct phases: a concept definition (CD) phase and a full-scale development (FSD) phase. The CD phase proposal was competed for by five major bidders, and two firm-fixed-price contracts of about $2 million each were awarded. The winning contractors also invested their own discretionary resources to discriminate themselves with the best-value FSD phase proposal. Figure D-l summarizes the overall acquisition process and the products of each phase.
Software development plan
Software engineering exercise
Firm fixed price
Competitive Design Phase (Inception)
ISRR Initial system requirements review ISDR Initial system design review
2167A software documentation Six software configuration items Major milestones Beta delivery (EOC)
Fixed price award fee
Full-Scale Development Phase (Elaboration — Construction — Transition)
SRR IPDR PDR CDR EOC FQT
SRR Software requirements review IPDR Interim preliminary design review PDR Preliminary design review
CDR Critical design review EOC Early operational capability FOT Final qualification test
The CD phase was very similar in intent to the inception phase. The primary products were a system specification (a vision document), an FSD phase proposal (a business case, including the technical approach and a fixed-price-incentive and award-fee cost proposal), and a software development plan. The CD phase also included a system design review, technical interchange meetings with the government stakeholders (customer and user), and several contract-deliverable documents. These events and products enabled the FSD source selection to be based on demonstrated performance of the contractor-proposed team as well as the FSD proposal.
From a software perspective, there was one additional source selection criterion included in the FSD proposal activities: a software engineering exercise. This was a unique but very effective approach for assessing the abilities of the two competing contractors to perform software development. The Air Force was extremely concerned with the overall software risk of this project: Recent projects had demonstrated dismal software development performance. The Air Force acquisition authorities had also been frustrated with previous situations in which a contractor's crack proposal team was not the team committed to perform after contract award, and contractor proposals exaggerated their approaches or capabilities beyond what they could deliver.
CCPDS-R was also a very large software development activity and was one of the first projects to use the Ada programming language. There was serious concern that the Ada development environments, contractor processes, and contractor training programs might not be mature enough to use on a full-scale development effort. The purpose of the software engineering exercise was to demonstrate that the contractor's proposed software process, Ada environment, and software team were in place, were mature, and were demonstrable.
The software engineering exercise occurred immediately after the FSD proposals were submitted. The customer provided both bidders with a simple two-page specification of a "missile warning simulator." This simulator had some of the same fundamental requirements as the CCPDS-R full-scale system, including a distributed architecture, a flexible user interface, and the basic processing scenarios of a simple CCPDS-R missile warning thread. The exercise requirements included the following:
• Use the proposed software team.
• Use the proposed software development techniques and tools.
• Use the FSD-proposed software development plan.
• Conduct a mock design review with the customer 23 days after receipt of the specification.
The software engineering exercise would provide objective evidence of the credibility of each contractor's proposed software development approach.
The results produced by TRW's CCPDS-R team were impressive. They demonstrated to the customer that the team was prepared, credible, and competent at conducting the proposed software approach. Approximately 12 staff-months were expended in the effort (12 people full-time for 23 days).
A detailed plan was established that included an activity network, responsibility assignments, and expected results for tracking progress. The plan included two architecture iterations and all the milestones and artifacts proposed in the software development plan. The exercise produced the following results:
• Four primary use cases were elaborated and demonstrated.
• A software architecture skeleton was designed, prototyped, and documented, including two executable, distributed processes; five concurrent tasks (separate threads of control); eight components; and 72 component-to-component interfaces.
• A total of 4,163 source lines of prototype components were developed and executed. Several thousand lines of reusable components were also integrated into the demonstration.
• Three milestones were conducted and more than 30 action items resolved.
• Production of 11 documents (corresponding to the proposed artifacts) demonstrated the automation inherent in the documentation tools.
• The Digital Equipment Corporation VAX/VMS tools, Rational R1000 environment, LaTeX documentation templates, and several custom-developed tools were used.
• Several needed improvements to the process and the tools were identified. The concept of evolving the plan, requirements, process, design, and environment at each major milestone was considered potentially risky but was implemented with rigorous change management.
This exercise proved to be a discriminating factor in the CCPDS-R contract award. TRW had proposed an architecture-first, demonstration-based approach and had demonstrated its operational concept successfully under realistic, albeit small-scale and accelerated, conditions. Despite submitting a bid that was more than 20% higher than that of their competitor, TRW's approach was selected as the best value and lowest risk. Award of the contract to TRW was due, in large part, to successful performance on the software engineering exercise and TRW's ability to demonstrate a much more credible, lower risk process under realistic conditions.
The software engineering exercise served the same purpose as an SEI Software Capability Evaluation (Appendix E). Each bidder's proposal provided a software development plan—the "say what you do" part of an organizational process. The exercise demonstrated that the proposing organization could perform as advertised.
Was this article helpful?
What you need to know about… Project Management Made Easy! Project management consists of more than just a large building project and can encompass small projects as well. No matter what the size of your project, you need to have some sort of project management. How you manage your project has everything to do with its outcome.