1. Are the software work products produced according to the project's defined software process?
a The software project manager is responsible for compliance with the software development plan. Any deviations from plan or standards (or both) are reviewed periodically through status assessments and are accommodated as appropriate in subsequent iterations or product baselines.
2. Is consistency maintained across software work products (e.g., is the documentation tracing allocated requirements through software requirements, design, code, and test cases maintained)?
a The CCB provides continuous attention to change management traceability. Release descriptions are a mechanism for assessing consistency and completeness of the work products of a major milestone. Traceability among the engineering sets (use case models, design models, source code, and executable components) is maintained by the environment. The extent to which such information is summarized or detailed to ensure completeness depends on the scale of the project and the stakeholder concerns (for example, safety), and is captured in release descriptions.
3. Does the project follow a written organizational policy for performing the software engineering activities (e.g., a policy which requires the use of appropriate methods and tools for building and maintaining software products)?
a The organizational policy requires specific activities and a standard environment for the purpose of standardizing methods or tools across projects. Many of the methods and tools are left open to project-specific selection.
4. Are adequate resources provided for performing the software engineering tasks (e.g., funding, skilled individuals, and appropriate tools)?
▲ The adequacy of software engineering resources is not specified by policy. A good benchmark is that about 50% of a project's effort should be allocated to software engineering tasks: 10% in requirements, 15% in design, and 25% in component implementation. The determination of adequate resources and individuals is project-specific and should be scrutinized by the PRA.
5. Are measurements used to determine the functionality and quality of the software products (e.g., numbers, types, and severity of defects identified)?
▲ The explicit purpose of the metrics required and reported in status assessments is to assess progress and quality.
6. Are the activities and work products for engineering software subjected to SQA reviews and audits (e.g., is required testing performed, are allocated requirements traced through the software requirements, design, code, and test cases)?
a All the engineering sets (technical artifacts) are evolved and updated at each major milestone. SCOs, CCBs, and release descriptions force continuous attention to traceability.
Intergroup Coordination, Level 3
This set of questions is specifically supported by a focus on architecture. Intergroup coordination is specifically associated with software architecture because architecture encompasses the intercomponent and human-to-human interfaces.
1. On the project, do the software engineering group and other engineering groups collaborate with the customer to establish the system requirements?
▲ The project vision statement and the release specifications are the responsibility of the software architecture group. They are negotiated with the customer and are evolved at each iteration.
2. Do the engineering groups agree to the commitments as represented in the overall project plan?
a The software development plans, release specifications, and WBS define the commitments and plans.
3. Do the engineering groups identify, track, and resolve intergroup issues (e.g., incompatible schedules, technical risks, or system-level problems)?
a Demonstrations are the mechanism for productive and tangible engineering coordination at an architectural level. CCBs provide intergroup resolution on the level of SCOs. Proper scheduling of architecture demonstrations enables integration issues to be resolved as early in the life cycle as possible. Proper scheduling also enhances early resolution of important intergroup issues.
4. Is there a written organizational policy that guides the establishment of interdisciplinary engineering teams?
▲ CCBs, PRAs, and demonstration teams are established interdisciplinary engineering teams.
5. Do the support tools used by different engineering groups enable effective communication and coordination (e.g., compatible word processing systems, database systems, and problem tracking systems)?
a Standard work breakdown structures, standard environments, and the SCO database enable the various engineering groups to coordinate within a common framework. Within a project, the artifacts developed by all teams should use common notations, methods, and tools.
6. Are measures used to determine the status of the intergroup coordination activities (e.g., effort expended by the software engineering group to support other groups)?
a Tracking the efforts of the architecture team provides insight into the stability of the architecture. Stability is a good indicator of effective intergroup coordination. Because the architecture team is separate, with explicit WBS elements, tracking can be achieved more easily through the defined management and quality metrics reported in the periodic status assessments.
7. Are the activities for intergroup coordination reviewed with the project manager on both a periodic and event-driven basis?
a A good architecture-first approach plans the first iterations to expose any significant issues in intergroup coordination. Periodic status assessments and major milestone events provide tangible and objective insight into intergroup coordination through observation of architecture metrics.
Peer Reviews, Level 3
The process framework does not specifically call for peer reviews in the classic sense. However, there áre several mechanisms whose purpose is exactly that of classic peer reviews. These mechanisms include demonstrations (global integration peer reviews), CCBs (change management peer reviews), status assessments (management peer reviews), and conventional peer reviews (code walkthroughs, inspections), as incorporated by project software development plans.
1. Are peer reviews planned?
▲ CCBs, status assessments, and demonstrations should be planned and followed through in a systematic way.
2. Are actions associated with defects that are identified during peer reviews tracked until they are removed?
a All defects, independent of source, are tracked via SCOs, and the metrics are reported in the status assessments.
3. Does the project follow a written organizational policy for performing peer reviews?
a Organizational policy should require CCBs, demonstrations, and status assessments. It should also specify that other forms of peer reviews be defined in the project's software development plan.
4. Do participants of peer reviews receive the training required to perform their roles?
a Training is an organization-specific issue.
5. Are measurements used to determine the status of peer review activities (e.g., number of peer reviews performed, effort expended on peer reviews, and number of work products reviewed compared to the plan)?
a CCBs provide extensive change management metrics. Release descriptions require the same ROI metrics to be collected for demonstrations. The SEPA periodically assesses the ROI of organizational trend analyses from status assessment data.
6. Are peer review activities and work products subject to SQA review and audit (e.g., planned reviews are conducted and follow-up actions are tracked)?
a Software project managers, CCBs, and PRAs provide continuous follow-through. Quantitative Process Management, Level 4
1. Does the project follow a documented plan for conducting quantitative process management?
a An appendix to the organizational policy should define the plan for quantitative process improvement. Status assessments evaluate collected metrics in the context of organizational norms maintained by the SEPA.
2. Is the performance of the project's defined software process controlled quantitatively (e.g., through the use of quantitative analytic methods)?
▲ The data are collected and reported to the SEPA through status assessments. These metrics are fed back into the planning of each subsequent iteration.
3. Is the process capability of the organization's standard software process known in quantitative terms?
▲ An appendix to the organizational policy should define the current process assessment and the plan for process improvement in quantitative terms.
4. Does the project follow a written organizational policy for measuring and controlling the performance of the project's defined software process (e.g., projects plan for how to identify, specify, and control special causes of variation)?
a The software development plan should define a metrics program for measuring and controlling the software process. It should also require that this process (and its control mechanisms) be evolved and improved as the project progresses.
5. Are adequate resources provided for quantitative process management activities (e.g., funding, software support tools, and organizational measurement program)?
a The adequacy of process management resources is not specified by policy. A good benchmark is that a team about the size of the square root of the number of active projects is sufficient.
6. Are measurements used to determine the status of the quantitative process management activities (e.g., cost of quantitative process management activities and accomplishment of milestones for quantitative process management activities)?
a An appendix to the organizational policy should address the ROI of the SEPA activities.
7. Are the activities for quantitative process management reviewed with the project manager on both a periodic and event-driven basis?
a Status assessments and major milestones provide periodic and event-driven reviews of quantitative process management data.
Was this article helpful?
What you need to know about… Project Management Made Easy! Project management consists of more than just a large building project and can encompass small projects as well. No matter what the size of your project, you need to have some sort of project management. How you manage your project has everything to do with its outcome.