Figure 6-3. Life-cycle evolution of the artifact sets prototype. This prototype involves subsets of development in all four sets and specifically assesses whether the interfaces and collaborations among components are consistent and complete within the context of the system's primary requirements and scenarios. Although there is generally a broad understanding of component interfaces, there is usually not much depth in implementation for custom components. (However, commercial or other existing components may be fully elaborated.) A portion of all four sets must be evolved to some level of completion before an architecture baseline can be established. This evolution requires sufficient assessment of the design set, implementation set, and deployment set artifacts against the critical use cases of the requirements set to suggest that the project can proceed predictably with well-understood risks.
The main focus of the construction phase is design and implementation. The main focus early in this phase should be the depth of the design artifacts. Later in construction, the emphasis is on realizing the design in source code and individually tested components. This phase should drive the requirements, design, and implementation sets almost to completion. Substantial work is also done on the deployment set, at least to test one or a few instances of the programmed system through a mechanism such as an alpha or beta release.
The main focus of the transition phase is on achieving consistency and completeness of the deployment set in the context of the other sets. Residual defects are resolved, and feedback from alpha, beta, and system testing is incorporated.
As development proceeds, each of the parts evolves in more detail. When the system is complete, all four sets are fully elaborated and consistent with one another. In contrast to the conventional practice, you do not specify the requirements, then do the design, and so forth. Instead, you evolve the entire system; decisions about the deployment may affect requirements, not just the other way around. The key emphasis here is to break the conventional mold, in which the default interpretation is that one set precedes another. Instead, one state of the entire system evolves into a more elaborate state of the system, usually involving evolution in each of the parts. During the transition phase, traceability between the requirements set and the deployment set is extremely important. The evolving requirements set captures a mature and precise representation of the stakeholders' acceptance criteria, and the deployment set represents the actual end-user product. Therefore, during the transition phase, completeness and consistency between these two sets are important. Traceability among the other sets is necessary only to the extent that it aids the engineering or management activities.
Conventional software testing followed the same document-driven approach that was applied to software development. Development teams built requirements documents, top-level design documents, and detailed design documents before constructing any source files or executables. Similarly, test teams built system test plan documents, system test procedure documents, integration test plan documents, unit test plan documents, and unit test procedure documents before building any test drivers, stubs, or instrumentation. This document-driven approach caused the same problems for the test activities that it did for the development activities. ,
One of the truly discriminating tenets of a modern process is to use exactly the same sets, notations, and artifacts for the products of test activities as are used for product development. In essence, we are simply identifying the test infrastructure necessary to execute the test process as a required subset of the end product. By doing this, we have forced several engineering disciplines into the process.
• The test artifacts must be developed concurrently with the product from inception through deployment. Thus, testing is a full-life-cycle activity, not a late life-cycle activity.
• The test artifacts are communicated, engineered, and developed within the same artifact sets as the developed product.
• The test artifacts are implemented in programmable and repeatable formats (as software programs).
• The test artifacts are documented in the same way that the product is documented.
• Developers of the test artifacts use the same tools, techniques, and training as the software engineers developing the product.
These disciplines allow for significant levels of homogenization across project workflows, which are described in Chapter 8. Everyone works within the notations and techniques of the four sets used for engineering artifacts, rather than with separate sequences of design and test documents. Interpersonal communications, stakeholder reviews, and engineering analyses can be performed with fewer distinct formats, fewer ad hoc notations, less ambiguity, and higher efficiency.
Testing is only one aspect of the assessment workflow. Other aspects include inspection, analysis, and demonstration. Testing refers to the explicit evaluation through execution of deployment set components under a controlled scenario with an expected and objective outcome. The success of a test can be determined by comparing the expected outcome to the actual outcome with well-defined mathematical precision. Tests are exactly the forms of assessment that are automated.
Although the test artifact subsets are highly project-specific, the following example clarifies the relationship between test artifacts and the other artifact sets. Consider a project to perform seismic data processing for the purpose of oil exploration. This system has three fundamental subsystems: (1) a sensor subsystem that captures raw seismic data in real time and delivers these data to (2) a technical operations subsystem that converts raw data into an organized database and manages queries to this database from (3) a display subsystem that allows workstation operators to examine seismic data in human-readable form. Such a system would result in the following test artifacts:
• Management set. The release specifications and release descriptions capture the objectives, evaluation criteria, and results of an intermediate milestone. These artifacts are the test plans and test results negotiated among internal project teams. The software change orders capture test results (defects, testability changes, requirements ambiguities, enhancements) and the closure criteria associated with making a discrete change to a baseline.
• Requirements set. The system-level use cases capture the operational concept for the system and the acceptance test case descriptions, including the expected behavior of the system and its quality attributes. The entire requirements set is a test artifact because it is the basis of all assessment activities across the life cycle.
• Design set. A test model for nondeliverable components needed to test the product baselines is captured in the design set. These components include such design set artifacts as a seismic event simulation for creating realistic sensor data; a "virtual operator" that can support unattended, after-hours test cases; specific instrumentation suites for early demonstration of resource usage; transaction rates or response times; and use case test drivers and component stand-alone test drivers.
• Implementation set. Self-documenting source code representations for test components and test drivers provide the equivalent of test procedures and test scripts. These source files may also include human-readable data files representing certain statically defined data sets that are explicit test source files. Output files from test drivers provide the equivalent of test reports.
• Deployment set. Executable versions of test components, test drivers, and data files are provided.
For any release, all the test artifacts and product artifacts are maintained using the same baseline version identifier. They are created, changed, and obsolesced as a consistent unit. Because test artifacts are captured using the same notations, methods, and tools, the approach to testing is consistent with design and development. This approach forces the evolving test artifacts to be maintained so that regression testing can be automated easily.
Was this article helpful?
What you need to know about… Project Management Made Easy! Project management consists of more than just a large building project and can encompass small projects as well. No matter what the size of your project, you need to have some sort of project management. How you manage your project has everything to do with its outcome.