Info

Planning Artifacts

1. Work breakdown structure

2. Business case

3. Release specifications

4. Software development plan

Management Set

Operational Artifacts

5. Release descriptions

6. Status assessments

7. Software change order database

8. Deployment documents

9. Environment

Planning Artifacts

1. Work breakdown structure

2. Business case

3. Release specifications

4. Software development plan

Management Set

Operational Artifacts

5. Release descriptions

6. Status assessments

7. Software change order database

8. Deployment documents

9. Environment

Figure 6-1. Overview of the artifact sets

6.1.1 The Management Set

The management set captures the artifacts associated with process planning and execution. These artifacts use ad hoc notations, including text, graphics, or whatever representation is required to capture the "contracts" among project personnel (project management, architects, developers, testers, marketers, administrators), among stakeholders (funding authority, user, software project manager, organization manager, regulatory agency), and between project personnel and stakeholders. Specific artifacts included in this set are the work breakdown structure (activity breakdown and financial tracking mechanism), the business case (cost, schedule, profit expectations), the release specifications (scope, plan, objectives for release baselines), the software development plan (project process instance), the release descriptions (results of release baselines), the status assessments (periodic snapshots of project progress), the software change orders (descriptions of discrete baseline changes), the deployment documents (cutover plan, training course, sales rollout kit), and the environment (hardware and software tools, process automation, documentation, training collateral necessary to support the execution of the process described in the software development plan and the production of the engineering artifacts).

Management set artifacts are evaluated, assessed, and measured through a combination of the following:

• Relevant stakeholder review

• Analysis of changes between the current version of the artifact and previous versions (management trends and project performance changes in terms of cost, schedule, and quality)

• Major milestone demonstrations of the balance among all artifacts and, in particular, the accuracy of the business case and vision artifacts

6.1.2 The Engineering Sets

The engineering sets consist of the requirements set, the design set, the implementation set, and the deployment set. The primary mechanism for evaluating the evolving quality of each artifact set is the transitioning of information from set to set, thereby maintaining a balance of understanding among the requirements, design, implementation, and deployment artifacts. Each of these components of the system description evolves over time.

Requirements Set

Structured text is used for the vision statement, which documents the project scope that supports the contract between the funding authority and the project team. Ad hoc formats may also be used for supplementary specifications (such as regulatory requirements) and user mockups or other prototypes that capture requirements. UML notation is used for engineering representations of requirements models (use case models, domain models). The requirements set is the primary engineering context for evaluating the other three engineering artifact sets and is the basis for test cases.

Requirements artifacts are evaluated, assessed, and measured through a combination of the following:

• Analysis of consistency with the release specifications of the management set

• Analysis of consistency between the vision and the requirements models

• Mapping against the design, implementation, and deployment sets to evaluate the consistency and completeness and the semantic balance between information in the different sets

• Analysis of changes between the current version of requirements artifacts and previous versions (scrap, rework, and defect elimination trends)

• Subjective review of other dimensions of quality

Design Set

UML notation is used to engineer the design models for the solution. The design set contains varying levels of abstraction that represent the components of the solution space (their identities, attributes, static relationships, dynamic interactions). The design models include enough structural and behavioral information to ascertain a bill of materials (quantity and specification of primitive parts and materials, labor, and other direct costs). Design model information can be straightforwardly and, in many cases, automatically translated into a subset of the implementation and deployment set artifacts. Specific design set artifacts include the design model, the test model, and the software architecture description (an extract of information from the design model that is pertinent to describing an architecture).

The design set is evaluated, assessed, and measured through a combination of the following:

• Analysis of the internal consistency and quality of the design model

• Analysis of consistency with the requirements models

• Translation into implementation and deployment sets and notations (for example, traceability, source code generation, compilation, linking) to evaluate the consistency and completeness and the semantic balance between information in the sets

• Analysis of changes between the current version of the design model and previous versions (scrap, rework, and defect elimination trends)

• Subjective review of other dimensions of quality

Because the level of automated analysis available on design models is currently limited, human analysis must be relied on. This situation should change over the next few years with the maturity of design model analysis tools that support metrics collection, complexity analysis, style analysis, heuristic analysis, and consistency analysis.

Implementation Set

The implementation set includes source code (programming language notations) that represents the tangible implementations of components (their form, interface, and dependency relationships) and any executables necessary for stand-alone testing of components. These executables are the primitive parts needed to construct the end product, including custom components, application programming interfaces (APIs) of commercial components, and APIs or reusable or legacy components in a programming language source (such as Ada 95, C++, Visual Basic, Java, or Assembly). Implementation set artifacts can also be translated (compiled and linked) into a subset of the deployment set (end-target executables). Specific artifacts include self-documenting product source code baselines and associated files (compilation scripts, configuration management infrastructure, data files), self-documenting test source code baselines and associated files (input test data files, test result files), stand-alone component executables, and component test driver executables.

Implementation sets are human-readable formats that are evaluated, assessed, and measured through a combination of the following:

• Analysis of consistency with the design models

• Translation into deployment set notations (for example, compilation and linking) to evaluate the consistency and completeness among artifact sets

• Assessment of component source or executable files against relevant evaluation criteria through inspection, analysis, demonstration, or testing

• Execution of stand-alone component test cases that automatically compare expected results with actual results

• Analysis of changes between the current version of the implementation set and previous versions (scrap, rework, and defect elimination trends)

• Subjective review of other dimensions of quality Deployment Set

The deployment set includes user deliverables and machine language notations, executable software, and the build scripts, installation scripts, and executable target-specific data necessary to use the product in its target environment. These machine language notations represent the product components in the target form intended for distribution to users. Deployment set information can be installed, executed against scenarios of use (tested), and dynamically reconfigured to support the features required in the end product. Specific artifacts include executable baselines and associated run-time files, and the user manual.

Deployment sets are evaluated, assessed, and measured through a combination of the following:

• Testing against the usage scenarios and quality attributes defined in the requirements set to evaluate the consistency and completeness and the semantic balance between information in the two sets

• Testing the partitioning, replication, and allocation strategies in mapping components of the implementation set to physical resources of the deployment system (platform type, number, network topology)

• Testing against the defined usage scenarios in the user manual such as installation, user-oriented dynamic reconfiguration, mainstream usage, and anomaly management

• Analysis of changes between the current version of the deployment set and previous versions (defect elimination trends, performance changes)

• Subjective review of other dimensions of quality

The rationale for selecting the management, requirements, design, implementation, and deployment sets was not scientific. The goal was to optimize presentation of the process activities, artifacts, and objectives. Some of the rationale that resulted in this conceptual framework is described next. Although there are several minor exceptions to these generalizations, they are useful in understanding the overall artifact sets.

Each artifact set uses different notation(s) to capture the relevant artifacts. Management set notations (ad hoc text, graphics, use case notation) capture the plans, process, objectives, and acceptance criteria. Requirements notations (structured text and UML models) capture the engineering context and the operational concept. Design notations (in UML) capture the engineering blueprints (architectural design, component design). Implementation notations (software languages) capture the building blocks of the solution in human-readable formats. Deployment notations (executables and data files) capture the solution in machine-readable formats.

Each artifact set is the predominant development focus of one phase of the life cycle; the other sets take on check and balance roles. As illustrated in Figure 6-2, each

Inception Elaboration Construction Transition

Management

Requirements

Design

Implementation

Figure 6-2. Life-cycle focus on artifact sets phase has a predominant focus: Requirements are the focus of the inception phase; design, the elaboration phase; implementation, the construction phase; and deployment, the transition phase. The management artifacts also evolve, but at a fairly constant level across the life cycle.

Most of today's software development tools map closely to one of the five artifact sets.

1. Management: scheduling, workflow, defect tracking, change management, documentation, spreadsheet, resource management, and presentation tools

2. Requirements: requiren^eits management tools

3. Design: visual modeling tools

4. Implementation: compiler/debugger tools, code analysis tools, test coverage analysis tools, and test management tools

5. Deployment: test coverage and test automation tools, network management tools, commercial components (operating systems, GUIs, DBMSs, networks, middleware), and installation tools

Allocation of responsibilities among project teams is straightforward and aligns with the process workflows presented in Chapter 8.

Implementation Set versus Deployment Set

The separation of the implementation set (source code) from the deployment set (executable code) is important because there are very different concerns with each set. The structure of the information delivered to the user (and typically the test organization) is very different from the structure of the source code information. Engineering decisions that have an impact on the quality of the deployment set but are relatively incomprehensible in the design and implementation sets include the following:

• Dynamically reconfigurable parameters (buffer sizes, color palettes, number of servers, number of simultaneous clients, data files, run-time parameters)

• Effects of compiler/link optimizations (such as space optimization versus speed optimization)

• Performance under certain allocation strategies (centralized versus distributed, primary and shadow threads, dynamic load balancing, hot backup versus checkpoint/rollback)

• Virtual machine constraints (file descriptors, garbage collection, heap size, maximum record size, disk file rotations)

• Process-level concurrency issues (deadlock and race conditions)

• Platform-specific differences in performance or behavior

Much of this configuration information is important engineering source data that should be captured either in the implementation set (if it is embedded within source code) or in the deployment set (if it is embedded within data files, configuration files, installation scripts, or other target-specific components). In dynamically reconfigurable systems or portable components, it is usually better to separate the source code implementation concerns from the target environment concerns (for reasons of performance, dynamic adaptability, or source code change management). With this approach, the implementation can be decoupled from the actual platform type and from the number and topology of the underlying computing infrastructure, which includes operating systems, middleware, networks, and DBMSs.

As an example, consider the software architecture of a one million SLOC missile warning system (a project described in detail in the case study, Appendix D) with extreme requirements for fault tolerance and data processing performance. On this project, significantly different configurations of executables could be built from the same source sets.

• A version that includes only the primary thread of processing on a development host to do a subset of scenario tests

• A version that includes primary and backup processing threads on a development host, which could then exercise some of the logical reconfiguration scenarios

• Functionally equivalent versions of the two preceding configurations that could execute on the target processors to assess the required throughput and response time of the critical-thread scenarios on the candidate target configuration

• A version that could execute a primary thread of servers on one target processor, a shadow thread of servers on a separate backup target processor, a test/exercise thread on either target, and a suite of thread-independent user interface clients on user workstations. The latter, which could support a broad range of dynamic reconfigurations, was essentially the final target configuration.

Deployment of commercial products to customers can also span a broad range of test and deployment configurations. For example, middleware products provide high-performance, reliable object request brokers that are delivered on several platform implementations, including workstation operating systems, bare embedded processors, large mainframe operating systems, and several real-time operating systems. The product configurations support various compilers and languages as well as various implementations of network software. The heterogeneity of all the various target configurations results in the need for a highly sophisticated source code structure and a huge suite of different deployment artifacts.

6.1.3 Artifact Evolution over the Life Cycle

Each state of development represents a certain amount of precision in the final system description. Early in the life cycle, precision is low and the representation is generally high. Eventually, the precision of representation is high and everything is specified in full detail. At any point in the life cycle, the five sets will be in different states of completeness. However, they should be at compatible levels of detail and reasonably traceable to one another. Performing detailed traceability and consistency analyses early in the life cycle (when precision is low and changes are frequent) usually has a low return on investment. As development proceeds, the architecture stabilizes, and maintaining traceability linkage among artifact sets is worth the effort.

. Each phase of development focuses on a particular artifact set. At the end of each phase, the overall system state will have progressed on all sets, as illustrated in Figure 6-3.

The inception phase focuses mainly on critical requirements, usually with a secondary focus on an initial deployment view, little focus on implementation except perhaps choice of language and commercial components, and possibly some high-level focus on the design architecture but not on design detail.

During the elaboration phase, there is much greater depth in requirements, much more breadth in the design set, and further work on implementation and deployment issues such as performance trade-offs under primary scenarios and make/ buy analyses. Elaboration phase activities include the generation of an executable

Engineering Stage

Production Stage

Inception

Elaboration

Construction

Transition

Management

£

—C—1 O)

—o—

c

0)

Project Management Made Easy

Project Management Made Easy

What you need to know about… Project Management Made Easy! Project management consists of more than just a large building project and can encompass small projects as well. No matter what the size of your project, you need to have some sort of project management. How you manage your project has everything to do with its outcome.

Get My Free Ebook


Post a comment