A A Systems Engineering Environment Case

A systems engineering environment (SEE) basically provides an automated information center in which a team can carry out the thirty elements of systems engineering. Developing such an environment is considerably more than an academic exercise. As an example, the Rome Laboratory of the Air Force has a program to structure an SEE known as Catalyst [A.3, A.4]. This is being formulated as a comprehensive environment for a systems engineering team. The Navy also has been interested in systems engineering, having requested information and proposals from industry on the broad subject of the engineering of complex systems [A.5]. Subordinate areas of interest to the Navy under this program have included:

TABLE A.3 Comparison Matrix—SD-DSS

Criteria

Subcriteria: Measures

Alternative 1

System Alternative 2

Alternative 3

Cost

RDT&E ($K)

200

380

287

Cost

Prod/Con ($K)

1563

330

103

Cost

O&M ($K)

780

1500

1080

Cost

Retire ($K)

50

50

50

Functionality

Data: collection/extraction

Subjective

Subjective

Subjective

Functionality

Data: data manipulation

Subjective

Subjective

Subjective

Functionality

Data: data dictionary

Subjective

Subjective

Subjective

Functionality

Data: security

Subjective

Subjective

Subjective

Functionality

Model: model creation

Subjective

Subjective

Subjective

Functionality

Model: sensitivity

Subjective

Subjective

Subjective

Functionality

Model: management

Subjective

Subjective

Subjective

Functionality

Dialog: consistency

Subjective

Subjective

Subjective

Functionality

Dialog: styles

Subjective

Subjective

Subjective

Functionality

Dialog: flexibility/adaptability

Subjective

Subjective

Subjective

Functionality

Dialog: help

Subjective

Subjective

Subjective

Functionality

Dialog: I/O devices supported

Subjective

Subjective

Subjective

System

RMA: availability (MTBF/MTBF + MTTR)

0.92

0.98

0.9

System

RMA: schedule maintenance (mean time in hours)

2

6

10

System

Performance (seconds)

1.2

0.76

1.9

System

Capacity: Saturation (no. of users)

46

38

22

System

Capacity: DBMS (defect records in millions)

10

6

5

System

Capacity: Models (models in thousands)

2500

2000

1500

TABLE A.4 Ideas on Calculating Evaluation Criteria—SD-DSS

Criteria

Subcriteria: Measures

Ideas on Calculating Evaluation Criteria

Alternative 1

Alternative 2

Alternative 3

Cost

RDT&E (SK)

Prefer lower RDT&E costs over life cycle

200

380

287

Cost

Prod/Con (SK)

Prefer lower prod/con costs over life cycle

1563

330

103

Cost

O&M (SK)

Prefer lower O&M costs over life cycle

780

1500

1080

Cost

Retire ($K)

Prefer lower retirement costs over life cycle

50

50

50

Functionality*

Data: collection/extraction

Prefer ability to handle more information

0.114

0.481

0.405

Functionality*

Prefer robust ad hoc query & report gen.

0.126

0.458

0.416

facility

Functionality*

Data: data dictionary

Prefer robust DD facility

0.33

0.33

0.334

Functionality*

Data: security

Prefer design with comprehensive security

0.652

0.235

0.113

Functionality*

Model: model creation

Prefer robust model creation facility

0.126

0.458

0.416

Functionality*

Model: sensitivity

Prefer sophisticated sensitivity analysis

0.169

0.387

0.444

Functionality*

Model: management

Prefer robust model management

0.169

0.387

0.444

Functionality*

Dialog: consistency

Prefer less variation in menu design

0.5

0.25

0.25

Functionality*

Dialog: styles

Prefer support higher number of dialog styles

0.126

0.458

0.416

Functionality*

Dialog: flexability/adaptability

Prefer open systems architecture design

0.126

0.458

0.416

Functionality*

Dialog: help

Prefer accurate and informative help

0.169

0.387

0.444

Functionality*

Dialog: I/O devices supported

Prefer higher number of different I/O devices

0.139

0.435

0.426

supported

System

RMA: availability (MTBF/

Prefer higher system availability

0.92

0.98

0.9

MTBF+MTTR)

System

RMA: schedule maintenance (mean time

Prefer shorter maintenance time

2

6

10

in hours)

System

Performance (seconds)

Prefer shorter end-user response time

1.2

0.76

1.9

System

Capacity: saturation (no. of users)

Prefer larger number of users

46

38

22

System

Capacity: DBMS (defect records in

Prefer higher number of records

10

6

5

millions)

System

Capacity: models

Prefer higher number of models

2500

2000

1500

* Indicates pairwise comparisons performed.

* Indicates pairwise comparisons performed.

TABLE A.5 LCC Matrix—SD-DSS

Design

Cost Category

FY93

FY94

FY95

FY96

FY97

FY98

FY99

FY00

FY01

FY02

Costs (K)

%LCC

Alt 1

1. Research & development

$100

$100

$200

7.71

2. Production & construction

$185

$178

$150

$150

$150

$150

$150

$150

$150

$150

$1,563

60.28

3. Operations & maintenance

$180

$120

$80

$80

$80

$80

$80

$80

$780

30.08

4. Retirement and disposal

$50

$50

1.93

Total (actual KS)

$285

$278

$330

$270

$230

$230

$230

$230

$230

$280

$2,593

100.00

Total (present K$)

$259

$230

$248

$184

$143

$130

$118

$107

$98

$108

$1,625

Alt 2

1. Research & development

2. Production & construction

$180 $255

$200 $75

$380 $330

16.81 14.60

3. Operations & maintenance

$240

$180

$180

$180

$180

$180

$180

$180

$1,500

66.37

4. Retirement and disposal

$50

$50

2.21

Total (actual KS)

$435

$275

$240

$180

$180

$180

$180

$180

$180

$230

$2,260

100.00

Total (present K$)

$395

$227

$180

$123

$112

$102

$92

$84

$76

$89

$1,481

Alt 3

1. Research & development

2. Production & construction

$127 $50

$160 $53

$287 $103

18.88 6.78

3. Operations & maintenance

$240

$120

$120

$120

$120

$120

$120

$120

$1,080

71.05

4. Retirement and disposal

$50

$50

3.29

Total (actual K$)

$177

$213

$240

$120

$120

$120

$120

$120

$120

$170

$1,520

100.00

Total (present K$)

$161

$176

$180

$82

$75

$68

$62

$56

$51

$66

$975

GOAL-

RDT&E-

PROD/CON-

L RETIRE-

L SECURITY -

CREATION -SENSITIVE -MANAGE —

AVAIL -SCHED-

AVAIL — System availability

CAPACITY — Capacity

COL/ETR — Collect/extract functionality

CONSIST — Consistency of interface

CREATION — Model creation functionality

DATA — Data management functions

DATA MAN — Data manipulation functionality

DMBS — Number of records for software defects

DD — Data dictionary functionality

DIALOG —Dialog management functions

FLEX/ADP — Flexibility and adaptiveness to accommodate new technology FUNCTION — Functionality

HELP — Help capabilities, diagnostics, training, etc. I/O DEV — Input/output devices supported MANAGE — Model management functionality MODEL — Model management functions MODELS — Number of decision support models O&M — Operations and maintenance costs PERFORM — Performance measured by response time in seconds

PROD/CON — Production and construction costs

RDT&E — Research development test & evaluation

RETIRE — Retirement and disposal costs

RMA — Reliability, maintainability, availability

SATUR — Saturation point in number of users

SCHED — Scheduled maintenance

SECURITY — Comprehensive data security protection

SENSITIVE — Sensitivity functionality

STYLES — Dialog styles supported

SYSTEM — System characteristics

TOT LCC — Total life cycle costs of the system

Figure A.3. Evaluating criterias using expert choice—SD-DSS.

Exhibit A.4: SD-DSS Risk Assessment

Program Risks

—Reduction of corporate fiscal resources

—Risk mitigation: business process improvement effort to reduce costs

• Cost/Schedule Risk: Medium

—Similar corporate programs realized 10% cost and schedule growth —Risk mitigation: prototyping, user engineering during design, and utilization of corporate metrics program

• Administrative: Low

—Experienced and stable development team in place Technical Risks

• Code & Unit Testing: Medium —Development of three software modules (builds)

—Risk mitigation: maximal use of COTS SW, structured programming techniques, and early implementation of CASE Tools

• Integration Testing: Medium

—Integration of three software modules (builds)

—Risk mitigation: CASE tools and early introduction of test plans

• Transition/Activation: Medium

—Uploading of software defected data from external sources —Risk mitigation: early introduction of real defect data into the code & unit test and integration test phases

• System information capture

• System understanding, guidance, and synthesis

• System reengineering

• Integration

Thus, there is strong and continuing interest in developing systems engineering as well as an automated environment that facilitates the practice of systems engineering. One approach to the formulation of an SEE follows. The objectives set forth for the SEE were:

• To enable a more effective and efficient design of large-scale systems

• To provide a facility/center with CASE expertise

• To provide tools for managing complex representations and data

• To allow different teams to simultaneously work on the various elements of system design

The high-level functional requirements of the SEE are shown in Figure A.4, with the key requirements under each in Exhibit A.5. Three architectures were set forth, as represented in Exhibit A.6. The mapping of the three architectures against the functional areas is shown in Table A.6.

Two different evaluation methods were employed to assess the merits of the three architectures. The evaluation matrices for these methods are shown in Table A.7. The methods differ in the weights assigned to the evaluation criteria; the second method led to the calculation of a ''cost-benefit ratio.'' A 1-3-5-7-9 scale was utilized for the technical criteria. Results for the three alternatives are as follows:

Alternative 1: All Macintosh computers

• Technical score = 6.2 (Method 1); 5.7 (Method 2)

Alternative 2: All PC (DOS) computers

• Technical score = 7.0 (Method 1); 6.0 (Method 2)

Alternative 3: Macintosh + DOS computers

• Technical score = 5.9 (Method 1); 7.5 (Method 2)

By examining the two methods of evaluation, it was possible to see how the final evaluations change as different procedures are followed. This examination of sensitivities provided a broader perspective regarding how final architectural alternatives might be evaluated to derive a preferred architecture.

Figure A.4. High-level functional requirements of a SEE.

Exhibit A.5: SEE Requirements

Requirements

• Needs/goals/objectives

• Mission engineering

• Requirements analysis

• Functional analysis

• Functional allocation

• Specification development Performance Analysis

• Alternative evaluation

• System design/analysis

• Scheduling

• Life-cycle costing

• Technical performance measurement

• Program/decision analysis

• Risk analysis

Logistics and Integration

• Interface definition and control

• Integration logistics support

• Integration

• Configuration management Test and Evaluation

• Quality assurance

• Requirements traceability Operational Evaluation and Reengineering

• Operations

• Operational evaluation

—Prototyping —Mathematical models —Simulation models

Exhibit A.6: Alternative Architectures—SEE

Architecture 1

• Microsoft Office (Word, Excel, Powerpoint)

—System Architect —Oracle —Expert Choice —Microsoft Project —SLIC —RTM —Extend Architecture 2

• Microsoft Office (Word, Excel, Powerpoint)

• SE tools: —CORE —Design IDEF —System Architect —DBMS: Oracle —Expert Choice —@Risk —Foresight —RTM

• Hybrid (MACs, PCs, and workstations)

• Microsoft Office

• SE tools: —CORE —RDD-100 —Design IDEF —System Architect —DBMS: Oracle —Expert Choice —@Risk

Exhibit A.6: (continued)

—NETSIM —Foresight —Microsoft Project —SLIC —Extend —RTM —GPSS/H —MATLAB —Demo

Project Management Made Easy

Project Management Made Easy

What you need to know about… Project Management Made Easy! Project management consists of more than just a large building project and can encompass small projects as well. No matter what the size of your project, you need to have some sort of project management. How you manage your project has everything to do with its outcome.

Get My Free Ebook


Post a comment