Optimal

FIGURE 20-7. Minimizing the costs of quality (COQ).

functional areas or plants. This is particularly important if we consider the life-cycle cost model discussed in Section 14.19. We showed that typical life-cycle costs are:

• Acquisition: 28 percent

• Operations and support: 60 percent

Since 60 percent of the life-cycle cost occurs after the product is put into service, then small increases in the R&D and acquisition areas could generate major cost savings in operation and support due to better design, higher quality, less maintenance, and so forth.

20.9 THE SEVEN QUALITY CONTROL TOOLS3_

Over the years, statistical methods have become prevalent throughout business, industry, and science. With the availability of advanced, automated systems that collect, tabulate, and analyze data, the practical application of these quantitative methods continues to grow.

More important than the quantitative methods themselves is their impact on the basic philosophy of business. The statistical point of view takes decision-making out of the subjective autocratic decision-making arena by providing the basis for objective decisions based on quantifiable facts. This change provides some very specific benefits:

• Improved process information

• Better communication

• Discussion based on facts

• Consensus for action

• Information for process changes

Statistical process control (SPC) takes advantage of the natural characteristics of any process. All business activities can be described as specific processes with known tolerances and measurable variances. The measurement of these variances and the resulting information provide the basis for continuous process improvement. The tools presented here provide both a graphical and measured representation of process data. The systematic application of these tools empowers business people to control products and processes to become world-class competitors.

The basic tools of statistical process control are data figures, Pareto analysis, cause-and-effect analysis, trend analysis, histograms, scatter diagrams, and process control charts. These basic tools provide for the efficient collection of data, identification of patterns in the data, and measurement of variability. Figure 20-8 shows the relationships among these seven tools and their use for the identification and analysis of improvement opportunities. We will review these tools and discuss their implementation and applications.

3. This section is taken from H. K. Jackson and N. L. Frigon, Achieving the Competitive Edge (New York: John Wiley & Sons, Inc., 1996), Chapters 6 and 7. Reproduced by permission.

IDENTIFICATION ANALYSIS

IDENTIFICATION ANALYSIS

Data Tables Data tables, or data arrays, provide a systematic method for collecting and displaying data. In most cases, data tables are forms designed for the purpose of collecting specific data. These tables are used most frequently where data are available from automated media. They provide a consistent, effective, and economical approach to gathering data, organizing them for analysis, and displaying them for preliminary review. Data tables sometimes take the form of manual check sheets where automated data are not necessary or available. Data figures and check sheets should be designed to minimize the need for complicated entries. Simple-to-understand, straightforward tables are a key to successful data gathering.

Figure 20-9 is an example of an attribute (pass/fail) data figure for the correctness of invoices. From this simple check sheet, several data points become apparent. The total

Project Management Made Easy

Project Management Made Easy

What you need to know about… Project Management Made Easy! Project management consists of more than just a large building project and can encompass small projects as well. No matter what the size of your project, you need to have some sort of project management. How you manage your project has everything to do with its outcome.

Get My Free Ebook


Post a comment