Process Control and Optimization, VOLUME II - Unicauca

Rosemount Inc./Emerson Process Management (www.rosemount.com and ... evolved into a 14-point management program for quality improvement ... Control Theory ..... Wheeler, D. J., Advanced Topics In Statistical Process Control, Knoxville,.
245KB taille 50 téléchargements 300 vues
2.34

Statistical Process Control D. H. F. LIU

(1995)

J. F. TATERA

Software Package Suppliers:

(2005)

ABB Inc. (www.abb.com) Aspen Technology Inc. (www.aspentech.com) CyboSoft (www.cybosoft.com) Foxboro Co. (www.foxboro.com) Honeywell (www.acs.honeywell.com and www.honeywell.com/imc) JMP Software (www.jmp.com) Lifestar (www.qimacros.com) Minitab Inc. (www.minitab.com) Northwest Analytical (www.nwasoft.com) Oil Systems Inc.—PI (www.osisoft.com) Pavillion Technologies Inc. (www.pavtech.com) Rosemount Inc./Emerson Process Management (www.rosemount.com and www.emersonprocess.com) Siemens (www.sea.siemens.com/ia) Software Toolbox (www.softwaretoolbox.com) StatPoint LLC. (www.statpoint.com) StatSoft (www.statsoft.com) Stochos Inc. (www.stochos.com) Wonderware (www.wonderware.com) Yokogawa (www.yokogawa.com/us) Zontec Inc. (www.zontec-spc.com)

INTRODUCTION The goal of statistical process control (SPC) is to detect whether a process has undergone statistical abnormality — that is, a shift from its normal statistical behavior. The use of statistical techniques to detect variations in product quality and consistency dates back to Walter Shewhart’s work at Bell Laboratories in the early 1900s. His work resulted in the development of statistical quality charts (Shewhart charts). These charts are still used for analyzing patterns in product variability. In the 1940s and 1950s, W. Edwards Deming’s work in statistical quality control (SQC) methodology evolved into a 14-point management program for quality improvement. His approach emphasized the application of statistical principles to control the production process. As with most other applications of statistics, different statistical methods operate under certain assumptions. These methods are subject to significant abuse and misuse when the assumptions are not understood and the methods are applied to data sets that they should not be used on. Most of these assumptions have to do with the nature of the data set (homogeneous, randomness, distribution shape, data collection techniques, etc.). A number of different charting techniques

are available. This allows the user to select the one(s) that are most appropriate for their data set. Before going into the specifics, some general comments are appropriate. SPC and Process Control Just as a proportional-integral-derivative (PID) controller needs to be tuned to properly control individual loops, so an SPC process can be thought of as a means of tuning the overall process. SPC might not be considered by process control engineers as part of process control. It is a powerful statistical tool to help verify whether a process is performing properly and to help troubleshoot the process if a correction or improvement is desired. We usually think of control in terms of a direct correction applied to the process in response to a change in the control variable. This is the case with both the open-loop control configuration (where a human manually initiates the correction) and the closed-loop (where an automatic controller initiates it). One might think of SPC as an extreme form of open-loop control. SPC uses statistical process monitoring techniques to determine the status of the process, a combination of statistical and troubleshooting techniques to identify the cause(s) 405

© 2006 by Béla Lipták

406

Control Theory

of upsets and the process area(s) requiring improvement/adjustment, and verification techniques to ensure that the adjustment(s) have resulted in the desired change. Because of the open-loop decision-making nature of SPC, many believe that SPC charting should be done manually. This view is based at least partially on the assumption that a human observer will pay closer attention during periods of upsets and therefore will be better prepared for the troubleshooting and problem-solving phases of the process. Those sharing this view prefer to keep the observer actively involved in the SPC process. In the name of productivity and to make better use of our powerful computing tools, many organizations have implemented forms of automated SPC charting and monitoring. When automated, the SPC calculations and charting are usually implemented in a Plant Information Data Base and may be displayed either in that system and/or a process control computer. Most of those who have automated the first phase are still performing the nonmonitoring portions of the SPC process manually. A few of the more aggressive automation types are attempting to implement diagnostic and process adjustment activities with artificial intelligence or other computer models.

determine the tuning parameters for the total process. In other words with SPC, the control is usually on a process scale and not for individual loops. What does it mean for a process to be in statistical control? A process is considered in control if it is experiencing only normal upsets and variation that have often occurred in the past. A process is out of statistical control when the change is sustained and it brings the controlled variable outside the limits of its normal range of variation. In that case one should try to identify the cause of the change in order to decide how to respond to it. Understanding the types of variation allows one to predict the performance of a stable process. If an “in-control” process is making large quantities of out-of-specification product, the control limits need to be changed. Process control limits are not design specifications but are values that describe past performance of the process. Therefore, in order to change the SPC limits, one needs to change the process.

Continuous Processes



It needs to be emphasized that all statistical techniques are not valid for all data sets. SPC tools that were developed for discrete manufacturing need to be adapted and/or interpreted differently when applied to a continuous process environment. Many statistical techniques involve assumptions that are not valid on all data sets. These may include issues such as the randomness of the data, the shape of the distribution, and data collection technique issues. It should also be noted that choosing an appropriate chart(s) for the data set and/or problem can greatly simplify the SPC process. Different charting techniques can offer a different view of the same data set. SPC techniques are the heart and soul of today’s Six Sigma quality programs. Six Sigma is a very popular and powerful process improvement tool, which is implemented by a team. The team is usually led by operations people and includes process control engineers and operations, quality, or middle-management personnel. It is normally targeted at measuring the Process Capability of both manufacturing and nonmanufacturing processes and then focuses on improving those processes. What Is Statistical Control? All processes exhibit variation. There is no cause for adjusting the total process unless the variation exceeds statistically acceptable limits. If the process is within acceptable statistical limits, changes would only introduce instability. Just as a PID controller needs to be tuned to properly control a loop, so the SPC process can be thought of as a procedure to help

© 2006 by Béla Lipták

SPC TOOLS AND TECHNIQUES SPC problem-solving techniques include:

• • •

Analysis of processes for stability and the effects of process modifications (control charts and capability indices) Defining problems and setting priorities (Pareto charts) Identifying causes for good and bad performance (cause and effect diagrams—fishbone charts/diagrams) Quantifying relationships between process or product variables and other variables (scatter plots or other correlation tools)

Control Charts The most widely used SPC tools are control charts. On these charts, the measurements of product samples are plotted to show their centering (XBar chart) and dispersion values (R chart). The centering value is the average of the samples. The dispersion value is the frequency distribution from that average. These charts are used to distinguish random, uncontrollable variations in measured parameters from variations that are controllable and should be corrected. To control product quality, the aim is to keep the product variation within the random pattern. Figure 2.34a shows a centering and a dispersion chart. Both charts have similar components: centerline, upper limit, lower limit, and sample size. The centerline, also called the average, is shown with a solid line. The average of all the samples should be at this value. The upper and lower limits are shown by dotted lines. If the average is achieved, the averages of n samples should be less than the upper limit and greater than the lower limit. Variations within the limits are

2.34 Statistical Process Control

Sample size (n) UCL

407

TABLE 2.34b Control Chart Types for a Single Point Sampling Method Chart Family

Chart Type

Description

AVG Centering Chart LCL

Individual

Uses individual point values to provide a trend.

Geometric Mean

Uses the geometric mean to provide exponential smoothing.

Moving Average

Uses the moving average to remove noise and indicate an overall trend.

Range (R chart)

Commonly called an R chart. Plots the range which is the difference between the highest and lowest sample in the range. The number of samples for calculating the range is configured by the user.

Centering chart

URL Dispersion Chart

Dispersion chart

LRL

FIG. 2.34a Shewhart chart for a subgroup sampling method. The centering (XBar) chart helps determine whether the centering of the process is stable. The Dispersion (R) chart helps determine whether the spread of the process is stable.

considered to be normal. Variations outside the limits are considered to be abnormal. Charts and Tables Tables 2.34b, 2.34c, and 2.34d show various control strategies for both the single point sampling method and for the subgroup sampling method. Also described are points to be stored for various types of centering charts. Tables 2.34e, 2.34f, and 2.34g give the equations for single point centering chart limits and

Courtesy of Rosemount Inc.

TABLE 2.34c Single Point Centering Chart Point Values Centering Chart Type

Point to Be Stored

Individual

New value

Geometric Mean

(last point value) (1 − W) + (new value) (W), where W is the specified weight of a new point

Moving Average

Average of the new individual and the last n individuals, where n is the number of samples for range

Courtesy of Rosemount Inc.

TABLE 2.34d Control Chart Types for a Subgroup Sampling Method Chart Family

Chart Type

Description

Centering Chart

Average (XBar) Median

Plots the median value in the same manner as the average chart plots the average value.

Dispersion Chart

Range (R chart)

Commonly called an R chart. Plots the range which is the difference between the highest and lowest sample in the subgroup. Samples are collected until the subgroup is complete, then sorted in order of size. The average or median is determined for the centering chart, and then the range is calculated.

Range with Sigma

Plots the R values in the same manner as in the range chart, but the limits are calculated using a derivation value (sigma) that was entered during configuration.

Root Mean Squared

Used for subgroups of 12 to 25 points. Uses the root mean square (RMS) to calculate the limits and centerline.

Root Mean Squared with Sigma

Plots points in the same manner as the RMS chart, but the limits and the centerline are calculated using a deviation value (sigma) that was entered during configuration.

Standard Deviation

Plots the standard deviation of the subgroup points.

Courtesy of Rosemount Inc.

© 2006 by Béla Lipták

Commonly called an XBar chart. Plots the average of data in the specified range. All subgroup samples are saved until the specified size is met. The average is then calculated, and the value written to disk.

408

Control Theory

TABLE 2.34e Equations for Single Point Centering Chart Limits Centering Chart Type

Control Limit

Individual — “Limit Calculation Type” Range

Parameters R = range value X = subgroup average

3(AVG R) 1.128 3(AVG R) LCL = AVG X − 1.1128

UCL = AVG X +

Individual — “Limit Calculation Type” Standard Deviation

n

∑ (X − AVG X )

3

2

i

i =1

UCL = AVG X +

n=2 Xi = sample value X = subgroup average

n −1 n

3 LCL = AVG X −

∑ (X − AVG X ) i =1

n −1

Geometric Mean 3 UCL = AVG X +

W 2−W

(AVG R)

1.128 3

LCL = AVG X −

W 2−W

R = range value W = weight of the point X = subgroup average

(A AVG R)

1.128

UCL = AVG X + (1.88) (R) LCL = AVG X – (1.88) (R)

Moving Average

2

i

R = range value X = subgroup average

Courtesy of Rosemount Inc.

single point dispersion chart values. Tables 2.34h, 2.34i, 2.34j, and 2.34k give the equations for subgroup centering chart values and chart limits. Interpretation of Charts The XBar chart (Table 2.34d) or Shewhart chart is used in combination with rule-based checks. It can detect whether a process output has undergone statistical abnormality—that is, a shift from its normal statistical behavior. On noisy processes, a simple time trend does not make this kind of deviation visually obvious, but simple rule checks can. Centering charts generated from single point measurements (Table 2.34b) indicate that a process is not in statistical control if any one of the following is true: 1. One or more points fall outside the control limits. 2. Seven or more consecutive points fall on the same side of the centerline.

Range

Range Value Equation R (for only two samples) = | Rmax − Rn | R (for more than two samples) = Rmax − Rn

Courtesy of Rosemount Inc.

© 2006 by Béla Lipták

Centering charts generated from the subgroup sampling method indicate abnormal fluctuations when: 1. Any single subgroup value is more than three standard deviations away from the centerline (or set point). 2. Two consecutive subgroup values are more than two standard deviations away from the centerline, on the same side of the centerline. 3. Three consecutive subgroup values are more than one standard deviation away from the centerline, on the same side of the centerline.

TABLE 2.34g Equations for Single Point Dispersion Chart Limits

TABLE 2.34f Single Point Dispersion Chart Values Dispersion Chart Type

3. Ten of 11 consecutive points fall on the same side of the centerline. 4. Three or more consecutive points fall on the same side of the centerline and all are located closer to the control limit than to the centerline.

Parameters R n = minimum range value Rmax = maximum range value

Dispersion Chart Type Range

Control Limit Equation

Parameters

UCL = (3.27) (AVG R)

R = average range value

LCL = (0) (AVG R) CL = AVG R Courtesy of Rosemount Inc.

2.34 Statistical Process Control

TABLE 2.34h Equations for Subgroup Centering Chart Values Centering Chart Type Average, X

Equation

Parameters

x1 + x 2 + x3 L + x n n

X1 … Xn = individual sample values X = subgroup average n = number of values in the subgroup

409

4. Two out of three consecutive subgroup values are more than two standard deviations away from the centerline, with all three values on the same side of the centerline. 5. Five consecutive subgroup values are on the same side of the centerline. Another process violation detection chart, called the moving range chart, differs from an Xbar chart in that each datum is an actual scanned data point rather than a subgroup average of several samples.

Courtesy of Rosemount Inc.

The Purpose of the Charts TABLE 2.34i Equations for Subgroup Centering Chart Limits Centering Chart Type Average

Control Limit Equation

Parameters

n = number of values in the subgroup LCL = AVG X − ( A2 ) (AVG R) R = range of the subgroup A2 = value from Table 1.20j UCL = AVG X + ( A2 ) (AVG R)

Courtesy of Rosemount Inc.

TABLE 2.34j Constants for Subgroup Centering Chart Limits n

A2

n

A2





11

0.29

2

1.88

12

0.27

3

1.02

13

0.25

4

0.73

14

0.24

5

0.58

15

0.22

6

0.48

16

0.21

7

0.42

17

0.20

8

0.37

18

0.19

9

0.34

19

0.19

10

0.31

20

0.18

Courtesy of Rosemount Inc.

TABLE 2.34k Equation for Subgroup Dispersion Chart Value

1. Data should always be presented in a way that preserves the evidence in the data for all the predictions that might be made from them. 2. Whenever average, range, or histogram charts are used to summarize data, the summary should not mislead the user into taking any action that the user would not take if the data were presented in a time series. With all of the charting techniques that are available, one needs to be careful of which to use and when. These are powerful techniques that can be helpful but if misapplied can be misleading. Figures 2.34l and 2.34m depict histograms of a relatively large (500-data point) data set that is in statistical control. Histograms tend to obscure (at a minimum) the sequential information in the data. Using Charts Utilizing an average verses an X-plane histogram could lead one to differing conclusions regarding a process/data set.

100 50

Dispersion Chart Type

Equation

Range

Rmax – Rmin

Courtesy of Rosemount Inc.

© 2006 by Béla Lipták

When utilizing SPC charts one needs to remember their purpose and choose them accordingly. The purpose of a good chart is to give information about the data that is not self-evident. Graphs can help make recognition of data patterns and features much easier than most tables or lists. Shewhart’s first principle for understanding data is that no data have meaning apart from their context. Shewart also provided two rules for the presentation of data:

Parameters R = range of the subgroup (values sorted highest to lowest)

5

6

7

8

9

10

11

12

13

14

15

FIG. 2.34l Histogram of 500 observations in the “X-Plane.” (Courtesy of SPC Press Inc.)

410

Control Theory

10

5

0 5

6

7

8

9

10

11

12

FIG. 2.34m Histogram of the data provided in Figure 2.34l. Data set in the “Average Plane.” (Courtesy of SPC Press Inc.)

Six Sigma (Standard Deviations)?” If it is not, the Six Sigma program is to assist the user to identify the weak points that are keeping it from achieving this level of performance. Upon achieving a state of statistical control, a process can be evaluated to determine whether it is capable of meeting the desired specification(s) or requirement(s). Usually a minimum of 20 to 25 subgroups is desirable to get a representative picture of the process. In order to characterize the capability of the process, its natural deviation is compared to the width of the specifications. This includes the evaluation of both the upper specification limit (USL) and the lower specification limit (LSL) independently to account for the location of the average process reading relative to the specifications. If there is only one limiting specification (LSL or USL), then the capability only needs to be calculated on that side. The standard deviation (σˆ ) is used to evaluate the common level of process variation and is calculated by

R σˆ = d2 Averaged charts tend to hide the frequency and magnitude of outliers. If only a simple glance is made at these charts (which represent the same process, which is in control), it could lead one to think that the first depicted a process in tighter control (it appears to have a very smooth normal distribution shape) and the second appears to be only generally normal in shape. A quick glance could also lead one to think that the first indicated a process with a wider control limit (data range from 7 to 15) than the second (data range of 8 to 12). Also, the smooth-appearing normal distribution of the first as compared to the more jagged-appearing normal distribution of the second could mislead one to assume that the first represents a process that is in better control. This comparison of two very similar charts of the same data is meant to encourage the review of data in many different forms, using many types of charts. This can help one to see different features in the data and better understand one’s process. Patterns within the data may contribute more to understanding the process than a macro statistical overview of the data would. This is especially true for distributions that exhibit highly nonnormal tendencies (single tails, bimodal, etc.).

PROCESS CAPABILITY An unstable, out-of-statistical-control process cannot be evaluated for its capability. Process Capability is the heart of the famed Six Sigma program. The Six Sigma program is intended to answer the following question: “Is the process capable of producing a product within the specifications to the statistical

© 2006 by Béla Lipták

2.34(1)

where σˆ indicates that the value is an estimate, d2 is a scaling factor, which is based on n samples in the subgroup, and R is the average range. The values of d2 for 5, 10, 15, 20, and 25 samples (n) are, respectively, 2.326, 3.078, 3.472, 3.735, and 3.931. A common technique for reporting process capability is the use of a ratio called the Cp index. The Cp index is the ratio that is obtained by dividing the distance to specification by the distance to the common cause variation. The Cp index is calculated as follows:

C p for the upper side: C pu =

USL − X 3σˆ

2.34(2)

X − LSL 3σˆ

2.34(3)

where X is the subgroup average.

C p for the lower side: C pl =

The minimum of these two values will be the worst of the two cases and is reported as the Cp or Cpk index. A Cp value over 1.33 would indicate a capable (acceptable, saying that 99.73% of individual readings are within specification) process, while a Cp index of less than 1.0 would indicate a “noncapable” process. A Cp index between 1.0 and 1.33 would indicate a marginal process. Cpu values for four different process states are shown in Figure 2.34n. For the latter cases, the next step is to improve the process.

2.34 Statistical Process Control

Cpu =

2. Move the process average, if the specification has only one limit (unilateral), or center the process average if the process has two limits (bilateral). 3. Reduce the common cause variation.

USL – – x 3s

Cpu = 0.67

– x

Identifying Causes

Specification (USL)

Cpu = 1.00

– x Specification (USL)

Cpu = 1.33

– x Specification (USL)

– x

411

Cpu = 2.00 3s Specification (USL)

FIG. 2.34n Cpu values for four different process states: Cpu less than 1.0, a noncapable process; Cpu between 1.0 and 1.33, a marginal process; Cpu equal to 1.33 or greater, a capable process.

The goal of all SPC techniques is to first determine that the process behavior is not normal and next to identify/diagnose the cause(s) of unusual (good or bad but not normal) performance. One of the cause-and-effect tools used in SPC is the Ishikawa cause-and-effect diagram (also called the fishbone diagram). Figure 2.34o depicts an Ishikawa fishbone diagram. This diagram is actually a thought process flow diagram (sort of a cause-and-effect checklist/map), which helps the user to systematically determine the root causes of problems. It begins with the major potential causes and assists/guides the user to work backwards through the listed causes to determine a root cause. The user must first identify all the potential causes that can affect the process/product problem being studied. This diagnostics process can be lengthy and involve the use of designed experiments and many other techniques to study the process. The goal of this diagnostics process is to obtain a process that is in control. If the abnormal performance was bad, the goal is to eliminate its cause. If the abnormal performance was favorable, the goal is to incorporate it into the new normal process (process improvement). Implementing SPC Concepts

There are three general ways to increase the values of the Cp index: 1. Move the specifications or set points (if they have not been properly established).

The implementation of real-time SPC increases the demands on instrumentation, communication networks, and computer technology. Developments in microchip technology offer improved accuracy and stability for primary sensors. The use of digital transmitters eliminates the error contribution of

Materials

Process Why?

Why?

Why?

Why? Why? Why?

Why? Why? Problem statement Why?

Why?

Why?

Why? Why?

Why? Why?

Why?

People

Distillation SPC issue 9/15/03

Cause

Machines

The product boiling point specification has been continuously trending above the center line of the Cantaring Xbar Chart for two days.

Effect

Fig. 2.34o A fishbone diagram is a powerful diagnostic tool to help identify and define problems. (Created with QI Macros for Excel SPC Software.)

© 2006 by Béla Lipták

412

Control Theory

analog-to-digital conversion. Microprocessor-based “smart” transmitters contribute to better accuracy and higher reliability by their improved rangeability. Automatic pressure and temperature compensation, remote calibration, and self-diagnostics also contribute to better data quality. The range of process variables that can be measured has also increased. Advances in online analytical instrumentation have made it possible to measure a variety of physical properties and chemical compositions that could not be directly detected in the past. (For more information, refer to Volume 1 of this handbook, titled Process Measurement and Analysis.) Good quality data input (measurement) is useful for online SPC only if the control system network can transfer the process information rapidly enough for real-time data manipulation. The use of MAP (Manufacturing Automation Protocol) for integration of different vendors’ products into a common communications network facilitates streamlined data transmission. The MAP OSI (Open System Interconnection) model defines communication network functions in seven layers, allowing software programs to be utilized over different networks. Although most of these techniques predate the computer, the ability to interface a personal computer with the control system also increases the feasibility of adding real-time SPC to existing plants. Intelligence at the I/O interface level enables routine operating functions to be handled locally, while conserving higher-level processing capability for SPCtype applications. It is not necessary to implement any or all phases of an SPC program in a real-time mode. One only needs to automate those portions of an SPC program that one feels comfortable with, and the nonautomated SPC techniques can still be performed on a non-real-time basis. Many of those who have automated the data collection and monitoring phases are still performing the nonmonitoring portions of the SPC process manually. Others are implementing diagnostic and process adjustment activities with artificial intelligence or other computer models.

DATA STORAGE, COMPUTATION, AND DISPLAY The ability to store statistical records of individual process data points is essential for generating and analyzing SPC control charts. Historical information is particularly useful for statistical analysis of cause-and-effect relationships using fishbone diagrams. Figure 2.34p shows the Pareto chart that enables diagnosis of the most likely causes of off-spec production runs at the component level by searching the historical database. The desirability of performing identical mathematical studies as well as providing real-time data points warrants a dedicated SPC computer for many chemical process industry applications. Figure 2.34q is a scatter plot showing the use of a linear regression algorithm to explore the relationship

© 2006 by Béla Lipták

Cause* Causo Carbo Filaid Preimo Coaco Alumo Pocho Prchlo

Percent Qty 23.1 25 21.3 23 14.8 16 13.0 15 9.3 10 8.3 9 5.6 6 3.7 4 *The causes are specific to the process and the equipment used

FIG. 2.34p A historical data search, presented in the form of a Pareto chart, can help identify the most likely causes of off-spec production runs.

between a selected product property (particle size) and an input variable (mixing time). Knowing this relationship, the operator can adjust the problem variable (mixing time) to restore the product (particle size) to meet specifications. Statistical calculations translate into practical control results through the interpretation of statistical control charts. Configurable CRT displays are prerequisites to maximizing the use of online SPC charting capabilities. Automatic SPC alarm notifications and summary displays can isolate the problem input variables. A computerized technique called online diagnostics is used to determine and anticipate the causes of plant operating problems before they actually occur. This diagnostic technique is basically a mix of SPC and fault-diagnostic principles that are programmed into an expert system that operates online. Thereby, process operators can anticipate and react to potential problems before they occur.

Uncontrolled process variable (particle size) 5.00 4.50 4.00 Slope 0.0248

3.50 Controlled variable (mixing time) 3.00

20 22 24 26 28 30 32 34 36 38 40 42 44 46

Intercept 5.086

FIG 2.34q Scatter plots can be used to investigate relationships between process variables and can indicate how much a controlled variable (mixing time) should be adjusted to bring an uncontrolled variable (particle size) within specifications.

2.34 Statistical Process Control

Bibliography Braverman, J. D., Fundamentals of Statistical Quality Control, Reston, VA: Reston Publishing, 1981. Deming, W. E., Out of Crisis, Cambridge, MA: MIT Center of Advanced Engineering Study, 1986. Doherty, J., “Detecting Problems with SPC,” Control, November 1990, pp. 70–73. Grant, E. L., and Leavenworth, R. S., Statistical Quality Control, 6th ed., New York: McGraw-Hill, 1988. Holmes, D. S., “Time Series Analysis Overcomes SPC Problems,” Control, February 1991, pp. 36–38.

© 2006 by Béla Lipták

413

Scherkenback, W. W., The Deming Route to Quality and Productivity, Roadmaps and Roadblocks, Rockville, MD: Mercury Press, 1986. Shewhart, W. A., Economic Control of Quality Manufactured Product, New York: Van Nostrand Company, 1980. Wheeler, D. J., Advanced Topics In Statistical Process Control, Knoxville, TN: SPC Press, 1995. Wheeler, D. J., Understanding Variation — The Key To Managing Chaos, Knoxville, TN: SPC Press, 2000. Wheeler, D. J., and Chambers, D. S., Understanding Statistical Process Control, Knoxville, TN: Keith Press, 1986. Wolske, B. K., “Implementing SPC Concepts on a Real-Time Basis,” Chemical Processing, March 1988, pp. 52–56.