a complete guide to statistical process control
Next Gen measurment
Process control is commonly understood as ensuring a process operates effectively. Statistical Process Control, however, ensures the process constantly improves. This post details how to take the next step in process control.
Quality Control Methods
If a process error was bound to occur, would you rather know before happens? Or after the fact. If you chose ‘before’, you likely prefer having a warning because warnings enable an individual to develop a prevention plan. In most cases, we can’t prevent mistakes but abiding by statistical process control can lessen errors from occurring in an industrial process.
Statistical Process Control is an industry standard quality control method for monitoring, controlling, and improving a process through statistical analysis. It requires the development of set parameters from which a process controller can observe variations and constant real time measurements. The focus of statistical process control is on error prevention, and continuous improvement.
What is SPC?
The key component of statistical process control is gathering quality data in the form of process measurements obtained in real time. The data gathered is then plotted on a graph with predetermined control limits. Control limits are another key component of statistical process control which determine the capability of a process. Understanding the capability of a process is the backbone of process success.
Control limits, or natural process limits, are horizontal lines drawn on a statistical process control chart. The limits are drawn at a distance of ±3 standard deviations of the plotted statistic from the statistic’s mean. The plotted outputs shown on the control chart create the process signature.
The capability index, or Cpk, is a statistical tool used to measure accuracy (determined by average output results) and precision (determined by spread of results). In other words, Cpk is used to estimate how close a process is to a given target and how consistent a process is to its average performance. The higher the Cpk value, the better the process is. It is important to have a meaningful process capability that won’t be subject to outliers and variations from an unstable process.
The stipulation is that an engineer cannot know the full capability of their process without plotted real time measurements of the process. Essentially, process control is arguably not credible unless it employs real time measurements. Anything else will involve some level of guesswork or estimations.
What does SPC mean for Process Control Engineers?
Understanding if your process is capable is the first step to process control. An assessment of process capability will allow an engineer to statistically determine if the process will give the desired result in the way it is currently being run.
The process engineer is always checking how processes are functioning. In most situations, there is a reason for process parameters (IE flow rates or line pressures.) Without fully understanding the Cpk, which is outlined below, it is difficult to determine how the process can be best adjusted, or improved.
Process control without real time measurement data is fundamentally obstructed. For example, imagine a process that involves mixing cement to a specific density. Assume measurements are to be taken manually every 15 minutes. Anything that flows through the pipeline in between the measurements could diverge from the requisite density without being recorded. An individual might measure the process at the correct time and find that the density had become higher or lower than the control limits. In a case like this, the process will have to be started over.
Another possibility is that manually taken measurements display the density within the control limits, but the process goes astray in between measurements. Without real time measurements, a situation like this will never be detected.
Why use SPC within a process?
There are core dangers of abiding by a manual process control style. First, is discovering errors after it is too late, or worse, never discovering errors at all. It goes without saying that these scenarios are detrimental to a process.
The risks associated with manual process control are avertable by utilizing statistical process control. At the point in which a process even begins to vary, real time measurement data will illustrate those changes. This allows process control engineers, metallurgists, and the like, enhanced problem solving abilities. They can enact alterations necessary to maintain the process- or better stated, avoid emerging conflict.
The same goes for changes in a process that work in favor of the process goals. If a controller makes an adjustment to increase capability, the instantaneous feedback of that change will communicate whether the adjustment was favorable. So process controllers not only avoid error but can see first hand what changes positively affect the process. This is how continuous improvement (also known as kaizen) is achieved.
Expanding on the topic of statistical process control, we arrive at six sigma. Six sigma is a set of techniques and tools for process improvement. A six sigma process is one in which 99.99966% of the process is statistically expected to be free of defects. The ability of a process can be characterized by a sigma rating indicating its percentage of defect-free outputs. Specifically, within how many standard deviations of a normal distribution the fraction of defect-free outputs corresponds to. Lean six sigma is a combination of six sigma and lean management that focuses on lessening waste. One cannot guarantee 99.99966% of a process is statistically free of defects without monitoring the entire process diligently. Thus, it is not possible to reach six sigma without statistical process control.
Most modern processes are seeing a progression into automation and removing human error. If a process has developed a performance window with a high and low range, and it stays in between the high and low, the process is considered ideal. This is how statistical process control is a major factor in the progression towards automation and all productivity it entails.
Applying statistical process control, and the principles of six sigma (or lean six sigma) to the analysis of adverse performance exemplifies critical information about the process. Specifically, whether a process is stable, whether a change in course is required, and if progress attempts have the desired effect.
Statistical process control is the natural progression for industrial processes that want a more efficient process where they can mitigate errors using real time measurement data. Implementation of statistical process control requires higher monitoring of the process. However, it also delivers positive results in that it is the most efficient method of discovering flaws or downfalls in a process. These discoveries allow for necessary alterations for improvement. Long term statistical process control application will create closer Cpk’s as the accuracy and precision increase as a result of constant improvement. After some longevity, a six sigma process can be reached-which is the epitome of all process goals.