SIX SIGMA BENCHMARKING OF PROCESS CAPABILITY ANALYSIS AND MAPPING OF PROCESS PARAMETERS

Process capability analysis (PCA) is a vital step in ascertaining the quality of the output from a production process. Particularly in batch and mass production of components with specified quality characteristics, PCA helps to decide about accepting the process and later to continue with it. In this paper, the application of PCA using process capability indices is demonstrated using data from the field and benchmarked against Six Sigma as a motivation to improve to meet the global standards. Further, how the two important process parameters namely mean and the standard deviation can be monitored is illustrated with the help of what if analysis feature of Excel. Finally, the paper enables to determine the improvement efforts using simulation to act as a quick reference for decision makers. The global benchmarking in the form of Six Sigma capability of the process is expected to give valuable insight towards process improvement.


INTRODUCTION
Quality control and improvement is the most important part of organizations engaged in the manufacturing of products or delivery of service.Monitoring the quality using the standard references, or metrics, is vital to any organization that cares about customers and ultimately helps the organization to capture the market.
The advent of new technologies, increased demand for high quality products, and quality based competition mandate close scrutiny and careful selection of processes.The overall cost depends on the judicious selection of the process and thus process capability analysis (PCA) is considered as a vital step in ensuring the quality of products.Increasing competition, availability of low-cost suppliers, global supply chains and information technology driven manufacturing, all have caused new paradigms in process decisions.Manufacturers are moving towards more of outsourcing and trying to cut down the cost of production.In addition, these decisions are more influenced by global standards, and benchmarking with best practices.Hence it is imperative to ascertain the quality of output from a production process before that process is identified for batch or mass production.Further it is necessary to find out how the process can further be improved to meet the global standards so as to remain competitive in the market.

ORGANIZATION OF THE PAPER
In this paper, first a brief overview of PCA is provided and the numerical measures in the form of Process Capability Indices (PCI) are described.The mathematical formulae to calculate these PCI's along with their interpretation are also given.A description of relevant concepts of benchmarking and Six Sigma are also provided for completeness as well as continuity.Later, using field data PCA is performed and benchmarked against Six Sigma standards.The two key process parameters namely process mean and the variance are optimized to accomplish the required Six Sigma standards, using Excel's "what if" analysis.Next, using simulated data how much of improvement efforts are needed to reach global standards is demonstrated.Finally mapping of process parameters with respect to the required level of Six Sigma (SS) standards is carried out and how the two process parameters are simultaneously tracked is illustrated.This enables continuous monitoring and improvement and helps to set the goals clearly.
All the calculations, scenario building, optimization, and simulation, including the development of charts, have been done using Excel 2013 version, which has powerful features to support the current research work.

BRIEF LITERATURE REVIEW
Process capability analysis (PCA) forms the initial step in establishing acceptability of a production process to produce output as per the specified tolerance.The literature on process capability studies, besides being rich and diversified, has a long historical background.Different aspects of process capability have been covered with varying details to satisfy the needs of practitioners and researchers.It is not the intention here to provide an exhaustive coverage of PCA, but a brief overview of the major aspects relevant to this paper, is presented in the following sections.
Process capability refers to the ability of a process to produce the output namely a product or a service, according to the specifications as suggested or prescribed by the designer or the customer.Because of the variations that occur in a process due to assignable as well as the chance causes, a process will not be always performing as per the expectations and hence the output quality can deviate from the preset standards.Process capability studies help to verify whether the processes adopted by the manufacturer or the service provider are capable of meeting the specifications.In addition, process capability assessment studies have several objectives as follows, (Summers, 2005): 1.To ascertain the extent to which the process will be able to meet the specifications.
2. To determine whether the process will be able to meet the future demand placed on it in terms of the specifications.
3. To help the industries to meet the customers' demands.
4. To enable improved decision making regarding product or process specifications, selection of production methods, selection of equipment, and thus improve the overall quality.
Besides the above, it is reported that process capability studies help in vendor certification, performance monitoring and comparison and also for setting targets for continuous improvement.
Process capability indices are simple numerical measures to express the potential and performance capabilities of a process under different conditions.These indices essentially link the key parameters like process mean and variance to design specifications of a quality characteristic.Thus they act as a bridge between the fixed or static tolerance values and the dynamic process values as seen over a period of time.A lucid paper by Kane (1986) explains the fundamentals of process capability indices along with numerical illustrations.The importance of sampling for the estimation of process capability indices has been vividly presented by Barnett (1990).A detailed explanation about the process capability indices is available in Porter and Oakland (1991).It is important to observe that the process capability measures are all basically sample based and both the sample size chosen and the method of sampling need to be carefully considered.In addition, it is essential that the process be stable and in the state of statistical control before taken up for capability assessment.
Assessment of process capability is commonly done using process capability indices and Table 1 shows the different types of indices used in practice.The corresponding formulas are also shown in the Table 1.A Cp of 1.00 indicates that the process is judged capable.It is generally necessary to estimate the process standard deviation so as to estimate Cp of the process.Due to sampling variation and machine setting limitations, Cp = 1.00 is not used as a minimally acceptable value, and a minimum acceptable value of Cp is 1.33 which ensures acceptable quantity within the specifications, as a shift in the process mean from the target value and a change in the process variance occur over a period of time.Since the Cp and Cpk indices do not take into account the departure from the target/nominal value, Chan, Cheng, and Spiring (1988) have introduced another measure of process capability, called the Cpm index.This index takes into account the proximity to the target value as well as the process variation when assessing process performance.The Cpm index is also referred to as "Taguchi capability index', as illustrated by Boyles (1991) and Balamurali and Kalyanasundaram (2002).

Name Index Formula
Process Potential Index Cp Process capability indices that are commonly used are all based on the assumption of normally distributed data, however, the case of non-normality is also discussed in the literature, for example, Clements (1989), Somerville and Montgomery (1996), Rao and Xia, (1999), and Hou and Wang (2012), to name a few.But typically across the industries the assumption of normality is followed and rarely the non-normality is taken into account because of complexity of calculations, and long procedures.Further, it is interesting to note that the process capability indices are categorized as first, second and third generation indices depending upon what process conditions are being explored and indicate also their relative sensitivity in recognizing process changes.The process capability index Cp is considered as the first generation index and Cpk and Cpm are regarded as the "second generation" indices.Pearn, Kotz, and Johnson (1992) while discussing the distributional and inferential properties of process capability indices, also propose a 'third generation' process capability index and two new multivariate indices.These are claimed to be possessing better properties than the earlier developed indices.However, in most of the industries it is still the first and second generation indices that are commonly used for process assessment and monitoring.The interpretations to be made based on the values of the indices are as shown in Table 2.

Value of Cpk Interpretation < 1
Not at all capable = 1 Not capable = 1.33 Minimum requirement = 1.67 Promising = 2 Total confidence with process The procedure used in PCA, as well the recommended values need to be ascertained against statistical basis and analysis.Montgomery (1986), comments that some of the industry practices do not satisfy the statistical tests.Thus serious doubts arise about the validity of such measures calculated using those industries prescribed procedures.The recommended values according to Montgomery (1986) are given in the Table 3.

Six Sigma and benchmarking of the process capability
Six Sigma (SS) needs no introduction, as it is now regarded as an intensive approach to improve the process quality and be able to meet the global standards.
The concept of Six Sigma is basically to produce error free output.Sigma, s, is a letter in the Greek alphabet used by statisticians to measure the variability in any process and the Greek letter σ, represents standard deviation.Today a company's performance is measured by the sigma level of their business processes, (Breyfogle, 1999).Traditionally companies followed three or four sigma performance levels as the norm, despite the fact that these processes created between 6,200 and 67,000 problems per million opportunities.However, the Six Sigma standard of 3.4 defects per million opportunities is a response to the increasing expectations of customers, who want their products to be free from defects.
SS is defined in many ways as researchers, practitioners, and corporate people have given different perspectives about Six Sigma.Consequently, many definitions have been put forth to indicate what SS is all about.Some of the definitions of SS are as follows: ◊ According to Pyzdek (2003), SS is the application of the scientific methods to the design and operation of management systems and business processes which enable employees to deliver the greatest value to customers and owners.
◊ Persico (1992) states Six Sigma as a direct extension of total quality management which, in turn, is based on the principles and teachings of W. Edwards Deming, the legendary quality guru.
In many industry and business environments, the Six Sigma culture is deployed through a systematic and uniform approach and set of techniques for continuous quality improvement, (Harry, 1998).A Six Sigma program leads to better decision making by developing a system that prompts everyone in the organization to collect, analyze, and display data in a consistent way (Maleyeff & Kaminsky, 2002), and hence appreciated in the industries.
As a good amount of literature is available about the technique and applications of SS, a detailed description of SS technique is not attempted in this paper, but useful references are quoted to provide the necessary initial reading.Some of the useful resources suggested are Hahn, Doganaksoy, and Hoerl (2000), Hammer and Goding (2001), and Pande and Holpp (2002).Many authors have discussed about the goodness of SS by thoroughly reviewing the literature, and hence the following reviews should be adequate for understanding the growth and expansion of the literature pertaining to Six Sigma: 1. Six Sigma Literature: A Review and Agenda for Future Research, (Brady& Allen, 2006), 2. Six Sigma: Literature review and key future research areas, (Nonthaleerak & Hendry, 2006) 3. Six Sigma: A literature review, (Oke, 2007) 4. The origin, history and definition of Six Sigma: a literature review, (Prabhushankar, Devadasan, Shalij, & Thirunavukkarasu, 2008) 5. Six sigma: A literature review analysis, (Cagnazzo & Taticchi, 2009) 6. Six Sigma: A literature review, (Tjahjono et al., 2010) All these reviews cover the concepts, theory and the applications of the SS technique in many diverse areas which has prompted many companies to adopt the technique.2016).Because quite often the acceptable quantity expressed as percent of output is expressed, the corresponding Sigma levels are shown in Table 5.When the process sigma level is plotted against the percent acceptable quantity, the relationship takes the form as shown in Figure 1.From this figure it is evident that as the quantity acceptable approaches 100%, the process sigma level reaches six and further increase is not necessary.Though process sigma level leads to 3.4 defects per million, it is considered a zero defects.It is observed that almost like a habit companies continue to accept three or four sigma performance levels as the norm, despite the fact that these processes can have between 6,200 and 67,000 problems per million opportunities, (Pyzdek, 2003).
Considering this comment, in this paper the optimization attempt is towards reaching four sigma first and later moving towards Six Sigma.But essentially the global standards mandate that the processes be benchmarked against Six Sigma only as Six Sigma is considered the ultimate stamp of acceptance in a highly competitive environment.Continuous quality improvement of products and/ or service offered by a company is essential for survival in the market and meeting the demands of the customers.Hence the organizations are continuously searching for new techniques and tools to enable them to improve quality.Benchmarking is one such quality improvement technique that helps quality improvement by comparing the performance or any other measurable attribute with those who are doing it better.In essence benchmarking involves comparison with the superior performer, identify the gaps, and take proper action to overcome those gaps, thereby improving the quality.This process is not a one-time application but has to be used as an on-going process.Since new benchmarks are regularly created, it is necessary that the spirit of benchmarking is maintained.Benchmarking has been historically used as a technique for comparison of anything, a product, service, performance, output, or any measurable characteristic, with the superior performer or the "best in class" so as to find the gaps that prompt for improvement.After the publication of the success story of Xerox Corporation of USA, which adopted the technique to defend against the stiff competition from the Japanese manufacturers in the copier market, (Camp, 1989, the application of benchmarking has increases manifold.Though benchmarking exercises have been in existence for a long time, it is in the recent times customary to probe whether the subject under consideration has been benchmarked against the best in class, (Elmuti & Kathawala, 2013).The term "benchmark" was included in the guidelines of the prestigious US Quality Award, Malcolm Baldrige National Quality Award, in 1985, and benchmarking became a qualifying criterion to participate in the award process.
The literature related to benchmarking for quality improvement that covers the concepts, models, and applications, is abundant and has thus attracted the attention of several researches who have provided a comprehensive picture of the growth and spread of research based on benchmarking studies across the globe.For a complete list of literature on the process of benchmarking literature, some of the prominent review papers on benchmarking can be consulted, (Zairi & Youssef, 1995), (Kozak & Nield, 2001), (Scott, 2011) and (Dattakumar & Jagadeesh, 2003).
These papers also illustrate the various applications of benchmarking besides indicating the popularity of the topic of benchmarking and its applications.
The present paper which is focused on improving the process performance through benchmarking the process capability, decided to develop a generic benchmark which can be statistically established and capable of expressing using the main process parameters, namely process mean and process standard deviation.These two parameters are also the building block of the process capability assessment.
In this context, the globally accepted ultimate performance level namely Six Sigma was selected as the "benchmark" to be used to ascertain the quality of the process.Any process that exhibits the Six Sigma standard of performance would obviously be considered as the "best performer" and this paper has used Six Sigma to essentially mean the ultimate level of comparison for a given process to be considered as on par with the global standard.

The problem on hand
The problem considered here pertains to a discrete manufacturing process adopted in a company which used to supply components to major auto manufacturers in India.Though many different components were produced by the company, for the purpose of illustrating the methodology, only one component, namely a threaded fastener is considered here.The component has one critical dimension namely the core diameter which had a specification of 3.4 ± 0.05 millimeter, and thus considered as "critical to quality", (CTQ).The data pertaining to a batch of 125 components is shown in Table 6, which shows the core diameter values in millimeter pertaining to 125 values under 25 subgroups with each subgroup having five components.

PRELIMINARY ANALYSIS
Before the process output data was used to analyze the process capability, the preliminary analysis included (1) Plotting the histogram, (2) Testing the data for normality, and (3) plotting the control charts.
The histogram of the data collected is shown in Figure 2. Using the Kolmogorov-Smirnov test, the value of P is found to be 0.016, which barely indicates normality.Further, the normal probability plot was also drawn and the data was found to be only approximately normally distributed.This is well accepted as perfect normality is not expected in an industrial process, as is the case on hand.

Figure 2. Histogram of core diameter values
The typical control charts, namely x-bar chart and R-chart were plotted to ascertain the stability of the process.These two charts are shown in Figures 3 and  4 respectively.From these charts it is evident that the process is under statistical control and also stable.
Further, the charts do not exhibit any questionable patterns and hence further analysis was carried out.

Process capability analysis
Considering the data available, the typical process capability assessment was made and the typical pro-cess capability indices have been calculated.These are shown in Table 6.It is observed that the process spread exceeds the specification spread and thus out of specification values are expected.Further, process mean is not centered and process variance also needs to be reduced.However, the process capability indices clearly reveal that the process is not capable of meeting the requirements, and currently producing defects.This obviously demands improvement of the process by proper process control.In the next step, when the process is assessed for Six Sigma capability, it is observed that the process is currently performing at the sigma level of 2.62635812, as shown in Table 7, which is too inadequate.Considering a sigma value of at least 4.00, it was decided to find out the values of process mean and standard deviation to reach the de-sired result.Using a process sigma level of 4.00 as the threshold value, Excel's what if analysis is performed and the possible values and combinations of process mean and standard deviation are established.These values are shown in Table 8.Because the intention is to have a process sigma of at least 4.00, only those desirable combinations of process mean and standard deviation are selected and shown in Table 9.From the Table 9, it is understood that if by strict monitoring and proper centering, process mean can be controlled within a distance of 0.01 mm from the target value of 3.4 mm, then the process standard deviation could be ranging from 0.015 to 0.0225 mm, to yield a process sigma of more than 4.00.Hence, the process manager can decide as to which quality parameter can be easily "fixed", mean or the variance.By controlling the process standard deviation within a range of 0.015 to 0.0225 mm, and ensuring the process mean is within the range of 3.39 to 3.41 mm, the process manager should be able to reach a process sigma value greater than 4.00.This kind of a trade-off enables better process control and leads to substantially lowering the rejects as observed under the DPMO values column in Table 9, from a previous DPMO of 130006 when the process sigma level was less than 3.00.The lowest value of DPMO is 788 occurring for process mean of 3.40 mm and standard deviation of 0.01750 mm, with a corresponding sigma level of 4.66051.

Reinforcing the analysis with simulation
Having find out the desirable combinations of process mean and the standard deviation, it is now possible to go backward to find out the desirable values of the individual variable which is the core diameter so that the process sigma level is at least 4.00.This can be easily done by generating a set of normally distributed values using the optimum values of process mean and standard deviation as given in the Table 9.For example, for a combination of process mean of 3.38 and standard deviation 0.02500, a desired set of values of the core diameter can be generated using Excel itself, and then observe how the individual values should be existing.
As these values are under hypothetical conditions, it is important to note that these are only the expected values of the core diameter and hence not to be taken as the actual output from the process.Further, the actual output is influenced by several variables and that pattern is not captured by the simple simulation model described here.For a better visualization of the changes in process mean and standard deviation affecting the process sigma level, the values from Table 9 are mapped with the Six Sigma scale, as shown in Figure 5, with process sigma values along the vertical axis, and process mean along the horizontal axis.In this Figure, the different values of process mean are plotted against various values of process standard deviation leading to process sigma values ranging from 2 to 5.
As the desired target value of process sigma level is 4 and above, only those combinations of mean and standard deviation need to be selected.This is shown as the shaded area of the figure, at the top of the chart which can be called as the feasible region.Using Figure 5, it is now possible to identify the operating levels in the process which enable the desired sigma level performance.For a given sigma level of performance, the process mean and the standard deviation can be identified and used as process parameters.For example, if the process standard deviation is 0.0275 mm, the process mean when set at 3.38 mm would yield 4 sigma level of performance and then will decrease with the increase in the mean value.This clearly indicates that if both the process mean and the standard deviation increase, the sigma level of the process decreases.Thus it is now left to the process manager to decide as what level of sigma is to be targeted and accordingly decide the process parameters as found from Figure 5, and then aim to maintain the same in the process.Thus Figure 5 can be suggested as decision making aid to enable the selection of process parameters for a desired sigma level of performance by the process.

CONCLUSION
Statistical process control in the earlier times typically involved ensuring that the process is under statistical control and the process is stable.The control charts served the purpose of assessment and the additionally done process capability analysis completed the assessment.These two assessments helped the process managers to control the processes to en-sure better output and thus enabled smoother production.With the advent of process improvement, and Six Sigma becoming a major development, it became necessary for the process managers to continuously improve and also assure minimum global standards.This requires a thorough understanding of the Six Sigma metrics, which are globally recognized and used as common measures of process quality.Both the Six Sigma level of the process, and the DPMO have to be continuously monitored.
Today it is well known that "error free" output is expected by the customers and hence the process managers have to redefine the process performance as per the new standards set by the customers.In this context the process managers obviously look for benchmarks against which they can compare their processes and thus understand the gaps to proceed towards improvement.Six Sigma is one such benchmark that is easy to understand and convince the customers when selecting the benchmarking initiatives.
As Six Sigma is commonly accepted benchmark to define the quality, it is quite logical and prudent to choose Six Sigma for the purpose of comparison.In this paper such an attempt has been made to illustrate how global benchmarking of the process needs to be done using Six Sigma metrics and further, how using "what if" analysis feature of Excel, it is possible to get a clear picture of the desirable values of the process parameters.The inherent assumptions like normality of the output, and unchanged process behavior, are here also made, and the usual limitation of making the assessments based on sample based values, also exist.However, the paper serves an important purpose of helping the process managers to benchmark against the Six Sigma metrics, and thus aim towards global standards.The charts and tables illustrated in this paper provide a convenient method of selecting the process parameters for a desired level of performance of the process.This tradeoff provides a wide opportunity of setting the process parameters depending on the resources.The idea is to illustrate how process improvements can happen by controlling the process parameters and get in to a predictive model to enable improved results.The overall objective is developing and demonstrating a decision making model through the established techniques.

Figure 1 .
Figure 1.Process sigma level and acceptable output

Figure
Figure 3.Control chart for averages

Figure 5 .
Figure 5. Mapping of process sigma level (vertical axis) for process mean (horizontal axis) and process standard deviation along the curves

Table 4 : Conversion between DPMO and process sigma level
An important metric in the SS technique is the Defects per million opportunities, (DPMO), which refers to the number of unacceptable fraction expressed as a ratio of one million opportunities.A typical conversion between DPMO and the process sigma level is shown in Table4,(Six Sigma Daily,