Six Sigma and S88 Unite for Batch Automation Productivity Improvement

Stiahnúť (179 KB)

Abstract

Six Sigma is a quality improvement methodology applied to improving any type of process. It is a methodology endorsed by and heavily integrated into several major chemical companies.

This paper outlines how the program works with regard to performing process studies for the purpose of improving batch productivity through automation within a manufacturing environment.

Introduction

Six Sigma Much have been written and many companies have adopted Six Sigma methodologies to improve their internal processes or and/or process operations. Motorola, long recognized as initiating the Six Sigma methodologies claimed $2.2 billion in savings from 1987 to 1991. Texas Instruments claimed yield increased from 84.3% to 99.8%. Allied Signal claimed a reduction in defects in their processes by 68% in 4 months after implementation. And General Electric adopted Six Sigma in 1995 with claims of savings escalating from $170 mm in 1996 to $1.5 billion 1999 on expenses of $450 mm on over 37,000 projects.

So what is Six Sigma? Six Sigma is a rigorous, focused implementation of proven quality principles and techniques highly instituted and practiced that is used to measure, analyze, improve and control processes with the objective of reducing defects to six standard deviations (six sigma). Sigma, σ, is a letter in the Greek alphabet used by statisticians to measure the variability (deviation) in any process. Incorporating elements from the work of many quality pioneers, Six Sigma aims for virtually error free business performance. A company's performance is measured by the sigma level of their business processes. Traditionally, companies accepted three or four sigma performance levels as the norm, despite the fact that these processes created between 6,200 and 67,000 problems per million opportunities! The Six Sigma standard of 3.4 problems or defects per million opportunities (DPMO), or 99.99966% yield is a response to the increasing expectations of customers and the increased complexity of modern products and processes.

But more than a quality program, Six Sigma is disciplined approach to problem solving. Six Sigma employs a set of statistical "tools" designed to help measure, analyze, improve, and control processes. It is a statistical measurement of process capability and a way to compare different processes.

It would be a mistake to think that Six Sigma is about quality in the traditional sense. Traditional quality programs, that are traditional internally focused on internal requirements, has little to do with Six Sigma. Six Sigma is externally focused and is designed to improve productivity by focusing on defect prevention, cycle time reduction, and cost savings. Unlike mindless cost-cutting programs, which reduce value and quality, Six Sigma identifies and eliminates costs that provide no value to customers.

Six Sigma Steps

The set of tools and methodology of the Six Sigma process includes 12 steps organized into 4 equal major categories of Measure, Analyze, Improve, and Control, known as MAIC. Measure asks the questions – What is the frequency of the defects? Analyze determines when and where do the defects occur? Improve answers the question how can the process be fixed? And Control determines how the process can stay fixed.

Such steps include:

Phase Step Focus
Measure 1 - Select Product or Process CTQ Characteristic
(Critical-to-Quality from Customer's perspective)
y
  2 - Define Performance Standards y
  3 - Validate the Measurement System y
Analyze 4 - Establish Process Capability (Zst) y
5 - Define Performance Objectives y
6 - Identify Variation Sources x1...xn
Improve 7 - Screen Potential Causes for Change in y x1...xn
and identify the vital few xi from the trivial many xi Vital Few xi
8 - Discover Variable Relationships  Vital Few xi
9 - Establish Operating Tolerances on vital few xi Vital Few xi
Control 10 - Validate Measurement System for xi Vital Few xi
11 - Confirm Ability to Control Vital Few xi Vital Few xi
12 - Implement Control charting on Vital Few xi Vital Few xi
where y is the controlled variable and x is the manipulated variable.

Some companies have added a step 0 called Define or Learn, and the process becomes is called "DMAIC". Others have also added subsequent steps 13-15 called Translate or Leverage ("LMAICL") to emphasis the fact that a successful implementation can be applied to other like processes with other like benefits.

Within each of these steps, certain methodologies exist to determine key objectives.

For example, for step 1, Critical to Quality (CTQ) characteristics are determined using internal tools such as Quality Function Deployment (QFD) and Failure Modes Effects Analysis (FMEQ). These tools help determine which part of the process to attack, and assists in determining and prioritizing projects for further study.

Defining performance standards, step 2, uses methods to determine the boundaries of the process including inputs, activities and outputs by process mapping. Step 2 also determines the performance of the process via statistical metrics performed on process data. These statistical methods include analyzing the location, spread and shape of continuous data, or the number of defects from discrete data, leading to the calculation of the six sigma metric (Z). By calculating this metric, the accuracy and precision of the process can be measured, and with that the ability to determine whether problems are short-term or long-term and whether they are due to the basic process or with process control. This is necessary in creating the basic problem statement leading to approval for further study.

Validating the measurement systems, step 3, ensures that the problem is a control problem and not a measurement problem. Process variations can be attributed to either measurement problems in the controlled variable, weak controllability, or variations in manipulated or disturbance variables. In any case, it has to be identified which is the source of the variations in order to take corrective action. The idea is that if the process variation as measure is "p" and the measurement variation is "m". then the control variation ("c") is "p - m" a positive number, which should be better than 75% of "p" before any improvement in process control can occur. If it is determined that the variation is due to the measurement, then it must be fixed before going any further.

Establish the product's capability, step 4, is a statistical exercise that sets the baseline for the current process performance.

Define performance objectives, step 5, targets opportunities and establishes improvement goals. Performance benchmarking and process baselining highlight this activity. This step establishes how much the process can be improved and, thusly, determines the benefit number. At this point, a decision can be made whether to pursue funding of a project.

Identify variation sources, step 6, begins to identify and list the measured variables, x, that might be used to affect the controlled variable, y. Relationships and statistical evaluations are made that separate those variables that have a great effect from those that don't. Often, this is a great revelation.

Screen potential causes, step 7, provides diagnosis methods to uncover the vital xs, lump the trivial xs, and determine and how much impact they have on the response of the controlled variable. This is done either through a design of experiment or through data mining from historical process data.

Discover variable relationships, step 8, is a statistical analysis that obtains the transfer functions, y = f(x1...xn), determines the optimal level for the vital xs, confirms the results from independent data, and estimates the noise of trivial xs from the design of experiments.

Establish operating tolerances, step 9, determines the range of operation of each x. Tolerances are calculated based on measurement of y and xs. Relationships are determined for those xs that don't affect y, the xs that affect the mean, the xs that affect variability, and the xs that affect both mean and variability.

Validate the measurement systems, step 10 is the same as step 3 except that it is performed for the vital xs instead. If the measured variables have a lot of variation, it can be assured that the controlled variable will vary. Often step 3 is delayed until the vital xs are determined and validation of all measured variables is performed around the same time. Some feel that you have to validate all the measurements at step 3, or steps 4-9 are worthless, and therefore, they modify the Six Sigma process in this way.

Process capability, step 11, is basic Statistical Process Control (SPC). Root causes are identified through control charts and targeted levels are maintained through detection and prevention. Control Charts, such as Xbar/R, EWMA (Exponentially weighted moving average), and CUSUM are kept. This is the basic stage for the implementation of the batch automation.

Implement process controls, step 12, means to setup control mechanisms, monitor process variables, and maintain an "in control" process, by use of control charts and procedures. The results are audited, benefits are documented and an action plan is prepared to maintain the benefits.

Productivity Improvement Process (PIP)

Productivity comes in two flavors: potential productivity and actual productivity. Potential productivity is the known maximum possible value added per unit of input. Actual productivity is the current value added per unit of input. The difference between potential and actual quality is potential for improvement. Manufactures inprove productivity by focusing on improving production and its quality at the least cost and waste – i.e, produce products and services better, faster and cheaper.

Six Sigma methodology is also promoted by certain manufacturers to their suppliers. To this end, automation suppliers enter into a Productivity Improvement Program (PIP) to assist in defining and prioritizing automation projects in accordance with internal investment criteria. The purpose for the program is to leverage local engineering resources with their automation suppliers' resources in order to generate increased productivity from the processes. The automation suppliers provide technical resources for process control, process measurement, data analysis, benefit and cost calculation, and external benchmarking in order to quantitatively justify automation projects. The manufacturer provides leadership in identifying opportunities for study and in justifying implementation after estimates of benefits and costs have been performed.

Opportunities for investigation are identified and provided to the automation suppliers. The automation supplier's PIP team members then perform a detailed study in order to quantify benefits that are used to justify an automation project in order to realize and sustain those benefits.

The first step in the benefit study definition is to identify if an opportunity exists. Economic justification of advanced automation usually takes the following forms:

  • Estimates based upon experience
  • Estimates based upon analysis of baseline, or "before control" data
  • Estimates based upon previous post project audits in comparison to a baseline.

Estimates based upon experience or "rules of thumbs" are not generally acceptable to justify projects. They do not provide a reliable return on investment, accuracy of benefits are +-70% or more. However, these kinds of estimates are useful in prioritizing opportunities for which a detailed study would be called for.

Engineers and operators generally have a qualitative feel about improvements but need objective quantifiable evidence. Estimates based upon post project audits against a baseline will determine actual benefits of an automation project. However, this means that a project has been implemented and was justified previously on some other criteria. It is not the basis for justification of a project. Estimates based upon analysis of baseline data allow quantification of moving the behavior of a process variable from one operating point to another through some type of automation. The assumption is that automation is able to provide improved regulation of key variables and can push the process to an operating constraint. The amount that the process can be moved, calculated by statistical methods, is the basis from which monetary benefits can be calculated.

In the estimating of economic benefits, the sources of benefits or its objectives guides the analysis. The source of monetary benefits from improved batch control include:

  • Increased Throughput, Increased Production, or Improved Yields
  • Reduced operating costs from raw materials, energy, and/or manpower
  • Reduction in off-specification/rerun (Quality)
  • Reduced Batch Cycle Times (Throughput/Production)
  • Loss Prevention (Improved Safety)
  • Environmental or Regulatory Requirements,
  • Decreased start-up or shutdown, or recipe setup times (Throughput)
  • More consistent product (Quality)
  • Enhanced Operating flexibility (Resource Utilization)
  • Increased operator effectiveness

These benefits can be summarized by:

  • Benefits from operating better, usually against hard (equipment capabilities) or soft, (specifications) constraints
  • Benefits from compressing time
  • Benefits from avoiding unnecessary expenses such as environmental clean-up or product rerun
  • Benefits from more efficient resource utilization

The primary question to be answered is: "if the quality measurement could be controlled better, how much more money can be made." Related questions include:

  • What parameters affect the time it takes to make a batch?
  • What problems result in bad product or lost production?
  • What are the operational problems in order of importance.
  • What are any process safety problem that could be mitigated?
  • Are there any tasks that would simplify the job if done differently?

A determination must be made whether the opportunity is a process problem, a measurement problem, a control problem, or information problem. A process problem would undergo a different process call Design for Six Sigma (DFSS) and is not a MAIC process. If it is clear that quality cannot be sustained under current operating equipment, a new process design would be necessary. A measurement problem as opposed to a control is identified early in the MAIC process – step 3.

In order to evaluate possible productivity improvements, two sets of data about the process are required. The first set consists of process design information, which allows the capability of the plant to be established. The second set of information consists of actual process, production and operating data for comparison with plant design documents in order to determine opportunities for improvement.

Benefit Analysis

Benefits from operating better can use the following basic equations:

  • Operating Profit = Unit Upgrade ($/lb) * Unit Throughput (lb/day) - Operating Costs ($/Day)
  • Benefit ($/day) = New Operating Profit - Baseline Operating Profit
  • Benefit ($/yr = Benefit ($/day) * number of on-stream days per year

The benefits from operating better can be achieved through reducing the variation of quality of the final controlled variable (composition of a key product). Reducing variability in itself does not alone achieve any benefits. What reducing variability allows is for a unit to be operated more closely to constraints without violating the constraints.

To determine the operating points and variations of the key variables, data from the process must be collected and analyzed. Data analysis and engineering includes:

  • Data Reduction and statistical interpretation
  • Estimated Economic Benefits
  • Basic Advanced Control Designs
  • Budgetary Project Costs and hardware and software requirements
  • Preliminary Project Schedule

Data Reduction includes the use of computing tools to determine whether the collected data should be used or whether individual data points are to be excluded from the analysis. Data analysis provides for an objective manner of communicating by collecting quantifiable facts about a problem. It establishes baseline information about a process or its outputs, measures the degree of a direction or change, and creates a common basis for decisions. Moreover, data analysis is used to quantify the impact of a proposed solution by determining statistical boundaries through which costs and benefits can be estimated. It can then be used to compare before and after pictures of the process to determine actual benefits. Thus, data represents history and is the basis for predicting the future with inferential statistics.

The amount of data or its frequency of measurement for historical collection depends upon the time constant of the process. The faster (shorter time constant) the process, the faster the data collection time must be so as not to miss normal excursions or variations.

Snapshots of data taken infrequently will often miss key outlying points. The process may have wandered outside some soft constraint and have been brought back to normal. Unfortunately, this is the type of data most often kept by manual log charts. Other problems occur with logbooks that make data gathered from this source problematical. They are:

  • Often the data is not written with enough significant digits.
  • The data often looks totally unresponsive when compared to other readings.
  • The snapshots, often hourly are not kept at regular intervals. Cases where operators will write in data on the hourly log sheet towards the end of the evening or midnight shift are not uncommon, especially if they have been performing maintenance or other duties.

Averages of data induce a filtering effect that also tends to hide or smooth the excursions. Thus, outlying points are not captured. This is the data most often kept electronically by a data historian.

If the environment is a factor, then analysis may have to made for different time periods. Examples are differing operations between daytime and nighttime, differing operations between seasons, winter/summer. Also, adjustments have to be made if the operating objectives changed during the data collection period, such as increased throughput versus a differing period of increased quality. A long enough time period that can be comfortably handled with collected data at a frequency of its time constant for dynamic response is desired. The data should be collected during normal operation. Process moves not accounted for, or other unusual operation should be noted so that the data within that time period can be discounted. The objective is to get data from representative operation. Of course the biggest problem with the data is that often:

  • The last period of normal operation was several months ago
  • Snapshot data is stored for only a few days, while older historical data has often been averaged.

With batch processes, in addition to process measurement collected at periodic intervals, the most important data includes batch cycle times and turn around time from one batch to another.

Once the data has been collected and massaged, statistical measurements can be performed. A basic set of analytical tools is required to perform the analysis. The main requirement is to correlate data sets and requires the ability to calculate mean and variability (standard deviation) which are used to calculate process performance z(lt) and process capability z(st).

Statistical analysis of the data is all handled the same, i.e., the mean and standard deviation are calculated as:

mean x = 1/nΣxi
standard deviation s = [(1/(n-1)(Σ(x - xi)2]1/2

Many specialized statistical packages are available to calculate mean and standard deviation from the data. Also, most spreadsheets or calculators can automatically calculate these parameters.

Benefit Estimation

The definition of opportunity will specify in which area the opportunity will be found. For example, production rate will be increased, batch time will be reduced or some quality measure will be improved. This defines how productivity improvement will be measured. Data indicating that measurements (rate, time or quality) must be collected.

The basis for estimating benefits via analysis of "before-control" data and after-control data is to calculate how much the mean of the critical control variable can be moved in a profitable direction. This is expressed as:

∆x=|xc - x|

where the subscript "c" denotes after-control performance

Figure 1. Before/After Control analysis
Figure 1. Before/After Control analysis1

The above example demonstrates how shifting the mean towards the constraint can increase profitability. The amount of profitability, then, becomes the subject of the economic calculations.

Martin, et. al.2 list 5 different rules to determine how much the mean can be shifted with automation when actual "after-control" data is not available; same limit rule, same percentage rule, final percentage rule, and curvature rule, and achievable operation rule.

These rules are certainly valid and often used. However, Six Sigma applies another set of statistical evaluations. First process performance is calculated. This metric, known as zlt (long term z) is a measure assessing the current state of the final control element. Similar data also yield periods where the performance of the system was running well for certain periods of time, whether due to more experienced operators, continuity of control or some other reason, can be known as its process capability. This is known as zst (short term z). Establishing these indices can determine whether the focus should be on establishes a shift in the mean, reducing variation, or, commonly, both. The indices also help in determining how much the mean and variation can be improved.

An alternative is to establish a benchmark with similar processes running elsewhere. If this similar process can be established to be well defined as a standard then the current process can be expected to also achieve that performance. This gap analysis can determine what is "entitled".

Batch processes can certainly benefit from this analysis. An actual batch example is used to illustrate the use of the Six Sigma calculations.

Example

The following is an example of batch reaction times collected over a period of time:

 Figure 2. Batch Runs
Figure 2. Batch Runs

The data can be represented by a histogram of this period of time:

 Figure 3. Frequency Histogram
Figure 3. Frequency Histogram

Utilizing statistical tools, the following is a summary of the batch reaction times:

Current Performance:
Mean (zlt) 243.2 minutes
standard deviation 49.6 minutes
min 155 minutes
max 470 minutes
5% 185 minutes
Capability:
Mean (zst) 200.0 minutes
Standard Deviation 40.4 minutes
∆x = 243.2 – 200.0 = 43.2 minutes

This gives a capacity increase of 43.2/243.2 = 17.8% as more batches could be produced in the same overall time frame. Thus, the capacity increase from just reducing batch cycle times can be justified at multiplying the current production (lbs/yr) times the product profit margin ($/lbs) times 0.178. This does not even account for capacity increases that could be calculated for improved turn-around times and other improvements.

S88 Unites with Six Sigma

S88 models unite with Six Sigma methodologies by highlighting where savings in time can occur in batch processing. S88 modularizes a recipe procedure in terms of unit procedure, operations, and phases. The data analyzed in the previous section supposes data for the entire batch. However, if data can be collected on each unit procedure or for each operation in the unit procedure, savings in processing time be further refined and focused. For throughput, the times are typically charging, reaction, and discharging.

S88 also provides the design and implementation methods to achieve the calculated benefits. Papers presented during previous World Batch Forums and at this one, particularly the "Benchmarking for Success Session", highlight the benefits achieved from implementing batch automation through S88 concepts of modularity, adherence to specifications, maintainability of the batch architecture, and more.

The process improvement process is a definition of opportunity. As discussed previously, the opportunity for improvement may come from several areas and may be either economic driven or operational driven (or both). For batch processes the opportunity (or opportunities) will fall into one or more of the following areas:

  • Throughput/Production
  • Quality
  • Energy Conservation
  • Safety/Environmental/Regulation

As seen from the above example the most likely quantifiable justification is throughput increase. In addition, a case can be made that implementation according to the S88 methodologies not only increases throughput but would also improve quality. This is because as batch times become more consistent due to predictable and consistent running of recipes, quality improves with that consistency. The quality improvement can also be measured through Six Sigma calculations, and thus, add to the benefit numbers improving the justification for project execution.

Conclusion

Six Sigma has enforced evaluation of automation productivity projects according to the same set of criteria, removing much of the arbitrariousness from different units competing for automation dollars. Thus, Six Sigma establishes a shared understanding amongst engineers and managers on how projects can be funded and approved.

Where before, automation projects had been approved based upon individual benefit calculations, previous experience and reputation, and even political influences. Those calculations would differ depending upon management preferences for justification. Post audits were rare because they weren't needed as the project was already done and the monies spent. The control phase of the process ensures that post audits are performed to prove the benefits justified by the Six Sigma analysis. Now, all automation improvement projects have the same basis for justification, comparison, approval.

Not only have significant benefits been generated from identifying and implementing Six Sigma projects, but just as significant are the number of projects rejected because of Six Sigma evaluation. Several projects, after definition and qualitative determination of merit were found to lack the required ROI after Six Sigma scrutiny. These become cost savings, because even though the engineers would want to implement projects based upon perceived improvements, the actual dollars couldn't be justified. Those improvement dollars could then be diverted to projects that had a true calculated and justified return.

Six Sigma's magic isn't in statistical or high-tech razzle-dazzle. Six Sigma relies on tried and true methods that have been around for decades. In fact, Six Sigma discards a great deal of the complexity that characterized Total Quality Management (TQM). By one expert's count, there were over 400 TQM tools and techniques. Six Sigma takes a handful of proven methods and trains a small cadre of in-house technical leaders to a high level of proficiency in the application of these techniques.

References

1. Conlin, M. "Revealed at Last: the secret of Jack Welch's success", Forbes, Jan., 26, 1998.

2. Martin, G.D., Turpin, L.E., and Cline, R.P., "Estimating control function benefits", Hydrocarbon Processing, June 1991.

Odvetvia

  • Chemical

    Chemical plants rely on continuous and batch production processes, each posing different requirements for a control system. A continuous process calls for a robust and stable control system that will not fail and cause the shutdown of a production line, whereas the emphasis with a batch process is on having a control system that allows great flexibility in making adjustments to formulas, procedures, and the like. Both kinds of systems need to be managed in available quality history of product, and to be able to execute non-routine operations. With its extensive product portfolio, experienced systems engineers, and global sales and service network, Yokogawa has a solution for every plant process.

    Vidieť viac
  • Specialty & Fine Chemical

    Yokogawa has long served customers in the specialty and fine chemicals market. With a market leading batch solution that offers the best in class reliability and flexibility as well as industry experts who understand the complex requirements in designing a batch solution, you can be assured that in your partnership with Yokogawa you will have a system that will enable you to produce products that meet your customers’ needs in the future while maintaining safety and regulatory compliance.

    Vidieť viac

Hore