Adept Scientific - English
The world's best software and hardware for research, science and engineering.
flag arrow
clearclear

 Adept Store | register Join My Adept | Flags  
Adept Scientific | Amor Way | Letchworth Garden City | Herts | SG6 1ZA | Tel: +44 (0)1462 480055  
UKdedksvnofi
Home
Products
Training
Events
 Buy Online
Downloads
Academic Discounts
Support
My Adept
International |  About Us |  Adept Scientific Blog |  Contact Us |  Press Room |  Jobs
Adept Scientific on Facebook Adept Scientific on Twitter Adept Scientific on YouBube Adept Scientific on LinkedIn

Welcome to the Adept Scientific Press Room

May 23
2003

Press ReleasesExtend Quality Control into the Supply Chain

Article from Control magazine, December 21, 2001
By Rich Merritt, Senior Technical Editor

Sharing QC data with suppliers and vendors can reduce surprises and provide valuable feedback for process improvements.

What a revolutionary idea! Acquire quality control (QC) data while you are making the product, and then pass that data to your customer as part of your service. If you make a commodity product, it would differentiate you from your competition.

Better yet, take a lesson from the auto industry and require companies supplying you with feedstock to provide QC data on incoming materials. This would eliminate all the cumbersome lab testing and online sampling you do now, because you would already know the composition of the feedstock, and you could feed it into your process control system ahead of time.

Of course these are great ideas. That’s why people in the pulp and paper industry, among others, have been doing it for years. If you haven’t heard of supply chain QC before, it’s because it’s not exactly widespread, and because today’s software technology is just now making it easy for everybody to pass feedstock data through the supply chain.

Following the Paper Trail

Control engineers in the pharmaceutical, chemical, nuclear, and wastewater industries are familiar with the paper trails required by various government agencies. Quality data is required in huge volumes for regulatory purposes, so companies grudgingly produce reams of printouts and enshrine billions of bytes of process trivia into databases that never see the light of day unless some quality problem comes up down the road.

Parameter Pipelines

Quality control (QC) information flows through a process from the supplier to the manufacturing plant to customers. For example, QC data is gathered during raw material manufacturing by a laboratory information management system (LIMS) and stored in a process information management system (PIMS) database. QC data can be accessed by customers via the Internet through a web server, or by direct transmission via a virtual private network (VPN) over the Internet.

When a shipment of, say, Pirelli’s Miracle Elixir turns out to be tainted, examining all its stored QC data reveals how the problem happened, so it can be fixed in the future, and to whom the product was sent, so it can be recalled in the present. While those are noble goals, it seems like all that data might be put to a more immediate use, such as ensuring better process control further down the supply chain.

They do it in the food business, because they have to. “Once a food product has been through a manufacturing process, little can be done to alter its quality,” says Pearl Adu-Amankwa of the Food Research Institute, Accra, Ghana. Therefore, careful quality control throughout the process ensures good results. “Raw material control and process control are interrelated. The factory must not be deprived of an essential raw material while it awaits quality control clearance. This means that the work of quality control must be integrated with the factory management plan,” he explains.

Adu-Amankwa went on to describe the meticulous way the food industry uses quality control measures to test raw materials, put data into the production process, and eventually produce food that’s fit for consumption. While the problems of food production are magnified by the perishable nature of its feedstocks, the lessons learned there apply everywhere.

Carol Jackson, director of corporate accounts at OSI Software, says the paper industry, like the food industry, has been doing this for years. “Paper producers send production and quality information to the converters of their product whether the converter is on-site or miles away,” explains Jackson. “Exchanging information happens quite often where the feed to the converter’s machinery requires both process and quality parameters for the proper setup and operation of the production line. They need to know the moisture and weight profile across the sheet of paper or board that they are cutting, coating, or gluing to maximise their productivity and minimise losses to scrap.”

The Pipeline Continues

At the manufacturing plant, feedstock arrives and is stored in raw material tanks. QC data about the feedstock arrives via the VPN. Operators examine QC information and use it to adjust the control system. QC data obtained during manufacturing is processed by the plant’s LIMS, put into a PIMS database, and made available to downstream customers via the Internet and web browsers. Other systems in the manufacturing plant make use of all the QC data, including ERP, material balance, control optimisation programs, and similar software.

Jackson described examples of P&P companies who have been sharing data like this for 10 or 12 years with OSI’s software products. One company made a me-too commodity product that suddenly became very successful when the company made basis weight and moisture data available to its customers online. With this information, corrugated box plants could set up their machinery to run more efficiently.

Other examples of supply chain quality control are out there but difficult to find; perhaps because it takes two to tango, sometimes three. “We purchase granulated blast furnace slag from the steel industry as a raw material to make blended hydraulic cements,” says Jeremy Baldridge, production manager at Lone Star Industries, New Orleans. “We require our slags to be water-cooled and have a certain particle size, and we have specific requirements on total sulfur content, total alkalis, and silica content, to name a few.”

The steel mills supply this information to Lone Star, which uses it to control the process. “The chemical makeup of the slag affects the rate of reaction during hydration,” says Baldridge. “We control the reactivity by controlling the particle size during finish milling.”

Lone Star, in turn, supplies all this data to its customers. “Our customers require full chemical and physical data such as particle size; seven, 14, and 28-day compressive strengths for fully hydrated product; and a variety of chemical species concentrations,” says Baldridge.

The logic in these examples is straightforward: If any company measures quality control parameters during or after its production processes for better control or for regulatory purposes, and stores all that QC data somewhere, it could easily make it available to its customers and claim a competitive advantage. On the other hand, since customers know the data is available, why don’t they demand that it be supplied with the product?

“In most cases, companies are not tackling the supply chain quality issue with as much vigor as they should,” admonishes Cliff Yee, president of Northwest Analytical. “The software tools are available to do this, and Wall Street analysts list supply chain quality problems as the number one threat to the value of the stocks of most global companies. So why aren’t companies making supply chain quality a huge priority?”

Checking Incoming Feedstock

“A manufacturer monitors incoming quality for two reasons,” explains Paul LeMert, director of business programs for Wonderware’s eManufacturing Systems Group. “One is to ensure that the supplier is providing material that meets the specification laid down in the purchase agreement and second, more importantly, to correlate the incoming quality attributes against performance during the manufacturing process.”

Automotive manufacturers figured the first part out years ago. They’ve also figured out how to get their suppliers to do all the quality control work for them at little or no cost. Russ Agrusa, president of Iconics, says that some auto companies demand suppliers provide quality data on 100% of the incoming parts.

One such supplier is Ordnance Engineering Associates (OEA), Denver, a supplier of airbag systems. It has to develop a database of information on every aspect of the testing and manufacturing process for every component, and then maintain the database for the lifetime of a vehicle.

“The automotive guys don’t check all the incoming parts,” says Agrusa. “They do spot checks and compare it to the supplier’s data. If you have a good record with them, they pass your parts easily. If you don’t have a good record, you get checked more often.” Agrusa says much of incoming QC checking is similar to the way airlines are double-checking passengers these days. “If you fit the profile for a troublemaker, you get checked,” he explains.

Of course, the automotive people are looking at discrete parts, which usually have a simple pass/fail quality check. What about feedstocks? The same pass/fail checks don’t apply as well. Nevertheless, the quality of incoming feedstock directly affects further manufacturing processes, and you can bet that suppliers will be judged by it.
Dale Evely, consulting engineer in I&C at Southern Co., Birmingham, Ala., requires QC information from suppliers. “Our feedstocks are primarily fuel of various types, such as coal, oil, and gas,” he says. “We periodically sample and analyse the fuel to make sure it meets our specifications.”

“Statistically measuring incoming materials gives a manufacturer good data on the capability of a supplier to consistently produce material to exact specifications,” says LeMert. He says it’s a good practice to compare and contrast materials from various suppliers and determine how they affect a given process. LeMert recommends that all process manufacturers make the following determinations based on incoming feedstock quality:

Q: How does my process respond to variations in each key material attribute?

Q: How do variations affect my cycle times/run rate?

Q: How do variations affect my downtime?

These questions are important, LeMert says, when you have multiple suppliers and each supplies material within acceptable limits. What would be even better, of course, is if each of those suppliers provided QC information with each batch of feedstock. Presuming that you make the determinations listed above, you could automatically adjust your process to account for minor differences in feedstock.

“It is very common in the refining and chemicals industries for suppliers to provide a certificate of analysis or product quality,” says Jim Christian, principal consultant at Honeywell Industry Solutions. “Nearly all products in these industries must meet octane, vapor pressure, and many other specifications. Ethylene produced in one chemical plant must meet a purity spec to be used in another chemical plant, and so on.”

Christian says quality data typically characterises properties of the product, such as purity, viscosity, density, chemical composition, energy content, and so on. And, Christian says, refineries use this data to control processes. “Many refineries use incoming quality data in advanced process control. Advanced applications such as soft sensors and inferential calculations use it to improve their quality estimators. Feed quality information can also be used to automatically change production modes.”

Why aren’t more companies doing this? The excuse, one supposes, is that appropriate software has not been available until now.

Come and Get It

Customers of Koppers Industries can call up quality data over the Internet using a browser, put the data into charts and graphs, and manipulate the data any way they want using Northwest Analytical statistical software. When satisfied, they can output the data in tabular form, then cut and paste it into an Excel document for input into their process control system.

Regulated Solutions

“We’re drowning in data,” says Agrusa. “We have terabytes of data stored away describing manufacturing processes.” The problem, he says, is that little of it is regulated. “In the food and pharmaceutical industries, the FDA sets the rules. They say what data will be collected, how it will be collected, and how it will be presented. The rest of our industries are self-regulated.”

Jeffrey Johnson, senior project manager at Intellution, says FDA Regulation 21 CFR Part 11 is the critical rule for drug companies. “Pharmaceutical companies have historically created and retained batch records that document virtually every step of the production process,” he says. Although the various processes were thorough, they were far from foolproof, and the resulting paper documents were becoming unwieldy. Enter the FDA. “The FDA’s regulation mandates how companies are to create, store, and retrieve electronic records and the corresponding electronic signature,” explains Johnson.

But in addition to standardising the record-keeping aspect of quality data, 21 CFR 11 also makes life easier for quality people. “FDA-regulated businesses that adhere to 21 CFR 11 will avoid the sting of fines, penalties, and inspectional observations,” notes Johnson. In other words, supply your quality data correctly, and you don’t “fit the profile” anymore.

Alas, if you are not in a FDA-regulated industry, you don’t have 21 CFR 11 to guide or rule you. For years, companies who shared feedstock quality data almost always used the same software package, or they agreed on a format.

Dow Corning, Midland, Mich., uses the same software in 35 of its manufacturing plants worldwide. Using the same software makes it easier for the company to share quality data when the output of one process is used as the feedstock for another. The data is easily obtained because it is kept in the same database at each plant.
While the system has many other advantages, it greatly simplifies tasks for control engineers who are trying to use quality data for process control. One of Dow Corning’s research projects uses a combination of statistical and first-principle models to make correlations between process variables and product quality attributes. “This allows us to certify product quality by ensuring that our processes are well controlled, rather than by extensive laboratory quality testing,” says Barry MacGregor, company manager, manufacturing systems.

Using such data, Dow Corning was able to satisfy the needs of a customer that required extensive finished product quality assurance testing prior to shipment. MacGregor explains that this laboratory testing was costing his company about $500,000 per year. “We went to our customer with a proposal: If the product stays within certain statistical quality control parameters during manufacturing, then by definition the product should meet the customer’s specification,” says MacGregor. “We agreed to provide all the QC data needed to prove this by giving the customer access to data in the PI System.”

The customer agreed that the QC data was sufficient, and Dow Corning was able to greatly reduce its laboratory quality assurance testing.

Generating the Data

The easiest part of all is generating quality control data. Laboratory information management system (LIMS) software packages are available everywhere for extracting data from plant laboratories and making it available in a database or on the Internet. Statistical process control (SPC) and statistical quality control (SQC) software packages exist by the score. These examine the raw quality data from online analysers and instrumentation, grind it all up and present the results as control charts, histograms, X-bar charts, moving averages, and a host of other tools that make sense to quality control people.

We don’t want to get into the mechanisms for obtaining quality data, because that is an entire article unto itself. Suffice to say that virtually every process control system on the face of this Earth has the ability to load up an easily obtained software package that will capture this data for you and put it into whatever form you need to ship it on to your customer.

Likewise, every process control system has the ability to take quality data from another system in its own software family, plug it into its real-time control algorithms, and control the process using QC data from a feedstock supplier. That assumes, of course, that supplier and customer are running the same software.

The trick is to find a way to obtain the QC data you need in a standard form, so you can take quality data from anybody, plug it into your control system, and use it to run your plant. Similarly, you need a way to send quality data to your customer in a form they can use. And that’s the rub. There is no system available that will guarantee you can do that.

There is hope that such systems will be coming our way soon, based on OPC. “OPC is a key player in all this, because the OPC standards have created a standardized interface at the software level between applications,” says John Weber, president of Software Toolbox. “This makes it easier than ever before to get at the data.”

OLE for Process Control, or OPC, is a standard that makes it possible to connect software programs from different vendors. In a nutshell, instead of each software package requiring a custom driver to understand the output of another program, OPC programs conform to a standard way of defining attributes about data. This can include range information, data type, quality flags, date and time information (down to the millisecond), etc. Although nothing about Microsoft standards is as simple as it seems on the surface, OPC does make it a lot easier than previous methods for connecting, say, the output of a SPC/SQC program to the input of a process control system if both subscribe to OPC.

Another way is to make data available via the Internet or a private extranet. That way, the data is in XML, HTML, text, or some other universal format that can be manipulated easily and downloaded into a control system.

Koppers Industries, Pittsburgh, has such a system. Koppers makes carbon pitch, coal tar distillates, and phthalic anhydride and ships products to customers in rail cars. Customers typically ask for quality data on water content, density, softening point, flash/fire point, toluene insolubles, coking value, and viscosity. This data is derived from laboratory testing on the finished product and gathered by NWA Quality Analyst Web Server software from Northwest Analytical.

The Quality Analyst software creates a web page containing all the necessary data for each batch, and Koppers makes this available to its customers over a private extranet. Customers can browse the data using a standard commercial browser. Tushar Lovalekar in the IT department at Koppers says that customers can call up their batch and use NWA tools to analyse raw data and create control charts. “A customer can also create a tabular output on the screen, then cut and paste it into an Excel document,” says Lovalekar. “This lets them input it directly into their process control system.”
Since it takes several days for the rail cars to arrive, customers have adequate time to prepare. “Our customers find the forward view of what’s coming very helpful,” says Charles Kraynik, carbon materials product manager at Koppers. “It is most definitely a competitive advantage for us.”

Koppers got into this because a major customer called up and said they were buying feedstock from another company that made such information available, and they asked if Koppers could do it, too. “We looked at what they were doing, and decided that not only could we provide such data, we could do a much better job,” says Kraynik. Site analysis numbers indicate that some customers glance at the data occasionally, while others use the data extensively.

As NWA’s Yee points out, supplying data in this manner requires a forward-looking company. “The idea of ‘open kimono’ manufacturing, or allowing customers to see into the operation, is a scary idea. Eventually, we believe that companies will come to see such openness as a competitive advantage. The best manufacturers will be proud to show how well-run their processes are.”

I’ll Show You Mine

In many cases, marketing drives technology. So what might happen in the near future is that well-run companies could see sharing QC data as a competitive advantage. They would start marketing the fact that they show all their data, but their competitors (like you?) are not making quality data available, so they must have something to hide.

As we all know, there are some production processes out there that are better left under cover, because it’s a miracle that they last through the day or survive between OSHA and EPA inspections. Running SPC or QC on such a process would depress a regression analysis and put histograms into hysterics.

But if your plant is well controlled, and you are proud of the tight SPC and SQC numbers being produced, consider making it public. It could be a competitive advantage.

You may be driven to it anyway, so be prepared for the day that your marketing department comes calling and wants you to provide QC data to customers. Your response should be: “Send QC data? Sure, we can do that. I thought you’d never ask.”

Quality Analyst software is supplied and supported in the UK and Ireland by Adept Scientific plc, Amor Way, Letchworth, Herts. SG6 1ZA; telephone (01462) 480055, fax (01462) 480213, email quality@adeptscience.co.uk; or see Adept’s World Wide Web site http://www.adeptscience.co.uk/. Adept Scientific is one of the world’s leading suppliers of software and hardware products for research, scientific, engineering and technical applications on desktop computers.

With offices in the UK, USA, Germany and throughout the Nordic region, Adept Scientific is one of the world’s leading suppliers of software and hardware products for research, scientific, engineering and technical applications on desktop computers.



Top of the Page

Our Privacy and Terms and Conditions Statement
All Trademarks Recognised. Copyright © 2013, Adept Scientific plc.
Site designed and maintained by Lyndon Ash

Adept Scientific | Amor Way | Letchworth Garden City | Herts | SG6 1ZA | Tel: +44 (0)1462 480055