Making the Case for SPC | Quality Insight

August 18, 2023
9 min read
General

SPC tells when to leave the process alone and when to react. By using SPC, companies can minimize the variation of their process by identifying, reacting to and eliminating extra sources of variation.

Implementing statistical process control (SPC) takes energy and resources, so companies had better be able to show a financial benefit for having done it. But how do they show this benefit before implementation? And how do they do that without hearing the coconut-like sounds of heads hitting desks?

Managers speak the language of money, so companies need to phrase these potential savings in their favorite metric: “S-bar,” also known as a dollar sign.

A barrier one runs into during these discussions is that there are a number of managers who have been trained by their own company to think of quality as in-spec. Some people actually think that they are being practical when they say such a thing, when in fact, this way of thinking costs businesses a lot of money.

Some put it this way, “If it is in-spec, it is good. If it is out of spec, it is bad. So why do I need to do SPC?”

How does one explain to someone that in-spec can mean unnecessary financial costs, and that SPC is a way to begin recovering those costs?

Taguchi’s Loss Function

One way is to explain the benefits using Taguchi’s Loss Function. This can convince battle-hardened front-line managers that “in-spec and out the door” is tossing money in the trash.

Introduce the Loss Function by starting with a thought experiment. Let’s say that someone is running a simple business making parts. If they start from the premise that in-spec = good, out of spec = bad, then what is the cost incurred by the business and that of their customers if a part is pretty far outside of the specification limit?

It should be easy to figure out what these costs are-they might have to scrap it, losing all the expense they have put into it up to that point, or perhaps perform a lot of rework with the associated costs in time, personnel and material. Maybe they can call the customer to get approval to send it on deviation, and they might have to do a custom setup to handle the out-of-spec material. As a part is further out from the spec, the costs associated with trying to make it work increase rapidly, and they probably don’t have to get too far out before the costs outweigh any benefit.

Now ask the manager if a part that is barely outside spec incurs the same costs. The answer is probably no, though it still incurs cost. Someone might be able to lightly rework the part, and they still might have to call the customer to send it on deviation, or tweak their setup a bit to handle it.

So there is a continuum of incurred costs so that the further outside the spec, the more loss is associated with it.

Now consider a part that is barely in spec. Does that perform much differently than the part that was barely out of spec? Probably not, and so customers will likely have some lesser difficulty in getting this to work. There may be long-term consequences to having a part that is barely inside the lower spec one day and a part that is barely inside the upper spec the next. As far as internal costs, if companies charge by the part, they may be giving away material they bought by the pound if the part is on the high side, or perhaps due to measurement variability there is some probability that they are likely to find a borderline in-spec part as out of spec and try to rework it.

The ideal is if they can get the part where the customer needs it-right on target time after time-so one can easily see a part that is on target is one that has the lowest cost to the company and its customers.

The target is the minimum cost, and once we approach and go beyond the spec limit, there is an increasing cost to the part. But what costs do we experience further inside the specification?

As the variability increases, so does the need for an infrastructure to catch out-of-specification parts. For example, as the variability approaches the spec limits, there may be a need to increase the sampling rate, and therefore need an ever-increasing number of non-value added activities adding to inspection overhead. In-process part variation may cause greater internal scrap rates or process costs as well.

Therefore, a part that is in-spec can incur cost on a continuum as well-as the variability increases, so do the costs at an ever-increasing rate. If something is made out of spec, it has no quality, or the characteristic of “un-quality” because the customer is not getting the promised product. Poor quality starts at the spec limits and quality gets better the closer one is to target.

This is the essence of the Taguchi Loss Function. Taguchi concluded that these costs can be modeled by a parabolic curve.

How does variation even within the specification affect the cost of the process? We can use the Taguchi Loss Function to give an answer.

Where k is the constant that makes the loss numbers match the loss from a particular process. A $1 loss has been added to everything to indicate that the loss may not be zero even at target. Note that the specifications do not even come into the calculation. Specifications are the allowable variation about a target that their customer can tolerate, and ideally were generated by thinking through these costs.

These processes have different quality measured by the total process loss. Perhaps surprisingly, the uniform distribution indeed has the highest cost, even discounting the other hidden inspection costs that a uniform distribution might have.

So the engineer or manager looking to sell their boss on investing the time and money into implementing SPC has a number of arguments:

  • SPC reduces process costs by ensuring that process adjustments only happen when they need to.
  • SPC reduces the variability of the product or service output, thus improving customers’ quality experience.
  • The Taguchi Loss Function shows that reducing variability around the customer’s target reduces costs.
  • The data generated from SPC can be used to capture further process cost reductions.

Saving Money

In management’s eyes, the purpose of SPC should be to save money. The Taguchi Loss Function is an easy way to communicate to managers the true cost of variation-costs that do not show up as a line item on any balance sheet-and is useful to justifying the investment in implementing SPC.

SPC tells when to leave the process alone and when to react. By using SPC, companies can minimize the variation of their process by identifying, reacting to and eliminating extra sources of variation. Minimizing variation around the customer’s target makes companies and their customers more money-a statistic that is near and dear to those who hold the purse strings.

SPC Deployment

How would the Taguchi Loss Function (TLF) help an SPC deployment in a real situation?

Trane is a global manufacturer of heaters, air conditioners and ventilation systems. In 2006, Trane implemented an integrated SPC system using WinSPC software in 26 plants across the globe.

Randy DeGier, Trane’s global operations quality element leader, says of the implementation, “Leadership doesn’t spend that amount of money unless you provide them with a payback, so we naturally had to provide that to them to get the purchase justified. We conservatively estimated a one-year payback on our investment.”

DeGier continues, “Now we are getting into it and getting questions about how we are measuring the benefits of SPC.” DeGier’s first attempt at quantifying savings was a Costs of Poor Quality (COPQ) model that “rolled up direct and indirect costs out of operations.” He planned on using this to determine the cost benefit of SPC.

“The problem is that while measuring direct costs are easy, you get into arguments about what project or process generated the savings. We tried to get WinSPC administrators at each plant to measure COPQ, but we still had the same issues.”

DeGier identified a different way to quantify the cost benefit of SPC implementation using the Taguchi Loss Function equation to estimate the costs saved due to reducing and controlling the variability.

“We are going to take an example process in the plant where we are going to do improvements anyway and use the TLF to establish the (loss) curve. On the longer term, it would be useful to the owner of the process for justifying process improvements.

“We are devoting more effort to design for reliability. We have always tried to do this, but never had a way to measure it.” Using the TLF, Trane will be able to quantify the loss due to variation from the target. “The closer you are to target, the longer the life of the product.”

Using SPC to track their process will allow Trane to detect when a process is subject to unusual sources of variability, so that they can correct it before it becomes a larger problem. As they improve their processes, Trane can use the TLF to measure the improvements in terms of dollars, and their customers will in turn see an improvement in quality and reliability.

About DataNet Quality Systems

DataNet Quality Systems empowers manufacturers to improve products, processes, and profitability through real-time statistical software solutions. The company’s vision is to deliver trusted and capable technology solutions that allow manufacturers to create the highest quality product for the lowest possible cost. DataNet’s flagship product, WinSPC, provides statistical decision-making at the point of production and delivers real-time, actionable information to where it is needed most. With over 2500 customers worldwide and distributors across the globe, DataNet is dedicated to delivering a high level of customer service and support, shop-floor expertise, and training in the areas of Continuous Improvement, Six Sigma, and Lean Manufacturing services.