Biography of Walter Shuhart
The fate of great ideas are rarely happy. Control cards are no exception. You can even say that this is an idea with a particularly unsuccessful fate. Judge for yourself. In the year, Walter Shuhart, an employee of Bella Laboratory, found a seemingly obvious thing. He realized that everything in the world is variable. You say that heraclit is still, God knows how long, he said that "you can’t enter the same river twice." But Shuhart went further, he found out that different systems have different variability, that it, as it were, is one of the most important, most fundamental, properties of any system.
With the help of variability, systems can be distinguished as distinguished by people by fingerprints. But this, if the systems do not change significantly, but remain relatively stable in time. So it turns out that fingerprints are a more stable characteristic of a person than variability - a system characteristic. Although I don’t want to, we can’t do without an attempt to determine the term “system”.
Such definitions "darkness". Therefore, we will choose for our case something close to the definitions of E. Deming, for example: "The system is a set of some elements, components, subsystems, choose what you like working for a common goal." From the fact that I discovered Shuhart it followed that no results make sense to represent discrete meanings, since without taking into account variability in themselves they mean nothing.
Moreover, they can easily mislead. Did they know this to Shuhart? Of course, people understood that there was a "natural measurement error." Only they did not associate it with the properties of the system. Mistakes were considered as an annoying obstacle that prevents us from finding out the "pure truth." It was Shuhart that showed that this is not at all.
On the contrary, the variation just represents the "true property of any system." It would seem that this discovery should radically change the rules for presenting information in all areas of human activity, or, at least in scientific publications. But look at scientific magazines over the past 90 years. Find an indication of an element of uncertainty will be a rare success.
Only relatively recently such a practice was legalized in some fields of scientific research, such as measurements, for example, in analytical chemistry. You can also give an example the adopted standards of medical publications. Unfortunately, very often the assessments of uncertainty in the immediate consideration look like parodies: they rely on the premise about the normal distribution of the studied characteristic, which, as a rule, is very far from reality, and, moreover, and is not checked.
Logic is clear. If you are studying the "essence" of some very important and very subtle question, is it worth it to be distracted by any secondary details, especially since they are not according to your part? The answer is clear. Why does progress go so slowly, why are we not ready for changes even in cases where there is no doubt about their expediency?
It is unlikely that we will find a simple answer to this simple question. I think that one of the most important circumstances is in the education system. In her inevitable inertia. But is it only? For example, the idea of dividing labor was first published by Adam Smith in the year and after that his book “The study on the nature and causes of the wealth of peoples” was reprinted many times.
And the first practical implementations of this idea relate only to the line of the nineteenth and twentieth centuries, to the works of Henry Ford the Younger, related to mass production of cars. What did people do with a quarter? Why did they not immediately take up the division of labor? After all, Smith's arguments were very convincing, and the implementation itself did not at all require significant investments.
Or do people don't want to live better? Maybe ideas arise in advance so that people can get used to them, improve, reconcile? Many questions and there are no answers at all. And in the case of control cards, only about 90 years have passed, there is not even a century, can it be worried early? Let's talk about fifty years? In the end, we do not sit and do not wait for the next revolutionary idea to fall upon us.
Everyone has their own affairs, worries. Even if we first read about it, and even immediately understood the greatness of the new idea, which in itself is not very likely, and then, it is unlikely that we will immediately deal with its implementation. In addition, the novelty always scares us due to a sharp increase in uncertainty and appropriate risks, which is not possible to measure part at once.
So it turns out that we have been losing years before we are ready to implement new ideas. There is one more important circumstance: ideas never come alone. Behind them stretches a long train of inevitable consequences. And in our case, such a train is evident. The fact is that Shuhart did not stop at the statement of the variability of the world. If variability is an immanent property of the system, then its changes, especially sharp ones, should bear information about some events taking place in the system or around it.Why not use this information?
But how? The first thing that comes to mind is to evaluate the variability quantitatively and organize its monitoring in time. Of course, it is most convenient to make an assessment and monitoring visual. Moreover, given that at that time the computers were not yet in use and all the calculations had to be performed with almost “bare hands”. Therefore, Shuhart proposed to build a simple schedule: on a horizontal axis - time, in one form or another, rarely “physical”, and according to the vertical axis - a certain indicator that is important for our client, for our bosses, or for ourselves.
Now we have a clear simple graph that tells the history of the system in terms of the results of its consistent, ordered in time, indicators. The same data presented in the tabular form do not have such clarity. It turned out that visualization greatly facilitates the understanding of the current situation. But for the convenience of reading this schedule, several additional lines were not enough that could normalize it.
The first thing that comes to mind is the construction of a certain line parallel to the horizontal axis, which in a sense would averaged the data sequence. The easiest way is to build an arithmetic mean for a certain initial set of data. Having built such a line, Shuhart discovered that it has unusual statistical properties. In classical statistics, it is assumed that with an increase in the number of observations, the average value asymptotically strives for its mathematical expectation, that is, to an unknown "true" value.
But the average card has no true value, and it does not seek anywhere. Therefore, I had to abandon the standard model and build an analysis of the map on other grounds. It is not necessary to interpret the average statistically. Other interpretations are also known, for example, mechanical. Then the average will simply balance the "masses" concentrated in the results of the measurement.
It remains not clear, why did Shuhart not change the terminology and designations?
He involuntarily misled many people who turned and turn to his work for 90 years. Meanwhile, simply the rejection of the words and designations of the average and, as we will see later, a quadratic mistake, would remove a lot of errors and would exclude unnecessary discussions. No one is forbidden to consider control cards as a tool for checking statistical hypotheses, only Shuhart had in mind another.
It is interesting that E. Deming, a younger contemporary, friend and follower of Shuhart, who expanded, summarized and widely distributed the teachings of Shuhart, also did not feel the need to change the terminology and designations. The next natural step is to apply the borders showing the scale of the system variability. But how to do this if not everything is safe with statistical models?
Indeed, in practice, we never have information about the laws of distribution of random quantities we are dealing with. And there is no point in imposing reality clearly unrealistic models. Shuhart found a replacement for a statistical model in the form of operational definitions. The American physicist Paul Bridzman just published his famous book that put the foundations of the philosophical concept of operationalism in the year.
Like everything in the world, operationalism is not impeccable, but, nevertheless, it seems less evil than the prerequisites for statistical models. So, we need boundaries that will allocate a strip inside of which our system is subject to only natural variability for it. It can be assumed that in this case the system is stable, that is, we can describe its behavior with the accuracy that depends on variability.
The less variability, the more accurate the description of the system. It remains only to draw the boundaries of the strip containing uncertainty due to variability. Shuhart noted that if you use the statistical formula for calculating the quadratic error, and put up up and down from the average of three such "errors", you get a strip, just suitable for distinguishing a stable and unstable system.
It is such a goal that is looted in our schedule. It is important to emphasize that Shuhart did not put any statistical meaning into the quadratic errors of Sigma. This is not at all an assessment of random values. These are determined constants, calculated in this way that provokes the idea of their statistical nature. And the output of the point beyond the border of the strip is an operational determination of the exit of the system from a stable state.