Arrive with an idea, leave with a 3-year plan. Achieve reliability.

TRC gives you access to cutting-edge knowledge & technology

Sign Up

Please use your business email address if applicable

Failure Mapping

Failure Mapping

Patterns and Relationships

"Nature uses only the longest threads to weave her patterns, so that each small piece of her fabric reveals the organization of the entire tapestry." Richard P. Feynman


Every day, the TV business news features two or three individuals who purportedly are able to tell the audience why the stock market went down or up that day. Although some of these individuals are regulars, or have been on TV several times, many are individuals who are on TV only once and who have earned their fifteen minutes of fame by having been in the stock advice business for a long time. The opportunity to be on TV is probably a plum for them because it provides them with some free advertising; maybe some name recognition, and ultimately a few new customers. The belief is that, if these individuals can explain what just happened to the market, they likely as not understand it well enough to recognize impending events before they occur and their customers will prosper
from that foresight.

If you watch these shows closely, you will soon realize that these so-called experts simply identify some event or combination of events that form a reasonable hypothesis for the cause of the change, and they describe their hypothesis in a very compelling manner. In some instances, the event that is identified as the cause of the change reverses itself the very next day, but the general stock market does not reverse itself. In other words, there is no well-defined relationship between the recognized pattern and the following event. 

Ask yourself: if you were able to identify a variety of patterns that had direct relationships with specific changes in the stock market, how would you use the information? Would you be on TV trying to convince the audience how smart you are? For me the answer to the first question is that I would use that knowledge to make a ton of money for myself. The answer to the second question is that, rather than being in a TV studio, I would be on a yacht, voyaging around the world and phoning in a few critical choices to my brokers to keep my pile of money growing. Obviously, the more people I let in on my secret, the more I would dilute my own results.

In other words, I don’t believe these experts really know the causes and effects. In the financial world, there are no absolute relationships. There are recognizable patterns, but the element of human emotion tends to weaken the tie between any recognizable pattern and a specific event.

One might ask: if the financial example does not meet the gold standard as an acceptable form of “patterns and relationships,” then what does? There are myriad examples of instances where a pattern of circumstances or events is always followed by a specific event and the relationship is so certain that it can be viewed as cause-and-effect. Several examples are:

  • Corrosion of a pressure-retaining device resulting in a failure.
  • Repetitive fatigue cycles leading to failure.
  • Exposure to loads beyond the design limits, leading to failure.

In an environment in which all failures are not studied in a scientific manner, it is possible that patterns are concocted and relationships are imagined. A good comparison is with gremlins causing problems, or angels providing assistance when needed. Although those are extreme examples, many people do get into the habit of relating failures to circumstances that may have little if any bearing on the failure. 

In his book “How Do We Know What Isn’t So” Thomas Gilovich provides a number of examples of everyday life situations in which people see patterns leading them to beliefs that relationships exist that are totally unfounded. An interesting example has to do with previously-barren couples who adopt a child and then the wife becomes pregnant. According to the non-scientific system of beliefs, the fact they have adopted a child allows them to “relax,” and then for nature to take its course, leading to a pregnancy. 

In this situation, so long as both members of the couple are healthy and capable of conceiving, they will either have a child or they will not. The statistical likelihood of conceiving remains the same independent of the adoption. If the adoptive parents later conceive, that fact tends to support the biased belief system and adds one point in favor of the notion. If they do not conceive, the situation merely confirms the logic behind the adoption. Thus, no points are scored against the unscientific notion.

In an industrial plant, it is not uncommon to look for a relationship between poor performance and the individuals who are on duty when a problem occurs. With heavy mobile equipment like locomotives, it is not uncommon to look for a relationship between weather and failure frequency. Although those factors may occasionally appear in a pattern leading up to a failure, more frequently they do not.

With poor operators on duty, it is most likely that a number of other elements making up a pattern leading to a failure already exist, and the presence of the poor operator is the final element. One or more of the poor operator’s marginal practices evidently Patterns resulted in conditions that were no longer tolerable by the already weakened systems.

Most locomotives are built to make it through any kind of weather. Extremely hot, cold, or wet weather tends to identify the units that have existing weaknesses. The failures occur because of the weaknesses, not the weather. Weather will happen. Although it is not “ordinary,” it is expected.

In a business setting, there are two important characteristics that lead to what one of my professors used to call “stinkin’- thinkin’.” as exemplified by the following circumstances:

  1. The rigor of the scientific method has not been applied to under standing any of the perceived “patterns and relationships”.
  2. There is no organized system that forces rigor to be applied to all circumstances and events in a disciplined manner.

As a result, life goes on and people remain insistent upon the validity of their beliefs, even when there is no real data to support them. In a home or personal setting, these assumptions lead to problems associated with normal human biases such as:

  • Racism
  • Ethnic differences
  • Sexual biases
  • Etc.

Although those issues are troublesome, it is well beyond the objectives of this book to address them. Conversely, those same problems exist in asset-based business operations, and it is the objective of this book to address those problems.

Most, if not all, things fail as the result of a consistent pattern of events. The elements of that pattern are different for each and every failure, but the pattern is consistent. If the pattern is understood, and the events in the pattern are treated in a consistent manner, a starting point exists for quickly dealing with the failures and even eliminating them before they occur.

The starting point is to begin ridding people and organizations of non-fact-based, unrealistic belief systems. This exercise probably sounds easier than it is. People feel comfortable with their beliefs, and giving up their biases is tough. Often, personal identity and esteem is built upon those biases and asking that they be relinquished leaves the person in uncertain territory.

Suppose I am an operator and I have a belief that the systems I operate are inherently unreliable. I believe that these systems fail on occasions because of unreliability and are in no way due to my poor operation. Based on that scenario I can continue to maintain a belief system that supports my self esteem as a capable and knowledgeable operator.

Now you come along and tell me that something I am doing is causing deterioration and, ultimately, failures. My belief system tells me that good operators do not cause failures. It is important to me to view myself and be viewed by others as a good operator. The two views are inconsistent. I cannot both believe you and continue to maintain the belief system I have held for my entire career. This contradiction causes what is called a “personal crisis”. It is a situation in which I am learning for the first time what the world around me has known for quite some time.

As stated above, this dilemma is the starting point. All members of an organization must have this kind of epiphany, and they must become so hardened to the painful truth revealed in dealing with the facts of each and every situation that these facts are no longer painful. 

It is far easier to deal with information that you only half believe and that is probably why the TV commentators mentioned at the beginning of the chapter continue to be employed. Can you imagine how the world would be if everyone in the public media described only exact facts? First, they would be confined to saying very little because the portion of what they say that is certain fact is quite small. Second, the public reaction to their revelations would be devastating. The TV commentator would say that a recession or depression was coming and everyone would go into hiding. Our intuition that most people exaggerate their knowledge is helpful in situations where a significant portion of what is being said is for entertainment value.

The problem is that this TV mentality tends to intrude into our work and business lives. We allow people to exaggerate in situations where facts and only facts should be the standard. But to administer this standard of facts it will be necessary for people to say only what they know. To ensure that individuals say only what they know we must ask for information in a different manner. Our manner of asking must restrict the answer to the range of possible facts.

Think about it, when a person in power asks a question, there is an implied expectation that an answer can be provided. Many of the structured systems used by our organizations contain a further implication that the questions are being asked by the individuals in power. When something breaks, a failure reporting system asks a person to describe what went wrong. The conditions created by the system compel the person to provide a response. Not knowing what form of response is expected, the respondent provides a colorful description that is primarily intended to distance him from any possible blame. As a result, a lot of useless information is provided. 

Added color and added distancing in the description creates a starting point that cannot be reconciled with other data to describe a starting pattern for mapping failures. 

The same situation often exists at the endpoint of a failure that has been repaired. The greatest desire on the part of both the individual and the organization is to tackle the next problem rather than taking the time to document the previous one. As a result, the endpoint of the failure map is left unpopulated or only poorly populated, and mapping is impossible. 

Again, this sequence of events creates the need for yet another epiphany for individuals performing the repair, documenting findings, and searching for cause. Complete organizations as well as each and every member need to be committed to collecting the information needed to create failure maps. 

In the distant past, there was a detective series on television titled Dragnet. As the show’s leading character, Sergeant Joe Friday was remembered for the guidance he always provided when asking questions. Joe always said, “Just the facts” to emphasize how important it was for people to only describe what they knew for certain. 

As stated above, Failure Mapping is a system that recognizes patterns of events or conditions that have a direct relationship with subsequent failures. Our ability to accurately describe the meaningful elements of those patterns and then to recognize the relationship between them and a failure depends on our ability to capture simple, unadorned facts.

ChatGPT with
ReliabilityWeb:
Find Your Answers Fast
Start