Don't miss MaximoWorld 2024, the premier conference on AI for asset management!

Experience the future of asset management with cutting-edge AI at MaximoWorld 2024.

Sign Up

Please use your business email address if applicable

How the RCM Scorecard Morphed into The RCM Project Manager’s Guide

The RCM Scorecard provides prospective RCM project managers, their supervisors, champions and team members with sets of metrics or key performance indicators (KPI's) they should be aware of that define results from four (4) phases of any RCM effort they are likely to undertake regardless of the RCM approach chosen or the asset to it is applied. The phases addressed are called "Decision," "Analysis," "Implementation" and "Benefits." The new document, entitled the "RCM Project Manager's Guide," uses the RCM Scorecard as a base and also contains:

- explanations on pitfalls to avoid,

- success factors that work and

- readiness factors an organization must ensure are in place

prior to committing further on an RCM project (i.e., during the Decision Phase).

Background: As differing approaches to the methodology we now call RCM began appearing, a major controversy emerged within the Maintenance and Reliability (M & R) community concerning which approach to RCM was "best" in terms of benefits to users and protection against legal liability in the event of major catastrophe involving an asset to which RCM methodology had been applied. This erupted strongly after a series of conference papers and professional magazine articles appeared that touted SAE Standard JA1011 RCM as the only valid, legally defendable approach to RCM.[3] The defense was based on good logic and used graphic, world-famous examples such as the Bhopal, India, December 1984, Union Carbide plant disaster, but showed no direct relevance to any given RCM approach used before such incidents per se.

In an effort to quiet the controversy it was proposed initially that a set of metrics could be used to define the results of any RCM effort. Later, the concept of developing an approach to covering all phases of a project was proposed to provide a metrics-based method of evaluating progress and benefits during its various phases. Ultimately, comparison of approaches to RCM became subordinate to simple use of the various metrics to help manage and evaluate results of an RCM project.

The effort to produce the RCM Scorecard was supported by It was facilitated by the author of this paper, by then a former RCM practitioner with over 30 years experience with the various approaches to the methodology who no longer competed for RCM analysis work and had gone on record as a neutral observer pledged to focus on the positive aspects of RCM in all of its forms.

The resulting document is the result of a truly collaborative effort, now available free of charge to all who have an interest in it.

A draft RCM Scorecard was made public via in January 2005. A workshop was convened in March 2005 to discuss and refine the approach. The workshop was attended by over 100 persons from many different types of organizations from 16 countries. A significant number of leading practitioners of the various approaches to RCM participated as contributors and co-facilitators of breakout sessions addressing the four phases mentioned above. The result of this extensive, one day's effort was a consensus regarding what the content of the RCM Scorecard should be.

The remainder of this paper describes the initial controversy and it resolution in more detail. It also describes what happened after the consensus document was published and how it morphed into becoming the RCM Project Manager's Guide.

Revisiting the Controversy leading to the RCM Scorecard: Any one having experience with an RCM-based maintenance program after it is implemented can attest to its value and effectiveness compared to earlier approaches to maintenance. However, once more experience is gained in analysis and actual implementation of results on several systems or plants, it becomes apparent that patterns of failures, failure modes and effects are present in many classes of components (pumps, valves, motors, piping, instruments, etc.,) regardless of their operating context, installer or manufacturer. Performing full-blown RCM analysis on a system very similar in design, function and operating context to one already analyzed was also not always possible due to limited finances, personnel assets and time.

Thus, a search began in the early 1980's for means to obtain close to the same results in similar assets using the refined outputs from those analysis during which all of the seven (7) steps and essential elements of "Classical" RCM had been thoroughly documented.

This led to the development of variants and derivatives from the Classical RCM approach and a stratification of RCM methodologies as illustrated in the following figure.

RCM Morphed

The terms used in the figure above are defined below[4].

Classical RCM - The analysis approach described in the United Airlines report to DOD in 1978[5]

Super-Classical RCM - an approach that is more rigorous than the Classical RCM methodology

RCM Variant - An RCM approach that skips or combines steps found in Classical RCM or incorporates substitutes for or supplements to Failure Mode and Effects Analysis in order to reduce the time and resources needed for a project

RCM Derivative - an analysis approach that produces a non-redundant, RCM-like set of tasks (Time Directed Intrusive and Non-intrusive, Condition Directed and, Failure Finding) derived from what is already in the Preventive Maintenance and/or Predictive Maintenance (PdM) Program or within the capability of PdM technologies used.

As RCM variants and derivatives began to appear, the question of what constituted "real" Reliability Centered Maintenance methodology was raised. U.S. military services had by the 1990's issued standards, instructions and handbooks defining the steps required and the desired RCM process to be followed. However, the key document, the Military Standard, which is where the military services defined "real" RCM, had to be abandoned when the Secretary of Defense in the early 1990's mandated use of commercial standards wherever feasible. The services and other interested parties began vigorous support for development of an RCM standard that would serve commercial, utility and government purposes[6].

The result was a proposed standard issued for membership approval in 1999 by the Society of Automotive and Aerospace Engineers (SAE).[7] Organizations and individual consulting practitioners offering RCM services that adhered to the SAE RCM standard began touting its merits and criticizing any approach that did not. Participants at professional conferences with any track that included the subject of RCM were exposed to the criticism that anyone offering an "approach" to RCM that is not in complete compliance with the SAE standard wasn't doing RCM. Adherents to the SAE standard strongly suggested that practitioners of RCM-like analysis techniques drop the term "Reliability Centered Maintenance" and the abbreviation "RCM" from the title and description of their approach and call it something else. Potential clients were advised to avoid RCM variants or derivatives altogether. The fact that many couldn't afford or embrace (for many reasons) classical RCM or the more rigorous approach prescribed in the SAE standard was largely ignored.

Indeed some suppliers of RCM services developed counter-arguments that implied anyone could afford the approach because of the projected (largely subjectively determined) return on investment that could be expected. Since then there has been a concerted effort by purveyors of all approaches to application of RCM to prove that their approach is better, faster and cheaper, as competition has grown in the field of RCM analysis. Unstated was the reality that companies offering analysis services were rarely involved with implementation and were well away before much implementation and evaluation of the benefits from any given project could be completely proven successful or not.

Basically, the argument between practitioners centered on:

- validity or completeness of results from any analysis approach not in compliance with the SAE standard,

- comparison between results from various approaches to Classical or Super-classical RCM and

- the liability and possible criminal prosecution exposure of any practitioner that failed to use the most rigorous approach.

Thus, you had not only the "ours is better than theirs" argument but use of a "fear-factor" that was not without merit. However, examples used had no validity in terms of involvement with any more (or less) rigorous RCM approach. There was no direct cause and effect proven by actual history.

Purveyors of some RCM Variant methodologies claim the output from their approach yields virtually the same results as classical RCM methodology. Further, claims are made that such results are obtained at a fraction of the cost of using the classical approach. This claim was based initially on independent analyses using classical RCM and an RCM Variant methodology on a single system of a power generating plant in the early 1990s. The comparison study was sponsored by the Electric Power Research Institute (EPRI), which the funded two analyses (one with a classical approach and one with an RCM Variant approach - Streamlined RCM - developed under EPRI sponsorship). EPRI used this result to convince fossil utilities to adopt the lower cost approach to RCM rather than nothing at all. The supporting contractor, Erin Engineering, now part of SKF Corporation developed their own version of the approach to RCM, SRCM.

The controversy reached a peak in late 2002 at the annual conference of the Society for Maintenance and Reliability Professionals (SMRP) in Nashville, Tennessee. There, a paper was presented that showed radically different results from application (by two different vendors) of two different approaches (Classical and Variant RCM) in nearly identical systems used in the same operating context in two plants owned by the same company.[8]

There are many reasons even with the most rigorous methodology for attaining less than stellar results using any RCM-based methodology. These may be generally defined as readiness factors and "pitfalls" described in the new document entitled "The RCM Project Manager's Guide" which is attached as an appendix to this paper. Readiness factors and pitfalls are also discussed later in this paper.

Unfortunately, all this controversy made understanding and any attempt to cooperate to gain knowledge of the relative value or management of projects employing various approaches to RCM very difficult at the time.

Resolution of the Controversy About Which Is the "Best" Approach to RCM:

In a paper presented at the 2003 SMRP Conference it was suggested that consideration be given by leading practitioners of RCM (users, vendors, trainers, etc.,) to cooperating for the purpose of creating a common set of metrics for use in evaluation of projects, approaches and benefits to clients.[9] It was suggested that metrics may be used prior to, during and after a project to educate, assess progress and determine benefits and/or return on investment. These metrics may also be of value in assessing RCM team performance and in selection of contractors to provide support services and products. If industry members cooperated, it was proposed that a more targeted set of metrics may allow refinement for assessment of projects in progress and for comparison of expected outputs of various RCM and RCM-like (variant and derivative) methodologies.

Challenges were issued at various events, including RCM overview workshops at the 2003 and 2004 SMRP Conferences and during a paper on RCM describing the proposed scope of effort at the first Maintenance and Reliability Technology Summit (MARTS) in March 2004. Work began in earnest later that year to develop the draft metrics-based document.

A number of prominent RCM practitioners accepted the challenge and/or agreed to participate as reviewers and advisors[10]. owner Terrence O'Hanlon, having been cognizant of the controversy and in touch with those at the center of it, decided to sponsor development of a document that would outline an approach to metrics for RCM projects that would be acceptable to a broad spectrum of practitioners and possible clients for projects. He named the project and the document that was to emerge from it "The RCM Scorecard." He decided to support a series of actions and events that would yield a definitive result, including:

  • Development of a draft document, entitled "The Preliminary RCM Scorecard" that would be the basis for discussion and critique by anyone interested
  • Publication of the preliminary document on the Internet for downloading and study for a period of over 2 months prior to the RCM Conference in Clearwater Beach in March 2005
  • Facilitation of communications to the Scorecard content originator and facilitator of comments and recommendations for improvement of the document.
  • Sponsorship of a day-long Workshop at the RCM 2005 Conference for the express purpose of having a large number of attendees reviewing and arriving at a consensus on content of the document
  • Subsequent publication on the Internet of the resulting "consensus" document for general reference by non-commercial users without permission and by commercial purveyors of services with permission. followed through on all of the above items.

Based on recommendations from the many participants in its development, the RCM Scorecard was expanded from its original five (5) tables (those first developed by Mac Smith for metrics for the analysis phase of projects) to eleven (11) tables of metrics, measures or KPI's. Definitions of terms used in the document were added for better understanding. In addition, descriptive notes were added at the beginning of the document to provide for consistent interpretation of the numbers that could be derived. A 12th table was added at the end to indicate what constituted desirable directions of trends in metrics included in the benefits tables of the document. As mentioned above, those who took up the challenge at MARTS 2004 suggested that the original purpose of the metrics (comparison of approaches) be subordinated to what they considered more important uses, which are listed below.

Provide RCM users, participants and other interested parties with a "shopping list" or "menu" from which a few pertinent metrics may be selected to help determine:
- Whether or not to do any RCM analysis using any approach on a given system

- Progress during analysis and implementation and to demonstrate how successful a given project is while it is in progress and during the period afterward when benefits are realized

- How successful the project was in adjusting a maintenance program for a system after completion.

Provide managers, supervisors or "champions" with a tool to measure:

- Progress on a given asset or set of assets during analysis and implementation phases of a project

- Benefits derived from the project during and after it is complete.

The controversy over which approach to RCM is best seems to have quieted down recently. It cannot be determined whether or not publication of the RCM Scorecard influenced this one way or other, but now the metrics are there for use if some organization decides to fund a comparison effort.

It also appears that a consensus is building that all approaches may be applicable to a set of assets in a facility or mobile platform. The more rigorous approaches may be used on the most critical assets (perhaps no more than 20% of the total systems in a power generating plant, for example) and progressively less rigorous variant and derivative approaches to the balance of plant.

In some cases, conversion of a maintenance program to an RCM basis may successfully employ Derivative and/or Variant approaches for the least troublesome, "well-behaved" systems. This may be done simply to reduce the maintenance workload in order to free personnel for participation in more rigorous analysis and implementation efforts on more critical assets. Reduction of maintenance requirements is a typical result of RCM analysis on systems that are very reliable and/or which may be safely and more economically "run-to-failure."

Also, after publication of the "Consensus RCM Scorecard" in April 2005, many other RCM practitioners took note of its contents and began to suggest ways to build on the work of those who had contributed to the document in the past.[11] In particular, practitioners of the late John Moubray's approach (laid out in his book RCM II published by Industrial Press in editions during the 1990's) who were initially critical of the RCM Scorecard, began to see the value of the approach. Their fears that metrics would be detrimental to a more rigorous application of RCM were proven groundless, as the document is entirely neutral and silent concerning the rigor of any approach. During and after a workshop at the IVARA Maintenance and Reliability Conference in October 2006, ALADON employees (by then part of IVARA) and RCM II franchisees suggested ways in which the document could be modified to become a useful "guide" for persons assigned to manage RCM projects. Thus, the idea of the RCM Project Manager's Guide was formulated by persons who up to that time had been deeply critical and suspicious of some of the content of the document used as its basis.

As a result of the events described above, sections addressing "readiness to conduct an RCM project," "pitfalls to avoid during execution of an RCM project" and "RCM project success factors" were added, among some other guidelines useful to prospective project managers, supervisors, champions and project participants.[12]

All of the items mentioned above had previously been addressed in RCM overview and related workshops conducted in many venues by the author of this paper since 2003. So, the subject matter had been seen by upwards of 500 attendees at these workshops. Comments from workshop critiques and directly from participants during the session had been progressively incorporated into the handouts used for the workshops.[13]

The RCM Project Manager's Guide - New Content beyond the Scorecard:

Purposes of the "Scorecard" document carried over into the "guide" remain as previously stated. The guide has one additional purpose, listed # 1 and stated as follows:

Provide leaders of RCM initiatives with the knowledge and identification of tools needed to be successful.

The three additions concerning readiness, pitfalls and success factors mentioned in the preceding section of this paper are discussed next.

Readiness to conduct an RCM Project must be addressed during the "Decision" phase of a proposed project.

Readiness of an organization to initiate an RCM Project with a reasonable expectation of success is different from the decision to initiate one. After the decision is made, the commitments made by early supporters must be carried out consistently and continuously or failure may result. Determination of readiness for success of an RCM Project may be assessed using guidelines discussed below.

The designated or self appointed "champion" or prospective RCM project manager (organizer) must determine:

- Whether or not to perform RCM on the basis of expected outcome

- See metrics described in the attached document that may be used and other factors below to consider in making the decision

- Whether or not you are starting a "pilot" RCM project or a more extensive effort

- What the level of outside support will be and for how long it may be needed

- Are you going to internalize the effort at some point?

- If yes, do you have good candidate(s) for facilitators/recorders internally?

- Are you going to be able to get them adequate training, experience and mentoring?

Level of outside support may be determined by:

- RCM methodology or methodologies selected

- Cost per system

- Time commitment of internal RCM analysis and support team members

Leaders of RCM projects under consideration must also determine:

Level of resources to be made available to support the project through all phases (Analysis, Implementation, Benefits) including but not limited to:

  • Funding
  • Personnel to be assigned to the project (must be your best support & Subject Matter Experts - SME's)Facilities & Equipment - RCM Team work space, computer, software
  • Prior history of organization in change management - bureaucratic elasticity - Will it work for a while then return to what you did before?
  • Is your organization saturated by the "Flavors of the Month?"
  • Steadfastness of management & supervisor support for new initiatives
  • Likelihood of recommended changes in what maintenance to perform being permanently adopted
  • Does your organization have a commitment to procedure-based maintenance? If not how are changes to "stick?"
  • Are maintenance requirements routinely performed on the basis of written schedules?

Having considered all of the above and determined that the factors listed will not impede or defeat a project, the designated organizer(s) must ensure commitments for outside support are met.
After commitments are made for outside support needed, organizer(s) must determine:

  • Who is going to provide outside support?
  • Determined by what RCM approach(es) you are going to use
  • Want to assure during procurement phase that support selected has adequate experience, a good track record and credibility with prospective RCM analysis & implementation team members
  • Availability of outside person(s) may be a factor
  • What RCM software is to be used and who will provide it?

Then the project manager must determine, coordinate and schedule:

  • Training of analysis & implementation team(s) and support personnel
  • Orientation of cognizant managers & supervisors
  • Start date(s) for analysis

-  Must ensure no conflicts with major events or peak vacation periods

-  Set so most qualified & credible Subject Matter Experts (SME's) can participate on analysis and implementation teams

-  Ensure key internal support personnel will be available

-  Notify all prospective participants well in advance so no one is surprised or given rationale for opting out

-  Advise everyone chosen of seriousness of their commitment and expectations for constant participation in the long term

-  Availability of dedicated work space conducive to good effort for duration of analysis & implementation phases of the project

Must schedule/acquire:

-  Support equipment (computer & peripherals, RCM software)

Support personnel should assemble documentation needed by the project team:

  • Maintenance history
  • Copies of PM & PdM procedures/schedules,
  • System & equipment tech manuals,
  • Performance standards,
  • P & ID's and Electrical Drawings
  • Stockroom or in-place spares inventory lists,
  • Training manuals on systems and equipment,
  • Operating procedures & checklists,
  • RCM reference materials covering the approach(es) to be applied, etc.,

The project manager must decide:

  • What reports are to be made and to whom
  • What the backup plan/schedule is in case of plant emergency
  • Include contingency for outside support
  • What the plan is for implementation
  • Recommend implementation start before analysis is complete

Avoidance of pitfalls and factors leading to success should also be known by prospective RCM Project sponsors, champions and participants. These are also addressed in the new document as follows.

Given that the decision is made, based on readiness factors described above, to conduct an RCM Project, the next consideration for the project manager is to manage the program so as to ensure success. Given that a large percentage (estimated at 50% or more) of RCM projects fail, a review of some of the reasons for failure might be instructive.

RCM projects fail for individual as well as multiple reasons. Some of the many pitfalls that cause failure and that must be avoided are described briefly below.

  • Too much analysis and time off the "real" job required by Subject Matter Experts (SME's) either because proper preparation wasn't made in advance or the tools available to support the analysis didn't support high productivity.
  • Failure to provide for prompt, if not simultaneous, recording of results of analysis so they are immediately available to the team for each new day's effort or failure to provide a competent recorder of results (ideally from inside the organization who can control and provide access to them later).
  • Failure to plan, before analysis begins, for implementation of results at the earliest possible time, resulting in delay in realizing return on investment of time and resources in the project.
  • Failure to pick a field-proven, RCM project software program for use during and after the analysis phase - Constant updating during and after initial analysis is difficult or nearly impossible without good software because it causes long delays between team sessions.
  • Manpower requirements and costs for training, orientation, analysis and implementation are underestimated or ignored resulting in lack of "buy-in" by stakeholders in the outcome of the project
  • No "buy-in," especially when created by outsiders without any substantive input from stakeholders inside the organization
  • Lack of failure data or will to conduct thorough Root Cause Failure Analysis and Failure Modes and Effects Analysis and follow-up or difficulty or impossibility of collecting data on maintenance history or equipment failures
  • Failure to assign most experienced SME's to RCM Analysis either because they simply aren't available or are able to avoid such assignments
  • Lack of commitment by those who control assignment of SME's time and expenses, when the RCM project manager or "champion" does not control the personnel assets assigned to the project or the travel and living budget for those who come from outside a facility to support the project
  • Failure to have those engaged in the analysis phase lead the implementation of results
  • Too little knowledge of or aversion to PdM technologies - clinging to time- based, intrusive tasks
  • Failure to involve parties skilled in predictive maintenance, at least during the task selection step of the analysis phase
  • Getting bogged down having to "slog" through analysis results to identify "new" tasks to be scheduled and "old" tasks to be deleted because analysis reporting mechanism isn't definitive enough
  • Review and approval chain too long and over-controlling, causing unnecessary delays in implementation and subsequent realization of benefits from the project
  • Culture of the organization really does not embrace change or lacks a support mechanism for incorporating change on a permanent basis. (See readiness factors.)
  • Lack of appreciation that the Analysis Phase is easy compared to the Implementation Phase of an RCM project and that the longer time between analysis and implementation the lower the probability of ever getting tangible results
  • Lack of assurance that management will provide the resources to support the whole effort or be supportive at all either in short or long term
  • Failure to keep sponsoring "champion(s)" and other stakeholders current on progress of the project and/or selection of inadequate report elements (e.g., no meaningful metrics)
  • Failure to select the right systems or components for analysis - that is, the most troublesome with the greatest payback potential
  • Committing to do analysis on too many systems at once or in sequence before seeing any return on investment from the first ones analyzed
  • Failure to ensure that any other initiatives being implemented concurrent with RCM projects don't interfere or supplant them

Frequent, adverse outcomes of RCM projects, which imply things to avoid, are:

  • Results under-utilized or ignored
  • Time and budget overruns
  • Deferred or truncated programs
  • No funding or manpower allowance for implementation
  • Living Program never established so as to provide for feedback on items missed during analysis or that emerged later and no follow-up to ensure prompt implementation of new maintenance requirements
  • No attention paid to long term benefits realized from the analysis (Benefits Phase too short to document return on investment) which in turn results in RCM being under-utilized and full potential never realized.

Conversely the requirements for success which should also be in the minds of all stakeholders in the planned project include but are not limited to:

  • Clear, understandable goals, objectives, and expectations
  • Strong and continuous management support
  • Dedicated staff
  • Ownership and continuity of assignments to the project
  • Processes and systems selected to gain best results
  • Strategic and rapid implementation
  • Accurate measurement of results


- The RCM Scorecard was initially conceived to settle a professional controversy. However, it has evolved into what may be a useful document, the RCM Project Manager's Guide, especially to those with little or no experience with this methodology.

- Although RCM methodology is approaching 50 years old, it is still the best means to determine what maintenance should be performed on an asset to get closest to what has been designed in for maximum reliability.

- Given what we learned while trying to settle the controversy over RCM, all approaches to this methodology have merit because of the fundamental truths it reveals when its principles are embraced in any form.

- A variety of approaches (e.g., Super-classical, Variant and/or Derivative) should be considered for employment when the goals are to convert to an RCM basis for all assets.

- As first stated by Terrence O'Hanlon at a recent conference, "A little RCM is better than no RCM."


[1] Smith, A. M. Reliability Centered Maintenance McGraw Hill, NY., NY 1993 ISBN 0-07-059046-X

[2] Smith, A.M., & Hinchcliffe, G., RCM: Gateway to World Class Maintenance, El Sevier Butterworth-Heinemann Publishers, 2003 ISBN 0-7508-7461-X

[3] SAE JA1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes, first issued August 1999 with subsequent updates. This standard is available for purchase at

[4] Nicholas, J. R. & Young, R.K, Advancing Maintenance and Reliability (2nd Edition) MQS LLC Publisher, 2006, ISBN 0-9719801-2-8, to be available before the end of 2007 from

[5] DOD Report on Reliability-Centered Maintenance by Nowlan & Heap of United Airlines, dated December 29, 1978, which used data from the 1960's and 1970's and earlier papers and studies referenced therein to describe what has become known as "Classical RCM."

[6] By that time Classical RCM methodology had been in use for almost 40 years, and at least one variant had been used for almost 20 years.

[7] The SAE RCM standard JA1011 - "Evaluation Criteria for Reliability Centered Maintenance (RCM) Processes" was adopted and issued as an approved document in 2000. The standard provides criteria defining the "essential elements of RCM" and defines compliance with "full effort" RCM. This phrasing distinguishes full-fledged Classical RCM from "Streamlined," or any other method (e.g., Total Productive Maintenance [TPM]) for developing a program leading to machinery reliability.

[8] Smith, Anthony M. (Mac) (AMS Associates) and Hefner, Rod (Mid Atlantic Energy Company) "The Application of RCM to Optimizing a Coal Pulverizer Preventive Maintenance Program, Figure 10, Page 15." October 2002, SMRP Conference, Nashville, TN.

[9] Nicholas, J R." Settling the Controversy Over Reliability Centered Maintenance" Proceeding of the 2003 Annual Conference of the Society for Maintenance and Reliability

[10] Contributors included Doug Plucknette (Reliable Solutions, Inc., now affiliated with Allied Reliability Inc), Derek Burley (Cargill), Roger Zavagnin (IVARA), Anthony M. (Mac) Smith (AMS Associates) and many others who chose to remain anonymous.

[11] The consensus RCM Scorecard is available for free download from The follow-on document, "RCM Project Manager's Guide." is expected to be available either in addition or in place of the RCM Scorecard.

[12] For a discussion of some of these items see "How to Get the Most From Reliability Centered Maintenance (RCM) " by Jack R. Nicholas, Jr., P.E., CMRP, (then) CEO of MQS LLC in Proceedings of the Association of Iron and Steel Engineers Annual Conference, September 2003, Pittsburgh, PA

[13] The handouts consisted of excerpts from or the complete text referenced as follows: Nicholas, J. R. & Young, R.K, Advancing Maintenance and Reliability (2nd Edition) MQS LLC Publisher, 2006 ISBN 0-9719801-2-8, to be available from

ChatGPT with
Find Your Answers Fast