Level Up Your Reliability Skills: Get Certified! Boost your career now!

Elevate your industry profile at The RELIABILITY Conference.

Sign Up

Please use your business email address if applicable

Proving Asset Management Delivers

We have an opportunity to structure and then simplify our asset management endeavors and manage such complexity.


The maturation of the PAS 55 asset management standard (publicly available specification) and the exciting work on the ISO 55000 series of standards are creating some welcomed order to what can be (and often was made unnecessarily) complex.

How shall we capture, share and leverage outcomes of that improved order?

Let’s accept that PAS 55 is true progress and there has been “success,” but then let’s posit where we might view this (progress > success) in terms of actual versus design, with best facts as neutrally and objectively as practically possible please?

Success in asset management depends on an understanding and acceptance of the following facts:

  • Asset management is not something that can be “installed,” neither is it ever “finished.”
  • Asset management has to be living and dynamic.
  • Excellence is, has always been and always will be Notional, Unique, Relative and Dynamic.

With that said, we also have to acknowledge that there will be both similarities and differences across organizations as each of them progresses with their own version of asset management. Being able to capture, collate and share those similarities and examine those differences, however subtle, should be of much mutual interest and, when used properly, of great tangible value.

How to embark upon this goal? Refer to Figure 1. With coordinated effort, we can create further definition and structure for if, how, what and when to measure. Consider that we can now look at hierarchy or decomposition as shown from business objectives through to individual capability and competency to do what needs doing and be able to prove how well it was done.


Figure 1: Line of sight from corporate business objectives to tasks

A number of new standards and initiatives brings significant opportunity for organizations to measure better than they presently do. We now have sufficient detail to know if, when, why and how to measure specific happenings. Consider the following standards and publications. (See Helpful Links at bottom of page):

EN 15341 - Maintenance - Maintenance Key Performance Indicators – 2007

This European standard describes a system for managing key performance indicators (KPIs) to measure maintenance performance in the framework of influencing factors, such as economical, technical and organizational aspects, and to appraise and improve efficiency and effectiveness for achieving excellence in maintaining technical assets.

Global Maintenance and Reliability Indicators (GMARI) - Fitting the Pieces Together - 4th Edition

This is a publication of the European Federation of National Maintenance Societies vzw and the Society for Maintenance & Reliability Professionals. Harmonized indicators are those that are similar between the SMRP and EN 15431, and those for which any differences can be identified. The harmonized indicators provide a common platform for global organizations to benchmark their facilities across borders.

Society for Maintenance & Reliability Professionals (SMRP) - Best Practices - 3rd Edition (Compendium)

This comprehensive document developed by the SMRP Best Practices Committee for the purpose of standardizing how maintenance and reliability professionals measure and calculate common, and not so common, key performance indicators, and provides the necessary tools for measuring and comparing performance using consistent measuring systems.

    As organizations embark upon, refine and correct their approach to their asset management, there are real opportunities to embed the means for capturing critical business information to ensure success. These include:

    1. What their business objectives were and what they are now?
    2. Which key results areas (KRAs) did they measure? For example, cost, availability, quality, etc.
    3. Where those KRAs sit in terms of the asset management balanced scorecard (BSC) perspective?
    4. Which KPIs were chosen and fitted to measure certain specific issues?

    If we don’t do this, yes we will have better order as referred to earlier, but how will we be able to compare whether what we designed to meet specific business objectives worked and to what extent in time?

    If we tackle this better now, then think through the next details of the structure, we will have the opportunity to merge these parallel standards and initiatives to establish a sound, credible and ongoing foundation to:

    1. Perpetually challenge the structure of the theoretical asset management master model, system, and underpinning business processes and then improve and update these based on structured managed feedback.
    2. Know what works to what extent.
    3. Have the means to maintain, publish and leverage a dynamic set of asset management benchmarks.
    4. Put the next level of working details to PAS 55 system requirements Section 4.3.2 and greatly assist with requirements Section 4.6.4 and Section 4.6.5.

    Aren’t these matters paramount to maintaining the current momentum and advancing the asset management profession? We must uniquely design and then manage each change and be able to articulate how such similar change has unfolded as we each craft new and the next change.

    Tenets of Asset Management

    When developing unique approaches to asset management, we offer several tenets that must be understood and adopted.

    1. Organizations will vary in their approach to their asset management. This is entirely logical and practically what we would expect.
    2. No two organizations are exactly the same, nor would we expect or want them to be.
    3. Organizations will naturally be at different places with their own developmental maturity.
    4. Organizations will vary in their abilities to make changes and mature.
    5. What is a best practice for one organization isn’t automatically a necessary practice to apply to any other organization.
    6. What is excellence for one organization will never be arbitrarily the same for any other organization.
    7. The performance of each organization is therefore unique and each must be striving towards its unique organization’s dynamic targets at its own rate.
    8. Perfection isn’t necessary and is rarely justifiable commercially. Good enough, better then whom else, being safe and optimal is entirely acceptable.
    9. The use of the term “world-class asset management” without best clarity is a misnomer.
    10. World-class asset management isn’t a point or score. Even if it is conveyed that way, such scores, data, or benchmarks must never be used as legitimate targets to which to aspire.
    11. Performing any assessment in whatever way may well provide a view of your developmental maturity profile and reveal gaps.
    12. If, when, why and how to close certain gaps, by what amounts, at what speed, in what order, will always be unique.
    13. If your change plan is dominated by technical or mechanistic steps, activities and tasks, then be concerned over whether it is balanced and robust enough for it to likely succeed.
    14. Each change plan has to be crafted to manage that unique change. We would expect to see details of how (process), who (as people in functions), will make, sustain and inculcate these changes.
    15. We must uniquely design and then manage each change and be able to articulate how such similar change has unfolded as we craft new/next change.
    16. There are generic themes that we would expect organizations to consider as core and be good at doing. For example, we should all be good at planning and scheduling work well and deploying the right resources correctly.
    17. We have to be able to demonstrate to what extent the application of the best practices within PAS 55 (and forthcoming ISO 55000) worked and put into place benchmarks for ensuring that these remain current and proven to be best.

    So what must we take from this?

    We need to “craft” each organization’s approach to asset management.

    Know your current situation

    • Trying to go to some other place from wherever you are now must be based upon sound knowledge of why you are where you are now.
    • How we behave is influenced by our beliefs or cultures.

    From that data

    • Have you had any difficulties in making progress and managing prior change initiatives?
    • If you have, then what are you going to do differently to make this next change successful?

    Know who you are and need to be as a business

    • Why do you need to be at what place next?
    • If we are going to this next place from where we are now, then we must be confident that our targets are sound, have a basis, etc.

    The “art” here is to truly understand the unique organizational factors and drivers and to move in a structured way at the right pace. Design each change and then manage that change as it unfolds.

    We mutually need an asset management benchmarking monitor and change plan designer

    My suggestion is to marry together the metrics (SMRP/EFNMS, EN 15341, etc.) with a proven way to verify if these metrics apply to the version of the base model that fits each unique organization.

    How can we logically allow these new standards and initiatives for asset management performance measurement to coexist, but be wholly disconnected from PAS 55 and its PAS 55 PAM, which describes the best practices, but fails to fully deal with how and why certain measurements could be made? This is not a defensible position.

    This suggestion could be progressed via an online, neutrally managed asset management benchmarking monitor and change plan designer, such as that hosted by the Global Forum on Maintenance & Asset Management. (See Helpful Links at bottom of page).

    In my view, the crucial part here is it’s great you have a set of metrics, but how do you know which ones apply and when they do, to which specific and unique organization? Furthermore, when do you know to what extent performance is occurring (maturity) in that area and what of the interaction and connectivity of such measurements when embedded into and upon a set of master business processes?

    Would this not be better addressed in the following manner: When whomever conducts an asset management assessment, why an organization is where it is now is captured and then a plan is assembled for if, when and why to close which gaps by what amount. That same plan can (and must) be expressed in a similar order. Refer to Figure 2 for an outline of an asset management benchmarking monitor and change plan designer application.


    Figure 2: Bold and underlined text describes what the logical screens > tabs of an asset management benchmarking monitor and change plan designer application might look like.


    Figure 3: Asset management benchmarking monitor and change plan designer application logic

    Conclusions & Recommendations

    If asset managers now require those organizations endorsed for the conduct of certification and/or gap analysis purposes to adopt and use such an approach and then implement mandatory online updates to the master database each time a plan is created or updated, all can see and use this information. (See Helpful Links at bottom of page).

    We should then turn our collective efforts and attention to not just the conduct of certification and/or gap analysis, but more crucially to the subsequent ordered preparation and execution of asset management improvement plan/s and then proving to what extent what worked, and next to examining this and leveraging what we have learned.

    It isn’t in anybody’s interest to allow divergence and even fracture when we have 30-plus organizations that conduct such services, but we cannot see a dynamic view of where the asset management organizations are. Where are our dynamically evolving, statistically managed asset management benchmarks as a result of the conduct of PAS 55 assessments?

    In this author’s opinion, we must continue to detail the order we have created here and capture just how whomever converted the designed and agreed business objectives > KRAs > KPIs to which core programs > activities would have to be undertaken to fulfill the business objectives and achieve (even surpass) the KPIs.

    Let’s require ourselves to describe how the ‘system’ is then evolving and be able to show each new and/or updated core program by the production of these ‘business processes’ drawn that show visually how each program happens logically.

    A practice that is not practiced cannot be best (or otherwise). We must, therefore, be able to see how the practices that are presumed to be best are practiced; or how can we know that they remain a best practice?

    Helpful Links:

    Keep reading...Show less

    “R.A.I.” the Reliability.aiTMChatbot

    You can ask "R.A.I." anything about maintenance, reliability, and asset management.