Top 10 Issues Caused by Poor Facility Data Quality and Information

The consequences of not having accessible, complete, and accurate information can be very costly across the facility lifecycle, which can span multiple decades. Ensuring the information your operating personnel receives is accurate will allow them to run your facilities effectively and avoid the following issues caused by poor information quality.

Top 10 Issues:

  1. Higher information handover costs
  2. Errors, duplication and omission in spare parts;
  3. Delayed Completions / Commissioning and Turnaround activities caused by loss of productive man-hours wasted looking for information to conduct the work.
  4. Delayed Pre-Start-up Safety Review (PSSR) and handover (incl. TCCC Transfer of Care, Custody and Control).
  5. Increased cost for modification and upgrade projects
  6. Incorrect Process Safety Information (PSI) that can lead to bad decision and incidents
  7. Failing internal and external audits (i.e. increase likelihood of failed audits and citations)
  8. Safety incidents with Permit to Work ( e.g. Lock-out/Tag-out issues)
  9. Errors in Operating Procedures – leading to process upsets and outages
  10. Errors in RBI/RBA information used for CMMS systems; leading to decreasing MTBFs and increasing MTTRs

Technology is not the issue. There are a number of solutions that enable you to ensure information quality, not only for new facilities, but for existing facilities as well.

The challenge is to ensure you have the right information supporting the new technology with improved processes backing the appropriately trained organization.

5 Key Solutions:

  • Contracting Strategy. A contracting strategy (with teeth!) that granularly defines the information that you demand for your facility; which is typically defined in your information handover guide/standards. The contracting strategy should be applied and enforced across all initiatives by a contract compliance manager.
  • Data Gateways. Data gateways are crucial for checking and validating the information your suppliers and contractors receive against your information handover guide/standards. The information manager (i.e. the data custodian) should be responsible for enforcing the information handover guide/standards for the entire asset/facility project lifecycle.
  • A Class Library Manger. A class library manager can check the information you are receiving conforms to your asset classifications and that the mandated asset attribute information is complete and correct.
  • Use Technology to Structure Information. In areas you have existing unstructured information, use technology to structure the information; then have your experienced operations and maintenance personnel review and validate it through a quarantine process. Once validated, this can be released for use in the organization.
  • Systems Completions/ Commissioning. Systems completions/commissioning and handover is an ideal opportunity to validate your information (i.e. the so called digital/virtual asset) against the actual facility being built. This needs a management process that can include mobile devices to capture the field runs and as-built mark-ups (i.e. to replace the almost unmanageable ‘old stick files’). These ‘as-built’ changes need to be part of the PSSR/TCCC sign-offs to ensure the captured changes are being actioned. Just after systems completions/commissioning, and prior to operation is when the facility information will be at its best quality, and thereafter it will decay over time unless it is controlled with a robust MOC process. Surveys show that for a typical refinery up to one-third of the information will be in error, or out of date after five years of operation. The problem is incorrect information is not identifiable, so all information has to be treated as suspect and with caution.

Once the facility is handed over, the information becomes the property of the owner; generally engineering is the custodian of the design bases and associated information, and ‘the as operating plant’ information. The MOC process now becomes the prime controller of all ‘not-in-kind changes’. The MOC process needs to be enforced and have the capability to conduct very granular ‘electronic’ impact/ change assessment that covers the associated assets, and all related information (including procedures, registers, alarm and trip settings and bypasses, etc.).

Too often new technology is implemented as the answer to the information issues being experienced in an organization, only to fail because the existing delinquent processes are still being implemented with no regard to what is possible with the new technology. In addition, the entire change management process is either overlooked or ignored with the consequence that there isn’t ‘buy –in’, or support for the new technology from key stakeholder/users.

The right information, processes and resources supported by the right technology is the only way you can categorically say ‘We have got this covered!’

Related Reading