Taming the Explosion of IIoT Data with 3rd Gen AI Methods and More

Taming the Explosion of IIoT Data - Dominic Gallello

IMC-2020 Presentation 23:49 Minutes

by Dominic Gallello, Symphony Industrial AI & Symphony AzimaAI

Asset reliability and process professionals face an enormous challenged to tame the tsunami of data and turn it into useful information. With the advent of cheap IIoT devices, sensor data is growing 50 times faster than business data with only 2% of it actually being used! By 2025 the number of devices will double to 21 billion. These sensors can analyze multiple signals like sound frequencies, temperature, pressure and vibration to determine if a machine requires repair before it can cause a trip, or a process that can be optimized if it is handled dynamically based upon constantly changing conditions. There is not an equipment manufacturer who has not already or will be putting sensors on their assets.

This data is coming from multiple sources including high frequency vibration, time series, inspection imagery and video and semi-structured and unstructured events and logs. For sure, the human mind cannot cope fast enough with this data. While AI has become an essential component of making sense of the data, AI alone is not sufficient to detect anomalies without raising a huge amount of false positives and turn macro warnings about plant conditions into detailed apparent cause that focuses to the component level that can cause expensive outages or process inefficiencies.

The methodology involves building fit-for purpose anomaly and performance digital twins that are built using 3rd generation AI methods, that works in tandem with a failure library (FMEA) engine and recommendation advisory rule base. The digital twin detects faults and allows “what-if” scenarios to be evaluated in order to create the best possible outcome. FMEA incorporates mechanical component data like vibration, process and failure modes to identify anomaly cause and recommended fixes to address causes.