-
1
Smart Sensors
Smart sensors are different from plain sensors or transducers in that smart sensors are advanced platforms with onboard technologies, like pre-processing, intermediate storage, diagnostics, smart connectivity and some other features that transform traditional analog signals into true digital insights.
What is unique and more relevant to the smart sensor concept is the pre-processing capability at the sensor edge.
-
2
Embedded Edge
Newly developed Embedded computing capabilities and lower power consumption have strengthened substantially over the last years, thereby enabling data pre-processing and analysis at or near the source or ”edge”.
“Edge Computing” or “Embedded edge” is here to stay and is the next great milestone towards reducing the amount of data, while greatly improving the data quality, that is forwarded between the sensor and the analysis platform.
-
3
Sensor Fusion
Sensor Fusion is a combination of technologies that allows to fit multiple different sensors in the same asset. Data from these sensors is then analyzed with in time synchronization in order to detect small degradation patterns over time.
Digital Twin definition: “creating a virtual model from the point of view of its physical parameters. This virtual model -a digital twin- is a replica of what is happening in the asset”. This can only be achieved with Sensor Fusion technology.
OUR DATA PROCESS
Sensor Data
Capturing the physical variables
A sensor is a device that provides information on the physical variable of a process or substance in a predictable, consistent and measurable way.
At the physical edge, a transducer converts the physical variable into an electrical signal, that will be subject to different processing algorithms in order to be useful to asses the real-time condition of the industrial asset under study.
The Sensor features an electronic front-end that conditions the weak signal delivered by the transducer to the adequate parameters to be digitized with an Analog to Digital Converter.
The digital output of the sensor is then ready to be pre-processed.
Data Pre-Processing
Making Industry 4.0 possible
The sensor delivers digitized raw data. This type of data is called Time-series, continuous, or unstructured data.
The fact that this raw data fro the sensor is unstructured and continuos creates the following problems:
- Data Quality: Raw data fills the database with meaningless information.
- Storage space: Raw data. An the example given by G.E. twenty industrial sensors produce 1 Tb of data….A DAY!
- Post-Processing: Raw data is difficult to post-process as it contains “non-relevant garbage data” from idle timing that is “unwanted noise”
- Modeling: Raw data makes creating good predictive maintenance models difficult.
- AI was not made to process garbage.
In a nutshell, raw data has to be pre-processed in order to be used efficiently. We used different algorithms to perform the pre-processing without destroying the valuable underlying information of the environment.
Structuring Data
Providing Sensor data a homogeneous structure
Modern smart sensors deliver data-rich content that can not be categorized into classification-like discrete data.
Homogenization is the process of making things uniform or similar.
Alteria has patented an algorithm that simplifies the homogenization without destroying the underlying information, that is needed to build Condition Based Predictive Maintenance models.
Very complex continuous signal has been simplified to the extreme of being fully compatible with a discrete data string.
This data format is uniform and can be analyzed and processed using the same models as discrete data.
But the data set still represents a true model of the original signal, so it is valid to create a condition based Predictive maintenance model.
Connectivity
Forwarding data fro the nodes to Gateways
Forwarding data to the server side, using our Nodes and Gateway products. Understanding Connectivity and its limitations are very important and are often the weakest link on the Industrial IoT chain.
Connectivity is here referred to as the capacity to interconnect sensors, by using Nodes and Gateway products to forward data to servers in an efficient way, that allows fluid data interchange at optimal speeds.
Basically, there are three mediums to interconnect Industry 4.0 equipment
- Wiring, using a differential noise-immune data bus.
- Wireless, With the use of radio modems.
The is no recipe across the board. Issues like Bandwidth, Effective Payload, Timing, latency, and the limitations of radio modulation systems and protocols makes connectivity a complex subject where different solutions have to be discussed in order to make the right choice.
Data Sets
Organizing data, using data brokers
Once data is received at the Gateway, the data has to be organized using different tools such as an MQTT data broker.
Data is formatted in data sets and time synchronization added from the time stamp.
This step is vital to enable the homogenization of the different data structures from each sensor into a fully compatible data set.
Non-SQL Database
Storing your data safe
Nonrelational (Non-SQL) databases are used to store data in Industrial IOT for different reasons being the flexibility and rapid access the most important issues.
Predictive Analytics
Understanding and analyzing your data
A historian client is often used to start analyzing data stored in the server database.
We use platform agnostic applications where data can be consulted from any smart device, such a laptop or a cell phone.
Understanding your data is paramount for the next steps. Please check out the CRISP-DM process system for data, as our solutions are based in this worldwide data standard.
Analyzing your data will lead you the next step: Using your data to create a Prediction model based on your condition catalog.
Creation of Predictive Models
Using your data to discover
Usually, you may want to start creating your Predictive Models based on the knowledge Discovery of hidden patterns into your data.
There are many Machine Learning and statistical tools that can be used on that purpose.
Useful results
Improving the models: Iteration
Don’t forget that models are only good as they give useful results.
Models will have to be improved by using multiple iterations in order to clean the model from “garbage” results.
Artificial Intelligence
Using your data to change the way things are done.
This is the last step into your digital voyage.
Applying resources such as Neural Networks that emulate how the human brain works, will lead you to the last milestone: Predict failure and improve productivity at its best.