One of the primary advantages of digital manufacturing is that the data generated can be leveraged to do various forms of analytics. When an organization invests millions in digital manufacturing to setup a “smart” factory and in developing a digital twin, the real ROI comes from exploiting the data that these smart factories generate.
There are various areas in which data generated by smart sensors can be leveraged in the manufacturing process and in this post, we will cover few of the application area and the methodology at a very high level.
I intend to focus on how the analytical approaches work but not the math involved. However, the math behind the concepts is anything but simple. Complexity arises due to the sheer number of variables involved. If there are 100 sensors on the floor, we are looking at equations with up to 100 degrees of freedom. The resulting equations may be computationally easy to solve but the interactions among subsystems grows the complexity exponentially. This however, is the definition of Big Data. So let us now jump into applications in the following areas:
(1) Design and testing
(2) Anomaly detection
(3) Quality testing
(4) Process optimization
Since my undergrad was in Electrical engineering, I will use the example of a plant that manufactures induction motors. Key parts on an induction motor are shown below.
The manufacturing company is planning to launch a new model of induction motor and the design engineers are working on testing a prototype.
Leveraging data for design and testing
During the test runs, the sensors on the prototype generate data related to rotor speed, angular velocity, coil temperature, torque generated etc. As you can assume, many of the variables are interconnected, where the value of one impacts the other.
The data captured by the sensors is captured by a broker , ingested in real time into a NoSQL Database. Note that I am simplifying the technical architecture here for simplicity purposes.
The NoSQL database feeds into training dataset of a model, and then-using this model-react to live-data using this model. Finally the result will be translated into an action and sent back to the IoT device (actuator). The actuator then tweaks the physical hardware characteristics like rotor speed etc. to vary the operating parameters. Hardware components may be replaced as well based on the sensor data (coil with thicker gauge, rotor with slightly smaller diameter etc.).
This “update and improve” process continues till the prototype parameters align with the ideal design parameters of the product.
Analyzing process parameters
Finally, the engineers have validated and approved the final design and after passing all testing phases, Motor Co. starts manufacturing these motors. However, since this is a new product and considering the significant investment they have put into this, they want to monitor the production of first few batches very carefully.
Each equipment on the shop floor has certain operating parameters. Sensors on the floor monitor these parameters very carefully. Anomaly detection, a subset of pattern recognition, which is a subset of machine learning, is employed in many classes of analytics. In general, model variables or equations of model variables are mathematically compared for equivalency or ranges of equivalency.
For example, anomaly detection is applied to the friction coefficient in the roller bearings of a robotic arm. In this one-dimensional case, the anomaly is detected by stream analysis if the coefficient is higher than a predefined threshold. The manufacturing shop engineers know that it is abnormal for the frictional coefficient to be higher than, say, 0.35 so when 0.36 is detected, it triggers an action such as sending a message or lubricating a bearing. This can also be used to drive descriptive analytics to graphically display any model variable that is out of the norm.
Quality and testing
As the first few batches of motors starts to come out of production lines, Motor Co. wants to make sure the product is world class and meets the stringent quality criteria.
In the testing bay, motors are run connected with sensors that capture key quality parameters and relay the data. An example of parameters is: When the rotor in an induction motor moves, it makes a squeak. Normally the squeak is so feeble that we can’t hear it but it can still be measured. The squeak is largely due to friction between bearings and the ring holding them. By capturing the data on squeak frequency for faulty motors, Motor Co. engineers have developed a squeak profile (left hand side graph). In the quality testing runs, data collected via sensors is used to generate the squeak profile and compared against the threshold squeak profile to ensure that the motors pass this criteria.
Solve or Optimize
We generally don’t need new data for this type of analysis as it is mostly done post process. Motor Co. leverages the data from the entire manufacturing process to:
(1) Ensure that the process flow is optimal: Algorithms can compute millions of permutations and combinations to determine the optimal process flow/path. The objective can be minimizing total cycle time or cycle time for a work area
(2) Reverse engineer: Leveraging data and using it to get to a desired result or product specification.
Note that one of the key aspects of this approaches is the Digital Twin. So when I refer to a “model”, it is essentially a digital twin, a replication of the manufacturing process or processes. Though outside the scope of this article, I have included a link of an article in the appendix below that will introduce you to the concept of digital manufacturing.
The article below from Forbes is a good general introduction to digital twins for beginners.