Physics calculations may work perfectly well in theory. On a blackboard, academic science is pretty predictable (outside of the quantum realm, perhaps). Yet, nothing is manufactured in a complete vacuum, is it? When it comes to real-world settings, millions of factors could impact the state of a physical object — material, friction, temperature, pressure, altitude, wear… the list goes on.
With so many tangible conditions increasing the likelihood of deviation, it can be difficult to reproduce a digital twin that accurately represents real-world conditions. This is, in part, why some believe the next generation of digital twins will be more driven by artificial intelligence (AI). By leveraging neural networks created with sensor data, production could trust intelligence gathered from trends across data points instead of solely relying on theoretical models. This could enable better predictability, optimized operational performances and overall improved products.
To discover what the next generation of digital twins will look like, I recently met with Faustino Gomez, co-founder and CEO of NNAISENSE. Below, we’ll define what a digital twin is and why the concept needs an upgrade. We’ll also explore a few digital twin use cases to see how the next wave of AI-driven digital twins will be “learning directly from the data,” as Gomez describes.
What Is a Digital Twin?
No, we’re not discussing your cyber clone. Digital twins are a bit less transhuman than that. Put simply, a digital twin is a digital duplicate of a physical object or system. Digital twins are commonly used across many industries to monitor and test intricate apparatus. Industrial manufacturing, for example, often leverages a digital twin to monitor production and test the performance of equipment within a simulated environment.
Digital Twin 1.0
Some of the first digital twins are attributed to NASA. By reproducing mechanical objects locally, NASA could diagnose issues and test solutions for physical components 200,000 miles away in space, far beyond direct human intervention.
The digital twin concept also found roots in other mechanical devices, such as jet engines, enabling on-the-ground mechanics to diagnose issues in the air. Using digital twins, smart factories could mimic the behaviors of manufacturing robots to test behavior and identify potential bottlenecks.
Digital Twin 2.0 Adds More Data
Using a digital twin, engineers could represent complex interworkings of joints and parts within a computer screen — a 3D reflection of the device. As opposed to constructing physical models, virtual simulations significantly reduced the effort and resources required to test new concepts and diagnose problems. But, simulating reality in a black box wasn’t always easy.
Reproducing complex physical counterparts with a digital twin is wrought with complications that can be difficult to model in a standard CAD simulation, said Gomez. So, engineers began taking measurements from physical devices, such as vibrations or temperature, and integrating actual production data into the twin. This enabled operators to better forecast system performance by considering real-world elements. Yet, truly predictive forecasting was still lacking. That’s where AI comes in.
Digital Twin 3.0 Adds AI
Up until recently, digital twin and AI have been independent concepts. But with the advent of more sensors and greater data collection, integrating AI is becoming an inevitable evolution of the digital twin, said Gomez.
Adding an AI layer to a digital twin would involve training a deep learning model using hundreds of time-series data outputs from sensors. Generating this data would require capturing a chronological view of how systems work functionally for some time. “Then, take data and split it,” said Gomez. “Use some to train, some to test.” The process is not too dissimilar to Walk Forward Testing, common in financial trading, added Gomez.
Instead of using traditional statistics methods and standard physics, a neural net could unify data and find nonlinear relationships between data regardless of type. The end result could quickly identify cause-and-effect correlations between data sets (which simulated physics engines may miss) and predict future conditions based on new input.
Use Cases
With better intelligence, you get a more accurate depiction of possible outcomes. The use cases for crystal ball digital twins are numerous. Gomez described how 3D printing and factory maintenance could utilize digital twins:
- Additive 3d printing: 3D printing metal powder involves a complex thermodynamic process with high-intensity lasers. The process is often less than perfect, leaving cold and hot spots from layer to layer, but ignoring them could result in a defect. By analyzing each layer’s thermal displacements, a deep learning model could be trained on the intricate details of the process. This could be used to build a more refined process model, essentially enabling manufacturers to “pre-compute the part.”
- Predicting glass quality: Gomez described how a specialty glass company utilizes AI-driven digital twins. Their process model takes sensor readings at various chambers throughout its molten melt and cooling process, analyzing temperatures to maintain a uniform mix. Based on their controls, operators can predict what glass quality will be in the future. Essentially, it’s “a user assistance system based on a digital twin,” described Gomez.
As you can see in the examples above, a digital twin isn’t necessarily a single object — it could represent an entire process model for a large industrial operation. According to Gomez, having this digital twin could boost testing and R&D, improve development time, and decrease time to market.
Digital Twin 3.0 Forecasts the Future
For digital twins to become more effective, these simulations must be faithful to current and future real-world conditions. But, computing tests for situations involving wet chemicals, unstable materials, or thermodynamics is tricky. Therefore, to Gomez, predicting such dynamics really requires “learning directly from the process model,” he said.
The key thing here is a neural net doesn’t care what it’s monitoring. Be it torque, matter, friction, temperature, fumes or vibration, the model will learn regardless of what’s thrown at it. While the benefit here is you don’t have to know physics or chemistry, setting up this process requires much customization, which could differ significantly on the type of materials and process at hand.
The fusion of disparate data sets could bring interesting capabilities for testing and forecasting. Yet, quality and quantity of data is paramount, noted Gomez. Furthermore, these models won’t spring up in an instant — they must be tested and validated using real-world dimensions before deemed useful. While digital twin 3.0 is a promising step forward, more research will be required to ensure the predicted digital future reflects the physical outcomes.
AI-infused digital twins use “data from the process to predict what the process will be in the future,” said Gomez. This goes far beyond traditional 3D representations to produce more faithful, predictive digital twins.