YOU ARE AT:Enterprise'Methodology, not product' — and other key digital twins considerations

‘Methodology, not product’ — and other key digital twins considerations

High data quality and system interoperability need to be top of mind when employing a digital twins methodology

The use of digital twins — a digital representation of a real-world physical product, system or process — is increasing across several industries, most notably in verticals like advanced manufacturing and energy, but more recently, in fields like healthcare and smart campus and city planning. A panel discussion at the Private Networks Forum further explored the digital twins concept and what enterprises should keep in mind, particularly around data quality and system cohesion, when considering implementing this technology.

Digital twins is a methodology, not a product

For one panelist, Nazim Choudhury who is the director of market development at iBw, it was important to clear up one thing: While often considered a product, digital twins is actually more of a methodology. He explained that it’s about capturing “the right data at the right time” and ensuring that the data is accurate so that it can truly the real-life physical environment in a virtual environment. “And when you think of when you have that capability, what you can do is you can do everything in a virtual world versus doing it in the physical world — think about testing, optimizing, monitoring.”

The digital twins methodology makes it possible to do all these things — and likely more —without a physical environment, making it less costly and time consuming. In fact, Choudhury claimed that costs associated with things like testing and monitoring become “almost negligible” in a virtual environment, barring of course, the initial technology investments to make it all possible.

Beyond cost savings, Christina Yan Zhang, CEO and founder of The Meta, pointed out that digital twins make it possible to test scenarios that are too dangerous to investigate in the real world, such as military operations.

Data synchronization

As Choudhury mentioned, the success of a digital twins relies on capturing the right data; Dan Isaacs, the CTO at Digital Twins Consortium, further highlighted the criticality of data synchronization: “It’s the synchronization to allow data to be able to be accurately represented through this synchronization of your virtual representation,” he said, adding that with high accuracy, you get high precision, which ensures fidelity of your virtualization.

“From that data, you gain an actionable insight from that information,” he continued. “And that information allows you through the analytics to be able to conditionally operate on that physical entity and or process to achieve the optimal outcome… As long as there’s a synchronization to ensure that you have that requisite level of fidelity between your virtual and your physical, you have that digital twin to allow you to have that crystal ball effect.”

For the best crystal ball effect, though, digital twins needs high-quality data. “We also need to look at standardization of high-quality data because the fundamental of digital twins is need to have high quality data,” said Yan Zhang. She pointed out that only about 20% of the data analyzed by any given organization in the world informs decision making, while the remainder is “dark data,” or data acquired but not used in any valuable way.

“If we want to get digital twins right, we need to ensure the data is properly analyzed before they can become value like the new oil for our digital twin application,” she cautioned.

System interoperability

However, as a digital twin is scaled — from a small building to a smart campus or even a smart city, for instance — a challenge around the many different sources of data emerges. “You need to understand number one, how is that data that you’re taking from all these disparate sources and areas, how do you ensure the free flow of that data? Because in those types of examples, you’re going from a discrete digital twin to a composite digital twin, to a system of systems,” explained Isaacs.

Therefore, interoperability needs to be top of mind when employing a digital twins methodology. “Even when you look at that smart building case, you… realize how many different sources are there generating data,” Choudhury agreed. “You [can] have temperature sensors, water sensors, light sensors… you need to make sure that you have the ability to have interoperability between [those elements and] … the wireless communication system that you’re using.”

However, Choudhury also claimed that digital twins can actually aid in ensuring interoperability because it allows for a user to test the compatibility between disparate system components before committing to them.

ABOUT AUTHOR

Catherine Sbeglia Nin
Catherine Sbeglia Nin
Catherine is the Managing Editor for RCR Wireless News and Enterprise IoT Insights, where she covers topics such as Wi-Fi, network infrastructure and edge computing. She also hosts Arden Media's podcast Well, technically... After studying English and Film & Media Studies at The University of Rochester, she moved to Madison, WI. Having already lived on both coasts, she thought she’d give the middle a try. So far, she likes it very much.