Back to Top

Pakko De La Torre // Creative Director

Augmented Reality is Becoming the User Interface for IoT - RTInsights

Augmented Reality is Becoming the User Interface for IoT – RTInsights

Augmented reality can help bring IoT data and machine data to the person that’s actually trying to fix a machine in the field.

Industrial IoT is powerful, capable of tying together vast networks of plant equipment, systems, and products into intelligent networks that can assure smoother and more trouble-free operations. But up to this point, there’s been something missing from the equation: the people who need the skills or expertise to respond to IoT-based summons for maintenance, repairs and updates. That’s the word from Ian Hughes, analyst with S&P Global Market Intelligence. In a recent Q&A, he explains how augmented reality (AR) helps companies realize the full value of industrial IoT.

“The focus of industrial IoT has been on the machinery of the manufacturing process, and finding those,” Hughes says. “Initially it was instrumenting the things, and now it’s working out what data makes sense and filtering out the noise, and getting to the point that you can now train AI models to machine learning models to do this predictive maintenance for you.”

See also: Augmented Reality Now Bringing IoT Data to Life for Frontline Workers

Time to pay more attention to the human side of IoT, to “help enable the workforce to operate and do their job and fix the machines in a better way,” he continues. “With predictive maintenance, if you then send somebody inexperienced onto the shop floor to fix a bearing on a machine, and they’ve never done it before, and they’re relying on paper manuals and using brute force to fix it, you’re not getting the full benefit and virtuous circle of industrial IoT.”

That’s where emerging technologies such as augmented reality come into play, Hughes says. AR can be instrumental in “helping to bring that IoT data and machine data to the person that’s actually trying to fix the machine.” AR – enabled through headpieces or mobile devices – can “present IoT data into the view of the person that’s doing the work, and give them extra directions as to which bolt to undo, how tight something needs to be, whether something’s live or not, or which bit they need to replace.” This combines the person’s expertise with remote or AI-based support, he adds.

“I often say that augmented reality is the user interface for IoT,” Hughes says. “Because you have this spatial nature of machinery, and all this instrumentation, which is also spatial.” An AR-based system assisting a worker will “know which part it’s talking about on a machine… it may say, ‘it’s the bit over here, not the bit over there, that you need to engage with.’ It improves how the workforce can do their jobs.”

This forms the core of the developing digital twin concept, on a highly granular level, he states. “Whereas original IoT instrumentation is just sensor data going down a wire and appearing somewhere else on a on a dashboard, here people are now gathering that data into a virtual model that matches the physical device or the physical plan.” This can be exploded into a virtual model of an entire plant – a “3D model that you can spin around, look at, engage with, and zoom in on a particular part. And you also have a live stream of information about the actual state of the plant.”

The result, Hughes continues, is a digital twin model “infused with data, that you can, for instance, use in a simulation as a training tool. You can do analysis, and say. ‘what would happen if we changed these IoT values?’ ‘What if we changed suppliers?’ You then help workers learn to fix something in a in an augmented reality environment.”

This content was originally published here.