November 20, 2023
Ever since Mark Zuckerberg went all in on the Metaverse, the excitement (and therefore, the hype) has been huge. Mostly it’s been used for recreation and entertainment via the VR headsets and controllers, but Meta wants it to revolutionize work, most of all. With that goal in mind, where are we now?
Meta describes the Metaverse as a way to expand work experiences and “unlock the full potential of Meta’s work solutions by discovering new apps and experiences to bring VR to life, or connect to the 2D tools you already love to use.”
At the recent RISC-V Summit, Prahlad Venkatapuram, Senior Director of Engineering at Meta, talked in his keynote about Meta’s use of RISC-V processors in their Data Center SoCs (RISC-V in the Data Center was a huge theme of the event), and this public noise about investing in key embedded architecture sparked lots of conversation about whether the Metaverse has a place in the industrial world.
In concept, a virtual reality space like the Metaverse has use cases for industrial applications. Uses like digital twins or virtualized workspaces for remote operations and monitoring come immediately to mind. Going deeper, we could think about the Metaverse as “an opportunity to solve complex problems by adopting a system-of-systems approach. The goals of every system within the metaverse can be tied to increasing overall efficiency and affordability and reducing waste from a global perspective. Positive or negative changes within a single system are balanced against the overall impact to the system of systems,” as we wrote in this series.
But those are all theoretical, and they haven’t been fully implemented in any industrial environment. We spoke to Vishal Shah, Senior Director, Product Line Management and Strategy at Synaptics to get his input as the lead of the company’s VR efforts, and he said that when the Metaverse was first announced the tools and applications weren’t ready for commercial use, but that’s been changing.
There are three phases of VR, Shah said. The first is the VR we’re familiar with now, headsets and fully digital environments. The second is what he called “mixed reality” which is often thought of as augmented reality and involves layering digital or virtual elements over physical environments. The third, which would be a full Metaverse implementation, is what Shah calls “augmented reality” and this is the head’s up display-style, fully interactive and integrated physical/digital world.
As we get closer to that highest level, more applications will become viable and available. The most obvious, and likely one of the earliest to mature is a fully realized digital twin. “Digital Twin is still in early phases,” Shah said, and though that’s true, there are cases in use for training in healthcare, military, and emergency services. As the digital technology develops, we will likely see more uses for digital twins in warehousing, inventory management, and other supply chain logistics.
Right now, the state of the metaverse is mostly digital assets in a physical world at its most sophisticated. But development is ongoing and likely to grow rapidly. Though we’re not there yet, these virtual tools will eventually gain the fidelity and reliability to have control over physical infrastructure, operations, and work. We can see the earliest stages of this in UAV, robotics, and control room tech.
So, is the metaverse all sizzle and no steak? Right now, I’d say it’s a tasty appetizer. It’s going to be awhile before we get an entrée, but I wouldn’t bet against it arriving sooner than expected.