Explore the XR technology powering the Wireless Metaverse with top Qualcomm inventors

Today, “metaverse” is a rather archaic term, similar to the term AI a few years ago, tossed haphazardly for its prestige.The grand vision of people sharing virtual experiences that are indistinguishable from reality is being used everywhere, from enterprise collaboration platforms to the new modern virtual social network that Meta (Facebook) has recently heavily marketed.However, to actually enable or build any kind of metaverse would require an extremely complex ecosystem of hardware and software, preferably one without wires that would tether humans to systems or wall sockets.Wires definitely break the immersion for most users, making virtual worlds less real, unbelievable, or less useful, depending on the application.
In short, to incorporate augmented reality, virtual reality or XR (extended reality) rights, wireless headsets and glasses are a better route to the Metaverse.However, wireless XR systems present a very difficult set of problems that require cutting-edge mobile technology, from position sensors to ultra-low-power processing engines for vision and, of course, extremely low-latency wireless communication and communication between content sources. user.So it makes perfect sense that mobile processing and communications giant Qualcomm has been working on these problems for over a decade.
My analytics firm partner Marco Chiappetta and I had the opportunity to sit down with Martin Renschler, senior director of XR technology at Qualcomm Technologies (one of the company’s earliest engineers and arguably one of the founders of the Metaverse) to discuss the company’s efforts to develop The role of wireless AR, VR, and MR technologies, and where we’re headed in the future.Here are some highlights from our Q&A interview with Martin, showing a fascinating look at how all this virtual and mixed reality wireless magic comes together, and how the metaverse as we know it will continue to take shape.
When did Qualcomm decide to make XR a focus area and a market opportunity worth pursuing?Are there early development platforms that Qualcomm is involved in that the public might be interested in?
[MR] Back in November 2011, the XR R&D team was part of our Mission Impossible team, established by the Office of the Chief Scientist to advance the possibilities of mobile technology.Our Goal: Leverage Qualcomm’s mobile technology expertise to develop Snapdragon-based AR devices.By CES 2014, we had a working prototype at 720p, 80-fps that already included eye tracking, voice control, and ML-based gesture detection.In April 2015, BMW showed augmented vision glasses based on this design.New features in the BMW design are infrared head pose tracking in a moving car and video transmission to the car’s camera via Wi-Fi.This design also contributed to the ODG glasses.
[MR] has launched over 50 Snapdragon processor-based XR devices.Some people use Qualcomm’s entire XR stack.In addition to these devices using a Qualcomm-powered mobile technology base, 4G and 5G and Wi-Fi are the most obvious examples.
It appears that Qualcomm’s ubiquity in smartphones, and managing the wide range of sensors available across devices, has some important similarities to XR.How do the sensors and sensing requirements of today’s mobile devices differ from XR platforms?
[MR] Smartphones and XR devices do have a lot of similarities, especially in terms of some core technical requirements such as multiple connectivity methods (4G/5G, Wi-Fi, Bluetooth, etc.), complex sensor arrays and perception, and Very small envelope of power and size.XR as a technology area is highly dependent on extremely low latency and utilization of many concurrent cameras operating at high frame rates.We thought early on that we could use Snapdragon hardware for XR applications, but from day one we had to design a new software stack for XR, creating pipelines between different hardware modules without going through expensive, Sophisticated API layer.In later chipsets, we moved more and more XR algorithms into dedicated hardware to further reduce latency and power consumption.Today, we have a complete line of chipsets dedicated to XR equipment.
But the idea of ​​devoting significant design resources to bring a new XR-optimized chipset to market is no easy feat, especially when you consider emerging market opportunities like the Metaverse in its current state.So what does Qualcomm actually have to redesign specifically in its Snapdragon mobile chip library to bring AR, VR and XR to life?“On the VR side, we’ve added support for more concurrent tracking cameras, new IP for hardening XR-specific algorithms like 6DOF, XR-specific features for GPUs (such as for center point rendering) and DPUs (hardening Aberration correction for color VR lenses). On the AR side, stay tuned for that…” Renschler noted.
Again, this guy sounds easy, but it’s clear that Qualcomm has to invest a lot in R&D, but also need to place specific silicon bets on the moving targets of fast-moving markets and technologies.
One of Qualcomm’s core competencies is clearly wireless connectivity.How do the wireless connectivity needs of a standalone XR device differ from those of a smartphone or ACPC?
[MR] Standalone XR devices differ from phones in that they require multiple different types of connectivity technologies, all running simultaneously in the device (a problem Qualcomm excels at) – but XR in particular has many unique Features that are required for the user experience it provides.We had to develop specialized low-latency protocols and codecs for XR devices running in split rendering mode, where frames were rendered elsewhere and only reprojected on the device to correct for the time elapsed since rendering.Most cellular and Wi-Fi protocols are optimized for throughput, but not for these low-latency and flawless frame-by-frame rendering transfers – so that’s something we put extra effort into.Solutions like these are often overlooked, but they provide the backbone of how XR actually works as a technology.
Regarding the role that AI (artificial intelligence) and machine learning may play in all of this, Renschler noted that “there are several ML-based XR-specific capabilities — such as head pose prediction, 3D reconstruction, object tracking, and hand tracking — Hexagon processors and dedicated AI modules are ideal for these tasks. The area available for NPUs on each generation of Snapdragon has increased dramatically, and AI innovation is a key goal for our team and company.”
While Qualcomm has a long history of developing GPU and graphics technology, and the vast majority of Qualcomm-powered devices have displays attached, XR applications and use cases have very different display requirements.What is Qualcomm doing in this regard to advance XR display technology or other display-related innovations?
[MR] Added several features to XR: e.g. direct driving of color sequential displays and chromatic aberration correction in hardware.These display features are designed to provide optimum efficiency and functionality for the XR experience.This goes back to the core of what we do at Qualcomm: driving ideas, solving problems and laying the technology foundation to create the best end-user experience.
What experiences and capabilities does 5G connectivity offer XR that were previously unavailable on LTE or legacy networks?Is Wi-Fi also critical to the future of XR?
[MR] Typically, cellular networks are based on shared resources.However, XR requires consistently and guaranteed high bit rates, as new high-quality information is required for each frame.What 5G adds to help XR is enhanced beamforming, which allows many users in the same cell to achieve high throughput without interfering with each other, resulting in significantly improved signal-to-noise from higher, more concentrated data streams Compare.Likewise, for Wi-Fi, best-effort-based transport protocols over shared channels are not suitable for XR if there is a lot of simultaneous background traffic, and therefore need to be optimized.
Development tools and software are key enablers for fostering new XR technologies.What areas of Qualcomm’s focus will attract and inspire developers to create new XR experiences?
[MR] We recently launched Snapdragon Spaces, an open developer platform that makes it easy to develop AR and VR applications.It also supports converting existing games to become immersive when running on our AR glasses.Qualcomm’s XR stack is based on OpenXR and is therefore compatible with games and game engines that support OpenXR.
What do you think are the two or three most important challenges in advancing the XR/AR/VR experience?Is there a “holy grail” experience, capability or feature that Qualcomm or the industry is striving for?
[MR] XR today is great for games that put you in a virtual world (think classic VR headsets that completely cover your eyes).Over the next few years, there will be mixed or augmented reality experiences that place information and animations in your work or home environment.In order for this to work well, we need to fully understand the environment: basically we need real-time depth per pixel, we need to know what kind of objects we’re looking at and where all the light sources in the room are.This will require distributed computing because the power budget of wearables is limited.Wireless technology that reliably connects you to distributed computing nodes with low latency is key to making that happen, and Qualcomm is well-positioned to get the job done.
When we asked Martin what he thinks is Qualcomm’s role in the metaverse, and the hope it holds for the future of human collaboration and communication.He noted, “Qualcomm is fundamentally an enabler – we innovate through extremely complex backgrounds to solve technical problems and achieve the best device and end-user experience. We are strong supporters of standards (our cellular standard) Aspect actually predates our chipset aspect) because standards enable a wide variety of different devices to utilize our platforms and solutions in a simple and compatible way.
Metaverse’s vision includes standardization, such that the same virtual item or virtual clothing can be brought into various virtual worlds from different vendors and interacted in a standardized, predefined way.In our traditional role as enablers of the underlying technology, we will develop hardware acceleration to provide efficient encoding, encryption, transmission, decoding, decryption and rendering of Metaverse assets and interactions, and drive this standardization forward.”
Renschler makes a good point here.For the metaverse to take off, we all need to speak basically the same language.The enabling technologies for design delivery and rendering of virtual worlds are obviously critical, but so is allowing disparate systems and solutions to interact with each other through standardized communications and visual representations.Otherwise, our virtual experience will be limited to a certain number of closed, custom islands, rather than open virtual worlds.
Regardless, it’s clear that Qualcomm is taking a beachhead in wireless XR with the goal of enabling untethered virtual worlds.In it, we can move freely in the real world, but explore new experiences in collaboration, entertainment, modern workplaces, and virtually connected social interactions.


Post time: Aug-05-2022