Building the Open Metaverse

How Hugo Swart and Qualcomm are paving the road to XR

Hugo Swart from Qualcomm discusses their role as a technology provider enabling XR devices. He shares Qualcomm's vision for XR as the next compute platform and their progress towards an open, interoperable metaverse.

Guests

Hugo Swart
VP and GM of XR, Qualcomm Technologies, Inc.
Hugo Swart
VP and GM of XR, Qualcomm Technologies, Inc.

Listen

Subscribe

Watch

Read

Announcer: 

Today, on Building the Open Metaverse

Hugo Swart:
What I saw was the opportunity to do something better than PC VR, better than slapping a phone in front of your eye, which was doing a standalone headset where all the processing, fast connectivity, and displays made for VR are all put in one device.

Announcer:

Welcome to Building the Open Metaverse, where technology experts discuss how the community is building the open metaverse together. Hosted by Patrick Cozzi and Marc Petit.

Marc Petit:

Hello, and welcome back, metaverse builders, dreamers, and pioneers. I am Marc Petit, and this is my co-host, Patrick Cozzi.

Patrick Cozzi:

Hey, Marc. Always a pleasure to be here.

Marc Petit:

You are listening to Building the Open Metaverse, season five, the podcast, which is your portal into the open virtual world and spatial computing.

Bringing you the people and the projects that are on the front lines of building the immersive internet of the future, that open and interoperable metaverse for all that we all want.

Patrick Cozzi:

And today, we have a very special guest joining us on that mission. Hugo Swart from Qualcomm, where he is VP and GM of XR and Metaverse.

Hugo, welcome to the show.

Hugo Swart:

Hi, Marc, Patrick, and everyone listening to the show. First of all, big congratulations to you guys. I really love the show and am very happy to be here. I am very happy to discuss how to bring the industry together and create the open metaverse.

Marc Petit:

Describe for us in your own terms your journey to the metaverse.

Hugo Swart:

I'm going to start in early 2000. Metaverse, of course, just very selective people would think about VR. But why am I going to start then? I started my career with mobile when we were transitioning from computing on a PC to computing in mobile.

My first job was to evangelize wireless data technology, meaning how to get internet to the phone. Early 2000 is remarkably similar to the challenges and to what we're doing right now in creating XR and the metaverse. The use cases many people would think about were navigation, how I communicate, playing games, and taking pictures.

In the early 2000s, the processors were not ready, the displays were not ready, and even the communication, there was a path to it. It was not overnight that we got this marvel that is the smartphone into everyone's hands. It was year-over-year improvement around this vector.

Computing processors, the connectivity displays, the whole infrastructure, and of course the content, but it does happen. When everyone sees the use case, the value to consumer, the investment continues, and that's where I started my career.

I want to say seven, ten years around wireless data was when I saw this full transformation happen, and then I moved to a division of Qualcomm that was incubating new businesses beyond phones. Using our technology, our processors in new market segments. It started with smart TV kind of businesses, the first Amazon Fire TV actually had a Snapdragon on it. Then around 2015, while trying to incubate these new businesses at Qualcomm. If you remember one of the first waves of hype on virtual reality. With Facebook acquiring Oculus and the PlayStation VR and a lot of cardboard, back then, what I saw was the opportunity to do something better than PC VR, better than slapping a phone in front of your eye, which was doing a standalone headset where all the processing, fast connectivity, and displays made for VR are all put in one device.

I started with one of my partners, Ortech, they're ODM, and we built the reference design. 2016, we announced that reference design at EFA. It already had sick stuff running locally with reasonably good graphics with the Snapdragon 820; that was the first product that we used. 

To me, that was the big catalyst for the standalone VR category. You saw Oculus Go launching with Snapdragon 820. You saw there was a daydream VR headset following a reference design. HTC with their VIVE Focus device. All following that first concept of a standalone. Then things start to pick up. I think that there was a clear benefit of having a standalone VR headset versus slapping a phone on the face.

There was a clear benefit versus not the same graphic levels, but the ease of use compared to a PC VR. Things just grew organically, naturally in volume and in functionality. I'm going to go back to the mobile days where year over year, generation over generation, you start to see almost step level of improvements on the hardware side. With the hardware getting better, the content gets better. That's a little bit of the history. 

From that incubation of VR standalone headsets, we decided to create a separate business unit at Qualcomm dedicated to XR. I started to focus 100% of my time in XR, really passionate about it. Really looking forward to seeing the transition from mobile into metaverse/XR.

Patrick Cozzi:

Very cool journey, and I agree wholeheartedly about the standalone VR headsets. We're wondering if Qualcomm has such breadth and depth, and if you could maybe do a high-level overview, it'd help bring our audience up to speed.

Hugo Swart:

Qualcomm is not your brand that everyone recognizes when sitting at a dinner table and talking about products and companies out there. But Qualcomm is a key foundational technology provider to various industries. When I say technology, of course, our bread and butter is the silicon, the processors with connectivity. But it's also core technology. We have a big R and D team on anything from connectivity, from the second generation of cellular communication with CDMA to 3G, 4G, 5G, already working on 6G, but Wi-Fi, Bluetooth, satellite communication.

Communication is actually where the name came from, quality communications, that's Qualcomm, that's how the company started. But then as we saw communication products, bringing your cell phone became more and more capable with more and more processing on it, that became a core part of our business, doing processors. We have our own GPU, Adreno GPUs, we have camera IPs, video IPs.

When we build a processor, it's not just a CPU, it's not just a graphics card, it's only one. It's an SOC; that means we have deep expertise in each of these areas. It's like we have a graphics company inside Qualcomm, we have a camera company inside Qualcomm. All these specialties are really, really deep and we create that. AI is another one, that as we talk about AI, both in the context of XR, and everywhere, that's something we are investing a lot.

We produce the silicon, these processors with communication, we call it we're enabling the intelligent connected edge. When we say edge, we mean devices. It could be your car. That's a big area where Qualcomm is growing a lot. Of course, the phones where a great majority of the premium Android phones are using our Snapdragon processors, watches, industrial products, and IOT products. We see it ourselves as an ecosystem enabler. It's the behind-the-scenes technology provider that creates new industries. And that's what we do and what we're aiming for with the XR and the metaverse as well.

Marc Petit:

Well, thank you for that, Hugo, because Qualcomm is such an important company in our industries because as you see, it's an enabler of so many things.

Give us a sense of the size in terms of people, how big is Qualcomm today?

Hugo Swart:

Well, we are above 40,000 people. Revenue last year was $44 billion, and in a very interesting and good position to lead the digital transformation. The same kind of transformation that we started and built on phones, now as you need more intelligent connected devices, we are enabling across the industries and the world.

Marc Petit:

At Qualcomm, you have talked about your vision for XR as the next compute platform, and you have been a very, very strong advocate of accelerating and opening the XR ecosystem to make this vision a reality. How is this going?

Hugo Swart:

I talked a little bit about the past from 2015 to now, and I can talk a little bit about where we are and where we are going.

Let me separate the question into two parts. First into what we're doing, what products, and how we're enabling the ecosystem. I want to double-click on the open part of it, which is very important. In terms of what we have been doing in XR, actually my first strategy internal deck in around 2015 on XR is still valid. I talked about four pillars that we needed to build to be successful in XR, not only as a business for Qualcomm but enabling the industry. First is, of course, the processors, and the chips, both for raw compute, but also in connectivity. That's the number one. The number two is technology for spatial computing. By the way, I'm not just copying what one of the newest companies into this space started calling this a spatial computer.

We actually have been talking about spatial computing for a while. I think even Magic Leap and others have been talking about the word “spatial computer.”

We have the first pillar, which is the chips. The second is technologies for spatial computing. What is that? It's really the perception tech for user understanding, head tracking, hand tracking, eye tracking, all kinds of user inputs, and tertiary set outputs as well. Then the environment understanding. Oh, where's the floor? Where's the table? What objects are in my field of view? And so forth; we work on these technologies.

Let me just pause to explain why we work on this technology. It's twofold. One is because our customers need these features, this functionality. If Qualcomm can provide it as a horizontal platform, that helps more people come to the market.

Even more important is that these, let's call these features, these algorithms for spatial computing, we have to harden that. We have to put them in silicon, because if you want to have less than a watt but have all this AI computer vision that I talked about for user understanding, it's a lot of compute. And you want to do that in less than a watt and you can't do this in the cloud because it's super low latency, and it's going to actually incur more power if I try to transmit all this camera information to the cloud, I have to do it locally. How do I do it locally? If I just run these algorithms on the CPU it's going to, itself, run more than a watt, just these core algorithms. I have to create custom silicon, custom IP blocks.

It's almost like we have the GPU for graphics. I have a video engine for video, I have an ISP for camera, I need an IP block for XR. That's why it's so important for us to develop that second pillar of core technologies. First pillar, the silicon. Second, the technology. There are also a lot of multimedia technologies, reprojection on the display, and so forth. But that's the second pillar.

The third pillar is reference design, building a reference design that can lower the bar for our customers to create the devices. Of course, the more experienced OEMs, they leverage less. They look at our reference design; they may use it, or they may not. But if you are a smaller company, a startup, well then the reference design really helps get you started working with ODMs and so forth. 

The fourth pillar is working with the ecosystem. When we started working in 2015 with the ecosystem, it was more talking about partners in general, ISVs, and enterprise players. But more recently, we developed our own SDK which we call the Snapdragon Spaces. It's essentially a plugin to the Unreal Engine and Unity, but it follows OpenXR. We just wanted to have a tool to have developers, and also OEMs, that want to start working with our Qualcomm hardware, even before commercial products are out there. That's Snapdragon space.

So those are the four pillars we are developing or working on. I think we're getting a lot of traction in all of them. On the processor side, I think it's really important to highlight we have dedicated chips for XR. When we started, you look back into 2015, we took a mobile chip, a smartphone chip, the highest-end smartphone chip, and put it on a standalone.

It was much better than having a smartphone on your face by using the same processor with minor tweaks that we ask my mobile peers, "Hey, can you please put this feature on your phone chip?" And that's how we started. But now we have dedicated chips, and a dedicated portfolio for XR. Both target the VR MR category and AR glasses. We have two, kind of these two. You asked me how are we doing? It's excellent. It's a huge investment to do a new chip, and we can already fund and invest in dedicated chips for the market. 

On the customer traction, we already launched more than 80 devices. It sounds crazy when we would say 80. Where are they? Well, of course, everyone knows about the Quest, and no doubt that's the most popular one.

Also, when you look into China, we launched tens of devices in that market. Pico is, I think, the highlight, now part of ByteDance, HTC, Panasonic, and Lenovo, I think it's a big one. Lenovo both with the VR and AR. When you look at all of them, we are up to 85.

Going back to your question, how are we doing? How can we get to 85 devices? It's not by working one by one with big investments for Qualcomm. No, that's why we need the reference design. If it wasn't for the reference design, how can I support startups? I'm going to start transitioning to the open part of your question. That's why one of the reasons we consider ourselves open, kind of horizontal, is that we're enabling not only the big players with our technology, with our processors but startups as well, looking to Lynx as an example in France.

They built up an MR headset using our XR2, using a lot of our technology. We are able to scale with this strategy. I look at it very bullish, very optimistic about the future. I think that products like the just announced Quest 3, it's a step function from the Quest 2. If you're looking into the XR2 Gen 1, which was on the previous Quest and XR2 Gen 2, we have more than two times the GPU horsepower. We have more than eight times the AI capability. Then you look into the display, pancake lenses, you look at how the devices are becoming smaller, lighter, more capable.

That's why I continue to be very bullish that looking back at the mobile history, I'm seeing the same movie. It's almost like deja vu from the smartphone creation to what we are seeing now. The second part of your question is open, and I think I addressed some of it already with our horizontal platform, and we like to think about ourselves as open because of that.

When we look into standards, very important. I think that for the market to flourish, we need standards. Like in the mobile days, it was things like 3GPP influencing how are we going to do wireless data and have interoperability between infrastructure and the headsets and handsets and so forth.

Marc Petit:

Patrick will want to geek out on distributed computing. But before we get there, I think everybody's asking two things in their mind. The first question we'll ask, how far are we from those lightweight sleek glasses?

Can you open your crystal ball for us and give us your perception of that?

Hugo Swart:

I'd like to think, as that's the holy grail we all want, one pair of glasses that can do both, fully immersive and all day long. That is transparent, optical, and see-through. It's not a short timeframe. But I think I'd like to see it as a north star where we want to go. But I like to look more into what we can do in the next five years? I'm not sure we're going to get to these kinds of glasses in five years. I think it's going to be probably a little bit more. That doesn't mean that I'm pessimistic. On the contrary, I think that a lot of progress can and will be made in the time span of two years. I think that today, it's very separate categories that I think about. When I say MR, what I really mean is a video pass-through, a VR headset with video pass-through.

VR, MR, I think a lot is going to happen over the next couple of years. You saw Apple with their product, which is essentially an MR device. Also the Quest 3, an amazing product that will really democratize access to MR. I think you have these two extremes of products in terms of price points and target markets. But I think that MR is going to evolve very rapidly over the next two to three years. I'm bullish about it.

Then when it comes to AR-type glasses or smart glasses, we see two types. One is more, let's call it the smart glass. I don't even want to call it an AR glass. But really, glasses that have cameras can do not only video recording and picture image taking, but also do AI with it. You can do a lot just with cameras and some local AI. I think that's going to be very powerful.

From the smart glasses, adding a display that now maybe it's not that a hundred-degree field of view, maybe even a 30-degree field of view, but can provide a lot of value. If you think about a camera plus a small display, now you can talk about navigation, you can talk about translation, real-time language translation. For people who are hearing impaired, having subtitles as someone is speaking, I think there are tons of applications that can happen; this is now. It's not five years from now. It's technology that we can do now.

That's the other product that we just announced, the AR 1 that is targeted for these type of smart glasses. I think that we're going to see a great amount of progress in the smart glass category. It's a continuum, it's not a, “oh 6.5 years I think we're going to get there.”

I'm going to go back to the mobile example. When did everyone on earth have a smartphone? Well, it took years for the phones to mature. But if you start looking at what we had in 2005, the progress that we made by 2010, it was just crazy. It was amazing. That's the region that we're talking about where we are right now with the first dedicated chips, the first dedicated displays, and, of course, content getting ready. I'm bullish on growth over the next three years. I think in three years, we're going to see these two disparate product categories still, but then the holy grail kind of glasses. I'm going to stick to, I think, what most of your guests say, ten years.

Marc Petit:

One last question from me that probably our listeners are also wondering about: you referenced the Apple Vision Pro. What did this entry mean for Qualcomm?

Hugo Swart:

It's a big endorsement to the XR and metaverse industry, where one of the leading brands, consumer brands, technology brands of the world also joins. I think it's big. It helps, I think, in particular with getting excitement with developers, trying new things. I think that the products that we're enabling with our horizontal foundational technology, we can enable all tiers of products.

I'm very excited about the Quest 3 type of product because of them being more accessible to more people and having amazing technology and amazing user experience.

I think again, the Apple Vision Pro brings awareness to consumers, enterprise, and developers. But I think that in the short term, products like the Quest 3 have a bigger chance of getting much higher volumes.

Patrick Cozzi:

Hugo, we want to ask you a bit about use cases. The new fusion capability looks really promising, being able to merge that smartphone AR with head-worn AR/VR. For this hybrid experience, what use cases are you most excited about?

Hugo Swart:

What we saw as engaging with developers, it's a big commitment for a developer to, “oh, let me build a full-on 3D spatial application from scratch.” You have those developers, and they're great, and we really work closely with them. But then there's a huge pool of mobile developers. I already have my app on my phone. What if, okay, if the user has AR glasses but AR2-based AR glasses, can I have a touch of AR into my application?

Let's say I have a game, a shooting game, or a racing game that is still being played on the phone. If I have my glasses on, now maybe I can have the map being seen overlaid on top of the real world, maybe on the side of the phone or an explosion. Or if I have a kind of a furniture-purchasing app where I still scroll, still the app on the phone, but when I click on a given couch, I can see the couch in the real world as a 3D object. We feel that can really lower the barrier for mobile developers to start experimenting with AR. We see both consumers and enterprise customers leveraging it. I highly encourage the audience to watch our AWE keynote presentation, where we showed one application from Kittch.

This is a cooking app where you have famous chefs telling you how to cook. The example is very similar where the app is still on the phone, but then with the glasses, I can put timers on top of a pan, or I can get the videos overlaid on the real world.

The other one was with Red Bull. It was a really, really cool experience where they enabled, again, the app is on the phone, but I can take the video and put the video feed; I think they had it for a downhill bike. You can see the video, the 2D video. But at the same time, I can put a 3D rendering of the mountain on a table and see the bike going down the trail. It's a very immersive experience, but it's a mobile app that, if I have the user, I can pull out these experiences.

We're very interested. We're getting very positive reactions from developers around the globe on it.

Marc Petit:

That's good to hear. You also mentioned Qualcomm being a key player in automotive. Do you see a time when we can create experiences that work on a phone, a head-worn headset, and a windshield as well?

Hugo Swart:

Definitely, but I'm going just to have that question to bridge a little bit of what Patrick was talking about, distributed compute, because I think it has something on the car as well.

I think the vision we have for split compute is where you have your glasses on that, hey, again, you want to have it less than a walk or as low power as possible, not only because of battery life but also because of thermals. I cannot dissipate a lot of heat in my glass.

But, I need many experiences, I need 10 watts, I need 50 watts, and maybe even 100 watts. I cannot put that on my face; it needs to be one watt. I need a split computer architecture where the glass can identify a host compute, ideally nearby in proximity, but also in the cloud.

And then I can have this distributed compute architecture with my phone, with my PC, with my car. I also think that it's about how we have these devices work together depending on the input and output available. What do I mean by input and output? If someone has a glass on, I have the inputs from the glass, my hands, my eyes, everything, and a display in front of my eyes. Even if I'm in the car, I may not use the windshield. I may use what's on my glass. If I don't, well then it can go to the car. I think that's kind of the world we are thinking and designing for. What we want is to make the lives of developers as easy as possible, as open as possible, and for something that, of course, has value to consumers.

Marc Petit:

In terms of communication infrastructure, can we do this on 5G, or we had Michaël Trabbia, the CTO of Orange, here, and you're talking about 6G and software-defined networks? Can we achieve that level of connectivity in the current generation of telecommunication infrastructure?

Hugo Swart:

We definitely can from, let's call it, the air interface, the 5G air interface, the 5G connectivity. But I think it's not only the air link, it's not only the wireless link, it's also, hey, where is that compute at? Are you putting GPU farms on every base station? Is it in your local neighborhood? Is it one in the whole country? Because there's latency, every hop adds latency. It's not only the air link. But I think from 5G, we have many R and D programs at Qualcomm on that and showing that that's possible.

We showed it at the last Mobile World Congress in February this year, showing glass to phone to cloud. There's distributor processing at three levels. There's some in the glass, some in the phone, some in the cloud. Between phone and cloud, you can dynamically switch. Meaning if I have a good 5G connection, then okay, move more to the cloud. 5G connection starts to degrade; if you start seeing a degradation in signal, move more of the processing to the phone. Of course, you're not going to have the 300-watt graphical experience, but you still maintain a level of service.

Patrick Cozzi:

Hugo, we really appreciate you being on the show. Love you sharing your optimism for the future. As you know, we'd like to wrap things up by asking if you'd like to give a shout-out or two to any person or organization.

Hugo Swart:

First of all, yeah, thanks so much for inviting me here, and sharing a bit more about what Qualcomm is doing to enable the open metaverse.

In terms of shout-outs, definitely my engineering team is working relentlessly on new chips, and technologies, on Snapdragon Spaces. But also, I think we all need to appreciate what Meta is doing, the big investment they have in the space. I want to really, they have been great partners of ours. With the launch of the Quest 3 and the new Ray-Ban Stories, I think that is really going to help drive us forward. 

But it's not only Meta. We work with many different partners, customers in the world, and it's a good set of people that it’s hard to give all the shout-outs. Definitely on the other people that I think get less attention like the ODMs, no, a few ODMs like CorTech, display manufacturers, and a lot of these brands that are lesser known, but are really helping drive the hardware ecosystem.

Marc Petit:

Hugo, thank you so much. You are a true metaverse pioneer. We're very honored to have you with us today. You give us an amazing glimpse into Qualcomm, a company that, as you said, is not the first top-of-mind company, but is so important in our industry. Thank you for doing this for us today.

And a big thank you to our listeners. We did 56 episodes over the first four seasons. We're into season five, and we're super happy to get the feedback.

You can follow us on LinkedIn. We have a page there where you can find all those episodes and on our webpage, buildingtheopenmetaverse.org.

Thank you, everybody. Thank you again, Hugo and Patrick. We'll see you soon.