NVIDIA CloudXR
with Greg Jones

Season 1 / Episode 4

Welcome to the LikeXR podcast season one about the extended reality market from people who really understand the industry. This week we are joined by the Director of global business development for XR and product manager of CloudXR at NVIDIA. Greg Jones focuses on partnerships and projects that use NVIDIA professional XR products. Before coming to NVIDIA, Dr. Jones was the Associate Director of a 200-person academic research institute focused on scientific computing, image analysis, computer graphics and data visualization at the University of Utah.


Greg Jones
Director of global business development for XR and product manager of CloudXR at NVIDIA
Could you tell us more about your role in NVIDIA and what are you working on now?
nVidia is a large company & everybody knows that. We’re making graphic cards but we’re doing a lot of other things. Artificial intelligence is at the main stage. We have an XR team, VR, AR that makes reality themes. We look at those two different ways as the prize for consumer among the enterprise team. We think about how VR, XR, AR can help the automanufacting, architecture, all those types of enterprise nVidia & entertainment. My role in that is a business development. I have a lot of partnerships with companies that work with nVidia & a part of management. We have a variety of parts and XR is one of those. The team I work with really finds partners to demonstrate our products, show the industry how well those parts will help enterprise users & how they do in VR or AR. That is what we focus on. We bring that feedback as we work with those partners on the products we make. Our engineers take that feedback in and change the product. It’s a really nice cycle to form these partnerships & a feedback to improve our products. Cloud XR is a very exciting part advent to streaming in AR or VR.
Show more
I want to talk more about CloudXR as a solution for by now you start forming an ecosystem. I do believe that is a fantastic opportunity for XR industry in general. What inspires you the most about this technology?
— If you look across enterprise users & even consumer right now to be a VR user for enterprise and you remember enterprises nowadays are such a quite large. Photorealism… Automobiles, design review all need a photorealism. That usually means that I need a big GPU, I need a big power for that GPU. So every VR user in enterprise if I want to see a real building of architecture or manufacturing has to be a customer that has a relatively expensive workstation. Right as your fingertips. And that’s of everybody. So if we stream for network station from a machine-in-the-machine room or from a cloud service provider now instead of VR users having to have a desktop right next to them they can get to that desktop somewhere else so multiple people can use that machine. Now the customer basically just too tied to the workstation no matter who buys it. If you think of a Quest 2 then it’s pretty affordable. Quest 2 for an enterprise is just 600 USD. The bar for being a high-fidelity VR user has been lowered so now that kind of market can expand. It’s just that you don’t needing a desktop. If you think of and that’s one of my favorite examples it is VR in architecture firms. In most architecture firms even the large ones VR is a couple of special VR rooms and they are all set up with the tracking, desktop and VR technician take care of them. So I want to see my building in VR after getting up from my desk, after walking out the hallway, after I get the technician’s time. He or she needs need to load the data on the workstation and I need to get into that headset. And it takes me a lot of time because VR is a lot of complexity. Now we stream in VR. I can stay on my desk and put my all-in-one headset on, I IP to the server and I’m in VR wherever I want to be, wherever there is a network I can be in VR right away. So If I’m looking at the building and I’m putting a 40 foot wall into my building and I think what’s really hard to understand on a too small display is what a 40 foot wall really looks like. And I just want to pop it really quickly and see that 40 foot wall in real life. Now I’m streaming in XR, I just pop headset on and see that wall and I can take my head & arms back to work. It’s the ease-of-use I think is even more important than the macrotizing to everyone with the headsets or everyone with the big server. I think its ease-of-use will change the market and that exponential world we’re seeing in VR now will start taking place.
It definitely breaks the barrier.
— That’s right. We’re definitely see growing consumer like a consumer the last time I saw – 3-to-1 all-in-one headset sales compartment cabin. In enterprise it is about 50/50. We see that all it was its ease-of-use for consumer to just take off. I think that if streaming XR we’ll see that same growth curve in enterprise.
Sounds really interesting. It definitely has tons of potential. Interesting era. Tell us more about the most remarkable cases or projects that became real because of CloudXR.
— What I’ve seen for CloudXR? The very best VR scenario was with the Varjo headset. That’s not a CloudXR story but that’s where for me VR became real. I was in the room looking at the Volvo car in pass-through mode and that super hi-resolution headset let me touch the car. And I got trapped. I was like – I believe there is a car right here with me. And I could see my colleagues around the room. I was in a headset, could see a beautiful car and my colleagues were there. In my mind enterprise was is that’s been an ultimate experience. That going into the world is completely different, a virtual world where marking through an architecture is seen. And I actually started believing that I’m in the building. That’s kind of a goal. A goal is to be able to stream those experiences to the super hi-definition headsets with pass-through mode. With CloudXR we have a Steam VR based system right now and VR is really good. I can go into that architecture thing and be full in that room. That’s an interesting point making it available to everybody so in the VR we’re everywhere we want to but we still have work to do to get that super ultimate experience streaming form and that’s really forming market. I think it’s all possible in one or two years but for me the best VR experience is still Varjo headset for it was an amazing experience. But to put on Oculus or Quest 2 just for my home use right now… I think it’s GeForce RTX 2080 in my machine because I can’t get 3080. And I pop my headset on and I go on to my friend with the Quest 2 and I can walk around the car model which is a really convenient easy thing to do at my house. CloudXR now if I’m in a network I can do that from AWS service 50 millisec away. Now just an ease-of-use for common enterprise experience is just incredible. I’m really excited about that. It works with 50 millisec per work. I didn’t think that was possible 2 years ago until we started with CloudXR.
Does this mean that I can now play HL:Alyx in my Oculus or Oculus Quest?
— Yes! Even from the cloud, from your home desktop or you can do it from AWS, Google products, 10cent, Alibaba. That’s absolutely crazy.

Amazing. I have to try it. As we started to speak about gaming, cloud gaming is not a new industry but many users complain about problems with latency and framerate. This problem is particularly painful for HMD users. How do you handle it in CloudXR?
— That’s a great question. Nice thing about CloudXR is that it is built on GeForce NOW stream. Every advancement we see in cloud gaming in which GeForce NOW is better enough for years every quality of service improvement they make for that streaming in CloudXR gets to use right away. That’s a big deal. Network is inspired with this 5G movement. The 5G protocols are better in going up & giving better signals and all that. Also part of the protocols are half-faster than network has to work in our world and it’s our forecast. That’s all a part of 5G specifications. Our internal networks are TELCO networks are going to keep better & better over the next few years. All the benefits we get from better networks for phone calls or for whatever services TELCO is going to stream down are cloud gaming, XR, cloud usage are better & better. So I see it’s hard to get a low late high framerate experiencing some low network produces but it’s going to be better. The funny thing I was worried the worst XR experience of our lives right now it will just continue to be more amazing. It’s pretty cool.
A version of CloudXR 3.0 is already available in which users received bi-directional sound and it’s an essential part of collaboration & social interactions. What are next steps in the development of CloudXR? What should we expect in CloudXR 4.0?
— Our CEO doesn’t like to talk about what we haven’t already done so I’ll be careful in that regard. I think that users are such a good deal for XR to low that barrier. A lot of the things we’re focusing on XR in next generation is its scalability. How do you to get into CloudXR more easily like we just announced for modern enterprise where they have a system that has user content and device management, and our CloudXR is going to be the streamer under that system. If you think of that as the user said that workspace 1 or our XR system knows the user & his device model when user puts his headset on and then you can render your VR experience continuing to refine that ease-of-use scalability. Because now we have two customers in that sense – an end-user who wants to have a great experience and IT manager who wants to administrate all this nonsense moving into the machine room to the cloud. To keep the system or the IT manager also has a good experience in delivering XR bits to the end-users. It’s an important piece of the growth curve for CloudXR. I can easily say and give away that future looks so that scalability will be a nature of things for us. There is a bunch of headsets coming out now: 3 all-in-one’s, Pico, Oculus, and others. There is going to be more all-in-one headsets coming out so figuring out how to support all those different headsets’ operating systems. AR headsets – they are the future for CloudXR.
Which devices CloudXR supports these days?
— We have a client sample that works with Oculus 3, we have one for Oculus Quest 2, for Pico it is going to be released one, HTC is going to release something as specific Oculus 3 itself. Certainly Vive Pro, Valve’s X and all know-cases when users those all work and then you can do AR type of stuff with wireless handhelds. What we do is we make a client samples so people can take that sample and make their own client for their own headset or for their own use-cases. Those are the groups we support client samples.
As you highlighted the scalability let’s talk about such a hot topic like Metaverse. I’m sure you’ve heard that latest Facebook’s announcement that the Metaverse will play a crucial role in their future development & strategy plans. What were your first thoughts when you heard about that?
— I think that Metaverse is really exciting. Sure I read Snowcrash in times and for that part of living, learning, playing, working the Metaverse is really appealing. What Facebook is doing is kind of a social Metaverse and nVidia is working on an enterprise Metaverse. We call our product Omniverse that is a super system and a collaborative simulation environment where you can work collaboratively and I can visualize the scene as we work together using a render or Omniverse render. Enterprise Metaverse that kind of bore in Metaverse virtual world through our headsets and eventually you can think of the AR applications where I show camera view of my scene and I see a digital foot under. Now in the near future AI will be able to see the virtual world into the real world and give me context comparing real world to the virtual world. So the idea of enterprise Metaverse and being its digital twin where I go in with AI, agents, collaborators we design the digital world and how we build it and understand it in the real world. To me it is a really exciting piece of a Metaverse. It is beyond social. As soon as you talk about enterprise Metaverse the avatars get more real, the photorealism is required so the computing and simulations really step up. Now I’m not just talking about making water look like water moving around due to computations of flowing dynamics of water acting like it should. I think that mathematics, simulations, and all that should be at enterprise level. In that world if I can design a car with colleagues all around the world I could look at airflow across that car in hi-res similar simulation and I’m doing it with avatars, AI agents in Metaverse. That’s what I see and it is really exciting.
Do you think these attempts from Jaggernaught’s, Facebook, Google or Microsoft will lead to a breakthrough and maybe to a next big thing in XR industry?
— Absolutely. I think it’s great at companies but as you mentioned earlier it really requires enterprise. It requires ecosystem. It’s not just a big company and users, it’s a company like nVidia with a great expertise where all the partners coming & joining in and wiring connectors to Omniverse. Ecosystem development is as important as what big companies are doing. I like press announcements that we’re building Metaverse but partners and ecosystem will be needed to create this Metaverse ecosystem especially for enterprises. That’s a lot of work to do and I think that people forget about it. It not just big company says that we’re going to build it. It’s an ecosystem that says we’re going to work collaboratively.

What in your opinion can we expect from Facebook’s Metaverse in general? Let me explain. Many people who I spoken to on this topic imagined that it’ll be like a Ready Player One. How real is it? What do you think?
— It’s a good question. I’m not sure but if you think about my Metaverse that has to run natively on my headset then it has to be powered by battery. That’s not a 300 graphics cards like A40. So it could not be limited by that compute. My Metaverse will have relatively simple graphics in relatively simple simulation. I think that social Metaverse where we get together & play games with our friends will demand developers to be clever to build Metaverse on a global chipset. Even in those cases streaming is a big deal because we’re going to get more compute. On a graphics level social Metaverse will stay here for a while and as we start streaming it starts going up and the Metaverse will be really cool to be in. Enterprise Metaverse will be little ahead of the gaming as it will use remote computing from the very start. When those two come together it could be really interesting for then we will really work and play in the same Metaverse because we will have all the networking & computing figured out. That will be a really cool thing and will take a few years.
Do you have any concerns about trusting your data to such company as Facebook that has a not very good reputation in this context or it is not a problem for you?
— This is just me but I worry less about the company who’s doing with my data because I know what they’re trying to do. They’re trying to sell me things, understand me and help their ecosystem to understand me so my ads can be better now than then. I worry about nefarious players out there in world in fact are wired together so I worry far more about it, about cyber or crypto than I do about companies. But I understand both worries. Seeing the news now, companies getting clients’ data and users of their system are really under pressure to do better. I hope that consumer keeps them on a side and the management transparency engagement. That transparency will help the user’s data, overall security updates in general. I worry about the criminals far more than I worry about big companies. I could be wrong.
In one of the previous podcasts there was mentioned a very interesting concern about that we do not know how AR or VR technologies influence people and humanity in this virtual world. And as we do not know the rules but we want to try enter this virtual world. It’s like a theater where there is no difference between actors and visitors, and then you switch off the light and nobody knows what is happening but director or owner of this Metaverse. What do you think about consequences and the facts, these hidden dark holes when we do not know how it can influence our health, psychological health? For you are quite old enough to handle with that by yourself but when we are talking about our children who are very affected by all these new technologies… We had this discussion when TV was invented and nobody realized how it can affect us and now it is more real but still virtual world, though the effect could be much more significant. Would you allow your children to enter it because I have some concerns about that?
— My kids are older. But the damage I did as a parent is already done. I worry less now. I think that virtual world and I’ll narrow it down. So I watch kids TV and it’s a pretty even. They watch things whether it’s a cartoon character or an actor in interactive made up worlds. Even if it is a film of a real actor it is still a made up social situation. I think if my kids spent time in virtual reality with other people even if they are a made up avatars there’s still better social interaction than in TV mode. People interacting with people, learning the rules, society - that’s an important piece of growing up. I mean that virtual world in context will actually be better than TV for developing lines. That’s how I view it. I think it will help for everyone to understand each other because of reaching social interactions, watching a performance.
Speaking of Metaverse’s ethical side, who should rule it? Should it be Creators, users, Facebook, politicians?
— That’s a great question. I haven’t look down at the barrels since enterprise is pretty straightforward. I want to design things together. I mean the rules that I don’t want to share out of this collaboration. I don’t really think it is a social engagement that needs for ethical Metaverse. Society is complex so if we take out all those actors in good faith to design the right system… And for me it’s really hard to design a system that wants to take care of ourselves first and the next is just a person. Social interaction is about growing empathy. Virtual reality wants to help with that and make a better society rules for usage. Enterprise is straightforward – let’s design together, if it’s a new design let’s keep it secret till we ready to tell the world about it. But rules are the same like data. The society wants to be harder. It will take all those actors you mentioned, get them together and will make a reasonable system.
What opportunities for nVidia this kind of a Metaverse reveal? What do you think?
— Opportunities for nVidia is a partly reason why I’m here and they are unlimited. Our graphic cards are hard to create in a virtual world that looks, feels real and accessible. The development platforms like Omniverse, AI for GPU to work with neural nest, a part of working and designing with experts and sometimes they see very well… They can advise me on next designs or dropping the GUI (graphical user interface). I’m having a conversational user interface instead for some apps. That future nVidia is solving where there are graphic cards, hardware, inner software and networking. That future for nVidia is a perfect place to be. It’s a huge support for the future. I’m excited to be here in nVidia because I think streaming XR is the answer for XR growth. These are the worst times for XR in our lives and we’ll get better. It includes the Metaverse, working with AI, optimizing card designs, understanding our world. I guess it’s just right. I couldn’t think of a better company to be with and to witness this growth, this next-gen XR.
Can you remember your first moment, a spark, a touch with XR? How did you fall in love with XR?
— My growth in XR was pretty gradual. I was in academic group four years and we were visualizing data. An idea was nothing but a human-computer interface but can we get visualizations so interactive that it can be a human data interaction now? That includes putting crystals on a 10 ft screen in stereo and really trying immersing data. I really didn’t pay much attention to VR that time. One colleague from nVidia who was in that academic group graduated as PhD and as the director of VR said: ‘Hey! You should come work with me in VR!’. And I said: ‘VR isn’t great. I can’t imagine why VR will have a future academic visualization’. He lives here in Salt Lake City, we both live and David is his name. He put me in a VR Funhouse right back on a Maxwell card. I looked at the flames as part of physics – a great simulation. The graphics was great. I was like – OH MY GOSH! How VR do it and what I was looking at? This is really cool. We could port visualization for enterprise. So VR Funhouse was my introduction to VR. It even been working and trying to do a 3D graphics for years. I thought if with VR Funhouse it’s possible what else can we do in nVidia? That’s how I came into VR just a few years ago right from Maxwell. Funhouse was great in a right sense.
On one of the previous podcasts Tom Fiske, maybe you are familiar with him, he asked us one question and I’m still thinking about it. He asked – If you can do one project or film in XR what would you choose? I mean any project that you can imagine.
— I’m really simple, not complex guy. I have some guitars on the background. I started to take blues lessons for my guitar and it’s really boring. But every bad guitar player is dreaming sitting up on stage with their guitar before 100000 people so if I could do a VR experience and all 100000 were all avatars not real because I don’t want anybody to listen to my guitar I think I’ve played a concert with some band that I can pick out. Maybe Stevie Raymond concert and that will be my ultimate Metaverse experience.
As a co-founder of an XR-solution development company I’m really interested in how I can get your technology for commercial use?
— CloudXR is what we call a gate release. You go to the CloudXR website, apply, wright down the usage and the reason we do it is we focus on enterprise users so it’s a valuable consumer right now. Apply and then get e-mail on our products. If you don’t hear back on an app you should e-mail to CloudXR support with your use-cases. Or go to EWS marketplace and get there also to test it out. It’s pretty widely available.
In what do you believe more – in AR or VR?
— I think both. If I got to create a building then I’m going to walk through then VR is the perfect vehicle - photorealism, VR renders, and actual scene. If I’m trying to install a really complex system into the building then I’m for AR because I want to overlay that data into my real scene. Both have really strong specific use-cases. I’m not a believer in AR or VR but as a digital expert I can see working on something with that experience with what I call a friend, a colleague. Both can be extremely important and I don’t have a favor. I do have virtual worlds and if I’m playing the guitar I want to be in AR because I don’t want that crowd to be real!
Listen to the episode here:
Made on
Tilda