Join gaming leaders, alongside GamesBeat and Facebook Gaming, for their 2nd Annual GamesBeat & Facebook Gaming Summit | GamesBeat: Into the Metaverse 2 this upcoming January 25-27, 2022. Learn more about the event.
I’ve been talking with our speakers for our upcoming metaverse event to get a preview of their views on challenges in building the metaverse. And one interesting thing I’ve heard so far is problem of the sniper and the metaverse.
Kim Libreri, chief technology officer at Epic Games, brought it up to me first in a preview of our talk at GamesBeat Summit: Into the Metaverse 2.
As Libreri described it, the challenge of the metaverse — the universe of virtual worlds that are all interconnected, like in novels such as Snow Crash and Ready Player One and the latest Matrix Resurrections movie that Libreri is actually in — is that it’s a networking problem.
“Normally, the way that people would think about distributing a hugely parallel world is you’ll divide it into a grid,” Libreri said. “And players would be in little areas of that grid and move from grid to grid to do it.”
The 2nd Annual GamesBeat and Facebook Gaming Summit and GamesBeat: Into the Metaverse 2
In a racing game, this kind of rendering of a game world works pretty well and isn’t as hard to do. The car driver will be in one grid and may be moving on to the adjacent grid, but this kind of movement is something that a connected computer can keep up with. But the sniper in a combat game is harder.
“If you’re on the top of a mountain, and you have a high-powered sniper rifle, and you look through it, and you can see somebody that is miles and miles away,” Libreri said. “Now you’re not only just having to communicate simple network traffic between these grid locations, but you also have to deal with the rendering coming from a completely different machine.”
The game company has to transfer the networking data between different players so their computers can render the correct point of view. Everything the sniper sees has to be documented. As the sniper moves around the environment, the system has to record and send its relationship to other moving objects.
Now you may understand why only 100 players are allowed in a Fortnite battle royale game. So much data has to be collected on each player’s relative location and movement, and then it is passed to the server and synchronized with all the other players. If you take away a lot of the computing power, beef up the 3D graphics requirements with a virtual reality environment, then pack the electronics into a wireless and portable and compact device like the Facebook Meta Quest 2, you could fit only 16 players in a game, as is the case like the Population: One VR game. That’s not much of a metaverse.
Now if you try to do this with 1,000 players or 100,000 players in the same grid (the same server, or the same shard), then your networking problem grows exponentially.
“With this concept of how you handle massively distributed gameplay in infinitely big worlds, there’s a lot of a lot of research that we still need to do,” said Liberi. “I think Tim [Sweeney, CEO of Epic Games] would argue that we probably need a new programming language for gameplay when it comes to these sort of big, massive, parallel simulations.”
Raph Koster, CEO of Playable Words and the creator of multiple virtual worlds like Star Wars Galaxies, wants to warn people that solving these problems won’t be easy. Solving the networking problem for lots of players in the same space is a gargantuan problem, he said.
For instance, imagine that the split-second timing that has to be worked out among those players.
“It’s going to be very hard when the person across the stadium is dodging behind a door,” said Dave Baszucki (another speaker), CEO of Roblox. “If the player across the stadium is in Moscow and I’m [a player] in San Francisco, we probably have a quarter second or half second of latency.”
Latency refers to interaction delays, or the responsiveness in a simulation as a user makes inputs.
“Do they give preference to the sniper’s [movement] or the person who’s dodging the bullet?” Baszucki said. “It’s a really tricky problem.”
A concert could be easier to render and synchronize among lots of people because you might only see 50 people close to you, and you’ll see them in higher fidelity. They won’t be moving as much as players in a combat game. Still, it’s a challenge. All those people could be dancing and moving a lot in one place.
“We have dabbled in working solutions for this [sniper] problem,” said Herman Narula (another speaker), CEO of Improbable.
Nobody has really solved this problem just yet, but it’s one of the things that has to be solved on the path to the internet. We’ll hear possible solutions from folks like Narula.
The metaverse will be a massive simulation or a massive set of simulations. As Matthew Ball of Epyllion (another speaker) observed in his massive metaverse explanation story (soon to be a book), Microsoft Flight Simulator is the most realistic consumer simulation in history, with two trillion individually rendered trees, 1.5 billion buildings, and other features that require 2.5 billion petabytes of data. No single consumer device can store all that.
You don’t really want to try to go up to see those trees up close (you’ll probably crash the plane if you do). The only way that Microsoft can display the data in real time to you is by feeding data into the computer as needed from internet-connected data centers. The data is streamed in real time to the computer running the game.
While that seems impressive, those trees don’t move. You won’t see the wind blowing through the trees. They stay put in a grid, and the developers don’t have to worry that you’ll suddenly want to see the terrain of Dubai when you’re flying over San Francisco. Now god forbid a sniper might be in one of those planes. Or you network a bunch of planes flying together at the same time. Then you start needing to synchronize all that data and movement with other machines.
Now you may see that the metaverse is one of the most difficult computing problems of all time. It’s no surprise that Raja Koduri, chief architect at chipmaker Intel, predicts that we’re going to need 1,000 times more computing power in order to power a metaverse with billions of people interacting in real time. Of course, Koduri wants us all to buy lots of chips. But as you can see with the problem we’ve described, this is a huge computing and networking problem.
“The CPU itself isn’t even the big challenge,” Koster said. “Actually, the networking is a bigger problem. Because you can store that data. This is what so many folks are trying to solve with concurrency. If you have one person in a forest and they clap, great, you need to send a network message back out to one person.”
Koster added, “If there are two people, one clap generates two outgoing claps. If there are four people, one clap generates four outgoing claps. Here’s where it gets really thorny. If there are four people, but I clap your hand. That’s one message to me one message to you, that is different. Third parties need to see Raph clap Dean’s hand. Okay, those are not the same network message. It’s exponential. So by the time you get to 100, there are well over 1,000 different messages going out. And that’s the, you know, that’s what causes the concurrency problem.”
People are working on solutions. Comcast this week said it has completed tests on delivering greater bandwidth of 4 gigabits per second (and eventually 10 Gbps in both directions) over its cable network. Over time, it hopes to deliver this better bandwidth to our homes.
Bandwidth delivers more throughput, like adding more lanes on a highway to push more data through the internet. Latency is the time it takes a data signal to travel from one point on the internet to another point and then come back. This is measured in milliseconds (a thousandth of a second). If the lag is bad, then fast-action games don’t work well. Your frame rate can slow down to a crawl, or you can try to shoot someone and miss because, by the time you aim at a spot, the person is no longer there. Subspace believes it can generate 80% lower latency for players across 60 countries.
In the past couple of years, Subspace has built out its parallel network using its own networks and hardware as well as partnerships with providers of dark fiber, or some of the excess capacity for the internet. And now it is rolling out its self-serve network-as-a-service. The network lets developers — such as the makers of real-time games — deliver real-time connectivity for their users. (We’ve teamed up to work with Subspace on a Metaverse Forum, for thought leadership on the open metaverse.)
Founder Bayan Towfiq started working on this problem because the public internet is failing key applications that need real-time communication, such as games. The internet was never built for real-time interaction, and it is beset with problems such as latency, jitter, and packet loss that ultimately hurt engagement.
Subspace has deployed a global private network, including a dedicated fiber-optic backbone, patented internet weather mapping, and custom hardware in hundreds of cities. This network pulls gaming traffic off the internet close to users and ensures the fastest and most stable path.
Subspace, for the first time, lets existing games and internet applications bring private networking to every internet-connected device without changes to code, VPN clients, or on-premise hardware, the company said. Subspace has customers with hundreds of millions of users already.
Ball wrote that the average person doesn’t even notice if audio is out-of-sync with video unless it arrives more than 45 milliseconds (ms) too early or more than 125ms late (170ms total). Acceptability thresholds are even wider, at 90ms early and 185ms late (275ms). With digital buttons, such as a YouTube pause button, we only think our clicks have failed if we don’t see a response after 200–250ms.
Other firms are working on the problem. RP1 hopes to be able to put 100,000 people in a single shard so that you could have a huge concert in the metaverse and do it in real time.
Dean Abramson, chief architect of RP1, said at our metaverse event last year that he believed RP1 can reach about 100 million users with about 2,500 servers. That’s anywhere from 200 times to 500 times more efficient than anything else, he said. That’s encouraging, but perhaps hard to fathom. We’ll see what kind of progress can make with worlds that are extremely complicated — where each of those 100 million users has a lot of detail.
Libreri noted many ways of distributing computation and data management across the cloud infrastructure are also necessary to develop for the metaverse, and you get the scope of the problem.
Then you have the challenge not only of finding the networking data between these grids on a map, but you also have the challenge of networking data between worlds. In a metaverse, we’re supposed to be able to move between worlds quickly. The computer doesn’t now which world you want to visit. And in contrast to getting an update in one game world, it can’t anticipate what you’re going to want to do. It can’t predownload a world just so it loads quickly when you decide to visit another world.
How would that go over? I try to hop from world to world to world. But wait. I’ll catch up with my friend later because I have to download 2.5 billion petabytes of data — or at least start streaming it — before I can load that next world. Ball did some calculations on what is needed and it isn’t pretty.
Nvidia CEO Jensen Huang recently said his company wants to marshal a huge amount of supercomputers and AI experts to create a climate model of the world. They want to produce a “digital twin” of the Earth on a meter-level scale to be able to predict how the Earth’s climate will change over time. That is a massive amount of data to capture, and Koster points out that the meter-level detail is going to be constantly changing. That’s going to be pretty hard to model. But once Nvidia models it, the metaverse of the digital twin of the Earth — built in the Omniverse simulation world for engineers — will be available for free for others to use as they wish.
“As long as the metaverse world you want to explore happens to look like whatever the Earth looked like at the moment that Nvidia captured its snapshot, that is useful to you as a game developer,” Koster said. “But we all know a model with meter-level accuracy is going to be out of date within 30 seconds, right?”
So how are we going to create the metaverse. Here’s my suggestion. First, we kill all the snipers.
GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.
How will you do that? Membership includes access to:
- Newsletters, such as DeanBeat
- The wonderful, educational, and fun speakers at our events
- Networking opportunities
- Special members-only interviews, chats, and “open office” events with GamesBeat staff
- Chatting with community members, GamesBeat staff, and other guests in our Discord
- And maybe even a fun prize or two
- Introductions to like-minded parties
Become a member