Home » Tech » GPEG can make streaming games and animations much more efficient and interactive

GPEG can make streaming games and animations much more efficient and interactive

Primal Space Systems raised $ 8 million for its Instant Interactive subsidiary and will use it to create a technology dubbed GPEG, which is like a cousin of the MPEG format used for making videos, but for graphics.

But GPEG, a content streaming protocol, is a different way of viewing data, and its creators hope that it can be a huge boost to broaden the appeal of games and make people feel like they can be part of an animated TV show. . Instant Interactive wants to use its GPEG technology to transmit games more efficiently on the one hand and on the other wants to transform passive video entertainment into something more interactive and engaging.

This may be the point where most people stop reading this story. But I think there are many legs in this technology. The idea of ​​Geometry Pump Engine Group (GPEG) was born with the co-founders of Instant Interactive Barry Jenkins (a doctor who became a graphics expert), John Scott (formerly of Epic Games) and Solomon Luo (a medical vision expert ) – who have thought about this challenge for years and created the startup, Primal Space Systems, and its division focused on the Instant Interactive game.

Investors include a variety of seed and angel investors, including ImmixGroup co-founder Steve Charles. The capital will support the development and initial launch of GPEG, which can be used with game engines such as Epic’s Unreal. As a company, Instant Interactive has been working on technology since 2015. It has only seven employees.

Jenkins has spent more than a decade working on technology and the company has 11 technology patents that allows for better streaming of games and ways to turn television shows into interactive entertainment. While MPEG (short for Motion Picture Experts Group) has provided us with technologies to compress videos so that they can be easily viewed across networks, GPEG can make some very expensive game and entertainment technologies much more practical, said Bill Freeman, president. the company, in an interview with GamesBeat.

GamesBeat Summit 28 - 29 April | Two bit circus | Los Angeles, CA. Dawn of the New Generation. Join over 500 game leaders for 2 days of challenging discussions and unbeatable networks.

Above: Bill Freeman is president and COO of Instant Interactive.

Image credit: Instant Interactive

“This type of technology can enable interactive content on exaggerated programs” such as Netflix interactive shows, said Freeman.

For cloud gaming and interactive television, GPEG replaces video-based streaming through the use of pre-encoded content packages, which can be transmitted more efficiently using GPEG middleware technology. Packages are pre-recovered to eliminate delay, also reducing overall streaming costs. GPEG’s middleware solution is designed to be used with all existing content distribution networks and to be integrated into any game engine, including Epic’s Unreal Engine 4, enabling efficient delivery of real-time personalized content.

“We think it’s possible to bring interactivity to the entertainment industry, break interactivity from player silos and create content that everyone can consume,” Jenkins said in an interview with GamesBeat.

The possibilities for interactive entertainment

Above: Netflix’s Black Mirror: the Bandersnatch show lets you choose the results.

Image Credit: Netflix

If you’re wondering what it is, you may have heard the story that Epic Games’ Unreal Engine, used primarily to create games, was used to create special cinematic effects for the Disney + TV show, The Mandalorian. GPEG can be used on Unreal and could allow for a TV show where you can participate in the action, taking the story in the desired direction.

With GPEG, the interactive entertainment industry, as exemplified by Netflix’s Black Mirror: Bandersnatch show where you can choose your own story, could get a big technological boost. Jenkins envisioned GPEG as a more effective way to create transmedia or content used on multiple media such as comics, films and games.

Today’s games often have visually rich cinematic sequences or “movies” within the game that are rendered in real time. Advanced real-time rendering effects can now give games a sophisticated cinematic look that was not possible before. At the same time, the art of interactive storytelling has evolved considerably since Professor Henry Jenkins (then at MIT, now at USC) spoke of transmedia.

Narrative games like Detroit become human and life focuses on Strange II in allowing users to actively discover and participate in the conflicts and resolution cycles that are fundamental to the story.

And if you look at streaming, today’s Content Centric Network (CDN) infrastructure offers precoded video streams at a very low cost per user and on a global scale. However, this infrastructure is not enough to support video-based cloud gaming systems such as Stadia or GeForce Now, which depend on expensive game server hardware hosted in specialized data centers, said Freeman.

While most of the game content today is provided by CDN, this delivery is in the form of slow game downloads, which often require users to wait minutes or hours to start the game. New game engine streaming methods and VR content on game consoles, gaming PCs and mobile devices
devices could eliminate these download delays and provide virtually instant access to interactive content.

Using existing CDN infrastructure, this new type of stream could change the way games are delivered over broadband and wireless and could also enable new types of instant interactive content for cable and over-the-top (OTT) audiences. , Netflix thinks).

Jenkins said you should imagine animated programs or special effects sequences streamed on your console or gaming PC
application like “HBO Max Interactive” or “Netflix Interactive”. Such programs would convey the story-based entertainment experience, “lean” provided by the video, but would also allow users to choose a game controller and customize the characters, explore a different narrative arc, take up a short or otherwise “lean” challenge. in the experience in more deeply involving ways than the simple branched video of Bandersnatch.

Such programming would naturally appeal to players and could also attract non-players and the mainstream general public to the unique and irresistible entertainment value of interactivity. This type of programming could improve the traditional transmedia approach by allowing convergent media experiences that combine the impact of cinematic storytelling with the type of engaging involvement made possible by the modern game engine.

Instant Interactive wants to create a graphic revolution

Above: shows the GPEG encoder output for a single viewcell. The fully visible triangles (blue) are calculated using a conservative method of pre-calculating the visibility from the region which is much faster and more accurate than ray tracing.

Image credit: Instant Interactive

Instant Interactive is a pioneer in the development of a middleware protocol for game engines, GPEG, for streaming interactive content on game consoles, PCs, mobile devices and new generation set-top boxes.

Primal Space Systems itself has used the technology to allow drones to transmit data more efficiently when flying over a place and capturing images and location data. The United States Army is using that technology. But Instant Interactive uses technology to stream games more efficiently and turn passive video entertainment into something more interactive and engaging.

“We all grew up with MPEG (a video format),” said Freeman. “This is not MPEG. It is a new way of encoding and transmitting 3D data. Games have been our centerpiece, but we see GPEG go beyond games, bringing interactivity to what has historically passive content that will be more inclined forward. “

Cloud gaming and interactive entertainment need help. Putting games in the cloud so that they can be processed on heavy servers is theoretically a great way to process games. You can then stream a video of a game scene to a user’s computer or console. When the user interacts with a controller, the input is sent to the data center, where the impact is calculated, and then a new scene is sent in video format to the user’s computer. This allows you to lift heavy loads in the cloud, so a high-end game can be run on a low-end laptop.

The problem is that this data streaming consumes a lot of bandwidth and only recently cable modem systems have been able to transfer data at speeds high enough to allow cloud gaming services such as Google Stadia and Nvidia GeForce Now. But with GPEG, Jenkins says data can be significantly reduced and transferred using a fraction of the bandwidth needed today.

“We know what OnLive tried to do years ago,” said Freeman. “But if we could stream fully interactive content without delays, losses, frame drops or downloads and provide an instant interactive experience with the highest quality and actually at a much lower cost using today’s infrastructure. we are more excited about interactivity. So VR, MR, AR content is really possible with our technology. And since we are not downloading all the content ever, we eliminate piracy from the equation. This is not traditional cloud gaming as others have come closer. It’s a completely new way to transmit data. “

The crazy doctor

Above: Barry Jenkins is co-founder of Instant Interactive.

Image credit: Instant Interactive

Jenkins graduated from Harvard University Medical School with a medical degree. But he was very interested in computational models of human vision, artificial vision and real-time graphics. He wrote software for processing huge 3D datasets that was used by defense contractor Northrop Grumman to render the images.

“We are focused on 3D data itself, not video data, which is exactly what Stadia and GeForce Now do,” said Jenkins. “They run the whole game, obviously in a data center, then compress the video frames and then send it to the user. And as you know, this was possible many years ago, but not necessarily practical. It has a very high cost per user and an almost unprecedented cost per user in the data center. We estimate that Stadia dedicates $ 1,500 worth of equipment to one user at a time. “

Compression of game content in video format is not so good and therefore occupies a lot of bandwidth. There is also a lot of latency, or interaction delays, in a cloud game while the player’s laptop sends input, travels to the data center, is calculated there and then sent back to the computer in the form of a video. With 720p resolution, this delay is OK. But not with today’s 4K televisions.

How to fix game streaming

Above: In this top view of a game level, the green sections are the parts of a scene that a user could see.

Image credit: Instant Interactive

Instant Interactive software can solve this problem, as it is integrated into Unreal Engine and will allow game publishers, game developers and game distributors to offer their users a better experience, Jenkins said.

“We are actually streaming the content of the game engine itself in a very agile way to the game console on a PC or mobile device,” said Jenkins. “We take the game itself, all the levels that make it up, and process them offline through our GPEG encoder, and transform it into data, which is GPEG. The data of that package can be found on any server. It is not necessary to have a GPU, practically any CDN server. “

He added: “What it does is data streams in our GPEG client software, which is a plug-in for Unreal Engine. So we integrated this into Unreal Engine. It was written to be integrated into other engines as available. And basically instead to download the whole game, initially, we stream very quickly just the data that game needs at the moment. It’s all preloaded. So, you know, the user has no latency. “

The software takes a certain scene and divides it into cells. So the software pre-assembles exactly which surfaces such as texture triangles are visible from each pane. And so we code the change between the boxes. All of this data is located on a server and the server predicts where a user is in a game and therefore only shows the parts of a scene that the user can see. It also predicts what the user could see in the next few moments and precompiles it to show it to the user. The game engine should not render the rest of the scene.

“So instead of downloading your Call of Duty games as a 140 gigabyte download, we could really start with a few dozen megabytes of data,” said Jenkins. “There is a real opportunity here to improve game performance by managing very precisely what the game engine must actually render at any given moment.”

Reaching into the past

Above: 1996 earthquake

Image credit: software id

Computer scientists have studied this concept for decades. John Carmack, the graphics guru who once worked in id Software on games like Doom and Quake, was the first to use this type of technology in a game, in the Quake game engine of the 90s.

“It actually accelerated the game as a factor of three or four in the frame rate as using this pre-calculated visibility,” said Jenkins.

While others have left this technology behind, Jenkins does not. And applied it brought it to the modern world.

Instant Interactive wants to license what it has created, middleware, to other companies.

“We would like to see it used by many distributors and game makers other than Valve to Epic, Activision and EA,” said Freeman. “The bandwidth requirement is actually much more reasonable than video-based game streaming because we are not using any compression. We are actually taking advantage of the structure of the game data itself.”

Jenkins said GPEG can work with better interactivity, instant access, efficient, lossless data transmission and fast delivery of 4K and VR content.

“This is really middleware designed to help the game engine run better,” said Jenkins. “And to say that you are watching Netflix and decide to jump off the camera director of the cinematographer, for a different narrative or to take up a short challenge and become more immersed and engaged in the story. Video-based streaming does not really allow you to do it. In a matter of seconds, you’re actually playing more content. A 22-minute video episode could become an hour or more of engagement every week. “

Leave a Comment