Complete British News World

Intel wants to see the computing power shared across networks

Intel wants to see the computing power shared across networks

For several years, there has been an opportunity to stream games from the “cloud”. In practice, it is about a graphics card in a server environment that handles rendering and sends the final result in a video stream to a local machine, which in itself does not require any significant performance. When Intel sets its vision for the future, it wants to take the concept of borrowed computing power one step closer to the user.

I Job On Intel’s website “Powering the Metaverse” Raja Kodori, Intel’s Director of Graphics and formerly at AMD, writes about the challenges of creating metaverse (Sv metaversum). A virtual world that connects billions of people around the world through virtual reality (VR) and augmented reality (AR). Koduri himself wrote that the metaverse could be the next big platform then World wide web and mobile phone.

The concept isn’t new and the term was coined by science fiction writer Neil Stephenson nearly 30 years ago, but it’s only recently that the term has become the word on everyone’s lips. In addition to the possibilities of connecting people to one another, Koduri sees challenges related to computing power that needs to be several orders of magnitude more than what is available today. At the same time, much less latency is required than what the Internet offers today.

Within the meta-intelligence layer, our work focuses on a unified programming model, software development tools, and open libraries to enable developers to more easily deploy complex applications. The meta ops layer describes the layer of infrastructure that provides computing to users beyond what is available to them locally. Finally, the meta-computing layer is the raw horsepower needed to run these metaverse experiments.

One solution to the performance problem is to share computing power from devices within the same network, here Intel is using games as a concrete example that is easy to understand. In a conceptual demonstration, a laptop with the graphics part integrated with the processor is used to play Hitman 3. Despite the lowest settings, the result can hardly be considered playable.

See also  Ai and "Magic windows" - Here's the news from Google I / O

Everything changes when the system borrows computing power from other devices in the network and here Intel takes the gaming desktop computer as an example. Then it will be possible to increase the resolution and settings of the game on the laptop, which was previously hardly strong with the game. A local unit with locally installed applications that is able to borrow computing power is supposed to be able to run very smoothly from the user’s perspective.