As the most popular topic in the recent science and technology field, the existence of Metaverse has given countless companies the opportunity to brag about their technological reserves and technological layout. Giants in the semiconductor field such as AMD and Nvidia have successively announced news related to Metaverse, and it has led After the stock price has soared, Intel has finally been unable to sit still recently.
In a recent report, Raja Koduri, Intel’s senior vice president and head of Intel’s accelerated computing systems and graphics division (former chief architect of AMD’s graphics division), thought: “Metaverse may be the next major computing after the World Wide Web and the mobile Internet. At the same time, he also believes: “Our current computer, storage, and network infrastructure is simply not enough to realize a true Metaverse”, “We need cluster computing performance that is 1,000 times or more than the current server.”
Although Raja Koduri believes that Metaverse cannot be implemented yet, it does not prevent Intel from deploying Metaverse related technologies. Not to mention the infrastructure of Metaverse such as processors and servers. Intel also announced a new software field. Some comments believe that this technology may become one of the cornerstones of the Metaverse.
Intel’s Metaverse starts with software?
Recently, Intel announced a new game technology called Continual Compute. The main purpose of this technology is to solve the problem that low-performance devices cannot effectively run high-performance applications, so that any device can be used. Run 3A or even Metaverse-level games and applications.
In the technical demonstration session, Intel allowed a laptop without a discrete graphics card to successfully run “Hitman 3”. As a 3A-level game launched this year, “Hitman 3” does not have low PC configuration requirements. Even the general discrete graphics card cannot run the game smoothly.
With the help of Intel’s continuous computing technology, the laptop calls the high-end discrete graphics card on the device from another device on the same network, and then transmits the data that needs to be calculated to the device through the network, and then obtains it through the network. The calculated result is finally presented on the user’s display.
Sound familiar? Players who are familiar with PC games may think of cloud games or streaming media for the first time. These two technologies are not new things now, but they have not been well popularized for various reasons for many years.
It should be noted that although it sounds similar to these two technologies, Intel’s continuous computing technology is fundamentally different from the two at the application level. Regardless of whether it is cloud gaming or streaming media transmission, both are to run the game itself on a cloud device or another device, and then transmit the game screen to your device via the network.
And Intel’s call duration calculation technique is high-performance hardware to other devices involved in computing, high-speed networks and special software protocol data rather than a direct transmission operation screen. The difference between the two is that the conversion of the calculation result of the former is actually done on your PC. Your PC is a part of the whole calculation system, while in the latter system your PC is just a monitor, you are just watching A “video”.
In the case of a poor network environment, the latter will have obvious problems such as image quality freezes, while Intel’s new technology can temporarily reduce data transmission by reducing the number of frames when facing problems such as network fluctuations. Requirements to ensure that the user’s gaming experience remains smooth.
Moreover, because the device is directly called, some hardware-specific functions can also be supported, such as Nvidia’s DLSS 2.0. These technologies are difficult to be reflected in cloud gaming and other streaming technologies, but with the help of Intel’s continuous computing technology. In the future, you will have the opportunity to get the assistance of these advanced AI technologies directly on your laptop.
So the question is, where do you get this device equipped with a high-performance graphics card and processor? Intel’s answer is “requisition of idle devices in the same network”, of course, provided that the other party “agrees that its equipment is requisitioned.” To put it simply, there are two prerequisites to enable this function. One is that it is on the same network, and the other is that the other party agrees to requisition its equipment. Only when both requirements are met can the related function be activated.
In theory, as long as it is hardware that meets the software protocol, it can be included in this “rental device” network, regardless of whether the counterparty is AMD or NVIDIA hardware, it can be borrowed by your PC as long as you are willing to support it. However, in the current technology preview that Intel is showing, only the protocol handshake between Intel processors is supported, and the type of graphics card is not stated.
From the game video shown by Intel, there is no obvious difference in fluency between games running through continuous computing technology and games running directly on the machine under the same image quality. However, it is worth noting that Intel’s “Hitman 3” screen is set to low, and Intel did not show the performance of this technology in high image quality. According to the documentation provided by Intel, the technology is still in the development stage. A phased result.
Will continuous computing technology be one of the cornerstones of the “Metaverse”?
After Intel demonstrated continuous computing technology, why would anyone call it the cornerstone of the future “Metaverse”? When it comes to Metaverse, we have to mention virtual reality technologies such as AR and VR. Only with the help of these technologies can we achieve a true “Metaverse.”
However, whether it is AR or VR or a more comprehensive XR, the premise of providing services to users is that users need to have a device with sufficient performance to support the software’s permission. Taking VR games as an example, a current ordinary VR game is It needs a device with performance at least better than PS4 to run, and the effect is not enough to give players an immersive experience.
If considering the rendering images and real-time feedback required for immersive experience, the current high-end personal PCs cannot meet the demand, so Intel executives will point out in related articles that we need to increase the computing performance by more than 1000 times before it is possible. Meet the requirements of real “Metaverse” software operation.
Although from the perspective of technological development, we will be able to manufacture equipment that meets the performance requirements in the future, but its initial volume and power consumption will certainly not be low, and it is unlikely to be carried by individuals and used at any time. If Metaverse users want to get a complete gaming experience more easily by then, they can use Intel’s continuous computing technology to directly call the high-performance PC placed on the other side of the room through the network or rent one directly from the network. A high-performance PC.
Under the premise of using continuous computing technology, the user’s own device only needs a computing processor with a certain performance, a display device and a network device to run applications that require high-performance hardware to run. To put it simply, continuous computing technology allows users to obtain the performance of high-performance PCs anytime and anywhere to assist them in gaming or launching some applications. In the world of metaverse, this technology will allow players to obtain it at any time at a lower cost. The complete experience.
So, what are the advantages of Intel’s technology over cloud computing servers such as cloud games? One is that individuals or service providers who provide services do not need to build a huge online server, they only need to connect devices equipped with relevant hardware to Intel’s shared network. Secondly, users obtain data from the devices closest to them. It can greatly improve the delay and packet loss caused by long-distance data transmission.
Moreover, in Intel’s plan, this technology will form a computing resource pool at the network level in the future, and allocate computing resources through the intelligent resource allocation mechanism of Windows and other systems. Every idle computer connected to the network is this resource pool. a part of.
Seeing this, Xiao Lei reminded of idle PC resource utilization plans such as the Phoenix Project. Participating users can get different degrees of return with the contributed resources. In the future meta-world, we may be able to purchase computing resources in the resource pool just like buying traffic, without having to bear the costs and expenses required for high-end hardware.
In addition, according to Intel’s disclosure, continuous computing technology is only one of their technical reserves in the Metaverse. They are discussing how to reduce the computing performance requirements of immersive experience content such as VR through technical means such as deep learning and neural networks. In order to meet the needs of users.
However, Intel also admitted that in the next five to ten years, the performance of the chip will still be difficult to meet the needs of real Metaverse program operation. Real-time interaction, ultra-fine real-time rendering and other issues are all for performance, network and other infrastructures. A huge challenge. Therefore, the first use of this continuous computing technology may be to become another “cloud game” project. At least in the current Minecraft-level “metaverse” games, this technology has not been used much.