NetEase Fuxi and Li Renjie: Virtual humans are the future, and Metaverse is a beneficial compensation for the real world

The most useful value of the meta universe should be the beneficial compensation of the real world. It can solve the problems of the real world. The virtual world must be used to serve the real world, including but not limited to education meta universe, industrial meta universe and cultural meta universe, etc. Wait.

In 2021, with the listing of Roblox, the first stock of the meta universe, and this concept of the future derived from the science fiction novel “Avalanche” gradually became known.

Metaverse has become the hottest track in the investment circle, but building such a grand, high-degree-of-freedom, and highly fantasy virtual world requires AI, content creation engine, content ecology, network communication technology, Solutions in different fields such as AR/VR ecology and blockchain are jointly completed.

Many Internet companies entered the meta universe. Netease Fuxi wants to apply the experience of combining games and artificial intelligence to a broader field, “We believe that virtual humans are the future. The most useful value of the metaverse should be the beneficial compensation of the real world, and it must be used to solve real world problems. The virtual world serves the real world, including but not limited to education meta-universe, industrial meta-universe, cultural meta-universe, etc.”

The following is a live speech by Li Renjie, co-head of NetEase Fuxi:

NetEase Fuxi and Li Renjie: Virtual humans are the future, and Metaverse is a beneficial compensation for the real world

I will mainly talk about three aspects today. One is the concept of the meta universe; the second is our technological accumulation in the meta universe; and the third is what NetEase Fuxi has done based on the concept of the meta universe.

On the Internet, you can find a lot of discussions about the meta universe, and the definition is very different. Why I chose such a picture is because I think it is open enough and the ecology is very diversified. It is said that hardware is the key to opening the door to the meta-universe, but it does not say whether it is AR or VR; it says payment services, but it does not say whether it is based on the blockchain, whether it is to be decentralized or not to be decentralized; Content and assets, but did not say certain UGC. Because now Metaverse is contending with a hundred schools of thought and a hundred flowers blooming, we think it must be diversity. As Teacher Cai Weide said this morning, it does not necessarily have to use digital assets, it does not necessarily have to be AR or VR, or user-created content. For example, Microsoft does not emphasize that it is its own content.

How should the value of the second universe be measured? We feel that the most useful value of the meta universe should be the beneficial compensation of the real world, which can solve the problems of the real world, and must use the virtual world to serve the real world, including but not limited to education meta universe, industrial meta universe, cultural meta universe, etc. .

Let me introduce to you what NetEase Fuxi does. Maybe you are relatively unfamiliar with this name. Our predecessor is actually called Fuxi Lab. From this name, we can see that we are doing basic research at the bottom level. What we did before is Explore the combination of games and artificial intelligence to help our developers reduce costs and increase efficiency, and help our players improve the gaming experience.

After years of precipitation and exploration, Fuxi has helped the game to do a lot of things. We feel that we have reached a stage where the ability to combine games and artificial intelligence can be used in a wider area, not only in games, but even in culture. Entertainment, even industry. So we entered our second stage, hoping to bring this immersive interactive experience to every field and enrich everyone’s spiritual world.

NetEase Fuxi and Li Renjie: Virtual humans are the future, and Metaverse is a beneficial compensation for the real world

This is the 6 main directions we are doing, including reinforcement learning, natural language processing, visual intelligence, virtual humans, user portraits, and computing platforms that support the underlying big data and artificial intelligence for all 5 research directions. We have done very, very Many kinds of things, because today’s time is limited, I mainly introduce some of our explorations in the direction of virtual humans. We believe that virtual humans must be the future , not only games, but also a very important part of the virtual world. In order to reduce costs and increase efficiency, all its links may be fully artificially intelligent in the future.

Speaking of virtual humans, perhaps the first thing that comes to mind is to hope that virtual humans resemble themselves. I don’t know how many gamers are there? I am an old game fan. Starting from the early MMORPG, at that time everyone was not able to create their own characters. Later, our domestic games are developing better and better. We can customize a lot of characters in the game, even I adjusted my character perfectly in 200 latitudes, but not everyone has a very good artistic cell. For example, I couldn’t create a character image very similar to me, so that there was a ” The profession of “face pinching teacher”, 50 yuan per time, he can help you pinch an image. How do we want to apply artificial intelligence to this? Users only need to upload a 2D photo to pinch a 3D image. We have implemented smart face pinching in many games. We recently cooperated with “Forever Tribulation”. Some players want to be like themselves, and some people want to look better than themselves. We have made a lot of iterations. Now we can achieve the virtual image in the game as long as the player uploads a photo.

What is missing after having a face? Hairstyles are also what we do now. For example, I have no pursuit of hair styles in real life. It may be getting worse, but I still hope that hair styles in the virtual world are better, including that it can generate your body and clothing. We can also scan it with a drone to generate a three-dimensional modeling of the scene.

Once you have your model image, how to make the model speak in the next step, and how to make it move. Now there are three ways to make the virtual person move. The first is the movie-level production that everyone sees, and it needs people to wear it. Tights can be migrated in through the performance of the actors. The second is through the camera. You can do some actions in front of the camera, and the expression can also be embedded, but both of these require Zhongzhi people to perform. I think if everyone wants For the development of virtual humans, even thousands of virtual humans and thousands of meta-universes, it is impossible for every virtual human to perform as a human. The only way to achieve this is through AI. We can give AI language and characters. Realize the realization of animation, expressions, etc.

Expressions include actions, as long as they give text, he can speak out by himself, and the player can adjust her emotions, such as adjusting her to be happy emotions or sad emotions. When such a virtual person performs automatically and acts as an NPC character in the game, The cost has dropped a lot.

After we have an image, can speak, and move, how we want to string these things together, we are now trying a song, from writing lyrics to arranging to singing and actions, all of which are done automatically by AI. This song is called “Wake Up”, and it was a song released at our NetEase Conference last year. Traditionally, it takes at least two months to make a song, but it can be realized in one minute with AI. In addition to his expressions, actions, and speaking, we would hope that he will be smarter. This is also our research in reinforcement learning.

We can experience in the game that we can play against NPCs. This uses intensive automatic learning. It doesn’t play at first, but slowly learns later, and then learns to cheat.

Just now we have some exploration and research on virtual people. After we have people, we still hope that there will be some scenes. So we have made a product called “Yaotai”. We hope that everyone will hold meetings in the virtual world. Or activities. Its background is the sudden outbreak of last year. After the epidemic comes, many of you may have a month or two to work at home and use a lot of online conference systems, but we have found that although the conference system brings a lot of efficiency. But there are still many senses of ritual, immersion and sociality that are needed for offline meetings that cannot be replaced.

For example, the 36 krypton conference we held today can actually be held online, but why don’t we hold it online? First of all, we need a sense of ceremony, because 36氪 invites everyone to participate in the conference, invites everyone to come, and even they hope to publish something at the end. I also need to have a sense of social interaction. For example, I just met Liu Yexi’s father sitting next to me. I wanted to know him very much, so I added friends, but it’s very rare for everyone to have such an experience in online meetings, so we just I wonder if we can do something like this to enrich our traditional online meetings.

So in October last year, when the concept of meta universe was not so popular, we made the first attempt, just to verify our ideas. Unexpectedly, although it was a conference held by many scholars, everyone had a good time in it, so we strengthened our confidence. There is a lot to do with the immersive activity system. Here is a group photo after the meeting at that time. It is a bit unclear. In fact, everyone used AI to pinch their faces. This is the difference between the traditional meeting and the online meeting that I just mentioned.

NetEase Fuxi and Li Renjie: Virtual humans are the future, and Metaverse is a beneficial compensation for the real world

So we actually started to think more based on that. We re-made the entire conference system. Now it is completely different from what you saw in October last year. For example, we have added a lot of brand-new participation experiences. You can have your own exclusive character. We have added action migration and expression migration. You can perform various actions on the camera, which can be transmitted through virtual transmission. It is impossible for everyone to download dozens of gigabytes for a meeting. For the client, we use cloud game technology to achieve point-to-use without delay.

We also did a newcomer teaching with virtual humans, which has the function of teleporting in the venue; imitating our offline, for example, when two people get together, the camera will automatically turn on, and your virtual image will appear; translations in various countries’ languages, The conference system provided for the conference implementer, such as the review and feedback of the lecturer, which conference is like; even a lottery system has been added. We have also made a lot of styles, such as antiquity, fantasy, reality, and the style of future exhibitions. We are constantly improving this set of immersive event system. We hope that in the future it can be used not only for conferences, but also for exhibitions, and more When adding the resources of, individuals can mix and match them freely to create a scene of their own activities. You can arrange a family gathering or even make a script kill of your own.

Because of the limited time, I will introduce you so much today. If you are interested in this, you can search or contact us, thank you.

Posted by:CoinYuppie,Reprinted with attribution to:
Coinyuppie is an open information publishing platform, all information provided is not related to the views and positions of coinyuppie, and does not constitute any investment and financial advice. Users are expected to carefully screen and prevent risks.

Like (0)
Donate Buy me a coffee Buy me a coffee
Previous 2021-11-29 11:50
Next 2021-11-29 11:51

Related articles