[Guide] At the Connect 2021 conference, Xiao Zha renamed Facebook to Meta, and went all-out toward the Metaverse. In the eyes of many people, this is just the PPT of the Metaverse. In fact, Meta has been carrying out the Codec Avatars project, and has been able to create realistic virtual characters, even with distinct hair.
Zuckerberg has also come to challenge the “light saber cross dress.”
Do you think so? NoNoNo!
Xiao Zha is not as cool as Tang Yixin, but a bit more funny.
Hidden sword behind right hand, bright sword in left hand: a bottle of BBQ dip.
This is Xiaozha’s virtual avatar in the “Metaverse.” This bright sword video has been reposted by many people.
Since Facebook changed its name to Meta and officially entered the Metaverse, the popularity of the term “Metaverse” has not diminished.
The word “Avatar” (Avatar) is frequently mentioned together with “Metaverse”.
“Incarnation”, in a popular sense, is the image of people in the virtual world.
On December 10th, “Metaverse” pioneer Meta opened its own virtual world platform Horizon Worlds.
When users enter Horizon Worlds, they need to create a 3D virtual avatar.
With an avatar, users can play freely in this customized world.
However, what is slightly embarrassing is that the expressions of these avatars are very stiff.
Therefore, although the current 3D virtual avatars can barely be used, there are still thousands of miles away from the realistic avatars in science fiction movies, such as the “head player”.
The Codec Avatars project represents Meta’s efforts in the field of virtual avatars.
“Metaverse” is not a PPT, Meta is really doing it
Xiao Zha’s determination to change his name to Facebook is not a joke.
If you want to achieve real interaction in the Metaverse, virtual avatars can open the door of the Metaverse.
The Codec Avatars project aims to realize a system that can capture and represent realistic avatars for XR.
Based on 3D capture and AI technology, Codec Avatars allows people to quickly and easily build their own virtual avatars, and make the interaction in the virtual space as natural as in the real world.
Initially, this project started with a high-quality avatar demonstration, and then gradually realized the construction of a virtual avatar of the whole body.
In addition to simply scanning a person’s body, a major challenge is to make it move in a realistic way-not to mention making the entire system run in real time so that the avatar can be used in an interactive environment.
At the Connect 2021 conference, researcher Yaser Sheikh demonstrated the team’s latest achievement-Full-body Codec Avatars.
He said that Codec Avatars now supports more complex eye movements, facial expressions, hands and body postures.
In addition, Meta also showed that the virtual avatar can achieve real rendering of hair and skin under different lighting conditions and environments.
Then, clothes can also be simulated in the Metaverse.
It has good elasticity and wrinkles after shaking.
Body details and clothes can be restored to reality, and there is also a high-fidelity virtual space.
Meta built a virtual apartment, where all the objects in the apartment are a mapping of the real world.
In this way, the user’s moving objects or interactions in the virtual space can be synchronized with the physical space.
At present, this seems to be a “best case” scenario for the company to construct a mapped real-world environment for experimentation.
After entering the Metaverse, manual tracking is equally important. If you flick your arms to achieve interaction in the physical world, it is not only tiring, but also unsightly.
To this end, Meta has been working on a more subtle and natural method, namely XR input through an EMG wristband.
Some time ago, Meta also launched a tactile activation glove, which is an effort to achieve new interactions in the Metaverse.
How to build Codex Avatars?
The moment Meta started making Codex Avatars dates back to 7 years ago.
In 2014, Yaser Sheikh, the head of Panoptic Studio, a 3D capture laboratory under the Carnegie Mellon University Robotics Institute, met Oculus chief scientist Michael Abrash, and the two had a very speculative chat.
In 2015, Yaser Sheikh joined Facebook (now Meta) and has been leading the Codex Avatars research team ever since.
“To create a realistic avatar, the foundation is measurement,” said Thomas Simon, a research scientist at Codex Avatars.
“Accurate data is required for avatars to be fake and real, which requires good measurement. Therefore, the key to constructing a real avatar is to find a way to measure the physical details of human expressions, such as the way a person squints his eyes or wrinkles his nose. .”
The Codec Avatars team at the Pittsburgh Lab uses two main modules to measure human expressions: encoder and decoder.
First, the encoder uses the camera to capture what the subject is doing, and then assembles the captured information into a unique code that represents the state of a person’s body and environment. This code can be sent to wherever needed at any time.
Then, the decoder converts the code into a video signal.
Ordinary smart phone cameras can also shoot vivid videos. However, Codec Avatars is a combination of massive physical data and complex software, which is much more complicated than imagined.
The Codec Avatars project uses hundreds of high-resolution cameras to capture face details, and each camera captures data at a rate of 1 GB per second.
what is this concept?
For example, the data recorded by these cameras for three seconds is enough to fill a 512GB disk.
The Codec Avatars team continued to record for about 15 minutes, which has reached the limit of their existing hardware conditions, but as much data as possible also laid the foundation for creating the most realistic virtual avatars.
The Codec Avatars team is using this data to train the AI system so that AI can quickly build virtual avatars from videos.
Sheikh said: “We can now restore the 3D heads of people with exaggerated hairstyles, and even restore the heads of people wearing EEG capture caps.”
“This is one of the most advanced methods in the world that can automatically generate real digital portraits of humans. It provides a new way of virtual face-to-face communication, which can be widely promoted in the future.”
With the emergence of Codec Avatars, will people’s acceptance of “Metaverse” social interactions become higher and higher?
Posted by:CoinYuppie，Reprinted with attribution to:https://coinyuppie.com/meta-metaverse-avatar-project-exposure-xiaozha-lightsaber-turned-into-a-full-512gb-hard-drive-in-3-seconds/ Coinyuppie is an open information publishing platform, all information provided is not related to the views and positions of coinyuppie, and does not constitute any investment and financial advice. Users are expected to carefully screen and prevent risks.