Zuckerberg also came to challenge the “light saber dress up”.
Do you think so? NoNoNo!
Xiao Zha is not as cool as Tang Yixin, but a little more funny.
The sword is hidden behind the back of the right hand, and the sword is displayed in the left hand: a bottle of BBQ dipping sauce.
This is Xiao Zha’s virtual avatar (Avatar) in the Metaverse, and this sword-bright video has been forwarded by many people.
Since Facebook changed its name to Meta and officially entered the Metaverse, the popularity of the word “Metaverse” has not diminished.
And the word that is frequently mentioned along with the Metaverse is the word Avatar.
“Incarnation”, in the popular sense, is the image of people in the virtual world.
On December 10, the “Metaverse” pioneer Meta opened its own virtual world platform Horizon Worlds.
When users enter Horizon Worlds, they need to create a 3D avatar.
With avatars, users can have fun in this customized world.
However, it is a little embarrassing that the expressions of these avatars are very stiff.
Therefore, although the current 3D virtual avatars are barely usable, there are still 108,000 miles away from the realistic avatars in science fiction movies, such as the “Head Player”.
The Codec Avatars project represents Meta’s efforts in the field of virtual avatars.
Metaverse is not PPT, Meta is really doing it
Xiao Zha’s decision to change his name to Facebook was no joke.
If you want to achieve real interaction in the Metaverse, the virtual avatar is the second line of Rendu who opens the door of the Metaverse.
The Codec Avatars project aims to implement a system capable of capturing and representing realistic avatars for XR.
Based on 3D capture and AI technology, Codec Avatars allow people to quickly and easily build their own virtual avatars and make interactions in virtual spaces as natural as in the real world.
Initially the project started with a high-quality avatar demo, and gradually moved towards building full-body virtual avatars.
Beyond simply scanning a person’s body, a major challenge is getting it to move in a realistic way — not to mention getting the entire system to run in real-time so the avatar can be used in an interactive environment.
At Connect 2021, researcher Yaser Sheikh presented the team’s latest achievement, Full-body Codec Avatars.
He said Codec Avatars now support more complex eye movements, facial expressions, hand and body poses.
In addition, Meta showed virtual avatars achieving realistic rendering of hair and skin under different lighting conditions and environments.
Clothes, then, can also be realistically simulated in the Metaverse.
It has good elasticity when you pull it, and it will wrinkle when you shake it.
Body details and clothes can be restored to reality, as well as a high-fidelity virtual space.
Meta builds a virtual apartment where everything is a mirror of the real world.
In this way, the user’s movement of objects or interactions in the virtual space can be synchronized with the physical space.
For now, this seems to be the best-case scenario for building a mapped real-world environment for companies to experiment with.
Once in the Metaverse, manual tracking is just as important. It is not only tiring but also unsightly to interact in the physical world by swinging your arms.
To that end, Meta has been working on a more subtle and natural approach, XR input via an EMG wristband.
Some time ago, Meta also launched a haptic activation glove, all in an effort to enable new interactions in the Metaverse.
How to build Codex Avatars?
Meta started making Codex Avatars back seven years ago.
In 2014, Yaser Sheikh, head of Panoptic Studio, a 3D capture lab owned by Carnegie Mellon University’s Robotics Institute, met Oculus chief scientist Michael Abrash, and the two had a very speculative chat.
Yaser Sheikh joined Facebook (now Meta) in 2015 and has been leading the Codex Avatars research team ever since.
To create a lifelike avatar, the foundation lies in measurement, says Codex Avatars research scientist Thomas Simon.
Avatars rely on accurate data to look fake and real, and that requires good measurements. So the key to building a realistic avatar is finding a way to measure physical details in human expressions, such as the way a person squints their eyes or wrinkles their nose.
The Codec Avatars team at the Pittsburgh lab uses two main modules to measure human expressions: an encoder and a decoder.
First, the encoder uses a camera to capture what the subject is doing, and then assembles the captured information into a unique code that represents a person’s physical and environmental state, which can be sent wherever needed at any time.
The decoder then converts the code into a video signal.
Regular smartphone cameras can also capture vivid video. However, Codec Avatars are a combination of massive physical data and complex software, which is much more complicated than imagined.
The Codec Avatars project captures face details using hundreds of high-resolution cameras, each of which captures data at 1 GB per second.
what is this concept?
For example, three seconds of data recorded by these cameras is enough to fill a 512GB disk.
The Codec Avatars team continued to record for about 15 minutes, which was the limit of their existing hardware, but as much data as possible also laid the foundation for creating the most realistic virtual avatars.
The Codec Avatars team is using this data to train AI systems that can quickly build virtual avatars from videos.
Sheikh said: We can now restore the 3D avatars of people with exaggerated hairstyles, and even the avatars of people wearing EEG caps.
This is one of the most advanced methods in the world capable of automatically generating realistic digital portraits of humans. It offers a new way of virtual face-to-face communication that could be widely rolled out in the future.
And with the emergence of Codec Avatars, will people’s acceptance of Metaverse socialization increase?
Posted by:CoinYuppie，Reprinted with attribution to:https://coinyuppie.com/metaverse-is-not-ppt-it-has-developed-to-this-point/
Coinyuppie is an open information publishing platform, all information provided is not related to the views and positions of coinyuppie, and does not constitute any investment and financial advice. Users are expected to carefully screen and prevent risks.