“I guess the kissing feeling reproduced by the VR headset will be an unbearable feeling of loneliness.”
Researchers at Carnegie Mellon University have developed a hardware module on the Meta Quest 2 headset that allows users to experience the tactile sensations of drinking water, brushing teeth, and kissing in VR.
At the beginning of the Metaverse craze, there were people who were serious about pouring cold water: “Speaking of such a virtual immersive experience for a long time, is it possible to reproduce kisses and hugs in the VR world?”
Just recently, researchers at Carnegie Mellon University reproduced the “kiss” in the Metaverse.
Recreate the feel of lips and tongue in the Metaverse
Aside from the occasional vibrating handheld controller, most common VR devices available to consumers ignore senses like taste, smell, and touch, focusing instead on sight and sound.
That’s enough to make the VR experience more engaging than it was decades ago, but not enough to really trick the user’s brain into thinking that what his eyes see is comparable to reality.
Over the years, researchers have worked to develop and improve both new and old VR devices so that users in virtual reality feel as real as they otherwise look.
The most outstanding of these is that researchers from Carnegie Mellon University’s Human-Computer Interaction Institute (also known as the Future Interface Group, FIG) have developed a device that allows users to feel the touch of lips, teeth and tongue in virtual reality. technology.
The tactile reproduction of VR users may not be enough to touch cats and dogs in virtual reality, and feel the realistic hair on virtual cats and dogs with hands.
But the tactile experience of drinking from a virtual water dispenser is now really reproducible.
Of course, you can also feel the breeze brushing from your lips while riding.
And the feeling of shards hitting between your lips and teeth when you hit the box…
This…it hurts to hear…
Of course, with such a useful new technology, it would be a waste to use it to reproduce brushing teeth and drinking water in VR.
No matter who it is, the first thing that comes to mind is to use this technology to reproduce the sweet touch of kissing in VR. Researchers are no exception.
The working principle of the device is to use the thin and light ultrasonic transducer in the newly added module to aim the ultrasonic pulse at the position in and around the user’s mouth, and to suspend and move the tiny particles by bombarding the tiny particles with high-energy ultrasonic pulses.
In doing so, they create the sensation of touch on the user’s lips, teeth, and even their tongue when they open their mouth.
In this way, not only a simple touch can be reproduced between the lips and teeth. Through programming changes to the hardware manipulation code, the ultrasonic pulses can be programmed into various specific patterns.
This makes it possible to recreate the sensation of an object sliding over, swiping across your lips, or the constant vibration of a virtual stream of water splashing in your mouth as you lean over to drink from a virtual water dispenser.
Naturally, it also includes various tactile experiences of the interaction between the lips and tongues of the couple while kissing.
Many people have seen business opportunities: if FIG’s technology is commercialized, the world’s first Metaverse hardware device that can remotely reproduce kissing remotely will be born.
If it is equipped with other hardware and technologies, ahem, there is nothing but unimaginable, there is abundant potential, unlimited business opportunities, a blue ocean……
Thinking of this, not only the middle-aged editor who is rubbing his hands and laughing, netizens also think about it.
“Coming soon in XXX version.”
But the most touching thing is this netizen’s question:
“This technology is amazing, but for users like us who have been single all the time and haven’t kissed a real person in ten or eight years, can anyone describe the feeling of kissing in real life in words we can understand?”
The user said it even more sadly: “I guess the kissing feeling reproduced by the VR headset will be an unbearable feeling of loneliness.”
However, if you wear it while playing a horror game, the experience may be too “gram”.
This device allows you to experience the sight and touch of a giant spider hitting its face and mucus flowing from its face into its mouth at the same time.
The girl in charge of the experiment was walking, and then a “face hugger” rushed over without warning and frantically “interacted” with your lips…
Another eldest brother who participated in the test said that the effect was so real that he slapped his face wildly in reality and wanted to fan the spider away…
Previous studies often used real physical contact when “reproducing” VR scenes.
For example, stab a robotic arm directly into your mouth, or use a heating wire to generate heat and a fan to simulate airflow.
Obviously, these methods do not do much, and the latency is high.
Maybe you have already walked a few steps, and the robotic arm has not completed a full circle.
CMU researchers took a different approach.
On the bottom of the VR headset, they installed a custom PCB with 64 40kHz ultrasonic sensors that face the user’s mouth at a 45° angle.
The size of the entire array is 17.9×10.6×1.5cm and the weight reaches 107g. Still relatively thin and light overall, but still a 15.8% increase over the Oculus Quest 2’s 676g mass and 15mm thick, increasing the overall height of the device by about 16.5%.
Next, the researchers connected the array via ribbon cables to the Ultraino driver board and upgraded it with an IX4427N integrated circuit chip. Then match the Arduino Mega, and use the Ultraino DriverMEGA firmware to flash the machine.
In this way, all 64 sensors can be independently controlled with a time step resolution of 2.5us, which is equivalent to a phase resolution of 𝜋/5 at 40kHz, and an analog node position accuracy of 0.9mm.
On the software side, the researchers chose to develop in Unity.
Using the known position of the virtual object and the headset, including the geometry of the ultrasound array, it is possible to create haptic effects in the necessary three-dimensional positions. After that, the calculated “transmission scheme” is passed to the Arduino at a speed of 10Hz.
While this data is transferred over USB, it’s still an average of 2ms “faster” than the visuals rendered on the Oculus Quest 2 (which is close enough in time to be considered synchronous).
However, the current plan is not perfect.
The researchers are currently only able to generate haptics for objects closest to the lips, not multi-node haptics.
The good news is that users usually only bring one object to their mouth (e.g. pick up a cup to drink).
Haptic parameters used in the scene
The effect, in a nutshell – a big hit!
Perhaps this result isn’t all that surprising, after all, haptics improve realism and immersion.
In particular, haptics can provide VR event information beyond the user’s field of vision, which greatly enhances the user’s experience.
Of course, the researchers didn’t stop there.
A next-generation upgrade is already in development.
Posted by:CoinYuppie，Reprinted with attribution to:https://coinyuppie.com/kissing-in-the-metaverse-cmu-launches-vr-headset-plug-in-replicating-the-lifelike-touch-of-lips/ Coinyuppie is an open information publishing platform, all information provided is not related to the views and positions of coinyuppie, and does not constitute any investment and financial advice. Users are expected to carefully screen and prevent risks.