NVIDIA GTC Symposium: How to Build an Industrial Metaverse

NVIDIA GTC Symposium: How to Build an Industrial Metaverse

Omniverse Replicator for driving

Nvidia’s Omniverse is a precursor to an open Metaverse, an interconnected universe of virtual worlds like in novels like Avalanche and Ready Player One. And it’s rapidly moving beyond science fiction into industrial applications.

This is because the Metaverse will no longer be about fun and games. Businesses are using Omniverse’s open platform to simulate the digital world and then build something in the real world. Creating “digital twins” like climate models of BMW factories, or even the entire planet, would give the Metaverse true meaning in enterprise and other non-gaming applications.

This was my third time at Nvidia’s GTC event, and I hosted a panel on Omniverse to discuss the progress companies are making in building 3D assets, simulating engineering, and sharing across company product lines. Our group assessed how much progress we’ve made and what they expect to make in 1, 5 or 10 years.

Panelists included Rev Lebaredian, Head of Omniverse and Simulation Technologies at Nvidia; Virginie Maillard, Global Research Head, Analog and Digital Twin Products, Siemens; Amy Bunszel, Executive Vice President, Autodesk Architecture, Engineering and Construction Design Solutions; Timoni, Vice President, Augmented and Virtual Reality, Unity Lori Hufford, vice president of engineering collaboration at West and Bentley Systems.

The following is an edited transcript of the interview.

NVIDIA GTC Symposium: How to Build an Industrial Metaverse

VentureBeat: Let’s start with your brief view of the Metaverse and your company’s place in it. Virginie, can you open for us?

Maillard: For Siemens, the industrial Metaverse has existed for some time in the form of digital twins and digital threads.We have applied it to factories, transportation, buildings, smart cities. It runs through the whole life cycle of design, production and operation. Our industrial engineering software already provides key building blocks for the industrial Metaverse, including immersive user experiences with AR and VR capabilities. For example, we can already provide this functionality in Simcenter or Process simulation.

In the future, we believe that the industrial Metaverse will expand as multiple technological means are available and brought together to provide more realistic immersion, real-time feedback, and more openness and interoperability between tools. We believe that these new capabilities will allow our customers to expand their experience, improve interaction with the digital world, enable collaboration between different people in the digital world, and also combine physical and digital to test reality more easily and quickly Different scenes in the world. For Siemens, the industrial metaspace will be an evolution of what we already offer to enable better digital twins and digital threads.

NVIDIA GTC Symposium: How to Build an Industrial Metaverse

Nvidia CEO Jensen Huang introduced Omniverse Avatar.

Bunszel: One of the things that excites me is how it will remove barriers between collaborations and drive innovation while also helping us operate all of our assets more efficiently. This includes all retrofits and demolitions, which are very important for sustainable development.

I feel that on the one hand, the industrial Metaverse is capable of creating digital twins of places, processes, and real-world objects, but I also want to talk about our Metaverse to provide a rich context for new designs. If you think about it, our clients can create a Metaverse that helps them visualize, simulate and analyze physical and functional performance before building anything – which is good for buildings and infrastructure, it will allow them to build Anything before being able to engage broadly with stakeholders around the world, helping them explore more ideas from crazy to practical.

Another advantage of Metaverse is that it will enable our customers to obtain data from any provider. The Metaverse is inherently open, enabling our clients to aggregate data from anywhere. I think we all know that people use tools from multiple vendors. Pulling everything together can sometimes be a challenge.

As for the role of AutoDesk, we’re not building the Metaverse. Our solution really helps the Metaverse. We’re looking for ways to help our clients unlock the value of the data they create in our solutions, enabling them to easily bring that data to the Metaverse and explore all the possibilities there.

West: My take is a little different. I tend to think of the Metaverse as a metaphor, like we were talking about the information superhighway in the ’90s. Rather than being something that will exist in the future, it’s a way of describing what I think is a more fundamental and interesting shift, and that’s how computing has changed over the past decade.They go from being networked and portable, which was a huge change that we saw in 2010, to now a plethora of miniaturized and high-fidelity sensors and machine learning. This gives us the ability to make computers really understand things like scanning space, recognizing objects, recognizing faces, being able to recognize sounds, being able to recognize the input part of the system.

When we discuss the Metaverse, we are talking about applications of contextual computing. Computers now have context through things like sensors and machine learning. The Metaverse refers to what people will do in a new wave of technology. That’s how I tend to think about it. Having real, real-world data—whether in real-time or recorded, that can be simulated and virtualized—allows us to have highly realistic simulations and real-time updates. For example a real-time 3D mirror of the world. This brings us to some very interesting industry use cases.

As for Unity’s place in that, we love getting all this data from all of our partners, from the real world itself, directly from the device, so that our developers and our community can make apps that make sense of that data. This is where we are in the ecosystem.

Hufford: Everyone has some great insights into the Metaverse. Bentley is an infrastructure engineering software company.So what does the Metaverse mean for infrastructure? Why do we need it?

How to balance our relationship with the natural world is a pressing question. Infrastructure plays a key role in this. We believe that the intersection of the digital twin, the Metaverse, and the real world is the perfect place to invest so teams can work better together and leverage talent from anywhere to solve these problems.

A digital twin of infrastructure is a digital representation of the physical world that is constantly updated with information from the physical world. It must contain rich engineering data and must be open so that it can be obtained from any source. It must also be able to track and visualize changes in real-world conditions, such as sensors and IoT devices such as drones.

But one key difference with entertainment is that while in entertainment, engineering precision and the laws of physics don’t necessarily apply because that’s one of the reasons it’s fun, in infrastructure they must. Digital twins must provide millimeter-level accuracy, geospatial alignment, and support complex engineering patterns.

Where does the Metaverse come from? In designing, building, and operating the world’s infrastructure, the Metaverse allows people to teleport or immerse themselves in these digital twins, or augment the real world with information provided by the digital twins.

Lebaredian: We got some really interesting responses. I agree with everything I’ve heard. It’s funny because it’s a big deal.It is now amorphous. There are many views on what the Metaverse is. It’s a bit like 1993 when the first web browser, Mosaic, came along. People ask, “What is the Internet? What is the web?” When it first came out, it was hard to answer that question.

NVIDIA GTC Symposium: How to Build an Industrial Metaverse

Omniverse Showroom is a new feature for viewing digital assets.

From our perspective, whatever the Metaverse ends up being called, whatever it is, it’s clear that computers are a key part of the virtual world and our world in digital form – how will humans interact with these Virtual world interactions, and our interface with them, will be more like the natural human experience before computers. We use computers all the time through these two-dimensional and abstract interfaces. All of this three-dimensional space technology overlays all of our information, and we can now experience it in the same way we have always experienced the world around us.

This opens up many opportunities. For the most part, discussions about the Metaverse have focused on entertainment, social, media, and especially VR, which is just one mode of experiencing it. But it’s actually much bigger than that. Our panel discussion here is interesting because it’s a very different perspective, which in my opinion is a more direct and important use of the Metaverse and all of these technologies in humans.

As for Nvidia’s place in this, we think the Metaverse is like the internet, bigger than any company, bigger than any company. To achieve our goals, we must all contribute and build together. What Nvidia can do is leverage our core competencies, our skills and passions, which are computing and computing stacks, to power the simulation of this virtual world.

To do this, the first thing we need to do is describe these virtual worlds in a way that everyone agrees, describing how they work, so that we can exchange these worlds between tools and simulators. The best tools and best simulators, the best toolsets for every job work together in harmony. That’s our focus.

VentureBeat: The first thing we’re going to talk about is how to get to the Metaverse. Where have we started? The first wave of Metaverse development came from game developers and content creators like Roblox. Rev, I was wondering if you could tell us how we’ve started from that perspective.

Lebaredian: I think if you want to know where the technology is going in the future, you should keep your eyes on the game. What our kids and teens are doing today is what adults will be doing 10 or 15 years from now. In recent decades, all the interesting things in the field of technology first appeared in the field of games. Voice over IP, social media, digital currencies, all of these were concepts in the gaming world before they became mainstream. Simulation of 3D worlds has existed in video games for over 20 years. Now, we’re using it for many non-gaming purposes.

But we can start by looking at how these players interact and how they use these tools. For what’s to come, they’re canaries in the coal mine. Then we can apply it to non-gaming, non-entertainment purposes.

West: It’s interesting. It turns out that every game needs a game system. Some game designers create game systems. I once read a great article explaining the different roles of different people in a game studio. Game designers say, “When we walk to a door, does the door open? Do I have to press the button? Does it magically disappear? Can I pull it? Can I just walk right through it?”

Every game engine needs to be able to let you describe in real-time whatever the user is doing in the system. The weird thing is, it turns out that when you want to do a real world simulation, or if you want to input a bunch of simulations that simply mirror real world situations, you basically need what we used to call game design, but now we’re starting to call it Something designed for the world. We want to reflect the fact that it absorbs real-world data.

This is accidental. I don’t think there is a very long-term plan around this, although there have always been very interesting futurists in the game design world. I believe this was predicted a long time ago. But the reality is that everything you need to design a very complex game world, realistic or unrealistic, can be used to mirror real-world information, or to simulate predicting real-world information. We’ve had this really interesting moment, the tools are here, just as computers are starting to come online to be able to give them to us.

NVIDIA GTC Symposium: How to Build an Industrial Metaverse

This GTC is going to have a lot about robots.

VentureBeat: Timoni, can you talk about how gaming technology is leading the industry to the Metaverse? I happened to talk to Meta’s Jason Rubin recently and he said he doesn’t think the Metaverse can be built without a game engine. This seems like a basic tool. Do you have your own opinion on how we got into the industrial Metaverse?

West: I think he’s right. We’ll see an evolution in how these tools are used. For example, back in the 70s, the killer application for the first commercial microcomputers was spreadsheets. But they are used in a very specific way. They are used for finance and inventory. Now you can see that spreadsheets are used a lot. Airtable is a great example of how spreadsheets are becoming more like data manipulators for those who understand how they want to work with their data.

Likewise, game engines were made for gaming, but they are already starting to be used for education, scientific visualization, training on self-driving car datasets for AI data simulation. Lori talks about this as the fidelity of sensor data increases. The truth is, in order to embody industrial quality, you have to have a level of precision that video games don’t need. You need to be able to enter and react to information faster. You need to have better visuals, high fidelity.

Our computers are fast enough and our technology is advanced enough that we can start thinking about game engines.We are in a phase where we are actively developing tools to support these types of use cases.

VentureBeat: Now, let’s talk about the industrial side of the Metaverse. The Metaverse is often showcased in consumer experiences in VR, but the industrial opportunity appears to be broader. Virginie, can you start for us?

Maillard: We identify different application areas for the industrial Metaverse at all stages of the product life cycle, from design engineering to testing and validation, to production, operations, and even training. Let me give you an example in each area.

First, during the design and engineering phase, the industrial Metaverse will allow for more collaboration between stakeholders, with more possibilities to involve non-technical stakeholders in the process thanks to the new user experience. From this perspective, the industrial Metaverse drives the democratization of simulation, enabling experts to communicate more effectively with non-experts. In this role, it will be possible to explore more design and manufacturing options in a more interactive way, involving more perspectives, resulting in better products and shorter time-to-market.This is our first concrete vision of the Metaverse.

Second, in the testing and validation phase, combining realistic environments with multidisciplinary, multiphysics simulations offers us the possibility to virtually explore more operational scenarios. For example, we can generate synthetic data through machine learning to train and test autonomous driving systems, such as AGVs in factories, robots, or self-driving cars. Since everything is virtual, we can investigate a large number of critical situations, such as accidents, and perform large-scale tests that we cannot afford in real life. Intensive virtual training and system testing ensures more safety and a shorter time to market.

The third example I gave you is in production. The primary use case for the Industrial Metaverse is virtual restoration. This means you can install and test new equipment and software without disrupting the production line. For example, you can use augmented reality to visualize new equipment in a factory, or you can simulate interactions with existing manufacturing assets. This already exists, but it is assumed that the industrial Metaverse will bring an enhanced user experience to the field.

The last example I have for you is on the operational side. Here, the main use cases for the industrial Metaverse are already well known. It is a visualization of workflow and data around physical assets. For example, in augmented reality, the operational data of a machine can be displayed on a tablet screen. We think the next step will be to get the information on the shop floor through the new interface to the operator so they can make faster and better decisions. We call these executable digital twins. This is a simplified model that is self-contained and can run on edge devices. This will be a compilation of the full range of information we can have at the workshop. As you can see, the Metaverse has a wide range of industrial applications.

NVIDIA GTC Symposium: How to Build an Industrial Metaverse

Nvidia’s Earth-2 will be the digital twin of Earth.

VentureBeat: Amy, what opportunities are you looking at in the Metaverse event?

Bunszel: First and foremost, there is a lot of overlap with what Virginie talked about. Overall, the theme of collaboration and innovation, bringing all this data together with all the stakeholders, is going to be revolutionary for all the industries we work in. Especially the AEC field – Architecture, Engineering and Construction.

We look at design, construction and operation. One of the great things about looking at the full lifecycle is that you can also start focusing on sustainability, taking the decisions you make during the build phase to the design phase where you can take advantage of all the things you get in the Metaverse or digital twin Background to really understand the downstream implications of decisions you might make about materials or construction methods. For infrastructure and construction, we are excited about the opportunity that Metaverse will enable our clients to make high-impact decisions early in the design cycle, when they can have the greatest long-term impact.

VentureBeat: Lori, where is your focus on the development of the industrial Metaverse?

Hufford: I agree with a lot of the same topics that Virginie and Amy are talking about. For us, it’s the convergence of the digital-physical world to create better outcomes for infrastructure. That’s where our focus is. For example, during the design process, humans are interacting with the digital twin, conducting an interdisciplinary design review of the supply chain on a global scale. In the world we live in today, it’s important to be able to leverage talent anywhere, ultimately resulting in the best possible design.

With the digital twin, the digital twin of the infrastructure, we are able to bring in models, real-world contexts and maps.We can identify issues and visualize changes during the design process, communicate with markers, and use the digital tools of the Metaverse to ensure design validity. In construction, we are also considering the use of augmented reality to visualize and evaluate construction sequences, minimise risk and lead to better outcomes. I can also repeat some of the things Virginie talked about about operations, using augmented reality and infrastructure digital twins to improve safety, doing simulation training, things like that. There are so many opportunities to bring the world together to produce better outcomes.

VentureBeat: How important is it for people to be able to bring assets from one virtual world to another? How do you overcome the exchange problems that are prevalent here to move data from one industrial world to another?

Lebaredian: If you go back to 1993, when the first web browsers came out, it’s hard to imagine that the web and everything we know about the internet would exist today if we didn’t have a common way of describing web pages. HTML – Having a format that can be exchanged between browsers is not just a good thing. It is essential. If every web browser had a different way of describing content, we wouldn’t be where we are today.

If we see the Metaverse as an evolution and extension of the Internet, a spatial overlay of a network, then the content in this Metaverse must be in an exchangeable form that everyone can contribute, understand, and participate in. But we are still early. Like HTML 1.0 in 1993, it still doesn’t do much. You can put some text together with pictures and hyperlinks. It took us about 20 years to add all the features we love today, including videos, interactive content and programs, and more. It will be a long time before we reach our goals in the Metaverse, but there is much more we can do today.

We decided to start with a generic scene description. We haven’t seen anything better, and frankly, we think it’s pretty good. But there is still a lot of work to be done. We need to develop more standards for how to describe everything we could possibly want in a world, and how to do it in a very accurate physical way, in a way that matches the real world. And not just something that looks good. We are making a significant contribution to the dollar, and we invite all of our partners and others to do the same. Over the next few years, we’ll sort out what the concrete form is, how we describe the concrete patterns of different aspects of these worlds for different industries, layered on top of a common plan.

NVIDIA GTC Symposium: How to Build an Industrial Metaverse

Nvidia unveiled its Earth 2 digital twin goals at its 2021 GTC conference.

VentureBeat: Timoni, do you want to try it too? How important is portability and swapping and how are we going to do that?

West: Rev provides a great background. I’m super curious – I love the way Rev talks about the web in the early days. There has always been a question of what kind of interoperability do we really need. The reason I’m not exactly “cheering for interoperability” is – I’m very interested in networking standards. I still do to this day. The only problem is that being able to describe how an object behaves in one space versus another is very, very tricky – here I go back to the way a game designer thinks.

Maybe it’s a little easier for industry, because they’ve gone ahead and created amazing standards and levels of interoperability that allow them to handle huge amounts of data. But a classic example we often see is, I want to take a sword from one game and use it in another. Or another space, I should say, a virtual world. What does it do? What if this was a game without swords? What if it was a peaceful world? We may have file format aspects that can be worked out, but then the interoperability and interactivity behavior is what differentiates the Metaverse from what we know today and Favorite key aspect of the internet, I really don’t have an answer. I’m guessing this will be a must. As people demand it, as they discover a need for it, we’ll see homebrew efforts to achieve the type of behavior that users demand.

But in terms of standards and file formats and interoperability, I think we’re making huge strides. We’ve had great conversations, done a great job, and have been actively discussing these things. We’ve had multiple conversations with Nvidia and AutoDesk on this.

Lebaredian: I totally agree. It’s hard. There are no easy answers to these questions. Having said that, I don’t think that means we shouldn’t try. There are a lot of smart people in the world, and a lot of engineers who can solve these problems.If there’s a will, I think we’ll figure it out.

If you go back to the early days of the web, we said the same thing about how to make applications behave and interact.We have had many false starts. We have things like Flash and ActiveX. The matter resolved itself. Strangely enough, we ended up with JavaScript as the way to describe everything.

If you had told me 20 years ago that JavaScript would not only power the internet and the web, but that we would also compile C++ into JavaScript for web assembly, I would have called you crazy. This is impossible. this will not work. But engineers can do wonders when they need to. If necessary, I think we’ll fix it. We just need to be willing to try. We will have many experiments, many will fail, but some will succeed.

West: I don’t want to sound negative. What I’m thinking is, I feel like we’re talking about the same thing. I know people will want this. I was very curious to see what would be squeezed out. Like you said, it’s impossible to predict that JavaScript will become the de facto lingua franca of the dynamic internet. But it will be very interesting. I’m happy to be in this exploratory phase now.

NVIDIA GTC Symposium: How to Build an Industrial Metaverse

Ericsson’s panoramic environment.

VentureBeat: I’m curious, can you come up with a portable scenario for the rest of our panelists? Why you might want to use someone else’s stuff in your Metaverse app? Gamers buy a lot of digital items and create virtual characters. They want to take these things from one place to another. But what about the industrial Metaverse? Can you now think of a reason why you want this portability?

Maillard: Of course. What we call openness or interoperability is part of the future of this industrial Metaverse, as a pusher for the simple reason that it is a way of connecting tools together, providing appropriate interfaces and consistency between tools data management. The second aspect is the easy transfer of virtual assets to different virtual environments.For example, a model of a car can go from design and verification to production, testing and operation, gaining more maturity at each stage to serve as a complete connected system to feed back and improve the design for the next iteration. To do this, we need interoperability because we need different tools to communicate together.

VentureBeat: I think in a digital factory, someone might be building a car, but all the parts of those cars might be ubiquitous in different factories and in different digital twins. In this case, it becomes useful if it is not necessary to create the same parts for two different digital twin factories.

I’m curious about the current and present state of the Metaverse. What is tangible to you? What are some specific examples of Metaverse workflows?

West: It’s interesting. Again, it depends on how you view the Metaverse. In terms of tangibility, I think the most reliable and concrete application of contextual computing is the combination of devices. The Metaverse started out as an idea for virtual reality. Since then, when I started getting into VR, I was thinking, “This is it! It’s going to replace all our screens with AR glasses! No hands!” Over the past seven years, I’ve come to realize that screens and The keyboards are all great. Every computing device we own has its pros and cons. Virtual reality is great for presentation. But it’s terrible for writing code. If you want to write a great American novel, or do transcriptions, you need a keyboard. All of these devices are just tools to help you achieve your goals.

So a good example of what I call a futuristic use case is when you have multiple different people who can use those different devices in a way that best suits their workflow. For example, someone wearing holographic glasses is looking at a detailed report on the engine. Some hold iPads, while others join them in virtual reality. This interoperability between devices, this real-time interactivity, and the ability to display information through devices in a way that best suits each person’s needs, for me, is what excites me most about this new type of computing. The best and truest expression.

Hufford: I totally agree with Timoni – we have to be able to meet people’s needs. We had to be able to integrate this technology into their existing processes and workflows. This is important to us. It has to be accessible, it has to be open so that users can imagine their own applications and develop their own Metaverse applications – in our case, for the design, operation and construction of infrastructure.

I can give a concrete example of how a leading organization today is using an infrastructure digital twin in the Metaverse.France’s ITER project is a partnership of 35 countries to build the world’s largest tokamak, a magnetic fusion device designed to demonstrate the viability of nuclear fusion as a large-scale, carbon-free energy source. It works on the same principle as the sun and stars — nuclear fusion, not fission. This is a huge design and build project with a lot of complex engineering models, millions of parts. It had to be built off-site and then hoisted into place.

ITER wants to get humans into the digital twin and experience it. The container model was brought into Bentley Synchro, which is powered by the iTwin platform for 4D building modeling and sequencing. Then, leveraging Unreal and the Omniverse, and using Cloud XR, it all comes together to create an amazing experience that immerses the team.

Realism is important here. We must be able to do this without reducing model complexity so that it can provide engineering value. It was the first time that customers were able to get inside a model. Realistic lighting enhances the experience, while a digital twin of the infrastructure, containing all required engineering data, enables customers and users to take full advantage of the experience.

Another example in use today is remote bridge inspection. There is so much existing infrastructure in the world.Conducting bridge inspections to ensure safety and determine repairs can be dangerous and time-consuming. The Metaverse can help with this. Again, there has to be a high level of detail to see things like cracks and rust. Drones were used to capture a realistic model of the bridge, and inspectors then used HoloLens to inspect the resulting model.Inspectors can perform more inspections with fewer trips and work remotely, while also improving their safety.

NVIDIA GTC Symposium: How to Build an Industrial Metaverse

Lockheed Martin is using the Omniverse to simulate wildfires.

VentureBeat: I’m curious what the next direction for the Metaverse is going to be. What do you think will become a reality in, say, a year or five? Which ones might take 10 years?

Maillard: As I said at the beginning, for Siemens, the industrial Metaverse is an evolution of what we already offer to enable digital twins and digital threads. In the future, we see that the immersion, real-time feedback and openness of reality will allow our customers to expand their experience of the industrial Metaverse.

We’ve talked about interoperability, so let me talk about immersion and real-time. By immersion, I’m referring to the realistic quality of the virtual 3D world. It will certainly make human cognition and human interaction with the digital world more aligned. The second benefit is training autonomous systems, as I mentioned earlier, using synthetic data. This is why immersion is important as an evolution. This won’t necessarily involve VR and AR devices, though. A laptop or tablet will suffice as they already provide immersion. The same is true of games today.

Another important aspect is real-time collaboration. In the future, users should get instant feedback from their actions and interactions with others. They will feel a synchronised experience. The challenge here is not to sacrifice simulation and prediction accuracy. Ultimately, the purpose of this real-time simulation is to quickly test various possible scenarios and troubleshoot problems in the early stages of design and production. That’s why at Siemens, we already offer a suite of multi-physics high-fidelity tools, and we continue to work on next-generation technologies to accelerate simulations while maintaining the required precision. Even though we already have a solid foundation, the cornerstone of the industrial Metaverse, we still have an exciting time ahead.

Hufford: A lot of people here are talking about communication. Over time, our view of communication will evolve and refine. Of course between people. We’ve been talking about this. But people to assets, assets to people, and assets to assets. Various modes, including XR and others, touch upon this.

These assets can alert them to an operating condition that requires maintenance, or can guide maintenance personnel through a repair process, to name a few. A virtual lobby where people and assets can interact. This will allow us to use our talent more effectively to design, build and operate infrastructure. For example, using holograms to bring remote experts into the field for consultations expands our ability to leverage talent globally.

As I mentioned before, we need to manage our relationship with the natural world. Over time and as the Metaverse continues to evolve, we at Bentley – we want to be good stewards of the planet. The amount and complexity of infrastructure data will power the Metaverse. We need to look for architectures and software designs that minimize computation and reduce the carbon footprint of the journey.

VentureBeat: What are the technical barriers to achieving these goals?

Bunszel: First, I want to reiterate that I think all of our customers are able to experiment with existing technology today.Digital twins have been around for many years. They provide a great environment for people to jump out of and do more.But when I think about obstacles, I have three that come to mind.

First, in many cases, if we use existing design data, we try to use it for new purposes. We need to automate how to get the right amount of information into the Metaverse. Regarding computing and the carbon footprint of everything we run in the Metaverse, we need to do something to prepare the data so it can best meet our requirements.

Another topic we haven’t really discussed is trust. In many cases, our clients put their intellectual property into the Metaverse. Figure out how we can leverage all the technologies we have in security, privacy, compliance, reputation, fraud, etc. – there’s a lot of potential to do amazing things, but we also need to make sure that all the IP that goes into these environments gets Protect.

For me, the third part is the concept of dynamics, which I think Virginie mentioned. The Metaverse is not static. Things change in the Metaverse. They may need to be reflected on the original data and vice versa. Having this real-time, interactive, always up-to-date environment will unlock even more power for collaboration as we figure out how to seamlessly transfer all the data back and forth.

NVIDIA GTC Symposium: How to Build an Industrial Metaverse

Jensen Huang of Nvidia.

VentureBeat: What are the next big technological leaps needed to get to the next level of the Metaverse?

Lebaredian: Everything we’ve been talking about here today about the Metaverse implicitly assumes that we have the ability to accurately simulate these worlds and make it match the real world we’re in. To this day, the world simulators we have are primarily designed for entertainment. They are designed for games. We need to find ways to extend these world simulators to be physically correct and take our existing computing power to the limit so that we can get these superpowers.

Once you can simulate a world and do that accurately, you can get some content. If we simulate the interaction of light and matter, we can see these worlds. We can see what they look like. If you can take a world from the real world and then rebuild it in the virtual world, simulate it, we can teleport virtually anywhere in the world. If you can run a physics simulation and see how the state of the world changes, and have confidence in its development, you basically have a time machine. We can go into the future and see what the world will be like, what it will be like sometime in the future. This allows us to explore many possible futures. We can change the initial state, explore all possibilities, and choose the best future that we think will be the most efficient, the most sustainable, the safest, or whatever future you are optimizing for.

The challenge here for us is to get the physical accuracy, which is going to require a lot of computation, but it also requires a lot of cooperation between all the tools and all the simulators and all the things that exist, so we can convert these worlds into All elements are combined into a form that can be simulated. If you think about it, every man-made object around us exists somewhere in some 3D digital form. Someone has used a CAD tool for every product you’re using, every building you’re in. Unfortunately, at some point they strayed from what was actually built, which is what is in reality. The better we can manufacture these tools in a collaborative framework and allow real-world and digital-world versions of things to be synchronized, the sooner we can achieve those teleportation and time-travel superpowers.

VentureBeat: Timoni, any final thoughts?

West: I would like to add. We do have CAD files that reference a lot of things that exist today. But for things that were handcrafted, things that existed before computer-aided design, we don’t have that data. I love that we’re deep into things like photogrammetry and object reconstruction, in the sense that we need to be able to bring the real world online and start merging digital and physical into a single point where we can have these types surreal simulation. This will help us all make better decisions and, more importantly, help computers respond to us in a truly democratized way, allowing all to use computers in a way that makes sense for them.

Posted by:CoinYuppie,Reprinted with attribution to:https://coinyuppie.com/nvidia-gtc-symposium-how-to-build-an-industrial-metaverse/
Coinyuppie is an open information publishing platform, all information provided is not related to the views and positions of coinyuppie, and does not constitute any investment and financial advice. Users are expected to carefully screen and prevent risks.

Like (0)
Donate Buy me a coffee Buy me a coffee
Previous 2022-03-23 22:45
Next 2022-03-23 22:48

Related articles