a16z: What will be the game changes brought about by the Metaverse?

This revolution will require innovation across the technology stack, from production pipelines and creative tools, to game engines and multiplayer networks, to analytics and real-time services.

You install the new parkour game everyone is talking about, and your avatar instantly gains a new set of skills. After a few minutes of the tutorial level, climb the walls and pass the obstacles and you are ready for a bigger challenge. You teleport yourself into your favorite game, Grand Theft Auto: Metaverse, follow a tutorial set up by another player, and soon you’ll be jumping over the hood of your car and jumping from roof to roof. Wait a minute… what’s the light coming from below? Oh, the hyper-evolved fire-breathing dragon! You take a pokeball out of your pocket, capture it, and continue on your way…

a16z: What will be the game changes brought about by the Metaverse?

This kind of game scenario cannot happen today, but in my opinion, it will happen in our future. I believe the concepts of composability (recycling, reusing, and recombining basic building blocks) and interoperability (letting components of one game work in another) are emerging in games that will revolutionize how games are built and experienced Way.

Game developers will build faster because they don’t have to start from scratch every time. Able to try new things and take new risks, they build more creatively. And there will be more, because the barriers to entry will be lower. The nature of the game will expand to include these new “meta-experiences” that, like the aforementioned examples, will function in and out of other games.

Of course, any discussion of “meta-experiences” also sparks discussions around another idea that gets much attention: the Metaverse. It’s true that many see the Metaverse as a well-crafted game, but its potential is much higher than that. Ultimately, the Metaverse represents the entirety of how we humans will interact and communicate online in the future. In my opinion, game creators who are based on game technology and follow the game production process will be the key to unlocking the potential of the Metaverse.

Why Game Creator? No other industry has so much experience building large-scale online worlds in which hundreds of thousands (sometimes tens of millions!) of online participants frequently interact simultaneously.Modern games are more than just “playing” – they are just as important as “trading”, “skilling”, “streaming” or “buying”. The Metaverse has added more verbs to that list – think “work” or “love.” Just as microservices and cloud computing started a wave of innovation in the tech industry, I believe the next generation of gaming technology will usher in a new generation of gaming innovation and creativity.

This has happened in limited ways. Many games now support User Generated Content (UGC), which allows players to build their own extensions to existing games. Some games, like Roblox and Fortnite, are so extensible that they’ve called themselves Metaverses. But the current generation of gaming tech, still largely built for single-player games, can only get us so far.

This revolution will require innovation across the technology stack, from production pipelines and creative tools, to game engines and multiplayer networks, to analytics and real-time services.

This post outlines my vision for the transformative phase of gaming, and then dissects the new areas of innovation needed to usher in this new era.

Upcoming game changes

Games have long been largely monolithic, fixed experiences. Developers will build them, release them, and start building sequels. Players buy them, play them, and then move on to other games after they’ve cleared all of the game content — usually in 10 to 20 hours of gameplay.

We are now in the era of games as a service, where developers are constantly updating their games after they are released. Many of these games also feature UGC adjacent to the virtual world, such as virtual concerts and educational content. Roblox and Minecraft even have marketplaces where player creators can get paid for their work.

Crucially, however, these games remain (purposefully) isolated from each other. While their respective worlds may be large, they are closed ecosystems and nothing can be transferred between them – not resources, skills, content, or friends.

So how do we break free from the legacy of the walled garden to unlock the potential of the Metaverse? As composability and interoperability become important concepts for Metaverse-minded game developers, we will need to rethink how we approach the following issues:

  • identity. In the Metaverse, players will need a single identity that they can carry with them from game to game and from game platform to game platform. Today’s platforms insist that players have their own player profiles, and players have to tediously rebuild their profiles and reputation from scratch with every new game they play.
  • Friendship. Likewise, games today maintain separate lists of friends — or at best, use Facebook as the de facto source of truth. Ideally, your network of friends will also track you from game to game, making it easier to find friends to play with and share competitive leaderboard information.
  • property. Currently, items you earn in one game cannot be transferred to or used in another game – and for good reason. A game that allows players to introduce modern assault rifles into the medieval era may be temporarily satisfying, but it will quickly ruin the game. But with the right constraints and constraints, exchanging (some) items across games can open up new creative and emergent ways of playing.
  • Gameplay. Games today are closely related to gameplay. For example, the whole fun of the “platformer” genre, games like Super Mario Odyssey, is achieving mastery of the virtual world. But by opening up the game and allowing “remixing” elements, players can more easily “remix” new experiences and explore their own “what if” narratives.

I see these changes happening at three distinct layers of game development: the technical layer (game engine), the creative layer (content production), and the experiential layer (live operations). At each layer, there are clear opportunities for innovation, which I describe below.

a16z: What will be the game changes brought about by the Metaverse?

Side note: Making a game is a complex process involving many steps. Even more so than other art forms, it’s highly non-linear and requires frequent loops and iterations because no matter how fun it sounds on paper, you don’t know if it’s actually fun until you play it. In that sense, it’s more like choreographing a new dance, where the real work is repeated in the studio with the dancers.

The expandable section below provides an overview of the game-making process for those who may not be familiar with its unique complexities.

Technical Layer: Reimagining Game Engines

At the heart of most modern game development is the game engine, which powers the player’s experience and makes it easier for teams to build new games. Popular engines like Unity or Unreal provide common functionality that can be reused in games, freeing game creators to build something unique to their game.Not only does this save time and money, but it also creates a level playing field, allowing smaller teams to compete with larger teams.

That said, the fundamental role of the game engine relative to the rest of the game hasn’t really changed over the past 20 years. While the engines have increased the number of services they offer — extending from mere graphics rendering and audio playback to multiplayer and social services as well as post-launch analytics and in-game advertising — these engines are still primarily provided as codebases, entirely managed by each Game packaging.

However, when considering the Metaverse, the engine plays a more important role. To break down the walls that separate one game or experience from another, games may be packaged and hosted in an engine, not the other way around. In this expanded view, engines become platforms, and the communication between those engines will largely define what I think of as the shared Metaverse.

a16z: What will be the game changes brought about by the Metaverse?

Take Roblox for example. The Roblox platform provides the same key services as Unity or Unreal, including graphics rendering, audio playback, physics, and multiplayer. However, it also offers other unique services, such as player avatars and identities that can be shared in its game catalog; expanded social services, including shared friend lists; robust security features to help keep communities safe; and help players create new A library of tools and assets for games.

However, Roblox still falls short as a Metaverse, as it is a walled garden. While there is some limited sharing between games on the Roblox platform, there is no sharing or interoperability between Roblox and other game engines or gaming platforms.

To fully unlock the Metaverse, game engine developers must innovate in 1) interoperability and composability, 2) improved multiplayer services, and 3) automated testing services.

Interoperability and Composability

For example, in order to unlock the Metaverse and allow for the hunting of Pokemon in the Grand Theft Auto universe, these virtual worlds will require unprecedented levels of cooperation and interoperability. While it is possible for a single company to control the common platform that powers the global Metaverse, it is neither desirable nor possible. Instead, decentralized game engine platforms are more likely to emerge.

Of course, I can’t talk about decentralized technologies without mentioning Web3. Web3 refers to a set of blockchain-based technologies that use smart contracts to decentralize ownership by transferring control of key networks and services to users/developers. In particular, concepts such as composability and interoperability in Web3 help to solve some of the core issues facing the move towards the Metaverse, especially identity and property, and a lot of research and development is going into the core Web3 infrastructure.

Still, while I believe Web3 will be a key component in reimagining game engines, it’s not a panacea.

Perhaps the most obvious application of Web3 technology in the Metaverse is allowing users to buy and own items in the Metaverse, such as virtual real estate plots or clothing for digital avatars. Since transactions written to the blockchain are a matter of public record, purchasing an item as a non-fungible token (NFT) could theoretically own an item and use it across multiple Metaverse platforms as well as several other applications .

However, I don’t think this will happen in practice until the following issues are addressed:

  • Single-user identity, players can move between virtual worlds or games with a single consistent identity.This is necessary for everything from matching to content ownership to blocking trolls. One service that tries to solve this problem is Hellō. They are a multi-stakeholder collaboration that aims to transform personal identities through a user-centric vision of identity, primarily based on Web2 centralized identities. Still others use the Web3 decentralized identity model, such as Spruce, which enables users to control their digital identities through wallet keys. Meanwhile, Sismo is a modular protocol that uses zero-knowledge proofs for decentralized identity management and more.
  • Common content format so content can be shared between engines. Today, each engine has its own proprietary format, which is required for performance. However, in order to exchange content between engines, a standard open format is required. One such standard format is Pixar’s Universal Scene Description for movies; another is NVIDIA’s Omniverse. However, all content types require standards.
  • Cloud-based content storage that enables content needed for one game to be located and accessed by others. Today, the content required for a game is often either packaged with the game as part of the distribution package, or available for download over the web (and accelerated via a content delivery network or CDN). For content to be shared between worlds, there needs to be a standard way to find and retrieve this content.
  • Shared payment mechanism, so Metaverse owners have financial incentives to allow assets to be transferred between Metaverses. Digital asset sales are one of the main ways in which platform owners are compensated, especially under the “free to play” business model. Therefore, in order to incentivize platform owners to loosen control, asset owners can pay a “cork fee” for the privilege of using their assets in the platform. Or, if the asset in question is famous, the Metaverse might also be willing to pay asset owners to bring their assets into their world.
  • Standardize functionality so the Metaverse can know how a given item is being used. If I want to bring my fancy sword into your game and use it to kill monsters, your game needs to know it’s a sword – not a fancy stick. One way to get around this is to try to create a taxonomy of standard object interfaces that each Metaverse can choose to support or not support. Categories may include weapons, vehicles, clothing or furniture.
  • Negotiate the look and feel, so content assets can change their appearance to match the world they are entering. For example, if I have a high-tech sports car and I want to drive into your very strict steampunk themed world, my car may have to be turned into a steam powered SUV to be allowed in. It could be that my assets know what to do, or that the Metaverse world I’m entering has a responsibility to provide an alternate look and feel.

Improved multiplayer system

A big area of ​​focus is the importance of multiplayer and social features. More and more games today are multiplayer because games with social features are vastly superior to single player games. Because the Metaverse will be entirely social by definition, it will be subject to various issues specific to the online experience. Social games must worry about harassment and toxicity; they are also more vulnerable to DDoS attacks that result in lost players, and often must run servers in data centers around the world to minimize player lag and provide the best player experience.

Given the importance of multiplayer capabilities to modern games, there is still a lack of fully competitive off-the-shelf solutions. Engines such as Unreal or Roblox and solutions such as Photon or PlayFab provide these fundamentals, but developers must fill holes such as advanced matchmaking themselves.

Innovations in multiplayer systems may include:

  • Serverless multiplayer games where developers can implement authoritative game logic and automatically host and scale in the cloud without having to worry about starting an actual game server.
  • Advanced matchmaking to help players quickly find other players of a similar level to play against, including AI tools to help determine player skill and ranking. In the Metaverse, this becomes even more important because “matchmaking” has become so widespread (eg, “Find me another person to practice Spanish with” instead of just “Find me a group of players to raid a dungeon with.” )
  • Anti-drug and anti-harassment tools to help identify and eliminate toxic players. Any company hosting the Metaverse needs to be proactive about this, as online citizens won’t spend their time in unsafe spaces — especially if they can jump to another world with all their identities and possessions intact.
  • Guilds or clans that help players join forces with other players, compete with other groups, or just for a more social sharing experience. Virtual worlds are also full of opportunities for players to join forces with other players to pursue common goals, create opportunities for services like creating or managing guilds, and sync with external community tools like Discord.

Automated Testing Services

Testing is an expensive bottleneck when launching any online experience, as a small group of playtesters must iterate through the experience to make sure everything works as expected, with no glitches or bugs.

Games that skip this step are risky. Consider the recent launch of the highly anticipated game Cyberpunk 2077, which has been heavily condemned by players for its numerous bugs since its launch. However, because the Metaverse is essentially an “open world” game with no set course, testing can be prohibitively expensive. One way to alleviate this bottleneck is to develop automated testing tools, such as AI agents that can play the game like a player, looking for glitches, crashes, or bugs. A side benefit of this technology is believable AI players that can either be swapped out for real players who quit a multiplayer match unexpectedly, or provide early multiplayer “match fluidity” to reduce the time players have to wait to start a match.

Innovations in automated testing services may include:

  • Automatically train new agents by observing real players interacting with the world. One benefit of this is that agents will continue to get smarter and more believable the longer the virtual world runs.
  • Faults or errors are automatically identified, as well as deep links that jump directly to the point of error so that human testers can reproduce the problem and work to fix it.
  • Replace AI agents with real players, so that if one player suddenly quits the multiplayer experience, the other players’ experience doesn’t end. This feature also raises some interesting questions, such as whether players can “tag” to AI opponents at any time, or even “train” their own surrogates to compete on their behalf. Will “AI assistance” become a new category of competitions?

Creative Layer: Reimagining Content Production

As 3D rendering technology becomes more powerful, the amount of digital content required to create games continues to increase. Consider the latest Forza Horizon 5 racing game – the largest Forza download ever, requiring over 100 Gb of disk space, while Horizon 4 requires 60 Gb. This is just the tip of the iceberg.The original “source art file”, the file created by the artist and used to create the final game, can be many times larger. Assets grow because these virtual worlds continue to grow in size and quality, with higher levels of detail and higher fidelity.

Now consider the Metaverse. As more and more experiences move from the physical world to the digital world, the demand for high-quality digital content will continue to increase.

This is already happening in the film and television world. The recent Disney+ show, The Mandalorian, broke new ground by filming on a “virtual set” running in the Unreal game engine. This is revolutionary as it reduces production time and cost while increasing the range and quality of the finished product. In the future, I hope more and more works will be shot in this way.

Also, unlike physical film sets, which are often destroyed after shooting due to the high storage cost of keeping them intact, digital sets can be easily stored for future reuse. In fact, it therefore makes sense to invest more, not less, and build a fully realized world that can be reused later to produce a fully interactive experience. Hopefully in the future, we’ll see these worlds available to other creators to create new content within these fictional realities, further advancing the Metaverse.

Now consider how to create this content. It is increasingly created by artists distributed around the world.One of the lasting effects of COVID-19 is the permanent push to remote development, with teams scattered around the world, often working from home. The benefits of remote development are obvious—the ability to hire talent anywhere—but the costs are high, including the challenges of creative collaboration, the massive amount of assets needed to build modern games simultaneously, and maintaining intellectual property security.

Given these challenges, I see three areas of innovation that will emerge in digital content production: 1) AI-assisted content creation tools, 2) cloud-based asset management, building and publishing systems, and 3) collaborative content generation.

AI-assisted content creation

Today, nearly all digital content is still built by hand, adding to the time and cost required to release modern games. Some games have experimented with “procedural content generation”, where algorithms can help generate new dungeons or worlds, but building these algorithms themselves can be very difficult.

However, a new wave of AI-assisted tools is coming, which will help artists and non-artists alike create content faster and with higher quality, reduce the cost of content production, and democratize the production of game tasks.

This is especially important for the Metaverse, as nearly everyone will be required to be a creator — but not everyone will be able to create world-class art. By art, I mean the entire digital asset class, including virtual worlds, interactive characters, music and sound effects, and more.

Innovations in AI-assisted content creation will include conversion tools that can convert photos, videos, and other real-world artifacts into digital assets, such as 3D models, textures, and animations. Examples include Kinetix, which creates animations from video; Luma Labs, which creates 3D models from photos; and COLMAP, which creates navigable 3D spaces from still photos.

There will also be innovation in creative assistants, who will be mentored by artists and iteratively create new assets. For example, Hypothetic can generate 3D models from hand-drawn sketches. Both Inworld.ai and Charisma.ai use AI to create believable characters that players can interact with. DALL-E can generate images from natural language input.

An important aspect of using AI-assisted content creation in game creation is repeatability. Because creators must frequently go back and make changes, it is not enough to store the output of an AI tool. The game creator must store the entire set of instructions that created that asset, so that the artist can go back and make changes later, or copy the asset and modify it for a new purpose.

Cloud-based asset management, build and release system

One of the biggest challenges a game studio has to face when building a modern video game is managing all the content needed to create an engaging experience. Today, this is a relatively unsolved problem with no standardized solution; each studio has to cobble together its own solution.

To understand why this is such a difficult problem, consider the vast amount of data involved. Large games can require millions of files of different types, including textures, models, characters, animations, levels, visual effects, sound effects, recorded dialogue, and music.

Each of these files is changed repeatedly during production, and it is necessary to keep a copy of each variant in case the creator needs to go back to an earlier version. Today, artists often meet this need by simply renaming files (for example, “forest-ogre-2.2.1”), which results in a proliferation of files. And due to the nature of these files, this can take up a lot of storage space, as they are often large and difficult to compress, and each revision must be stored separately. This is different from source code, where you can just store the changes for each revision itself, which is very efficient. This is because with many content files, such as artwork, changing even a small part of an image can change almost the entire file.

Furthermore, these files do not exist in isolation. They are part of an overall process commonly referred to as a content pipeline, which describes how all these individual content files fit together to create a playable game. In this process, the “source art” files created by the artist are converted and assembled into “game assets” through a series of intermediate files for use by the game engine.

Today’s pipelines are not very intelligent and are often unaware of the dependencies that exist between assets. For example, pipes are often unaware of the specific texture of a 3D basket held by a specific farmer character residing within the level. Therefore, whenever any asset is changed, the entire pipeline must be rebuilt to ensure that all changes are cleaned up and merged. This is a time-consuming process that can take hours or more, slowing down the pace of creative iteration.

The needs of the Metaverse will exacerbate these existing problems and create new ones. For example, the Metaverse is going to be huge — bigger than the biggest games today — so all existing content storage issues apply to the Metaverse. Additionally, the “always-on” nature of the Metaverse means that new content needs to be streamed directly into the game engine; it is impossible to “stop” the Metaverse to create new builds. The Metaverse needs to be able to update itself instantly. To achieve composability goals, remote and distributed creators will need a way to access source assets, create their own derivatives, and then share them with others.

Meeting these needs of the virtual world will create two major opportunities for innovation. First, artists need a GitHub-like, easy-to-use asset management system that will give them the same level of version control and collaboration tools that developers currently enjoy. Such a system needs to be integrated with all popular authoring tools such as Photoshop, Blender and Sound Forge. Mudstack is an example of a company focusing on this space today.

Second, there is a lot of work to be done with content pipeline automation, which can modernize and standardize the art pipeline. This includes exporting source assets to intermediate formats and building those intermediate formats into game-ready assets. Smart pipelines will understand the dependency graph and be able to do incremental builds so that when assets change, only those files with downstream dependencies will be rebuilt – greatly reducing the time it takes to view new content in-game.

Improve collaboration tools

Despite the distributed, collaborative nature of modern game studios, many of the professional tools used in game production are still centralized, single-creator tools. For example, by default, both the Unity and Unreal level editors only allow one designer to edit one level at a time. This slows down the creative process because teams cannot work in parallel on one world.

On the other hand, both Minecraft and Roblox support collaborative editing; that’s one of the reasons these consumer platforms are so popular, despite their lack of other professional features. Once you’ve seen a group of kids build a city together in Minecraft, you can’t imagine wanting to do it any other way. I believe collaboration will become an essential feature of the Metaverse, allowing creators to come together online to build and test their work.

Overall, collaboration in game development will become real-time in almost all aspects of the game creation process. Some of the ways in which collaboration may evolve in order to unlock the Metaverse include:

  • Real-time collaborative world building, so multiple level designers or “world builders” can simultaneously edit the same physical environment and see each other’s changes in real-time, with full version control and change tracking. Ideally, level designers should be able to switch seamlessly between playback and editing for the fastest iteration possible. Some studios are experimenting with this using proprietary tools such as Ubisoft’s AnvilNext game engine, while Unreal has experimented with real-time collaboration as a beta feature originally built to support TV and film production.
  • Real-time content review and approval, so teams can experience and discuss their work together. Group discussions have always been a key part of the creative process. Movies have long had “dailies” where production teams can collectively review each day’s work. Most game studios have a large room with a large screen for group discussions. But the tools for remote development are much weaker. Screen sharing in tools like Zoom doesn’t have enough fidelity to accurately view the digital world. One solution to the game could be a “spectator mode” in which the entire team can log in and see through the eyes of a single player. Another is to improve the quality of screen sharing, trading less compression for higher fidelity, including faster frame rates, stereo sound, more accurate color matching, and the ability to pause and annotate. Such tools should also be integrated with task tracking and assignment.Companies trying to solve this problem for movies include frame.io and sohonet.
  • Real-time world adjustments, so designers can adjust any of the thousands of parameters that define a modern game or virtual world and experience the results immediately. This adjustment process is critical to creating a fun, balanced experience, but often these numbers are hidden in spreadsheets or configuration files that are difficult to edit and cannot be adjusted in real-time. Some game studios have tried using Google Sheets for this, allowing changes to configuration values ​​to be pushed immediately to the game server to update how the game behaves, but the Metaverse will need something more powerful. Another benefit of this feature is that these same parameters can also be modified for live events or new content updates, making it easier for non-programmers to create new content. For example, a designer could create a special event dungeon and place monsters in it that are harder to defeat than usual, but which drop unusually generous rewards.
  • A virtual game studio located entirely in the cloud, where members of the game creator team (artists, programmers, designers, etc.) can log in from anywhere, on any type of device (including low-end PCs or tablets), and instantly Access to high-end game development platforms and a complete library of game resources. Remote desktop tools like Parsec may play a role in this, but it’s not just remote desktop functionality, but also how creative tools are licensed and assets are managed.

Experience Layer: Reimagining Operational Live Services

The final layer of Metaverse restructuring involves creating the necessary tools and services to actually operate the Metaverse itself, which is arguably the hardest part. Building an immersive world is one thing, but having millions of players worldwide and running 24/7 is another.

Developers must deal with:

  • Running any large municipality presents social challenges, which can be crowded with citizens who don’t always get along and need to adjudicate disputes.
  • The economic challenge of effectively running a central bank, which has the ability to mint new money and monitor money sources and remittances to control inflation and deflation.
  • The monetization challenges of running a modern e-commerce site with thousands or even millions of items for sale, and the associated demand for offers, promotions, and global marketing tools.
  • The analytical challenge of understanding what’s going on in their vast world in real time, so they can be quickly alerted before problems escalate out of control.
  • Since the Metaverse is (in principle) global, communicating with their digital components in any language, individually or collectively, presents challenges.
  • Challenges with frequent content updates to keep their Metaverse growing and evolving.

To meet all these challenges, companies need well-equipped teams with access to extensive levels of back-end infrastructure and the necessary dashboards and tools to allow them to operate these services at scale.Two particularly mature areas of innovation are LiveOps services and in-game commerce.

LiveOps as a field is still in its infancy. Commercial tools such as PlayFab, Dive, Beamable, and Lootlocker implement only part of a complete LiveOps solution. As a result, most games still feel compelled to implement their own LiveOps stack. Ideal solutions include: real-time event calendars, with the ability to schedule events, predict events, and create event templates or replicate previous events; personalization, including player segmentation, targeted promotions and offers; messaging, including push notifications, electronic Mail and in-game inboxes, as well as translation tools to communicate with users in their own language; notification authoring tools so non-programmers can also author in-game pop-ups and notifications; and testing to simulate upcoming events or new content updates, including a mechanism to roll back changes in the event of a problem.

More developed but still in need of innovation is in-game commerce. Considering that nearly 80% of digital game revenue comes from selling items or microtransactions in other free-to-play games, it’s worth noting that there’s no better off-the-shelf solution for managing the in-game economy — Shopify’s Metaverse.

The solutions that exist today only solve part of the problem. The ideal solution would need to include a catalog of items, including arbitrary metadata for each item; an app store interface for real money sales; offers and promotions, including limited-time and targeted offers; reports and analytics, targeted reports and graphs ; User-generated content, e.g. games can sell content created by their own players and return a percentage of the revenue to those players; advanced economic systems, e.g. item crafting (combining two items into a third), auction house (so players can sell items to each other), trading and gifting; and fully integrated with the Web3 world and blockchain.

Next Steps: Game Development Team Transformation

In this article, I share my vision of how games will transform as new technologies open up composability and interoperability between games. I hope others in the gaming community will share my excitement about the potential that is coming, and hope they will be inspired to join me in building the new companies needed to start this revolution.

The coming wave of change will not only provide opportunities for new software tools and protocols. It will change the very nature of game studios as the industry moves away from monolithic monolithic studios and becomes more specialized on new levels.

In fact, I think in the future, we’ll see greater specialization in the game production process. I also think we’ll see:

  • World builders specialize in creating playable worlds that are both believable and fantastic, filled with creatures and characters that fit that world. Think the incredibly detailed version of the Wild West built for Red Dead Redemption. Instead of putting so much into one game in that world, why not repurpose that world to create many games? Why not keep investing in that world and let it grow and develop, shaping itself over time to meet the demands of these games?
  • Narrative designers craft engaging, interactive narratives in these worlds, full of storylines, puzzles, and quests for players to discover and enjoy.
  • Experience creators who focus on gameplay, reward mechanics and control schemes to create playable experiences across worlds. Creators who can bridge the real and virtual worlds will be especially valuable in the years ahead as more companies try to bring part of their existing businesses into the virtual world.
  • Platform builders that provide the underlying technology that the aforementioned experts use to get the job done.

The team at a16z Games and I are excited to be investing in this future, and I can’t wait to see the incredible levels of creativity and innovation that these transformative changes in our industry will unleash. Gaming is already the largest single sector in the entertainment industry, but it promises to get even bigger as more and more sectors of the economy go online and into virtual worlds.

We haven’t even talked about some of the other exciting new developments on the horizon, like Apple’s new augmented reality headset or Meta’s recently announced new VR prototype, or the introduction of 3D technology to web browsers with WebGPU,

The best time to truly be a creator is now.

Posted by:CoinYuppie,Reprinted with attribution to:https://coinyuppie.com/a16z-what-will-be-the-game-changes-brought-about-by-the-metaverse/
Coinyuppie is an open information publishing platform, all information provided is not related to the views and positions of coinyuppie, and does not constitute any investment and financial advice. Users are expected to carefully screen and prevent risks.

Like (0)
Donate Buy me a coffee Buy me a coffee
Previous 2022-06-27 11:41
Next 2022-06-27 11:42

Related articles