Thoughts on the VeVeVerse

I just got back from DCon where the VeVe team shared an early version of their VeVeVerse, and my mind is just buzzing with ideas!

I was super impressed with the demo, and it really helped make the idea of the metaverse concrete for me in a way that other projects like Meta or Decentraland haven’t– the VeVeVerse just looks so much better!

What really has me excited are the tools for creators. The VeVeVerse will allow users to bring in NFTs from other projects, as well as create and incorporate their own 3D objects, and even add interactivity via a scripting language.

I think there will be tremendous opportunity in the near future for creators in the metaverse, with metaverses such as the VeVeVerse coming online and Apple getting into the AR/VR game next year with their glasses.

I have literally zero knowledge when it comes to creating 3D objects, but I’m excited to learn. I will be dedicating the next few months to diving deep into this subject and learning about the current state of 3D object creation– how it works, the file formats, the programs used to create them, etc. etc.

From my limited knowledge I think scanning and recreating existing objects could be a shortcut into creating 3D objects to populate the metaverse with (and potentially sell to other users). I wouldn’t be able to use licensed/trademarked/copyrighted objects of course, but there are a massive amount of objects out there that could be digitized– natural objects (flowers, rocks, insects, animals), people I know, generic goods, food, etc.

I a way, I see it like the early days of stock photography– when digital photography and the web were just getting started, there was tremendous opportunity in capturing everyday scenes that could be sold on stock photo sites. I see similar opportunities for 3D objects, or perhaps the creation of a 3D object library (if something like that doesn’t exist already).

Helping businesses prepare for the metaverse by creating metaverse spaces and digitizing their merchandise could also be a service. Again, I see parallels to the early days of the web when every business was rushing to get online, and those with web development skills (like myself) made a lot of money helping businesses get online. Scanning objects for the metaverse could be the Web 3 equivalent of that.

I also see in the not too distant future generative AI services such as Dall-E being used to generate 3D objects. They are already used by some companies for texture generation, and it shouldn’t be a big leap to go from creating 2D pictures to 3D objects.

The future is looking exciting again, and I feel the same buzz and energy in the air as at the start of the Dotcom era. I’m looking forward to diving into (and sharing my learnings from) the world of 3D modeling and metaverse building.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.