One week after the launch of the MATR1X Apollo Project, its top legendary weapon NFT, "Judgment", has been obtained by a few lucky users through opening treasure chests. The highest transaction price of "Judgment" is as high as 120,000 US dollars. This event has aroused heated discussions in the NFT market and made more people look forward to Web3 game accessories NFT. This article will elaborate on the birth process of MATR1X legendary weapon NFT.

The Origin of Everything——The Birth of a Legendary Weapon

 

As the random generator on the server ran, a lucky player drew a legendary weapon with a very low probability. In the random generator, this legendary weapon only has a few core parameters: NFT number, configuration ID, degree of wear, souvenirs and prints. When these raw attributes are generated, the information of this legendary weapon has been completely determined. However, this raw information alone is not enough to show a true legendary weapon. In order to present this legendary weapon with unique significance to players, we still have a lot of work to do.

Creating Something Out of Nothing — NFT Metadata Generation for Web3

First, we need to complete the metadata information of this legendary weapon. By configuring the ID parameter, we can read various basic information of this legendary weapon from multiple configuration tables, such as weapon type, name, quality, etc. However, for an NFT, we cannot write all weapon information into the metadata without filtering. Therefore, we will first use an attribute filter to filter out the attributes we want to present to players from all the configuration information of this legendary weapon, and put them all into the metadata.

Another problem we encountered is that many of the attributes of weapons are pure data information, such as the number 7 represents legendary quality. But what is presented to the player cannot be just a number 7. Therefore, we added an attribute converter to convert these attributes from data to player-readable information. In the attribute converter, we convert the attributes that we want to put into metadata after filtering. On the one hand, the data is converted into display information, and on the other hand, all the description information is converted in language (currently English is used by default) in this process. After the conversion, the metadata information that the player sees is obtained.

Wonderful Writing——Cloud Rendering Solution Based on Blender

Next, we need to deal with the most special and important content in the metadata - the picture of this legendary weapon. Since this legendary weapon has unique wear information and a set of decals, there are many weapons with different wear levels and various decal combinations in MATR1X FIRE. We cannot have designers draw all these weapon pictures at once. Therefore, we designed a solution for rendering weapon pictures based on Blender software and deployed it on the cloud server.

First, our technical art team created a set of material solutions in Blender that are the same as the Unity engine, so that the material resources created by the designer can be presented in both Blender and Unity engines. Based on this set of material solutions, we also developed a material conversion tool between Blender and Unity, so that weapons that have been made in game projects can be automatically imported into Blender without having to be remade. In addition, the designer also built multiple sets of lighting and camera arrangements suitable for rendering pictures from different angles in the Blender scene, and adjusted the post-processing of the entire rendering environment to complete the final rendering effect.

Through this solution, we can create and generate exclusive resources for this legendary weapon based on its metadata information, and finally render out the NFT image for display, thus presenting players with a truly unique legendary weapon.

From point to surface - building a high-performance rendering cluster

After solving the rendering problem of this legendary weapon, the next thing to think about is how to solve the problem of mass rendering and response speed. To this end, the team specially built a rendering cluster. The rendering cluster includes a central control node and multiple rendering load nodes.

When a new weapon is created in the game, its metadata will be immediately sent to the central control node of the rendering cluster and enter the rendering queue. The central control node will select an idle node from multiple rendering load nodes and send the rendering task to the rendering load node. After receiving the rendering task, the rendering load node will immediately call the bound Blender renderer to render the NFT image of the weapon and upload it to the cloud storage for future download.

By building a rendering cluster, we can generate exclusive NFT images of weapons for players as soon as they obtain new weapons.

Back to Basics - Metadata and Resource Distribution for Mobile Devices

As an NFT of the GameFi project, this legendary weapon has one most important application scenario: using it to kill people in the game. How can we allow players who own this legendary weapon to fight with the same weapon in the game? To this end, we have done a lot of work to achieve this goal.

For metadata, each weapon will be stored in a metadata database when it is generated. In the NFT transaction and display process, it will be provided to the MATR1X platform as the source of metadata and complete the NFT on-chain. When the player enters the game with this weapon, the game server will also obtain various information about the weapon from this database, allowing the player to obtain this weapon in the game.

In order to render the same weapon model again in the game, the intermediate materials in the NFT image rendering process will be retained and stored in the cloud storage. Players can download these materials directly in the game and combine them with the built-in materials of the game to show this unique weapon in the game. In order to obtain better performance and stability, we have also built a resource distribution network. All materials that need to be used in the game will be automatically compressed into a texture format suitable for mobile device GPUs, and appropriate resource specifications will be issued according to the performance level of the mobile device. In this way, we have achieved a balance between the performance limitations of mobile devices and the rendering effects of NFT images.

Going a step further - UGC and 3D preview

Technology needs to be iterated to continue to present players with more exciting NFT display effects. We have the following ideas for the future of the whole solution. First of all, we plan to let the whole solution go deep into the resource production link. The offline rendering solution can not only be used as a display effect, but also create materials in a more complex and richer way in this process. Through this technology, we hope to realize the presentation of UGC content in games and NFTs. Imagine that players can engrave their own 2061, KUKU and even other cooperative NFT materials as themes on weapons. On the other hand, this technology can not only be used on weapons, but other new assets including characters in the future can also use this solution to achieve the same effect.

In addition, we also consider providing players with a full 3D display of their game assets outside of the game. Thanks to the cloud offline rendering capability, we can perform complex resource processing in the cloud and send the processed materials to a web page running on the H5 game engine. In this way, players who are not directly involved in the game can also better appreciate and observe the rich game assets provided by MATR1X FIRE.