Table of Contents
- Links
- Project Summary
- Key Points of the Project
- Media
Links
Label | Link |
Mint | https://mint.nftkazoku.com/https://mint.nftkazoku.com/ the site is closed but you can view the media at the bottom of the page |
Manager | https://manager.nftkazoku.com/https://manager.nftkazoku.com/ the site is closed but you can view the media at the bottom of the page |
Open Sea | |
Etherscan | |
Twitter |
Project Recap
The Kazoku is an NFT project started in January 2022. Its initial goal was to promote a web3 ecosystem integrated into The Sandbox.
Several features were developed for the project's benefit:
- Smart contract for minting
- Smart contract for tokens
- Smart contract for a marketplace
- Smart contract for a staking system
- Interface to interact with each smart contract
- DAO system (web2/web3)
- Infrastructure for automated deployment
The project took longer than planned to release, due to the difficulties involved in communication in this type of project.
The crypto crash caused by LUNA occurred during the project's launch period, which greatly slowed down sales.
The entire collection was minted by the community (~180 Holders), but the mint price had to be significantly reduced.
Key points of the project
The mint contract
This is the foundation of any NFT project: a smart contract capable of issuing
non-fungible tokens
. Several standards exist, each with their advantages and disadvantages. The most well-known are ERC721, ERC721A, and ERC1155. For this project, the ERC721A from Azuki was chosen.This implementation offers the advantage of proposing multiple mints like ERC1155 at low costs while keeping the basic ERC721 implementation.
The implementation of the ERC721A contract in the context of the Kazoku project differs by two changes:
URL management
An override of the functions allowing to retrieve URLs redirecting to the metadata of the NFTs.
The objective of this change is to allow the display of a so-called "placeholder" image before revealing the real designs to users.
Adding a signlist
Many NFT projects use a whitelist system to benefit and reward certain members of their community. There are several ways to implement whitelists in Solidity. But the most optimized technique that I have found to date is based on a principle of address signature.
- An account A is used to sign the address of the buyer B.
- The signature is sent as an argument when calling the mint function.
- The smart contract uses the recover method to prove the authenticity of the signature based on the sender.
An example of the function used to verify the validity of the signature:
For more information https://www.instagram.com/p/Cb93bcYt1jt/, I talk about it on my Instagram.
The Resolver
When a third-party service wants to display the visual representation of an NFT, it must call the
tokenURI(uint256 tokenId)
function to obtain a URL where the NFT information is located in JSON format most of the time.There are several services that allow you to easily upload your metadata online such as Pinata. However, I am not a fan of these solutions because you are no longer in control of your data, some services are paid and you depend on their resilience.
That's why I set up a Python application using FastAPI and Alchemy API to create my own metadata resolver.
The
tokenUri
method of my smart contract redirects to this micro-service running on one of my servers. Then I use Alchemy's services to determine if the requested NFT has already been minted or not. If this is the case, I return the corresponding metadata.Alchemy dashboard allowing to measure the usage of their API.
And what about data decentralization?
This system has a flaw, as the metadata of NFTs are stored on a conventional server. However, this is not a fatality in this type of project.
- Images are stored on IPFS, only metadata such as name / description / attribute are stored on the server.
- To ensure data integrity, the hashes of all metadata are made public at the beginning of the sale, and a means is given to verify their validity.
- The user always has the option to Freeze their data via OpenSea.
- Most NFT projects wish to make some adjustments shortly after their release. This offers them a little more flexibility while still being in line with their community.
DApps and other features
This project is not only about a mint contract. In its initial version, a whole ecosystem of DApps was supposed to be deployed. Namely, a staking application, a marketplace, an ERC20 token, and a voting system.
The ERC20 token called $OHZ could only be generated through the staking system. It was used to purchase items in the marketplace.
To regulate the number of tokens in circulation, when purchasing on the marketplace, the tokens used were burned.
The staking system is divided into two categories.
Active staking: you need to place your NFT on a contract to generate rewards.
Passive staking: it is possible to keep your NFT to generate rewards.
A rigorous implementation had to be put in place for this contract. Many parameters had to be taken into account, such as the NFT faction, the rarity of the NFT, and whether the faction was in a winning period.
The voting system is more traditional, relying on an API linked to an RPC to verify whether the user has NFTs from the collection and thus check their eligibility to vote. Once a person votes, their address is temporarily blacklisted.
This web2 system was chosen by the project managers to avoid excessive gas fees for user interactions and the Kazoku ecosystem.
Integration on a website
Two Next.js applications have been deployed for the project. One to allow NFT minting and another to give users the ability to interact with the different DApps of the project.
A web3 module allows the connection of different wallet providers to the sites.
Once logged in, the user can freely use DApps with their wallet and sign transactions. Infura services are used to run certain providers.
Deployment automation
To allow for maximum responsiveness during deployment or development phase, many tasks are automated.
For the deployment of smart contracts, deployment pipelines with scripts have been created to avoid paying high gas fees.
Numerous unit and functional tests for all project features are present, and modules also allow for predicting the amount of gas that will be consumed by different methods. Security audits are also conducted using tools such as Slither.
For deploying web applications, database, reverse-proxy, and resolver, Docker and Docker-compose are of great help. They allow to quickly launch, recompile, modify, and stop the entire application stack or a part of it. This is very useful, for example, for quickly downgrading or upgrading server power depending on the situation.
CloudFlare is also used as a proxy for production servers. This allows for caching to reduce server load time and provides more security to the project.