NFT Cryptopunk Generation Using Machine Learning Algorithm (DCGAN)
DOI:
https://doi.org/10.5755/j02.eie.38435Keywords:
Blockchain, Cryptopunks, DCGAN, Non-fungible token (NFT)Abstract
A non-fungible token (NFT) is a kind of digital asset that signifies ownership or proof of authenticity of a special good or piece of material, such as artwork, music, films, or tweets. This study investigates how a deep convolutional generative adversarial network (DCGAN) can be used to create distinctive pictures of Cryptopunks that can be converted into NFTs. Cryptopunks, a pioneering form of NFTs, were introduced on the Ethereum blockchain in 2017 as part of a social experiment. In the NFT community, they have since grown in popularity as collectibles. To create brand-new, previously undiscovered characters, we trained a model on a dataset of existing Cryptopunks using the DCGAN architecture. In an effort to raise the calibre of the images produced, we tested various hyper settings and layer combinations. We also assessed the created images using a variety of criteria, such as the inception score and Fréchet inception distance, to make sure they were distinctive and of high calibre. Our experiments yielded a 15 % increase in the inception score and a 20 % decrease in the Fréchet inception distance, showing that our DCGAN model produces images that are more visually appealing and closer in quality to real Cryptopunks. These results highlight the effectiveness of our machine learning algorithms in improving the quality and uniqueness of NFT assets.
Downloads
Published
How to Cite
Issue
Section
License
The copyright for the paper in this journal is retained by the author(s) with the first publication right granted to the journal. The authors agree to the Creative Commons Attribution 4.0 (CC BY 4.0) agreement under which the paper in the Journal is licensed.
By virtue of their appearance in this open access journal, papers are free to use with proper attribution in educational and other non-commercial settings with an acknowledgement of the initial publication in the journal.
Funding data
-
King Saud University
Grant numbers RSPD2024R553 -
King Saud University
Grant numbers RSPD2024R553