Blender, Lambda, and NFTs — Oh My!
A deep dive into the architecture behind Ethentic’s beautification layer.
Johnny and I (Spools) have known each other for a long time, so when Johnny approached me in the spring of 2021 about a NFT project with a truly physical aspect, there was only one response I could provide:
After witnessing the incredible success and interest there was for platforms like Art Blocks and fxhash, we knew that we wanted to create something that was generative by nature. Due to Johnny’s experience with 3D-printing, we wanted to take our project one step further and provide art that was not only generated via algorithm, but could also be fabricated in the physical world. For two months before this point, I was teaching myself 3D modeling/rendering using Blender, so it was all the more serendipitous when Ethentic would require some level of image rendering automation to complete its beautification layer.
This article will act as a window into the effort that went into creating and architecting Ethentic’s beautification layer. I will touch on the software used, and the methods implemented that allowed us to create a generated-at-mint, 3D-printable NFT collection that has a beautification layer consisting of not only a few JPEGs, but also a browser-based 3D renderer that truly showcases the added physical dimension of the Causeways collection.
What the heck is an STL and how do I make it beautiful?
One of the main challenges of a physical collection is, well, making sure that it can actually be 3D-printed, so we had to ensure that our algorithm could output to a file formatthat was widely accepted across most modern 3D printers. The name STL is an acronym that stands for stereo-lithography which is a file format fully supported by the majority of 3D printers, as well as most CAD applications. Moreover, STLs can be easily imported into Blender, which made them a perfect choice for satisfying the digital rendering aspect of the Causeways collection.
For those not familiar with Blender, it is a free and open-source 3D-rendering software that can be used for digital art, film-making, architecture, and much, much more. There is an insanely helpful and creative community for Blender, and many of these members have provided enhancements to the software with user-created plugins. Additionally, if you come from a developer background, Blender has an extremely robust Python API, allowing you to automate any action of the software via script and command line, which is a huge plus for projects that need as little manual touch points as possible.
Ultimately, we were able to use Blender to create the scene for our digital images, and automate the placement of the STL, the material applied to the STL, and other aspects that were related to the on-chain data stored on the Ethentic smart contract. With a single python script and Blender, we were able to create the beautiful digital renders that can be found on our website.
Okay cool, but how do I avoid manually calling this command 2500 times?
The Causeways collection consists of 2500 unique pieces, all of which would have to be rendered after the NFT is minted (the trait metadata and seed that are used to create the STL is only generated once a token had been minted). For quite obvious reasons, waiting around for an individual to mint from our collection and then running our Blender automation manually was not really a tenable solution (A developer as to sleep sometime, right?). Thus, we needed a way to systematically trigger our Blender script. Moreover, because people can mint at any time we had to ensure that this could happen asynchronously.
This is where we introduced two new tools to our beautification pipeline: Docker and AWS Lambda. Docker images can be used to create containerized, portable Linux environments that are optimized to a specific application and need. Using Docker, we were able to containerize our Blender script so that it could be run anywhere where the Docker Engine is supported.
AWS Lambda is a Serverless solution that allows for near infinite scalability, as well as event-driven architecture (e.g., when a NFT is minted 😉). We chose AWS Lambda specifically because whereas most other Serverless solutions provide generic runtime environments (i.e., Python, NodeJS, Go, etc.), Lambda supports the use of custom Docker images as the runtime environment.
Our setup consisted of two separate Lambda functions:
- A Dockerized version of OpenSCAD which is the programming language our algorithm uses, and the program that produces the 3D-printable STL file (You can read more about our experience with OpenSCAD in this article).
- A Docker container that uses a NodeJS runtime for handling logic and running commands, which was then built on top of the available New York Times Blender Image.
Putting the Pipeline Together
So we have our Blender script for our renders, and we have a way to run it asynchronously in a fully automated manner. The final pieces required to complete our pipeline were:
a) A watcher that will observe events occurring on the Ethentic smart contract. The watcher will provide the seed and traits that are generated when a Causeway is minted and provide them in a HTTPS event that will trigger the OpenSCAD Lambda.
b) Somewhere to store our beautification layer assets. The OpenSCAD Lambda produces a STL file and preview JPEG, and the Blender Lambda produces the HD render of the Causeway, all of which need to go somewhere! We stored these assets in two backends: Digital Ocean and the Interplanetary File System (IPFS) via Fleek.
Finally with all of the puzzle pieces assembled, we get this flow:
But wait: How do we truly showcase something 3D?
No offence to JPEGs, but an image can only capture a single, two-dimensional still of a 3D object. For people buying a Causeway, we had to ensure that they could experience the art in its full, three-dimensional glory. Moreover, because the entirety of the website is served statically over a CDN, whatever solution we used had to render everything client-side.
ThreeJS is a fantastic library for WebGL that supports client-side rendering. Additionally, there is a wonderful React-based library for ThreeJS called react-three-fiber that makes adopting ThreeJS scenes for React components very simple. However, when dealing with 3D assets, especially those that will be rendered client-side, it is important to manage the size and complexity of your 3D scene so that the user isn’t stuck waiting an unacceptably long time for your application to render.
Using, Blender, we were able to export the basic layout of our 3D scene as a GLB file, and bake out the lighting of that scene as a JPEG that could be applied to the objects in that GLB file. This way, everything in our 3D scene except the user’s Causeway STL file could be pre-baked, leading to a much faster render time, as ThreeJS only had to calculate the lighting applied to the Causeway and nothing else in the scene.
Ultimately, this allowed us to create a a 3D viewing room component on our site where all of the Causeway files can be viewed and manipulated in 3D space, capturing the full experience of the artwork we provide to our minters.
There were many challenges along the way in terms of getting Ethentic ready for our launch of the Causeways collection. From the novel use of sorting algos in the smart contract (article to come!), and the additional complexity required to showcase a 3D-printable asset, the journey to completion has been a long but extremely gratifying one.
We hope that Ethentic will inspire others to push the boundaries of what is possible with digital collectibles, and to explore uncovered ground in the realms of generative art.