“In another reality I am half-human, half-machine. I can read Base64 and see you.”
Kim and I created self-portraits for the future.
The self-portraits are Base64 representations of images taken by a web-camera. The ideal viewer is a human-machine hybrid/cyborg that is capable of both decoding etched Base64 and recognizing the human element of the artifact.
After going through several iterations with felt, honey and rituals, we finally settled on etching timeless digital code onto a physical medium that ages over time.
The image is taken with a nodejs chat app that I created for LiveWeb: It takes a picture with the user’s webcam every three seconds and shows the Base64 code on the page - again an example of an interface for human-machine hybrids or cyborgs of the future.
After taking portraits with my webapp, we copy/pasted the code for each of us into MacOS TextEdit, exported it as a pdf, copy/pasted the contents of it into Adobe Illustrator and outlined the font, Helvetica. We chose this font as it is a very legible font, even at a very small font size. Our laser computer did not have Helvetica installed, therefore we outlined all letters.
The files were very large, as portraits varied between 160,000 and 180,000 characters. After 2 days of test-etchings on crescent medium gray and crescent black mounting-boards and experiments with different laser settings (speed, power and focus), we started the final etchings: Speed 90, Power 55, 600 dpi and focus slightly closer to the material than usual turned out the best results on our 75-watt laser. Each portrait took 1:33 hours to etch, in total we completed 4 portraits.