Digital Fabrication Final: Clay VR Future Nudes

Continuing from our explorations with base64 and clay, Kim and I created a VR/physical sculpture that mimics cyborg art in the future.

concept

We meditated the whole semester on the relationship between our physical and the digital body, explicitly on the space between physical and digital object. We tried to deconstruct the notion of sculpture as a form of human memory to the blueprint of code - only to re-imagine it in VR.

We called our piece “future nudes”, as only cyborgs in the future will be able to see the real body behind the sculpture without the help of VR goggles as we imagine them to be able to decipher code in real-time.

process

A depiction of the artists body gets laser-etched in binary code into wet clay, spread out on 12 clay tablets. These tablets get fired in a traditional kiln like the earliest forms of human written media, Babylonian cuneiform. Images of these tablets get put into a immersive VR environment and when the audience is touching the real clay tablet they can see their hands interacting with the physical object of the tablet - which in itself is the depiction of a real person converted into a digital blueprint, binary-code. In the final version the pixels of the body parts on each tablet pop up in the VR environment when the fingers are touching it.

In a first step the highly pixelated image (to minimize the binary file size) gets transferred to binary via a python-script (here one part of the image is shown):

male_torso_illustrator_binary.png

After that we prepared the low-firing raku-clay for laser-etching and kiln-firing. After applying manual focus on the 75 watt laser we used the following settings:

  • 600 dpi raster etching

  • 90 % speed

  • 30 % power

We got the best results when using out-of-packet fresh clay without adding any water to it. The clay fired well at cone 04 for the bisque fire.

Here a few pics from our process:

IMG_7751.JPG
IMG_7827.JPG
IMG_8074.JPG
Oi%H9qYxSQ+WRE0a4f5V7A.jpg
IMG_8350.jpg

And finally the experimental and still unfinished port into VR using Unity VR, HTC Vive and Leap Motion (in the final version the pixels of the body part pop up in the VR environment when the fingers of the audience touch the tablet):

IMG_8114.JPG


Machine Learning for the Web: Code Bodies with tensorflow.js

concept

We are code. At least in a great part of our lives. How can we relate to this reality/non-reality with our bodies in the browser?

tech setup

I experimented with tensorflow.js and their person-segmentation model running the smaller mobile-net architecture. I used multiple canvases to display the camera stream base64 code in real time behind the silhouette of the detected person from the webcam. Given the fact that I do pixel manipulation on two different canvases and run a tensorflowjs model at the same time, it still runs relatively fast in the browser - although the frame rate is visibly slower than a regular stream with just the model.

prototypes

A brief screen recording:

Another version with a bigger screen:

 


 

Digital Fabrication Final: Future Nudes

For our final project, Kim and I were experimenting with laser-etching wet clay this week - a poetic and exciting exploration!

Too make it short - so far the results were surprisingly great. That said we still need to fire the clay in a kiln, then we can give a final verdict on it.

Here the basic setup for cutting the clay:

IMG_2948.JPG

We used low firing white raku clay as we will use a kiln that fires at cone 06. It had a nice plasticity and was easy to work with. The tricky part was to get a consistent height for the clay slab. To achieve that, we used a kitchen roller and two pieces of plywood with similar height to roll it evenly over the clay. We then cut it in shape.

IMG_2951.JPG
IMG_2952.JPG

To keep clay particles from falling through the laser bed we used felt.

After applying manual focus on the 75 watt laser we used the following settings:

  • 600 dpi raster etching

  • 90 % speed

  • 30 % power

and it worked well with two goes. Now we just have to let it dry for two days do a first firing, glaze one part and then fire it again. It’s still not clear how it will turn out - something to look forward to!

IMG_2957.JPG

Digital Fabrication Final Proposal: Future Nudes

Kim and I want to create nude portraits/sculptures for the future (see here more detailed blogpost on Kim’s website .

After laser-etching our portraits in Base64 code on mat-board for our midterms, we decided to continue with this topic and play more with form and materials:


We want to either use Base64 or binary code for the encoding of the image. Regarding the base material for the project we want to iterate with etching wet clay. It seems to be the best material as it preserves our portraits pretty much forever (the earliest form of human writing, cuneiform, was “hand etched” into clay). It has the advantage that we could form it later into a sculpture or play with the shapes before burning. And it has a lot of symbolic meaning regarding the human body in different contexts (bible, Kabbalah, …).

It is pretty experimental to etch into clay and then form it, there are only very few sources online that have tried etching wet clay with mixed success. So we gotta play!

Why nude portraits? We like the idea that the portraits of our bodies will only be visible in a distant future - once humans are able to decipher code naturally, maybe as human-machine hybrids. We abstract our bodies into code and preserve them for a time where our bodies will have changed a lot: we might be half human, half machines by then. This aspect of preservation for a distant future reminds of the skor-codex from a few years ago.

For the looks of ceramics we are inspired by rough surfaces of dark stoneware, dark clay and playful sculptural explorations.

An interesting iteration would be a collecting a couple of different nudes, then cutting up the pieces into a row of beaded curtains that people could walk through - so they could interact with our bodies in code.

Digital Fabrication Midterm: Future Portraits

“In another reality I am half-human, half-machine. I can read Base64 and see you.”

final iteration

Kim and I created self-portraits for the future.

concept

The self-portraits are Base64 representations of images taken by a web-camera. The ideal viewer is a human-machine hybrid/cyborg that is capable of both decoding etched Base64 and recognizing the human element of the artifact.

process

We went through several iterations with felt, honey and rituals: Our initial idea of capturing a moment in time and etch or cut it into an artifact morphed into an exploration of laser-cut materials with honey. We were interested in the symbolic meaning of honey as a natural healing material. Our goal was to incorporate it into a ritual to heal our digital selves, represented by etched Base64-portraits that were taken with a webcam and encoded. We used felt as it is made out of wool fibers that are not woven but stick to each other through applying heat and pressure. This chaotic structure seemed a great fit for honey and clean code:

IMG_2855.JPG

Soon we dropped the idea of using felt as it seemed to be too much of a conceptual add-on and reduced it to honey and digital etchings - the healing should be applied directly onto the material in a ritual.

IMG_2854.JPG

After gathering feedback from peers and discussions about the ritualistic meaning we struggled with a justification for the honey as well: most of the people we talked to liked the idea better that only human-machine hybrids of a near or far future are technically able to see the real human portrait behind the code. After a few discussions about both variations of our concept dealing with digital identities and moments in time we favored the latter one and dropped the honey.

So we finally settled on etching timeless digital code onto a physical medium that ages over time - self-portraits for the future: Maybe we look back at them in 20 years and can see through the code our own selves from back in the day?

A little bit about the technical process: The image is taken with a nodejs chat app that I created for LiveWeb: It takes a picture with the user’s webcam every three seconds and shows the Base64 code on the page - again an example of an interface for human-machine hybrids or cyborgs of the future.

machine_facetime.gif

After taking portraits with my webapp, we copy/pasted the code for each of us into MacOS TextEdit, exported it as a pdf, copy/pasted the contents of it into Adobe Illustrator and outlined the font, Helvetica. We chose this font as it is a very legible font, even at a very small font size. Our laser computer did not have Helvetica installed, therefore we outlined all letters.

fullsizeoutput_310.jpeg

The files were very large, as portraits varied between 160,000 and 180,000 characters.

These digital preparations were followed by 2 days of test-etchings on crescent medium gray and crescent black mounting-boards and experiments with different laser settings (speed, power and focus).

IMG_2912.JPG

We discovered that getting the focus right proved to be difficult: The laser beam seems to get weaker once it travels all to the right of the bed. This makes the etching illegible on that side, whereas the far left it is fine. Focusing a bit deeper on the material with the manual focus produces satisfying results with white cardboard fixed this issue, whereas the fonts looked blurry on the black cardboard on the left side - it was etching too deep. Finding the right balance for the focus fitting both sides equally well took a long time and a lot of trial and error.

IMG_2915.JPG

Once we got the focus right, we started with the the final etchings: Speed 90, Power 55, 600 dpi and focus slightly closer to the material than usual turned out the best results on our 75-watt laser.

Each portrait took 1:33 hours to etch, in total we completed 4 portraits.

We see them as a starting point for further iterations, as “exploration stage one”: The concept of creating physical artifacts for the future that preserve personal digital moments in time with code and laser is very appealing to us.

We will iterate on our portraits for the future probably for the final.

IMG_6999.JPG
IMG_2887.JPG
IMG_2864.JPG
IMG_7004.JPG
IMG_6998.JPG

Digital Fabrication: Midterm Iterations on Healing, Honey and Felt

“In your backbone you feel a pointed something and it works its way up. The base of your spine is tingling, tingling, tingling, tingling. Then n|om makes your thoughts nothing in your head”

[Kxao ≠Oah - a healer from |Kae|kae area, quoted in Biesele, Katz & St Denis 1997:19]

(taken from JU|’HOANSI HEALING SONGS on NTS-radio)

IMG_6856.jpg



Screen Shot 2018-10-11 at 12.28.37 AM.png

For our midterm in Digital Fabrication, we iterated on the idea of the swarm based behavior. After reading more about Joseph Beuys’ use of felt and honey in his fluxus-performances in the late 60s and the JU|’HOANSI tribe and their use of healing through dance, we shifted our focus to the idea of a healing ritual for the digital age.

We are thinking about making a piece that uses three components in an interactive installation:

  • an image in Base64 code on paper (used for image encoding on the web)

  • felt with a perforated structure

  • honey

As shown in the image above, a depiction of the artists in a waterfall-like silhouette made from Base64 encoding is printed on paper. This is the foundation of the sculpture that reflects ourselves that we want to heal. Above the paper is a layer of felt cut in strips and shaped in a web. It filters the honey dripping from above which acts as a part of the healing ritual: Beuys saw honey as a healing material as it is gathered by bees, which represented a “peaceful” entity.

A coincidence we found out while cutting felt with the laser: it smells like honey afterwards.

IMG_2852.JPG

Digital Fabrication: Felt_Laser Explorations

Our assignment this week took us into unknown lands of laser-cutting (at least for us unknown…): I teamed up with Kimberly Lin and we went to mood-fabrics in the NY fashion-district to buy different shades of grey felt in a thicker quality (around 1/8 inch).

 

We opted for grey felt because of its iconic role in sculpture and art installations in the 20th century: Robert Morris, Joseph Beuys and Bianca Pratorious were our main inspirations for the choice of material. We were curious on how the material could be cut with the laser - and how the digital process could alter the artistic output and execution. So we decided to play !

IMG_2831.JPG
IMG_6596.JPG
 

We used Vectorworks to create simple slots and lines for our first prototype. After copying it into Adobe Illustrator we started laser-cutting - and to our surprise the felt turned out to be a great choice: it has a certain sturdiness and structural integrity that helps maintaining the form. There is still a lot of movement and room for re-shaping of the object. And the laser cuts it pretty fast, efficient and precise. We did three rounds of cutting (to prevent the material from too much burning) at conservative settings: 500hz frequency, 30 speed, 10 power worked well on the 60watt laser.

We prototyped different shapes and arrangements of the fabric pieces as we plan to build a large-scale kinetic sculpture later in the semester.

IMG_6624.jpg
IMG_6620.jpg
IMG_6690.jpg
IMG_6677.jpg

After playing with different combinations we decided to keep a self-standing organic structure for now and later on experimented how it would behave in motion.

Digital Fabrication: 2-D Object Drawing

I chose the Teenage Engineering OP-1 synthesizer for my Vectorworks object drawing assignment.

IMG_2818.JPG
 

It was challenging to get all the measurements correctly into the drawing, after a while I became a bit detail obsessed - as Vectorworks gives you the opportunity to be very exact:

Screen Shot 2018-09-19 at 3.26.36 AM.png

It took me quite a while to get used to the basic tools in Vectorworks - but it was great fun, very meditative.

Here my measured drawing:

op1_vectorworks1.png
 

And because it is such a beautiful object, here without measurements: