For this final I combined the techniques I learned in “Live Web” with the ones from “Machine Learning for the Web” and built a peer-based meditation piece:
A user can experience the base64-code from a remote web-camera streaming through his own body.
The result is a meditation experience that reminds of the calming effects of watching static noise on tv screens. As it is peer-based, the users body becomes a vessel for the blueprint and essential idea of something or somebody else. This hopefully creates a sense of connectedness and community while meditating.
Staring at a screen for meditation might not be everyones preferred way doing it, it is worth exploring this from a futuristic perspective: we probably will be able to stream information directly into our retina, therefore an overlaid meditation experience might be worthwhile considering.
Here a short video-walkthrough streaming videoframes from my mobile phone camera as base64-code through my own silhouette (the latter is picked up by the camera on my laptop and run through a body detection algorithm). All web-based with machine-learning in the browser powered by tensorflow.js and webRTC peer-to-peer streaming (code):