Technology makes promises to connect us and service human communication in a way never before possible. In theory and for example, having a long-distance relationship should now be as fulfilling as one that's face-to-face.
But what happens when the tech dries up and becomes an obstacle, and resentment toward our devices as conduits for our relationships kicks in?
Utilizing the untethered freedom of a conversational user interface (CUI), speech-to-text and the internet, these ‘coupled’ objects take the spoken last thoughts and wishes of one's ‘Good night’, giving form to and delivering them, on waking, to the other.
Individually, these physical receipts, artifacts or tokens may seem trivial as they are collected, used as bookmarks, lost, pinned to the fridge, thrown away or stored in the wallet. However, as they accumulate, they begin to represent something larger. They are a distillation of the sentiment of humans trying to communicate effectively over distance. They are also a stand against the ethereal digitization of our experiences and relationships, the automation of intimacy and our losing grip – so to speak – on things to touch and hold.
My thesis is called Awaken. It is an experimental art project where I recreated my personal dream experiences for others to experience in a Gear VR headset.
It is a dream sequence that is around 8.5 mins long. From opening to ending, with four dreams, and various transitions. The user will wear a headphone, and hopefully get to sit in a comfortable arm chair.
Piano Vision: Dark Side of The Moon is a machine learning and music composition project. Simply put, this is a piano that watches a television screen playing movies and composes and original score. Using MaxMSP the patch will reads all of the RGB values from the screen and playsmusic according to the intensity of each color. For example, if there is fire (more red) it plays fast, water (more blue) plays slower, trees (more green) it plays somewhere in the middle. Using a machine learning program call Wekinator, this project attempts to teach a computer different musical notes that correspond with color values in order to trigger a varied response. The responses are then recorded or *learned* and then used to score different films. The instrument is a upright piano with the keyboard and hammers removed, exposing the internal *guts* of the piano. The playing of the piano will be the achieved with solenoids to hit the strings and servos to pluck the strings. This project will be the orchestra to all of your movies, haunting and serene, glitchy and humorous. This iteration will have our Piano watch the trailer for the movie Die Hard. Searching for explosions and fire, the piano will play accordingly to accurately play compositions familiar to the action movie genre. The intention of this piece is to associate the absurdity of the destruction constantly displayed in our consumable entertainment and the incredible technology used to make these sequences come to life. When we pair a half destroyed piano with a film conveying over the top destruction, can we sense the connections between the two? Have we become so desensitized to what we watch that cities falling to the ground no longer bothers us? Does this not bother you? Does a broken piano trying to play music bother you? Both a piano using machine learning and a big budget Hollywood movie utilize incredible technologies, but in the end, what do they make us feel, if anything? We want to demonstrate the connection – the connection that our ability to use these incredible technologies is a beautiful dance between art, creativity, and imagination.
Story telling through a day to day object. This is a story of scanner in an elementary school. When Mora, a new teacher starts messing around with the machine, she accidentally spills coffee on it and the point of view of the scanner gives a new perspective…
Exploring ways to reflect on our relationships with our phones and ask the question. What can life be? Featured projects include: Shakie: the best selfie app for blurry photos, Chance of Rainbows app: Whats better than checking the weather? Rainbows!, Post Texts: for sending physical text messages, Crete: the worlds first connected brink for disconnecting but still getting notifications so you can sleep at night, and Zombie Crossing: a simple update to dated civic infrastructure.
INSIDE THE MACHINE is an interactive experimental installation, focused on finding new ways to make people using their body to interact with physical construction and animation.
People can see their real-time body projection and use their hands to play with motors and lights.
“Good Boy, Sammy” comes to life through voice recognition programming (p5.js speech library), projection, archived footage from 2009, and a hologram screen. Sammy responds to commands including sit, down, give me 5, 5 dollars, yodel and speak. A user approaches the cage he/she can get close, give a command and experience Sammy reacting in real time. It feels like he is alive and in the space with you.
Step aside taxidermy, because holo-dogging is taking over. â€¨â€¨
Real-Time Avatar Puppeteering is an investigation of movement and identity through the creation of photorealistic avatars and movement based control of these of avatars using contemporary motion capture technologies. Using photogrammetry, a technique for turning a large quantity of two-dimensional photo data into a three dimensional model of the photographed subject, we can create believable facsimiles our body using consumer level technologies. These 3D models are then able to be rigged up with an internal joint structure that allows for them to be manipulated with a variety of motion capture technologies, many of which allow for real time control.
This project is inspired by my ongoing research on New york city as an ecosystem and one of the most prominent residents of the city — the rat. New York City has a growing population not only of humans but also of rats. The way trash is being treated in the city and the growing number of people in it help to increase the rats population and push them towards humans' habitats.
Trat is a found object installation of a classic outdoor trash bin lighted by a “street light”. Filling it with trash reveals the truth behind what seems at first sight as typical New York trash.
Readymades, Temporary Expert: Design + Science in the Anthropocene