Have you ever lived far away from home and get homesick? What if there’s an object that could ‘physically’ put your hometown real-time weather into a jar and put it on the table, which allows you to see your hometown weather anytime at a glance.
Some people who lived far away from home tends to bring something from home as a reminder or representation of their connections with hometown. And the goal of ‘weather in a jar’ was to make this connection even stronger. A real-time weather status of a city reflects a very specific moment & location which could create a unique connection between a person and his/her hometown disregarding the distance of physical presentness and time zone.
Right now, I have the ‘weather jar’ and pepper’s ghost effect worked, assembled and ready to show, and inspired by Chunhan Chen’s pepper’s cone ICM final project, we are hoping to combine our projects and display the real-time weather effect in 3D.
CyberScamp is a project connecting physical input to a digital output. Based on the user’s interaction with a physical stuffed animal, there is an animated p5.js output through which the animal responds. These animated outputs would likely be made with an outside program, then exported into p5.js. The project uses an Arduino, a stuffed animal dog, a force sensitive resistor, and p5.js. The resistor is in the back of the stuffed animal. By assigning values to the levels of pressure exerted on the dog, we are able to break those different pressures into ranges. One range is very low pressure, which the animated dog does not respond to at all; he is neutral. The animation would maybe be the dog looking eagerly at the user, waiting for some kind of attention; because this is the animation displayed when there is no pressure being exerted (no user interaction), this scene would encourage someone to come and interact with the project in the first place. Another range is medium to high pressure, achieved through petting or patting the stuffed animal, which the animated dog responds well to. The animated dog would be very happy, possibly rolling on his back with a tongue out. A third range is very high pressure, in the case that the user punches or squeezes the stuffed animal too hard. While this isn’t really an ideal interaction with the project, we feel it’s necessary to add an output that addresses it. The animated dog would react poorly to this treatment, possibly looking sad, upset, and hurt. This upset reaction by the animated dog may last a bit longer than the happy reaction, but would eventually fade back and reset to neutral again.
“Space between us” is an exploration of human interaction in an architectural scale that focuses on transforming the individual experience into a collaborative connection through light. The project is composed of a set of two screens suspended back to back with a light matrix embedded in between them. The lights are activated by data that is collected from two cameras on opposite sides of the screen that capture movement, which is then live processed and displayed on the respective screens. Users will be physically separated by the panels but will visually perceive the other side's movement, similar to seeing through a filtered window.
Introduction to Computational Media, Introduction to Physical Computing