Jinxed Blinks

Zoe Wells

A painting that tells a looping story by changing every time the viewer blinks.



This project is telling the story of the girl Solveg from the Roald Dahl book “The Witches”. In the story, Solveg is cursed by a witch to be stuck inside a painting of a farmhouse that her family owns. No one ever sees the painting move, but whenever they look at it, she has changed position slightly, aging and eventually disappearing as the years go by. I was also inspired by “La Jetee”, a film taking the form of a slideshow of still images, and “In the Blink of an Eye”, a book by Walter Murch that compares cuts in editing to natural blinking patterns.

In my project, I am using a facial recognition library in python to watch for when the viewer’s eye aspect ratio (EAR) goes below a certain threshold for 2 frames. To improve on the blink detection accuracy, I added a feature that allows the user to calibrate the blink threshold to their own EAR by pressing the spacebar, which they would be prompted to do when they approach the project. Their blink will trigger a the image to change to the next still in the story. The story will have the girl aging and changing her position in the painting, with her environment aging with her. For the final version, I plan to have 15 images in an oil-painted style for the full loop of the girl growing old and disappearing, since the average person blinks around 15 times a minute.

My ideal setup for the project would be a monitor with a frame around it to complete the “painting” effect. My laptop, with the camera for facial recognition, would be placed below the monitor. I also plan to have a soundscape of nature noises coming from the painting, with a sound effect when a change happens.


Comm Lab: Hypercinema, Creative Computing

The Space Well

Haozhong Yao, Robert Ye

An experience of floating and diving into the deep space through a well.



– The universe is a mysterious well with unknown depth. Traveling in the space could feel like falling down to an abyss. We want to waive gravities from three different dimensions (one in the well, on from the ground the user stands, the one in outer space) into our project. Receiving the gravity from the earth while experiencing the gravity-less space environment, the user may find a spirit of exploration during the processing of falling into the well.

User Experience:
When you stand by the well and look into the water, the silhouette of you appear in the center of the well consist of stars.
If you put your arms up parallel to the ground, the projection of you will begin to fall into the deep space as the stars zoom in and scatter with acceleration, creating a sense of space traveling. And when you reach a certain falling speed, you will see the big bang.
If you swing your righthand very hard, your projection will spin and flash.
If you put your arms just a little away from the trunk – the projection is going to rotate a bit to create a sense of floating.

The project is a 34-inches-wide round inflatable pool, covered by a military waterproof black canvas and filled with 5 gallons of water.
A Kinect camera from 8 ft away captures the data of the user’s body.
– A projector hanging on the ceiling projects the animation of the stars in outer space into the water from up above, and the animation of the user’s figure.


Comm Lab: Hypercinema, Comm Lab: Hypercinema, Creative Computing, Creative Computing


Alison Huang

Music your way



My project is “video game controller like object,” but for music. The controller is made with foam for a squishy effect allowing the user comfort while still being durable. There are currently two joysticks on the top a user could control with their thumbs and 4 joysticks on the bottom a user could control with their middle and ring fingers all connected to either a song, tune, or sound effect. Currently, I have the 3 joysticks on the right side of the controller connected to 3 songs and the 3 joysticks on the left side of the controller connected to 3 tunes/ special effects sounds. The project should be user friendly and intuitive to figure out. Each joystick is mapped to volume and speed allowing the user to “remix” the songs. One joystick does not directly affect the other, so all sounds could be played at once at different volumes and speeds. When the joysticks return to a range close to their origin, the sound from that joystick will stop or pause. The joysticks are connected to an Arduino Mega that is connected to p5 through my laptop. This project requires little to no set up, little space, and, most importantly, no mess.


Creative Computing

Food Have Feelings Too

James Baffour, Sama Srinivas, Yulin Lai

The depressing, inevitable truth about life is that at some point people leave, no matter what you have to say about it, but this is our take on a silver lining so that everybody can find companions in their sadness.



This is an interactive storytelling piece that includes anthropomorphic food, food made out of clay, photo sensors linked to an Arduino and the p5.js editor. Users will have to interact with the clay food and the interaction triggered by Arduino and light sensors will activate pre-made animations made in After Effects.

This is for everyone who enjoys food, memes about grumpy old men, annoying teenagers, sad little boys and our visually appealing world and the characters we have created for it. Also, this is for those who can relate to the heart aching pain that comes when someone leaves or is taken out of one’s life, whether surprisingly or expected.

We also wanted to tell funny yet sad stories in a playful way using the skills we’ve learned so far. We wanted to say something about that inevitable truth, but do so in a playful and implicit way that seemingly skims over the true pain that it can cause an individual. We wanted to explore a new interaction with well-known and well-loved foods using the skills we have learned this semester.

We have three anthropomorphic food animated characters that we created for the interaction to compliment.

Storyline: Our three characters are a young, sad boy, a hormonal and annoying teenage girl, and a grumpy, old man. Using these archetypes of people in society, we are going to make scenarios using animation to create the reactions of these characters as the ones they like and love leave or are taken away.

Roger (Doughnut): A grumpy, old man whose super bitter about everything and is very mad at humans picking up his family and brothers in arms because it reminds him of his impending doom.

Raechel (Pizza): An annoying teenager who wants to do nothing but talk about her boyfriends and acts like she doesn’t care if you take her boyfriends away. But now, she has to deal with the harsh reality of life and loneliness.

Ronnie (Dumpling): A sad, lonely boy who has encountered too much loss in his life when it comes to his friends leaving. He has become jaded and thinks that inevitably everybody will leave him.

So how the interaction should work is the user should be prompted by a sign that says something along the lines of “Pick up the food one at a time”. The user can start at any prototype and put on the headphones. It will be pretty self explanatory from there. The user one by one will watch all of the animations and move on to the next prototype if they wish. Hopefully, nothing is broken after the user is done. There is a failsafe in the code to not mess up the interaction if two foods are picked up at the same time.

Pick up a food to try it out! Be warned, they are NOT edible!!


Creative Computing

Solar System Tour and Navigator

Apoorva Ramakrishnan, Sarah Peng

An educational tour of our solar system with a twist that puts a planet right in the palms of the user.



Space is big and cool, and traditional point and click solar system tours are a short and sweet way to learn about it. But just pointing and clicking at stuff with a cursor is kinda stale, so we wanted to make it possible for a user to really get their hands in there. That's how the controller came to be. It's a 6DOF sensor encased in a sphere, allowing users to spin planets around to look at them every which way. It's educational too, with facts being displayed for each planet. Combined with a traditional solar system tour interface made in p5.js, the alternative controller is meant to subvert an expectation while creating a freshly immersive and playful experience for users.


Creative Computing


Dingwen Kou, Ruyi Chen

Our oblivious sonic footprint can cause acoustic trauma or even death to underwater life.



“We now know that the underwater world is anything but silent. In fact, today's researchers are concerned that underwater noise produced by humans is distracting, confusing – and even killing – aquatic animals.”

Based on prior and current researches, noise pollution caused by human activities in the ocean (underwater explosion, ship or boat sound, construction, etc.) can and does severely disrupt whales' behaviors such as their communication and their ability to find food, mates or avoid predators, thus threatening their lives or even directly causing death. Therefore, we decided to use the microphone as the primary input and interaction in this project to simulate how underwater acoustic trauma is produced by humans and how it is largely affecting marine species which would be whales in this case.

When the sound made by the user from the microphone input hits a certain level:

The background color will change from black to a bloody red as the choppiness of the ocean will be influenced by the volume of the noise. The whales will be disturbed heavily and thus lose their track of peaceful and orderly swimming orbit which too will be affected by the volume of the noise.

When the environment calms down again, fortunately yet ironically everything will go back to the serene environment that the piece starts with, while in reality, most of the time the trauma caused would take decades to ease and heal and the tragedies of death are irreversible.

Primarily, this project acts as an educational and warning piece to every single one of us even though you might think that you are not doing anything bad.

These ocean threats, however, are also human threats.

As Cousteau said, “For most of history, man has had to fight nature to survive; in this century he is beginning to realize that, in order to survive, he must protect it.”


Creative Computing


Helen Hutchens, Xiaoshi Liang

Sensorship explores what happens when censorship is not longer an omission but a transmutation, by challenging two people—whether near-strangers or old friends—to hop in and figure out how to convey a message when your earpiece is against you.



Our relationship with those around us is increasingly carried out on screen. As the magnitude of digital communication increases inverse to face-to-face interactions, it belays the question: what happens when what we mean to say is not faithfully transmitted to who we’re communicating with?

The censorship of thought, language, and expression has, of course, been carried out in past ages, our current times, and likely future eras. Yet most of this has come across via blockage—a hand obstructing a printing press, a story cut from a manuscript, a forced deletion on social media…

What happens, though, when your message comes through? We trust that our machines will relay our messages in good faith. But what happens when your message is altered, warped, and effectively pre-packaged for its recipient? What can you do when the medium of communication itself believes in 'fake news'?

Users will assume different roles: one person will be a scientist who is a member of the UN’s Intergovernmental Panel on Climate Change (IPCC) and the other an emissary of the research to the world. They have to try and communicate—somewhat ironically by speaking into two analog phones—through a system that twists the terms that matter most (e.g. “global warming”) replacing them with other terms and thus meanings. Using a speech-to-text-to-speech process and the RiTa language library, both parties will be able to have a conversation, albeit one tampered with. The emissary's task is to receive the correct message from the researcher. They’ll have to beat the system in creative ways.


Creative Computing

Plant Boye

Faith Zeng

Experience the satisfaction of growing your own plant friend without dealing with the real responsibilities and risks!



Using sensors on the Arduino, the users will be able to pick, name, and grow a plant on the computer. The users will interact with each sensor, going through the motions of planting such as patting down soil or watering the plant. Each physical change will cause the virtual plant to grow a bit more. At the end, the user will be able to take a “picture” (screenshot) of their plant.


Creative Computing


Heather Kim, Katie Krobock

CyberScamp creates a unique, playful experience where interaction with a pup reaches into both the physical and digital worlds.



CyberScamp is a project connecting physical input to a digital output. Based on the user’s interaction with a physical stuffed animal, there is an animated p5.js output through which the animal responds. These animated outputs would likely be made with an outside program, then exported into p5.js. The project uses an Arduino, a stuffed animal dog, a force sensitive resistor, and p5.js. The resistor is in the back of the stuffed animal. By assigning values to the levels of pressure exerted on the dog, we are able to break those different pressures into ranges. One range is very low pressure, which the animated dog does not respond to at all; he is neutral. The animation would maybe be the dog looking eagerly at the user, waiting for some kind of attention; because this is the animation displayed when there is no pressure being exerted (no user interaction), this scene would encourage someone to come and interact with the project in the first place. Another range is medium to high pressure, achieved through petting or patting the stuffed animal, which the animated dog responds well to. The animated dog would be very happy, possibly rolling on his back with a tongue out. A third range is very high pressure, in the case that the user punches or squeezes the stuffed animal too hard. While this isn’t really an ideal interaction with the project, we feel it’s necessary to add an output that addresses it. The animated dog would react poorly to this treatment, possibly looking sad, upset, and hurt. This upset reaction by the animated dog may last a bit longer than the happy reaction, but would eventually fade back and reset to neutral again.


Creative Computing