Ceiling Drums

An installation and performance involving percussion objects hung from a ceiling, rhythmically resonated by solenoid motors.

Eamon Goodman

https://vimeo.com/489257121

Description

zoom link:

https://nyu.zoom.us/j/91942675074?pwd=Sm13YzQyVm9abEFUOXlFdUsrSk8rUT09

GO TO
https://midi-sender.herokuapp.com/
OR CLICK PROJECT WEBSITE TO PLAY THE DRUMS YOURSELF!

I have collected, arranged, and hung 7 percussive and sonic objects in and array around the listener's ear, be it a human or electronic eardrum. To each object is attached a solenoid motor which will strike the object, and I will control this striking both live with buttons and by creating rhythmic MIDI clips in Ableton Live. I'll then explore the vocabulary of sounds possible with my room-sized instrument, incorporating it into musical performance, perhaps on its own, played and manipulated by multiple people, and with other sound sources, for instance a pitch-detecting harmonizer I created, or just an acoustic instrument like the bass clarinet. If I have time and luck I'll make it possible for spectators to trigger the sculpture over the web.

ITPG-GT.2301.00006, ITPG-GT.2994.001, ITPG-GT.2048.00005
Intro to Phys. Comp., The Neural Aesthetic (Online), ICM – Media
Sound,Performance

The Trace.

The Trace is the project combined ICM and PC, and I try to capture the proof of existence with this project.

En-Tung Liu

https://www.youtube.com/watch?v=064WTEDT5nI&feature=youtu.be

Description

This project comes from questions: how can people exist, and how can existence be proven? So I combine the idea of long-exposure in photography with the p5 sketch and set the installation in a dark place. When the audience triggers the sketch, it will start to capture the audience's movement and draw the light trace on the dark canvas, and when the audience leaves, the trace will disappear.

ITPG-GT.2301.00002, ITPG-GT.2048.00005
Intro to Phys. Comp., ICM – Media
Performance,Art

SEEN

I see and I exist, no fears to be naked, no absolutes in black and white.

Zhuolin Wu

https://youtu.be/IeNJmdwtiRI

Description

This is my semester-long project of my Fall 2020 at ITP, which involves a bunch of eye symbols and graphics, expressing the theme: Looking for a witness of life. Inspired by Katy Perry’s 5th studio album , a underestimated pop records that kept me being alive, got me thinking in depth, and stimulating my souls and energy.

I deconstructed, and restructured the storyline and meaning of each track in this album, collected the lyrics that resonated with me, and visualized them in different forms such as Unity animation, video and audio, projection mapping, installation and graphic art. It was not that easy to show the sincerity and intimacy, not even to mention transferring those memories, emotions and feelings, into programming visual language, physical interactions and fabrications.

Those works speak for me, and by that I want to open more potential connections and communications in the future. Not only the connection to the others who can be your side and ride the journey with you, but also to the self.

Aside from the visual composition, I bring more personalities, narratives, and intimacy into it by etching my own monologue into the piece. Visit the project website for more details about SEEN.

ITPG-GT.2301.00008, ITPG-GT.2005.00003, ITPG-GT.2233.00003, ITPG-GT.2002.00005
Intro to Phys. Comp., CL – Visual Language (Online), ICM, CL – Animation
Art,Narrative/Storytelling

WordEater

WordEater is a mini game where you can use your webcam to gobble up words in order to generate a sentence.

Jeeyoon Hyun

https://youtu.be/CZt9pIUUvpk

Description

https://jeeyoonhyun.github.io/WordEater/

Ever felt confused of so many words floating around the Internet?

WordEater is a browser based game that lets you gobble up a bunch of meaningless words in order to make another meaningless sentence, eventually removing all words that you see in the screen.

It doesn't matter if you don't understand what the words or sentences are trying to say – after all, they are going to be swallowed and eaten anyway. All you need to do is get some peace of mind by consuming all the disturbing, shattered pieces of information that makes complete nonsense. The goal of the game is making your web browser more cleaner by scavenging fragmented data with your mouth. After all, your web browsers also need some refreshment from the gibberish they encounter everyday!

WordEater uses the Facemesh API in ml5.js to detect your mouth in your webcam. You can play the mouse version if you can't use your webcam – for example, if you are wearing a mask.

ITPG-GT.2233.00005, ITPG-GT.2465.001
ICM, Machine Learning for the Web (Online)
Machine Learning,Play/Games

Microhabitat

A microhabitat is a virtual open-door to a tiny habitat that can be seen only through a microscope.

Jung Huh

https://vimeo.com/488923446

Description

A microhabitat is a virtual open-door to a tiny habitat that can be seen only through a microscope. It welcomes audiences to a tiny room of a person living in NYC. The tour is not as big and fancy as you see in many of YouTube's’ open doors. Comparing the size of the room to that of other open door videos, it may feel like looking into microscope slides where details are only visible via a microscope.

The work brings the housing problem the 20s and 30s are facing. Finding a habitat has become more difficult. The more you get close to the central part, such as NYC or Seoul, the more expensive rents get making it tougher to find a place, and the smaller the room becomes. However, no matter how small the room may be, there lives a person with their own unique story and a big dream.

The audience peeks into the small room of a person through a microscope that has two controllers. Using the knob on the right side, a stage controller, the audience can look around the room as it rotates the camera situated at the center of the microhabitat. Using the knob on the left side, coarse adjustment, the user can look into the details on specific objects located in the room. Each object contains a personal story of the person living there as if you see the product details of objects that YouTube has in one's open-door video.

Virtual Experience
https://editor.p5js.org/jhuh3226/present/ErsWzJ5mn

ITPG-GT.2301.00008, ITPG-GT.2048.00003
Intro to Phys. Comp., ICM – Media
Narrative/Storytelling,Social Good/Activism

I want to see the world

An interactive experience tells how we see the world via biological processes.

Tiange Hou

https://vimeo.com/488284023

Description

We have different sensory systems obtaining the perception. But do you know what happened inside our bodies, leading us to perceive the world? When the light goes into our eyes, how do we get the vision from it?

“I want to see the world” is an interactive experience during which users help the person(with eyes, hand, and brain) to see. This project is an overview of how we see the world via the biological processes. When light enters the eyes in the darkness, a critical cell signaling pathway, called phototransduction, starts in cells in our eyes. The sodium channel is closed so Na+ could not enter the cell. Then, everything else helps us to see could occur. Information is sent from the eyes to the brain via nerves and more visual information processing happens in the brain, indicated by LED lights and drawings. The computational drawings simulate a biological technique, fluorescence imaging.

Hands are responsible for all the interactions. Users do some hand gestures to the “hand,” implying that when we cannot see, other sensory systems, such as touch, help us. First, users wave to the “hand” to let this person know that users are around. Second, users move their hands up and down to the “hand” to let the person know that they wish to help. Third, users follow the instructions shining the light on the “eyes.” And last, users wave the hand again to let the eyes see the final image.

ITPG-GT.2233.00003, ITPG-GT.2301.00007
ICM, Intro to Phys. Comp.
Education,Art

Deep Dream

Let Deep Dream bring you to the deep ocean, you sink in it slowly, like the water want to bring you somewhere.

Jingyuan Li

https://youtu.be/l8WtFeFsc1I

Description

Deep Dream is a creative coding project base on the ICM sound assignment I made with Sihan.
The code is mainly divided into two parts: sound and visual picture. The background sound is created by p5.Oscillator and the bubbles sound is controlled by the overlapped bouncing balls. As for the visual part, the wave reacts to the Oscillator's frequency. The stock of fish and bubbles are controlled by the teachable machine, which can identify bubbles sound.
NOTICE! It works better if you change your audio input from microphone to computer audio speaker.(Or you could make bubble sound with your mouse, see if it works.)
Code link: https://editor.p5js.org/Lea-0821/sketches/n_EVbM6Ty
Elements: p5.Js, Sound, Music Visualization, Teachable Machine
Reference:
https://www.openprocessing.org/sketch/891024
https://www.openprocessing.org/sketch/802077

ITPG-GT.2233.00003
ICM
Sound,Machine Learning