No Present, No Past, No Future

Meditative Sound Art

Russell Ladson


No Present, No Past, No Future is an exploration into understanding our perception of time. Using the Tibetan singing bowl, No Present, No Past, No Future attempts to visualize how the bowl observes and records time. It recognizes that our perception of time is a close examination of our existence – realizing that we, like time, are evolving or devolving towards nothingness.


Ceiling Drums

An installation and performance involving percussion objects hung from a ceiling, rhythmically resonated by solenoid motors.

Eamon Goodman


zoom link:


I have collected, arranged, and hung 7 percussive and sonic objects in and array around the listener's ear, be it a human or electronic eardrum. To each object is attached a solenoid motor which will strike the object, and I will control this striking both live with buttons and by creating rhythmic MIDI clips in Ableton Live. I'll then explore the vocabulary of sounds possible with my room-sized instrument, incorporating it into musical performance, perhaps on its own, played and manipulated by multiple people, and with other sound sources, for instance a pitch-detecting harmonizer I created, or just an acoustic instrument like the bass clarinet. If I have time and luck I'll make it possible for spectators to trigger the sculpture over the web.

ITPG-GT.2301.00006, ITPG-GT.2994.001, ITPG-GT.2048.00005
Intro to Phys. Comp., The Neural Aesthetic (Online), ICM – Media


cyber$exb0t is an interactive cybersex voice-chat experience with your computer. Users are able to choose from options of voices, ranging in pitch, to have cybersex with.

Tina Rungsawang, Zeyao Li


What would it be like if we had the chance to have intimate experiences with different kinds of people, different kinds of voices? How would it be different to get intimate with a low-pitched voice entity, vs a high-pitched one?

cyber$exb0t is an experience that allows users to playfully explore intimacy with different voices from the computer. They will have the option of choosing a voice to interact with, and go on a series of on-screen interactions to satisfy the computer through a cybersex chat. We are hoping that through this experience, people will be more open to having intimate interactions with different voices.

The mood of this experience is sexy, nostalgic, and fun! The aesthetics are heavily inspired by early 2000's cybersex chatrooms on AIM.

Hello, Computer: Unconventional Uses of Voice Technology

Tidal Observation Instrument

The Tidal Observation Instrument brings tides from around the globe into the audible range and allows their tones to mix and mingle.

Daniel Fries


The Tidal Observation Instrument starts with four tides. It uses historical data from the International Oceanographic Commission to log those tides as wavetables, and allows an instrumentalist to control the speed of the signal, from the native tidal frequency of 0.0000016Hz up to a more typical Low-Frequency Oscillator range of 0.01Hz. This LFO is allowed to modulate a sample of the waveform playing at 440Hz (Concert A). As the four tides play, it's possible to hear the last several days of tides across the globe resonate with each other, leading towards a strange new metaphor for the waves.



A poetic interactive painting of rain that reflects the viewers through splendid colors

Bei Hu, Junoh Yu


“Soundrop” is a poetic interactive painting of rain that reflects the viewer through splendid colors. In this piece, we focused on ‘rain’ and ‘meditation’. So through conversation, we decided to write a poem about rain as a part of the process of making. We tried hard to create a relaxing and meditative visual-sound experience for the viewer. Through using PoseNet and Webcam, we can capture the viewer’s position and present the shape of the viewer with rain. When the viewer stands in front of the P5.js painting with rain falling from the top, the white raindrops turn into rainbow colors as soon as they pass through the viewer’s body outline. The changing color of rain will be updated according to the body movement of the viewer. The speed of the rain and the volume of the rain sound could be changed according to the position of the viewer's body. We want people to experience the rain in a new way as they are standing in the rain, and listening to our self-written poems. (Co-created by Junoh Yu and Bei Hu)


Musical Garden

Make music by tending to a virtual garden.

Elias Jarzombek


This web instrument allows you to make music by planting and watering different kinds of “audio seeds” that grow into lush melodies and textures.

Watering the seeds causes them to grow both visually and sonically, and distinct areas in the garden cause the plants to behave in different ways.

Composing using this interface is more spacial than linear. Plants emanate sound that you navigate through using the mouse, so moving through the space influences the mix of sounds.

The implementation represents different types of sound using basic geometric forms and generates growth patterns algorithmically using L-Systems — a way of modeling generational systems. These patterns are at times also used to produce melodies.

The musical garden invites exploration, and can be found at

ITPG-GT.2653.00001, ITPG-GT.2048.00004
The Code of Music, ICM – Media


Nonclub is a room in passing, modeled after spaces we've lost access to and offered as an interruption, a break from digital overload.

Christina Lan


Nonclub developed over a course of exploring re-enactment, simulation, and memory in relation to underground spaces and sound. In thinking about the effects on mental health from the loss of certain spaces, this piece hopes to create an analogous experience that offers new ways of perceiving embodiment and forming connections.

Critical Experiences, Critical Experiences

Animal Talk – Chatnimal

Animal talk is an interactive web-based tool that uses text messages as an input to convert them into animal sounds and form, an encrypted audio.

Christina Lan, Dorian Janezic


Animal talk is a web-based application that was created for Connections Lab class in Fall term 2020 by Christina Lan & Dorian Janezic. Our collaboration started for a midterm project, where we had to use to create an interactive website for multiple users.

We both are very interested in sound, so we have decided to explore p5 sound library and created a pitch-matching game where users can try to click as close as possible to the Oscillator's frequency that was emitted. We were happy with the final output and the feedback so we decided to continue our collaboration for the final project. Our ideas shifted towards a performative interactive piece with multiple users which could use animal sounds to communicate, create their own encrypted language by mapping letters, words or phrases to specific sounds. Right now the project is at the stage where users can select the animal sound that will communicate their message to other users and when you receive a message you can hear the message in a converted, transformed or encrypted language. The sound is associated with an animation that plays when listening to the messages. The users can also preview or demo the animal sounds that they have selected by clicking on the canvas.

We want to further develop the project, and add new features that would form new interactions between users. One of the next steps is to upgrade UX and add an option for users to click on the text message and listen to the message and try to decode it or at least guess which animal did the sender choose.

Special thanks to Christina for collaboration, Craig and Mathura for making Connections Lab an unforgettable experience and all the residents and Ima Low Res Team.

Connections Lab

ITP Weather Band: Introduction

ITP Weather Band is an experimental band creating music, interactive objects, and visuals with weather data collected from a DIY weather station. Come meet the band and learn about our weather system.

Atchareeya Name Jattuporn, Lu Lyu, Cy Kim, Schuyler DeVos, Siyuan Zan, Sihan Zhang, Yiting Liu, Yeseul Song


ITP Weather Band is an experimental band creating music, interactive objects, and visuals with weather data collected from a DIY weather station. We built a DIY weather station system and created experimental instruments that turn the environmental data into music and visuals. We explore new ways of delivering information and stories about our immediate environment through the auditory and visual sense. We’re growing into an open source project.

The band consists of faculties, alums, and graduate students at New York University’s Interactive Telecommunications Program (NYU ITP) and beyond. We started the weather station project in 2019 as a sub-group of ITPower and launched the band in January 2020.

See here for the details and full credits

We had a special collaborator, Yichan Wang ( pursuing her MFA in Design and Technology at Parsons School of Design, for the weather band website.

At the winter show 2020, we're introducing:

* ITP Weather Station System by ITP Weather Band
A DIY weather station system that posts data to the web to be accessed from anywhere in the world for creative projects.

* Weather Data Visualization Interfaces by Yiting Liu, Atchareeya Name Jattuporn, and Cy Kim
On this website, you get to see the visualization of all the data or select a specific part of data based on dates, id numbers, and categories.

* Weather Band Website by Yichan Wang (New School, visual system design) and Schuyler W DeVos (development).
A visual system and website with a graphical weather mobile that sensitively reacts to the real-time weather data.

* Weather Instruments made by the band members, including

Meteor Shower by Sihan Zhang
: A p5 sketch that generates a meteor shower scene and relaxing music using wind and rain data

Gust Sound by Lu Lyu (shown at a separate spot on Yorb)

Leaf Dance by Siyuan Zan (shown at a separate spot on Yorb)

ITPG-GT.2808.001, ITPG-GT.2301.00004, ITPG-GT.99999.001
Understanding Networks, Intro to Phys. Comp., ITP Weather Band


Find the inner peace and present it physically by sound and light

Jezzy Zheng Lu


Equilibrium is a light and sound sculpture, using glass beads as the main medium to change light and sound through human breathing. Equilibrium is an experience and a self introspective journey. It invites the audience to take some time to get back in touch with themselves.

The interaction is as follow: First the FSR that is under the rug senses when a person sits down and keeps still, the time of flight sensor starts to detect the person’s breathing frequency by detecting the person’s chest fluctuations. The time of flight sensor picks up different distances within a given threshold between insufflation and inspiration, and this frequency is mapped into the range of LED and vibration motor, to change the brightness and vibration frequency. Different amplitude values cause the glass bead to change position, thereby changing the light path and creating sound through friction. Simultaneously, the servo continues to perform regular movements of up and down, allowing the glass beads to fall from a high place to a low place; repeating this process continuously, it produces a sound similar to wind/rain in nature. This white noise quality of the sound helps audiences gradually gain inner peace. On one hand, Equilibrium attempts to visualize the subtle change of human consciousness, giving it physical properties, from intangible to tangible. On the contrary, the attributes of light and sound are relatively abstract comparing to concrete objects.

Equilibrium's entire device is also pursuing a balance, whether it is the teeter-totter structure that keeps repetitive motion or the continuous and stable state of brightness and amplitude with the steady flow of human breathing.

Intro to Phys. Comp.