VIOLASTREAM 2.0

On one end of the cam: Viola. On the other ends of the web: all of you. Over a livestream, observe and control Viola's behaviors.

David Currie, Viola He

https://vimeo.com/465958625

Description

[VIOLASTREAM 2.0 LIVE]

Wednesday, December 16, 8-10pm EST
Thursday, December 17, 8-10PM EST

To participate, go to: https://violola.herokuapp.com/

To watch stream, go to: twitch.tv/violola/

>>>>>>>>>>>>>>>>>>>

VIOLASTREAM 2.0 is an online interactive performance where I hand control of my actions over to my audiences, who will collectively vote for my next tasks, movements, emotions, and when to do them.

Audiences have access to a webpage with embedded livestream, action choices, and a comment section. This page is connected to my stream, which displays those inputs in real time. While a computer voice speaks to me all your commands, only the highest voted task will be acted out. To stop a current action and trigger the next, you need to vote for “stop task” until it surpasses the vote count that triggers the task.

With performance artists Tehching Hsieh and Marina Abramovic in mind, I'm utilizing web technologies and streaming platforms to explore my own body and identity performance in relations to the others. While the audiences act as commanders and spectators, Viola's body perform the role of the object and machine, creating a cybernetic relationship through webcam and livestream as medium.

ITPG-GT.2074.001, ITPG-GT.2041.001
Cybernetics of Sex: Technology, Feminisms, and the Choreography of Control, Population Infinite: The Future of Identity (Online)
Performance

Net-Natyam

Net-Natyam is a hybrid training system and performance platform that explores the relationships between music, machine learning, and movement through electronic sound composition, pose estimation techniques, and classical Indian dance choreography.

Ami Mehta, David Currie, ,

https://davidalexandercurrie.github.io/net-natyam/

Description

Bharatanatyam is a form of classical Indian dance that involves using complex footwork, hand gestures, and facial expressions to tell stories. The dance is traditionally accompanied by Carnatic music and an orchestra consisting of a mridangam drum, a flute, cymbals, and other instruments. Net-Natyam uses three ml5.js machine learning models (PoseNet, Handpose, and Facemesh) and a webcam to detect the movements of a Bharatanatyam dancer and trigger a corresponding sequence of electronically composed sounds.

ITPG-GT.2994.001, ITPG-GT.2048.00006
The Neural Aesthetic (Online), ICM – Media
Performance,Machine Learning

Ceiling Drums

An installation and performance involving percussion objects hung from a ceiling, rhythmically resonated by solenoid motors.

Eamon Goodman

https://vimeo.com/489257121

Description

zoom link:

https://nyu.zoom.us/j/91942675074?pwd=Sm13YzQyVm9abEFUOXlFdUsrSk8rUT09

GO TO
https://midi-sender.herokuapp.com/
OR CLICK PROJECT WEBSITE TO PLAY THE DRUMS YOURSELF!

I have collected, arranged, and hung 7 percussive and sonic objects in and array around the listener's ear, be it a human or electronic eardrum. To each object is attached a solenoid motor which will strike the object, and I will control this striking both live with buttons and by creating rhythmic MIDI clips in Ableton Live. I'll then explore the vocabulary of sounds possible with my room-sized instrument, incorporating it into musical performance, perhaps on its own, played and manipulated by multiple people, and with other sound sources, for instance a pitch-detecting harmonizer I created, or just an acoustic instrument like the bass clarinet. If I have time and luck I'll make it possible for spectators to trigger the sculpture over the web.

ITPG-GT.2301.00006, ITPG-GT.2994.001, ITPG-GT.2048.00005
Intro to Phys. Comp., The Neural Aesthetic (Online), ICM – Media
Sound,Performance

The Trace.

The Trace is the project combined ICM and PC, and I try to capture the proof of existence with this project.

En-Tung Liu

https://www.youtube.com/watch?v=064WTEDT5nI&feature=youtu.be

Description

This project comes from questions: how can people exist, and how can existence be proven? So I combine the idea of long-exposure in photography with the p5 sketch and set the installation in a dark place. When the audience triggers the sketch, it will start to capture the audience's movement and draw the light trace on the dark canvas, and when the audience leaves, the trace will disappear.

ITPG-GT.2301.00002, ITPG-GT.2048.00005
Intro to Phys. Comp., ICM – Media
Performance,Art

Sensor Battle (Star Wars Edition)

Remotely controlled synchronous RC car battle entertainment

Minyoung Bang, Rui Shang, Sara Ro, Yifei Gao, Zhaozhi Liu

https://vimeo.com/488984000

Description

In the spirit of COVID-19 remote learning, we wanted to create synchronous entertainment done remotely – the first ITP Sensor Battle. We are a group of 5 members located in 13-hour-difference time zones and we created a live sensor battle entertainment connected via UDP server. This is a match among 3 players to determine the best and strongest sensor. 3 players, in the USA, are battling with 3 different sensors – joystick, gesture and muscle sensors – connected to the battle (UDP) server to control their respective RC cars in Beijing battle ground. The battle ground and competing RC cars are in Beijing and players and their sensors are in the USA.

The team ideated 4 different gameplays to determine the battle, but after putting to the vote to the ITP class of 2022, we decided on the Paintball gameplay. Each RC car has a canvas and a paint gun mounted; and the players eliminate the others by shooting the paint on the other players’ mounted canvases.

Technicality aside, we imagined ourselves as extraterrestrial beings representing different sensor communities. The Sensor Battle is taking place at Star-dium stadium on the remote frontier planet of Batuu. Hope you enjoy! *Vulcan Salute*

ITPG-GT.2734.00001, ITPG-GT.2301.00004
Live Web (Online), Intro to Phys. Comp.
Performance,Play/Games

KameHameHa Ball

The KameHameHa Ball is an interactive sphere that inspired by Dragon Ball Z.

Chenyou Lee

https://vimeo.com/489948512

Description

How to use:

1. Put your left hand on the hand shape of the sphere. Starting from your thumb to your little finger one at a time, pressing down the sensor on the finger tips.

2. When pressing down the sensor, it will activate one set of the LED inside the sphere. And also activate the animation on the monitor.

3. Each sensor controls one set of LED and a clip of the animation. At the end when you are pressing down all five of them, you will complete the “Kame Hame Ha.”

ITPG-GT.2301.00001
Intro to Phys. Comp.
Performance,Play/Games

Sound and Visual Bender

Bending sounds, color and form with our bodies.

Natalie Fajardo, Rebecca Melman

https://vimeo.com/489637134

Description

In Sound and Color Bender, we explore the relationship between the movements of our bodies, color, form and sound. This project is the beginning of what we would like to be a tool for performers to use to create music and visual art simultaneously. What is the connection of the gestures of an arm moving, to the frequency of a melody, to a visual pattern on a screen? While we are based on opposite sides of the country, Natalie and I worked together to create a glove that responds to the movement of the user’s hand. We used the micro controller’s built in accelerometer and gyroscope to measure the tilt and acceleration of the hand, which sent those values to audio and visual software. The project’s current state provides a meditative space, with stimulating visuals and an airy, atmospheric audio experience.

ITPG-GT.2301.00002
Intro to Phys. Comp.
Performance,Wearables

Interactive DJ Pillars

Immersive DJ experience with the dancing light art

Wyatt Zhu, Shinnosuke Komiya, Weiwei Zhou

https://vimeo.com/491479341

Description

Standing in the middle of some LED pillars, there will stand a DJ. There is also a DJ control panels are attached to the headphone, the panning of the panels will change the electrical effect of the music, as well as the gradient changing of the LED pillars. We want the main material of the pillar to be half-transparent, with several LEDs at the bottom of the pillars. The device is expected to be played in a fairly dark place. With the glowing of the lights, the light will shine through the half-transparent material and gradually changes over time.

ITPG-GT.2301.00005
Intro to Phys. Comp.
Performance,Music

Animal Talk – Chatnimal

Animal talk is an interactive web-based tool that uses text messages as an input to convert them into animal sounds and form, an encrypted audio.

Christina Lan, Dorian Janezic

https://vimeo.com/489374329

Description

Animal talk is a web-based application that was created for Connections Lab class in Fall term 2020 by Christina Lan & Dorian Janezic. Our collaboration started for a midterm project, where we had to use sockets.io to create an interactive website for multiple users.

We both are very interested in sound, so we have decided to explore p5 sound library and created a pitch-matching game where users can try to click as close as possible to the Oscillator's frequency that was emitted. We were happy with the final output and the feedback so we decided to continue our collaboration for the final project. Our ideas shifted towards a performative interactive piece with multiple users which could use animal sounds to communicate, create their own encrypted language by mapping letters, words or phrases to specific sounds. Right now the project is at the stage where users can select the animal sound that will communicate their message to other users and when you receive a message you can hear the message in a converted, transformed or encrypted language. The sound is associated with an animation that plays when listening to the messages. The users can also preview or demo the animal sounds that they have selected by clicking on the canvas.

We want to further develop the project, and add new features that would form new interactions between users. One of the next steps is to upgrade UX and add an option for users to click on the text message and listen to the message and try to decode it or at least guess which animal did the sender choose.

Special thanks to Christina for collaboration, Craig and Mathura for making Connections Lab an unforgettable experience and all the residents and Ima Low Res Team.

IMALR-GT.201
Connections Lab
Sound,Performance

Moving On :||

A real-time solo performance that builds on top of itself.

Marcel Truxillo

https://drive.google.com/file/d/1-vZOIY_taekskRdQ9o-Ilb3LoZ43eRRR/view?usp=sharing

Description

ZOOM PASSWORD : ITP

Every 10 minutes the performance resets, using the audio and motion capture data of the renditions before, the live actor responds to the ghosts of eir past experience. Building on top of itself until becoming a cacophonic crowd. The actor cannot escape eir past as it continues to present itself, ey can only respond.. and respond to the response. If ey cannot change what has and is happening, can ey still find resolve? Does the moment of resolve then become another reflection? A study of presence.

ITPG-GT.2032.001
Performing Reality
VR\\\\AR,Performance