Sensor Battle (Star Wars Edition)

Remotely controlled synchronous RC car battle entertainment

Minyoung Bang, Rui Shang, Sara Ro, Yifei Gao, Zhaozhi Liu

https://vimeo.com/488984000

Description

In the spirit of COVID-19 remote learning, we wanted to create synchronous entertainment done remotely – the first ITP Sensor Battle. We are a group of 5 members located in 13-hour-difference time zones and we created a live sensor battle entertainment connected via UDP server. This is a match among 3 players to determine the best and strongest sensor. 3 players, in the USA, are battling with 3 different sensors – joystick, gesture and muscle sensors – connected to the battle (UDP) server to control their respective RC cars in Beijing battle ground. The battle ground and competing RC cars are in Beijing and players and their sensors are in the USA.

The team ideated 4 different gameplays to determine the battle, but after putting to the vote to the ITP class of 2022, we decided on the Paintball gameplay. Each RC car has a canvas and a paint gun mounted; and the players eliminate the others by shooting the paint on the other players’ mounted canvases.

Technicality aside, we imagined ourselves as extraterrestrial beings representing different sensor communities. The Sensor Battle is taking place at Star-dium stadium on the remote frontier planet of Batuu. Hope you enjoy! *Vulcan Salute*

ITPG-GT.2734.00001, ITPG-GT.2301.00004
Live Web (Online), Intro to Phys. Comp.
Performance,Play/Games

Ripple Inside

Tell me your feeling, I hear, and I feel you.

Eden Chinn, Rui Shang

https://youtu.be/AsxloEstQ4U

Description

As humans, our existence is defined by different emotional states. When we feel an emotional impulse, it's like a ripple is dropped inside of us. This ripple flows outward and is reflected in how we perceive the world around us, as well as how we act within it.

For this project, we wanted to visualize emotional states using colors, shapes, and sounds in a poetic way.

The first thing we did is dividing all emotion words into 6 classifications: happy, content, sad, angry, shocked, afraid and then used p5.speech to recognize words instead of training words myself in the teachable machine because it’s far more accurate and for now this project can recognize over 110 emotion words.

We create a flowing 3d object and use sin() function to generate a beautiful ripple. More importantly, we generate multiple filters for one song in response to different emotions, and the amplitude of the song will affect the frequency of the ripple. For the visual part, we believe matching colors and custom shapes to different emotion words based on color and shape psychology could give people an immersive experience.

Tell me your feeling with one word.

I hear you, I feel you.

ITPG-GT.2048.00007
ICM – Media
Machine Learning,Music