Mimi (Yue) Yin – ITP Spring Show 2018 /shows/spring2018/ Tue, 15 May 2018 16:52:18 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 Greeting Booth /shows/spring2018/greeting-booth/ /shows/spring2018/greeting-booth/#respond Tue, 15 May 2018 15:59:45 +0000 https://itp.nyu.edu/shows/spring2018/greeting-booth/ Anthony Bui, Nick Wallace

What's in a greeting? Exploring relationship dynamics in everyday interactions

https://docs.google.com/presentation/d/11HveWLXetx3nr4Tvuh6k9iafrblRY4pNSPTjDjzTYOs/

Main Project Image

Description

What happens when two people who know each other meet unexpectedly? How do they decide to greet each other? What does their greeting say about their relationship?

Classes

Collective Play

]]>
/shows/spring2018/greeting-booth/feed/ 0
IMVII /shows/spring2018/imvii/ /shows/spring2018/imvii/#respond Tue, 15 May 2018 15:59:19 +0000 https://itp.nyu.edu/shows/spring2018/imvii/ Continue reading "IMVII"

]]>
Brandon Kader

I created a system to seamlessly integrate music, visuals, and depth sensory data to perform as a conductor. The performer moves in a space to conduct music samples and generative visuals.

https://www.brandonkader.com/thesis

Main Project Image

Description

I made an immersive experience without a device or VR headset. I have tried a VR headset and felt disoriented and extremely uncomfortable. I most definitely do not want to work in that medium. It is important that the performer not touch things like a conductor does not play a specific instrument in an orchestra, rather directs the performance. My experience with piano performance is that it is not very physically moving, the piano is stationary and the performer is seated at the keyboard. There is little showmanship or stage presence. I have felt the need to perform my music in another way. I explored how to make an immersive experience – including music and visual effects – using a touch free interface for modern composers to perform room scale installations. I used a Kinect camera to gather depth sensory data with Processing to map music and generative visuals in a system I built with Max MSP Jitter, a visual programming language that patches and connects the data to music and visual elements. I used a Triplehead2go for 3 projector display to create a room-scale visual experience.

Classes

Algorithmic Composition, Choreographic Interventions, Thesis

]]>
/shows/spring2018/imvii/feed/ 0