Wearables – Winter Show 2020 /shows/winter2020/ ITP/IMA/IMA Low Res Thu, 17 Dec 2020 23:35:37 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 /shows/winter2020/wp-content/uploads/sites/53/2020/11/cropped-web_plain_2000_1400-1-32x32.png Wearables – Winter Show 2020 /shows/winter2020/ 32 32 Breathe IN/OUT, Float UP/DOWN /shows/winter2020/breathe-in-out-float-up-down/ /shows/winter2020/breathe-in-out-float-up-down/#respond Tue, 15 Dec 2020 18:23:25 +0000 https://itp.nyu.edu/shows/winter2020/breathe-in-out-float-up-down/ Continue reading "Breathe IN/OUT, Float UP/DOWN"

]]>
Devices that enhance meditative experiences in both remote and physical locations, acknowledging that meditation is a communal, multi-sensory practice.

Eden Chinn, Lucas Wozniak, Mee Ko, Rajshree Saraf,

participant stands under physical installation that lifts up and down in response to their breath two participants with breath sensors attempt to breathe in sync to animate a feather breath sensor wearing breath sensor
https://youtu.be/DWl0pBt_lhw

Description

The user wears a belt with a stretch sensor attached, measuring the user’s breath. As the stretch sensor moves with the expansion of the chest, its data is recorded and utilized to manipulate remote and physical objects. There are two variations on this output for our project:

REMOTE/SERVER COMMUNICATION: Different people can wear breath sensors in different locations. Together, their breathing patterns create an output on p5. In our p5 sketch, the breathing is signified by a feather animation moving up and down. Both users' breathing patterns create this change collaboratively.

PHYSICAL INSTALLATION: The stretch sensor allows breath to become a controller that manipulates physical phenomena. An installation of yarn pompoms is suspended from a frame and moves in reaction to the wearer’s inhalation and exhalation.

Zoom link:
https://nyu.zoom.us/j/99001077668

p5 link:
https://editor.p5js.org/mnk2970/sketches/ylgyvcaia

ITPG-GT.2301.00008
Intro to Phys. Comp.
Art,Wearables
]]>
/shows/winter2020/breathe-in-out-float-up-down/feed/ 0
Meet the Future /shows/winter2020/meet-the-future/ /shows/winter2020/meet-the-future/#respond Mon, 14 Dec 2020 18:50:15 +0000 https://itp.nyu.edu/shows/winter2020/2020/12/14/meet-the-future/ Continue reading "Meet the Future"

]]>
Come join me for a virtual trip to enjoy the different scenery in the world, from the comfort of your own home!!!

Mingren Fu

Looking at the world through the lens of futuristic glasses A 3D virtual sphere that displays the snowy scene Scene of crashing waves from a perspective inside of the ocean virtual sphere
https://youtu.be/B_wdEPwhsrQ

Description

In my project, I've decided to create a Mixed Reality (XR) experience for my audience by combining Augmented Reality (AR) and Virtual Reality (VR) together. As a traveler, I would like to experience different environments before I have physically been to a place. Thus, one of the intended goals of my project is to provide a convenient way for people to get a sense of different types of outdoor environments from the comfort of their own home. The outdoor environments featured in my project include both nature (such as the ocean) and urban (such as the park) sights. To achieve my intended goal, I've proposed an installation for a pair of smart-glasses that will hopefully exist one day in the future. The idea (or concept) behind my installation proposal is that people will be able to view everyday objects at home from a completely brand-new perspective by simply wearing the futuristic smart-glasses. The featured outdoor environments in this project are displayed in multiple virtual 3D spheres. These spheres are created by using 360° panoramas images taken at different places around the world. Each of them is attached to its related real-life object. When the audience walks into one of the virtual spheres, they will get a super immersive experience while being surrounded by the whole (360°) space. To put it in another perspective, they will be able to look around the space in all directions inside the virtual sphere. This immersive experience will not only come from the visual aspect but the audio aspect as well. In other words, people will also hear different types of sounds as they enter any of the spheres. For instance, when they walk into a sphere of Ocean, they will be hearing the sound of waves splashing in addition to being surrounded by virtual nature.

IMNY-UT.102.00003
Comm Lab: Hypercinema
VR\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\AR,Wearables
]]>
/shows/winter2020/meet-the-future/feed/ 0
iWind_ /shows/winter2020/iwind_/ /shows/winter2020/iwind_/#respond Mon, 14 Dec 2020 18:49:58 +0000 https://itp.nyu.edu/shows/winter2020/2020/12/14/iwind_/ Continue reading "iWind_"

]]>
Use your body control to play with the wind

Binyan Xu,

iWind
https://youtu.be/xytzRsYMmDE

Description

My project is iWind.

You can use the rotation of your head to drive the Arduino to control the character’s flying in a simulated real world, like wind passing over forests and lakes.And you can also try to swim in the universe.

Arduino is installed on an action camera accessories/helmet and connected to Unreal Engine.

(iWind is designed and made by Michelle Xu and Wei Wu)

ITPG-GT.2301.00007, ITPG-GT.2079.00001
Intro to Phys. Comp., Virtual Production (Online)
Play/Games,Wearables
]]>
/shows/winter2020/iwind_/feed/ 0
Sound and Visual Bender /shows/winter2020/sound-and-visual-bender/ /shows/winter2020/sound-and-visual-bender/#respond Mon, 14 Dec 2020 18:49:24 +0000 https://itp.nyu.edu/shows/winter2020/2020/12/14/sound-and-visual-bender/ Continue reading "Sound and Visual Bender"

]]>
Bending sounds, color and form with our bodies.

Natalie Fajardo, Rebecca Melman

Example Image of Visual Aspect
https://vimeo.com/489637134

Description

In Sound and Color Bender, we explore the relationship between the movements of our bodies, color, form and sound. This project is the beginning of what we would like to be a tool for performers to use to create music and visual art simultaneously. What is the connection of the gestures of an arm moving, to the frequency of a melody, to a visual pattern on a screen? While we are based on opposite sides of the country, Natalie and I worked together to create a glove that responds to the movement of the user’s hand. We used the micro controller’s built in accelerometer and gyroscope to measure the tilt and acceleration of the hand, which sent those values to audio and visual software. The project’s current state provides a meditative space, with stimulating visuals and an airy, atmospheric audio experience.

ITPG-GT.2301.00002
Intro to Phys. Comp.
Performance,Wearables
]]>
/shows/winter2020/sound-and-visual-bender/feed/ 0
Handy /shows/winter2020/handy/ /shows/winter2020/handy/#respond Mon, 14 Dec 2020 18:48:49 +0000 https://itp.nyu.edu/shows/winter2020/2020/12/14/handy/ Continue reading "Handy"

]]>
Handy is a wearable interface interpreting tangible interaction to intangible experiences.

Peiling Jiang

Handy Handy
https://vimeo.com/379383886/52f9b9d41f

Description

Handy is a wearable interface interpreting tangible interaction to intangible experiences, using EMG signal and machine learning. Unlike current popular hand-tracking interfaces, including those based on LiDAR and infrared camera, that we need to consciously pose our body for, on-body sensing based on EMG signals allows more intuitive and relaxed interactivity.

IMNY-UT.224.001
Introduction to Machine Learning for the Arts (Online) (UG)
Wearables
]]>
/shows/winter2020/handy/feed/ 0
Prosthetic Biomagnetic /shows/winter2020/prosthetic-biomagnetic/ /shows/winter2020/prosthetic-biomagnetic/#respond Mon, 14 Dec 2020 18:48:20 +0000 https://itp.nyu.edu/shows/winter2020/2020/12/14/prosthetic-biomagnetic/ Continue reading "Prosthetic Biomagnetic"

]]>
Enhance your perception dimension, no need to wait until 2077

Jingxi Guo

Prosthetic Biomagnetic
https://youtu.be/jFEUR7Vr76o

Description

I have always been interested in how the human body reacts to magnetic fields. I made an assumption about the biomagnetic field of the human body serval years ago. I thought that modern human's body (includes the brain) had the ability to sense the magnetic field from the earth as the other animal does. We had that ability when we were an infant. But nobody told us that there is a feeling of the magnetic field is the feeling of the magnetic field. As we grow, we lost the ability to recognize the ability, because without correct guidance.

This device is a wearable device. It detects the direction of the magnetic field through a magnetometer, and at the same time converts the direction of the magnetic field into vibration, which is transmitted to the user through the skin, allowing the user to distinguish the direction of the magnetic field through the vibration signal.

In terms of technology, I used Seeduino XIAO, a smaller Arduino board, and BNO-055 as my sensor. The current version uses an ordinary vibration motor. Before the start of the winter show, I might replace the motherboard on the back with a PCB board so that it will become smaller. Attach part uses flexible 3D printing materials.

ITPG-GT.2301.00008
Intro to Phys. Comp.
Wearables
]]>
/shows/winter2020/prosthetic-biomagnetic/feed/ 0