Expressive Interfaces: Introduction to Fashion Technology – ITP Spring Show 2018 /shows/spring2018/ Tue, 15 May 2018 16:53:37 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 Conductive Crystallized Textiles /shows/spring2018/conductive-crystallized-textiles/ /shows/spring2018/conductive-crystallized-textiles/#respond Tue, 15 May 2018 16:01:03 +0000 https://itp.nyu.edu/shows/spring2018/conductive-crystallized-textiles/ Continue reading "Conductive Crystallized Textiles"

]]>
Sandy Hsieh

Cultivating conductive embellished textiles with electrolysis and exploring its potential for embedded electronic circuitry.

http://doc.sandyhsieh.com/category/fashiontech/

Main Project Image

Description

Inspired by “artificilae/matter”, Hannah Croft's extensive collection of samples where crystals are grown onto woven and embroidered textiles to create mineral surfaces of ‘cultivated embellishment', I want to explore and expand the potential applications of growing conductive crystallized textiles with electrolysis. Conventional methods of crystal application typically involve hand-sewing or adhesives, which can be labor intensive and expensive. Growing crystals is an obvious and elegant alternative, and the results have an organic aesthetic. I will be focusing on integrating the crystallization of copper within textiles to generate traces for electronic circuits.

Classes

Expressive Interfaces: Introduction to Fashion Technology, Soft Robots and Other Engineered Softness

]]>
/shows/spring2018/conductive-crystallized-textiles/feed/ 0
Haptic Wayfinding Bracelet /shows/spring2018/haptic-wayfinding-bracelet/ /shows/spring2018/haptic-wayfinding-bracelet/#respond Tue, 15 May 2018 16:00:44 +0000 https://itp.nyu.edu/shows/spring2018/haptic-wayfinding-bracelet/ Continue reading "Haptic Wayfinding Bracelet"

]]>
Vidia Anindhita, Sandy Hsieh

Wayfinding wearable device and mobile app that translates turn by turn directions to haptic vibrations to safely guide visually-impaired users to their desired destination.

Main Project Image

Description

For people with visual impairments or blindness, wayfinding, or the process of navigating unfamiliar spaces to reach a destination, is a complex and intimidating task. Although mobile phones now come preinstalled with screen readers (ie. iPhone VoiceOver and Android’s TalkBack), Google Maps and similar GPS navigation apps are not user-friendly. People with visual impairments or blindness rely heavily on multisensory feedback from their environment to navigate spaces, but the auditory feedback from the screen reader tends to overpower all other senses and can be very overwhelming for the user to concentrate on what steps to take next. This could potentially put the user in a distressed and vulnerable state, especially if others in the surrounding environment are also distracted (i.e. smartphone zombies). We are proposing to design a different user experience using principles of calm technology. Calm technology is a type of information technology where the interaction between the technology and its user is designed to occur in his/her periphery rather than constantly at the center of attention. Using Maps API and Bluetooth LE, we are developing a wayfinding bracelet and mobile app that translates turn by turn directions to haptic vibrations to safely guide visually-impaired users to their desired destination.

Classes

Bluetooth LE, Designing Meaningful Interactions, Expressive Interfaces: Introduction to Fashion Technology, Looking Forward 2: Design for Accessibility, Looking Forward 2: Design for Accessibility, Tangible Interaction Workshop

]]>
/shows/spring2018/haptic-wayfinding-bracelet/feed/ 0
Join /shows/spring2018/join/ /shows/spring2018/join/#respond Tue, 15 May 2018 16:00:36 +0000 https://itp.nyu.edu/shows/spring2018/join/ Continue reading "Join"

]]>
Max Horwich

A voice-controlled web VR experience invites you to sing along with the robotic choir

https://wp.nyu.edu/maxhorwich/2018/04/30/join/

Main Project Image

Description

Join is an interactive musical experience for web VR. A choir of synthesized voices sings from all sides in algorithmically-generated four-part harmony, while the user changes the environment by raising their own voice in harmony.

Inspired by the Sacred Harp singing tradition, the music is generated in real time, based on Markov chains derived from the original Sacred Harp songbook. Each of the four vocal melodies are played from the four corners of the virtual space toward the center, where the listener experiences the harmony in head-tracking 3D audio. A microphone input allows the listener to change the VR landscape with sound, transporting them as they join in song.

While the choir is currently programmed to sing only in solfege (as all songs in the Sacred Harp tradition are usually sung for the first verse), I am in the process of teaching the choir to improvise lyrics as well as melodies. Using text also drawn from the Sacred Harp songbook, I am training a similar set of probability algorithms on words as notes. From there, I will use a sawtooth oscillator playing the MIDI Markov chain as the carrier, and a synthesized voice reading the text as the modulator, combining them into one signal to create a quadrophonic vocoder that synthesizes hymns in real time.

For this show, I present to show Join in a custom VR headset — a long, quilted veil affixed to a Google Cardboard. Rather than strapping across the user’s face, this headset will be draped over the head and hang down, completely obscuring their face and much of their body. After experiencing the virtual environment, participants are invited to decorate inscribe the exterior of the headset with patches, fabric pens, or in any other way they see fit — leaving their own mark on a piece that hopefully left some mark on them.

Classes

Algorithmic Composition, Electronic Rituals, Oracles and Fortune-Telling, Expressive Interfaces: Introduction to Fashion Technology, Interactive Music, Open Source Cinema

]]>
/shows/spring2018/join/feed/ 0
Mimosa /shows/spring2018/mimosa/ /shows/spring2018/mimosa/#respond Tue, 15 May 2018 16:00:28 +0000 https://itp.nyu.edu/shows/spring2018/mimosa/ Continue reading "Mimosa"

]]>
Ridwan Madon

Mimosa is a wearable technology jewellery for women to distract creepy stares on their bustline.

https://www.ridwanmadon.com/single-post/2018/03/23/Project-Development-Progress

Main Project Image

Description

Inspired by the mechanics of a mimosa plant, the project reflects on the sole purpose of the wearable piece. Mimosa was made for women who are confident of their body, but feels uncomfortable when they receive stares on their bust line. Mimosa is triggered and controlled by the user as and whe she feels her space is being invaded. Using only her phone and bluetooth to trigger the servo motor, the piece creates an extention of the user ability to shield herself from this unjustified behaviour.

Classes

Expressive Interfaces: Introduction to Fashion Technology, Project Development Studio

]]>
/shows/spring2018/mimosa/feed/ 0
Rhythmic skin /shows/spring2018/rhythmic-skin/ /shows/spring2018/rhythmic-skin/#respond Tue, 15 May 2018 15:59:33 +0000 https://itp.nyu.edu/shows/spring2018/rhythmic-skin/ Continue reading "Rhythmic skin"

]]>
Nicolas Peña-Escarpentier

Wearable device for one-person procedural musical performances

http://itp.nicolaspe.com/2018/04/rhythmic-skin/

Main Project Image

Description

Using a set of 6 vibration motors, Rhythmic Skin transmits haptic messages to the performer, in order to give essential information about the different layers of music. Using a Teensy micro-controller, it can connect as a MIDI instrument, making it possible to connect to many audio softwares.

Classes

Expressive Interfaces: Introduction to Fashion Technology

]]>
/shows/spring2018/rhythmic-skin/feed/ 0