Interactive Origami Lighting

Mathura Govindarajan, Cristina Cannella

Interactive modular origami lights that responds to change in physical arrangement.

Description

The project involves a modular origami structure with lighting inside them. The origami model is designed such that its spatial arrangement can be changed by manually moving it. Each movement results in different 'circuits'. That is, a different arrangement leads to a different circuit connection which in turn changes the behaviour of the light inside the model. The main interaction is physical in the sense that it has to be handled and played with to see changed in the lighting.

Classes

Introduction to Physical Computing, Introduction to Physical Computing

Hyper Headphones

Dhruv Damle, Viniyata Pany

A playful twist to an ordinary product! These headphones not only play, but also pan and boost the music you love.

http://dhruvdamle.com/hyperheadphones/

Description

In addition to just playing music, hyper headphones respond in real time when the user tilts head to left, right, front and back. The left and right tilt maps linearly to the left and right audio channel panning, while the front and back tilt changes the volume levels.

The product houses a sensor that measures the X and Y rotation. A microcontroller, also encapsulated inside the product, then maps these parameters to volume levels and channel panning. This project transforms ordinary headphones into an interactive and playful gadget. Use it as a wearable that accompanies you wherever you go, or use it as a cool head-banging gadget when you listen to the albums by your favorite band!

Classes

Introduction to Physical Computing

Moolody

Peiyu Luo

A recorder player that can translate human sound into pitches and play it.

http://www.peiyu-luo.com/2015/12/07/winter-show-doc/

Description

My final project is a machine that could read some patterns and play the music.There would be two parts:
First part is a drawing machine. Users should sing a song, then the machine could analyze the notes and translate them into some specific patterns and print them.
Another part is a Synthesizer. This machine could read special patterns and play the music in a interesting way.

Classes

Cooking With Sound, Introduction to Physical Computing

Disembodied

Shangbo Chen, Yun Li

An interactive sound installation that people can undergo different environments based on their own choices and have otherworldly experiences.

http://kylinsays.com/pcomp/disembodied/

Description

Size: 2m x 2m ~ 2.4m x 2.4m

Environment: Dark and Quiet

Using: Kinect, Projections on ceiling

Our project is an installation that aims to provide otherworldly experiences for audiences using Kinect, projections and sound. Participant walks on an empty grid of projected tiles on the ground, each one contains different environment that shifts as participant goes. We let people in charge of where they want to go and make their own explorations created by our design and their very own imaginations.

We use Kinect, paired with Processing, to track people’s movement on a 5×5 tiles board. The board’s width and height are 2 to 2.4 meters long. Preferably, a dark environment make our projections more visible. Once Kinect located the participant, the program would play a soundtrack through headphones, composed of many environmental sounds that build up the virtual world around people. Lights would project the grid on the ground as well as flavor texts that contain an elusive and suggestive storyline which helps participants understand the world they are currently in.

We built this project to encourage people to go wild with their imaginations. At first glance, it is an empty grid of nothing. We hope that when people finished the walk and look back, they would see something out of this empty space: a vivid world created solely with their own curiosity, memories and emotions.

Classes

Introduction to Physical Computing

Real Me

Jingwen Zhu

Real Me is a live chatting app with BLE connected sensors that uses graphics and messages to reveal user's real emotion during chatting.

Description

Real Me is a chatting app with BLE connected muscle sensor, pulse sensor and vibration sensor. It tracks user's emotion while chatting and using system messages to tell the real emotion of the user. Sensor readings also draws out geometric motion graphics as the background of the chatting room.

Classes

Designing for Data Personalization, Live Web, Understanding Networks

Rube Telephone

Chino Kim, Aaron Montoya-Moraga

A Rube Goldberg telephone that tangibly processes and translates conversation between two people, creating a fun audiovisual experience using various old and new communications technologies and producing unexpected outcomes.

Description

Using Morse code, language translation and optical character recognition, our contraption processes and manipulates conversation between two hacked rotary phones. Each step in the chain reaction is a tangible experience and occasionally distorts the message as it passes it along.

A one-way transmission may look like this:

Someone speaks into phone 1 > their sentence gets tapped out on a Morse key by a solenoid > the resulting Morse code gets translated back to English and then to Spanish > the Spanish string is spoken by the machine and then gets translated back to English > the resulting sentence gets printed out by a thermal printer, a camera takes a photo of the printed text and the text gets pulled out of the image using OCR. Both the camera view and the OCR output would be shown on a display > the resulting sentence is then spoken through the earpiece of phone 2.

To do:

– Reverse the signal flow when going from phone 2 to phone 1.

– Use audio thresholds to control the direction of transmission and properly route audio signals.

– Play hold music in phone earpieces while message is being processed.

– Improve OCR (crop photo, error handling).

– Implement autocorrect so that words mangled by Morse or OCR get converted to actual words before they’re passed off to the next step (Google search suggestions API).

– Code ringer behavior – if one phone is picked up, the other rings (we are already able to make the phones ring, we just need to program the Arduino to control them).

– Build/buy stands and mounts for each “station” (two phones, Morse, translation, OCR).

– Write an Arduino program that will control a light bulb progress bar (we already have the relays and wiring in place for this).

– Write a Morse code program that will convert the signals tapped out on the Morse key back to English (this isn’t a priority since we can fake it and use the English string from the previous step).

Classes

Introduction to Computational Media, Introduction to Physical Computing

Sound Boxing

Yuan Gao, Yuchi Ma

Experience sonification and visualization when practicing boxing punches

https://vimeo.com/148029990

Description

Our main sensors are put into the boxing gloves, they include accelerometer and gyroscope. Then in the show, we give users this pair of boxing gloves and a Thai Pad to play with each other, basically asking them to punch like a professional boxer practicing. We will use the data sent out from boxing gloves (through bluetooth) to control the Spot Light, which is set up on top of the show area, and Sound Effect (speaker connected to laptop) around this area. Let user experience the spot light changing (colours, strobe) and sound beats changing according to the punching gestures they made.

Classes

Introduction to Computational Media, Introduction to Physical Computing

True Love Tinder Robot

Nicole He

Literally put your love life in a robot's hands.

http://nicole.pizza/itp/tag/tinder-robot/

Description

The True Love Tinder Robot will find you love, guaranteed. With Tinder open, you put your phone down front of the robot hand. Then you place your own human hands on the sensors. As you are looking at each Tinder profile, the robot will read your true heart's desire through the sensors and decide whether or not you are a good match with that person based on how your body reacts. If it determines that you're attracted to that person, it will swipe right. If not, it will swipe left. Throughout the process, it will make commentary on your involuntary decisions.

In a time when it's very normal for couples to meet online, we trust algorithms on dating sites to find us potential partners. Simultaneously, we use consumer biometric devices to tell us what's going on with our bodies and what we should do to be healthy and happy. Maybe it's not a stretch to consider what happens when we combine these things.

This project explores the idea that the computer knows us better than we know ourselves, and therefore it has better authority on who we should date than we do. In a direct way, the True Love Tinder Robot makes the user confront what it feels like to let computers make intimate decisions for us.

The robot is built with an Arduino, servos, a text-to-speech module, LEDS, a couple sheets of metal acting as galvanic skin response sensor, a bunch of wires, a box, and a speaker. The code is available on Github. You can also find my in-progress documentation on my ITP blog.

This is my final project for my Intro to Computational Media and Intro to Physical Computing classes during fall of 2015 at NYU ITP. Special thanks to my teachers, Tom Igoe and Lauren McCarthy.

Classes

Introduction to Computational Media, Introduction to Physical Computing

Play Pixels

Jonathan James Gallagher, Soyeon Chung

A DIY 16×16 RGB LED Matrix with camera acting as a video mirror with 256 large silicone buttons on top for drawing pixel art.

http://www.soyeonchung.net/2015/11/03/pcomp-final-project-proposal/

Description

We built a 16 x 16 LED matrix using Sparkfun button pads. It is a large version of an arduino based monome. Each LED is RGB, driven by TLC5940 IC's and is capable of outputting a 16 x 16 downsampled version of a live camera feed. Each LED is also a button, we have used this feature to enable users to draw pixel art by pressing buttons to change their colour. The purpose of this project is to provide a novel interface that allows the user to perceive themselves in the context of the form that our device displays. Only large differences in colour or light are tracked resulting in a recognisable outline devoid of the finer detail that we are used to seeing and perhaps obsessing over when using a mirror. We display the basic form of a person, hopefully making people realise that image is primarily an issue of perception which varies from observer to observer.

Classes

Introduction to Physical Computing

A Flash in the Pan

Yiting Liu, Yue Hu

Grabbing the star and evoking magic effects.

Description

The basic idea of the project is to evoke lives by grabbing the falling stars and dragging them to the particular positions. The lives are in both virtual and physical world in several forms. To clarify, in virtual world, when gamers grab a star and put it on the tree, the flowers will blossomed. Another, when gamers drag the star to the stone in virtual world, the stone in physical world will be evoked as well.

Through this project, we would like to show that life is short, like the flash in the pan, the meteor shot across the sky. However, life is also infinite and powerful since it could be transformed into various forms and last forever.

In the project, leap motion was used to detect the hand positions and gestures. Moreover, we used Processing to code the virtual effects and Arduino to build the connection within the virtual and physical world.

Classes

Introduction to Computational Media, Introduction to Physical Computing