Interior Design with Machine Learning

Zahra Khosravi

Interior design color predictor will learn user's selection pattern and predicts and matches the color combinations for furniture (3 color options) and interior wall paint (unlimited number of randomly generated options) using machine learning. Users enter their choices using a touch screen platform.

https://wp.nyu.edu/zahra_k_itp/interior-design-with-machine-learning/

Description

The user selects their favorite color combinations for furniture and pair it up with the paint of interior of their house for a limited number of times and the color predictor code will learn their selection pattern and predicts their favorite combinations using machine learning methods for unlimited wall paint options. The code learns the users' selection pattern and predicts and matches furniture color from 3 choices ( dark Gray, White, Red) with unlimited interior wall paint options randomly generated by the code. The user gives the color predictor their favorite furniture color combinations for any given wall paint for a limited number of time by tapping their foot on a wooden platform ( the platform uses sensors to sense and record their choice). Then, by running the code, they will see what the color combinations the code predicts for other interior wall paint options on the monitor. It is very exiting to see your favorite furniture and interior wall paint combination predicted by the code.

Classes

The Nature of Code

Reactive Display

M.H. Rahmani

A display that reacts, zooms-in, and shows you more when you take a closer look.

http://blog.mhrahmani.com/project/reactive_display/

Description

This project tracks the viewer looking at a big display, and zoom in the area the viewer looking at when they lean-in to take a closer look. The premise is simple, but after making my first draft i realized how intuitive and helpful is this kind of interaction with a display. Imagine when you try to look closer at a subway map, wouldn't it be nice if the map also helped you out?

Classes

Live Image Processing and Performance

Digital Rothko

Jason Yung

Colour field painting in the digital age

https://

Description

Digital Rothko aims to bring painting into a new medium: light. Artist Jason Yung mixes colours and creates forms with light in the same way a painter uses oil paints. Digital Rothko is a colour field light work that is a interpretation of “the life within the squares” that Yung sees in Abstraction Expressionist painter Mark Rothko's work.

Classes

Project Development Studio

1968 Washington Square Park

Ilana Pecis Bonder

A site-specific augmented-reality experience that transports the user back in time to 1968 Washington Square Park.

http://www.ilanabonder.com/back-to-1968

Description

1968 was a unique moment in the life of downtown Manhattan, and a year that is said to have changed the world. By creating a site-specific cinematic AR time-travel experience, we transport the user back in time to 68' Washington Square Park.

In order to create a coherent storyline, we collaborated with playwright Joanna Evans – who wrote the spoken text which guides the users as they roam alongside characters from the time, extracted from original footage and images from 1968, which are layered in space across the square.

Exploring new and immersive storytelling methods is an essential step in creating the intergenerational chain, bringing historical depth and meaning to the social protests taking place in the US of 2018. Our goal is to introduce our audience to both the upheavals and the potentials of 1968, as they encounter those who waved protest signs at the same place, and often on behalf of the same values.

Classes

Magic Windows, Prototyping Augmented Reality

Faceblurk

Chelsea Chen

All the mosaic pixels as a whole paint the landscape of digital surveillance state, which is a blurred image of depersonalization.

https://chelseachenchen.com/2018/04/30/faceblurk/

Description

According to Manuel Castells’s The Power of Identity, in this digital age, the biggest threat to our privacy is no longer being watched by the “big brother”. Instead, “little sisters” – some business organizations that have access to our privates – are more relating to our personal lives. Judging by the ads I received on Facebook and Instagram I feel my social media know me better than myself, but they don't really care who am I, they only care me or anyone of us as a target custom, a sample set of the big data, a dispensable pixel of their business.

In this project, I want to convey this concept by using image processing skills I learned on Pixel by Pixel class, and create a depersonalized album – Faceblurk.

It's my fight back.

Classes

The World, Pixel By Pixel

Haptic Wayfinding Bracelet

Vidia Anindhita, Sandy Hsieh

Wayfinding wearable device and mobile app that translates turn by turn directions to haptic vibrations to safely guide visually-impaired users to their desired destination.

Description

For people with visual impairments or blindness, wayfinding, or the process of navigating unfamiliar spaces to reach a destination, is a complex and intimidating task. Although mobile phones now come preinstalled with screen readers (ie. iPhone VoiceOver and Android’s TalkBack), Google Maps and similar GPS navigation apps are not user-friendly. People with visual impairments or blindness rely heavily on multisensory feedback from their environment to navigate spaces, but the auditory feedback from the screen reader tends to overpower all other senses and can be very overwhelming for the user to concentrate on what steps to take next. This could potentially put the user in a distressed and vulnerable state, especially if others in the surrounding environment are also distracted (i.e. smartphone zombies). We are proposing to design a different user experience using principles of calm technology. Calm technology is a type of information technology where the interaction between the technology and its user is designed to occur in his/her periphery rather than constantly at the center of attention. Using Maps API and Bluetooth LE, we are developing a wayfinding bracelet and mobile app that translates turn by turn directions to haptic vibrations to safely guide visually-impaired users to their desired destination.

Classes

Bluetooth LE, Designing Meaningful Interactions, Expressive Interfaces: Introduction to Fashion Technology, Looking Forward 2: Design for Accessibility, Looking Forward 2: Design for Accessibility, Tangible Interaction Workshop

Revenge of Left Shark

Caleb Ferguson, Gabriel Goncalves

Play DDR with your hands! Many years after messing up at the Super Bowl XLIX, LEFT SHARK is ready to teach you his moves.

http://www.blog.calebfergie.com/2018/05/02/revenge-of-left-shark/

Description

Revenge of Left Shark is an interactive rhythm and dance game. In other words, DDR with your hands. The game uses computer vision to place the player on the beach, supply visual cues and provide real-time scoring information. Show Left Shark your moves!

Classes

The World, Pixel By Pixel

Rocket Slinger

Lucas Chung, Shreiya Chowdhary

An ITP arcade game that teaches you the fundamentals of orbital trajectories using rockets!

http://chung.work/blog/2018/02/orbital-slingshot-game/

Description

Rocket Slinger is an exciting game that teaches kids about orbital trajectories.

It features a physical controller that enables users to aim and launch rockets

with precision. It will be on display for a two hour period on May 13th at the

New York Hally of Science.

Classes

Designing Games for Kids

35 Years of Mass Shootings in the US

Mary Notari

A piece of music is generated using a comprehensive dataset of the past 35 years of mass shootings in the US compiled by Mother Jones as the score in an attempt to answer the question: how can music move us?

http://www.marynotari.com/2018/03/28/interactive-music-final-project/

Description

Since first publishing a report in 2012, Mother Jones has been continuously updating an up-to-date spreadsheet of dates, locations, casualties, and other metrics for each instance of a mass shooting that has occurred in the US since 1982. This project was a result of Yotam Mann's class, “Interactive Music.” We were challenged to imagine and perform a completely novel musical score––namely, one that did not use traditional music notation. To that end, I have conceived of this dataset as my musical score. Each column of data-points corresponds to different audio events within a browser-based sketch. I use Tone.js and Moment.js libraries to generate the audio and time it out. Mozart's “Requiem” provides the basis for the chord progression. The audio events occur in concert with a p5 animation over a map of the US, with the shootings visualized as ellipses sized according to casualty counts. Users may hover their mouse over each ellipse to receive a detailed description of the shooting as provided by Mother Jones. After 3 minutes of continuous play, a button with the words “Stop this” begins to fade in. The song and animation will loop infinitely unless the user clicks the button, which will lead them to a 5calls.org page about anti-gun-violence advocacy. Like the shootings in real life, nothing will change unless those who are able to take action. The reasoning for this feature is to connect the function of the piece to its conceptual core: what is the point of aestheticizing data this fraught? Can there be a tangible connection between aesthetics and action? How can music be used to make subjects that might otherwise be paralyzing and overwhelming accessible and knowable? Put another way: how can music move us?

Classes

Interactive Music

Live in Concert

Aidan Nelson

A virtual reality experience in which you've got to show up and make the best of a weird situation.

https://www.aidanjnelson.com/blog/tpos/live-in-concert/

Description

This virtual reality experience thrusts the player into a situation for which they are wholly unprepared. Based in equal parts on 'gig' jobs in the NYC arts scene and their accompanying anxiety dreams, this should be a 'good learning experience' and 'paid in cash.' The only way you can fail is if you don't show up!

Built in the Unity3D game engine using Steam VR toolkit.

Experience lasts about 5 minutes.

Classes

The Poetics of Space