Dan O’Sullivan – ITP / IMA Winter Show 2018 /shows/winter2018/ A Show for the Recently Possible. Sun, 16 Dec 2018 17:27:18 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 /shows/winter2018/wp-content/uploads/sites/45/2018/11/cropped-ITP_logo_2014-01.original-1-32x32.png Dan O’Sullivan – ITP / IMA Winter Show 2018 /shows/winter2018/ 32 32 UnBlue /shows/winter2018/unblue/ /shows/winter2018/unblue/#respond Fri, 14 Dec 2018 18:27:14 +0000 https://itp.nyu.edu/shows/winter2018/unblue/ Continue reading "UnBlue"

]]>
Tsimafei Lobiak

A relaxing musical experience with generative art

https://editor.p5js.org/Luxone/full/Hk1riG03m

Main Project Image

Description

Unblue is a relaxing experience for those who want to take a break and enjoy generative patterns. The name UnBlue comes from idea that hopefully this experience can improve your ‘blue’ mood. Also, it comes from the deep love for blue colors. This project is intended for 1 person experience using headphones and a computer. The project consists of a main screen, 3 chapters showing different interactive patterns and also 2 transitions (one for beginning and one for the end). Each of the transitions are accompanies by lyrics to inspire or relax the listener. For each chapter there is a different musical composition and interactions as well. First chapter is called Fossil and has a spiral made out of distorted ellipses, creating the illusion of 3-D object. The second chapter is ’Seed’ and it is a drawing tool over moving rectangular tiles. The last chapter is Atom, and it is a complex particle system controlled with mouse clicks. The work is mostly completed (you can test it out youself), but there are plans to add more chapters/interactions before the show.

Classes

Introduction to Computational Media

]]>
/shows/winter2018/unblue/feed/ 0
Jinxed Blinks /shows/winter2018/jinxed-blinks/ /shows/winter2018/jinxed-blinks/#respond Fri, 14 Dec 2018 18:26:53 +0000 https://itp.nyu.edu/shows/winter2018/jinxed-blinks/ Continue reading "Jinxed Blinks"

]]>
Zoe Wells

A painting that tells a looping story by changing every time the viewer blinks.

https://itp.nyu.edu/classes/cc-f18-tth/final-project-progress-2/

Main Project Image

Description

This project is telling the story of the girl Solveg from the Roald Dahl book “The Witches”. In the story, Solveg is cursed by a witch to be stuck inside a painting of a farmhouse that her family owns. No one ever sees the painting move, but whenever they look at it, she has changed position slightly, aging and eventually disappearing as the years go by. I was also inspired by “La Jetee”, a film taking the form of a slideshow of still images, and “In the Blink of an Eye”, a book by Walter Murch that compares cuts in editing to natural blinking patterns.

In my project, I am using a facial recognition library in python to watch for when the viewer’s eye aspect ratio (EAR) goes below a certain threshold for 2 frames. To improve on the blink detection accuracy, I added a feature that allows the user to calibrate the blink threshold to their own EAR by pressing the spacebar, which they would be prompted to do when they approach the project. Their blink will trigger a the image to change to the next still in the story. The story will have the girl aging and changing her position in the painting, with her environment aging with her. For the final version, I plan to have 15 images in an oil-painted style for the full loop of the girl growing old and disappearing, since the average person blinks around 15 times a minute.

My ideal setup for the project would be a monitor with a frame around it to complete the “painting” effect. My laptop, with the camera for facial recognition, would be placed below the monitor. I also plan to have a soundscape of nature noises coming from the painting, with a sound effect when a change happens.

Classes

Comm Lab: Hypercinema, Creative Computing

]]>
/shows/winter2018/jinxed-blinks/feed/ 0
The Space Well /shows/winter2018/the-space-well/ /shows/winter2018/the-space-well/#respond Fri, 14 Dec 2018 18:26:48 +0000 https://itp.nyu.edu/shows/winter2018/the-space-well/ Continue reading "The Space Well"

]]>
Haozhong Yao, Robert Ye

An experience of floating and diving into the deep space through a well.

https://wp.nyu.edu/hy1460/wp-admin/post.php?post=54&action=edit

Description

Concept:
– The universe is a mysterious well with unknown depth. Traveling in the space could feel like falling down to an abyss. We want to waive gravities from three different dimensions (one in the well, on from the ground the user stands, the one in outer space) into our project. Receiving the gravity from the earth while experiencing the gravity-less space environment, the user may find a spirit of exploration during the processing of falling into the well.

User Experience:
When you stand by the well and look into the water, the silhouette of you appear in the center of the well consist of stars.
If you put your arms up parallel to the ground, the projection of you will begin to fall into the deep space as the stars zoom in and scatter with acceleration, creating a sense of space traveling. And when you reach a certain falling speed, you will see the big bang.
If you swing your righthand very hard, your projection will spin and flash.
If you put your arms just a little away from the trunk – the projection is going to rotate a bit to create a sense of floating.

Setup:
The project is a 34-inches-wide round inflatable pool, covered by a military waterproof black canvas and filled with 5 gallons of water.
A Kinect camera from 8 ft away captures the data of the user’s body.
– A projector hanging on the ceiling projects the animation of the stars in outer space into the water from up above, and the animation of the user’s figure.

Classes

Comm Lab: Hypercinema, Comm Lab: Hypercinema, Creative Computing, Creative Computing

]]>
/shows/winter2018/the-space-well/feed/ 0
Business Fan /shows/winter2018/business-fan/ /shows/winter2018/business-fan/#respond Fri, 14 Dec 2018 18:26:13 +0000 https://itp.nyu.edu/shows/winter2018/business-fan/ Continue reading "Business Fan"

]]>
Joseph Baker

Bleeding the physical and the digital world together with an iconic folding fan.

https://

Description

Tokyo, Japan

As I was waiting for the train on a hot summer day, I see a business man pull out a discreetly crafted, folding fan. After stepping into the subway, he placed it in his front blazer pocket as the air conditioning took over. This memory is tied to a country where I let my happiness thrive in the face of unfamiliarity.

The allure is finding a purpose in it all. When time is spent questioning what someone wants to do, the physical world starts to peel back little by little like the folds of a fan ¬¬¬–Tucked out of view but not forgotten.

The fan’s projection evokes what it feels like to open yourself to the world. This incorporates depth and brings another dimension. Showing one wave made from a fan has ripples of effects even if it is not in the physical world.

Classes

Introduction to Computational Media, Introduction to Physical Computing

]]>
/shows/winter2018/business-fan/feed/ 0
Peace Bomb /shows/winter2018/peace-bomb/ /shows/winter2018/peace-bomb/#respond Fri, 14 Dec 2018 18:26:00 +0000 https://itp.nyu.edu/shows/winter2018/peace-bomb/ Continue reading "Peace Bomb"

]]>
Alizarin Waissberg

An interactive experience aimed to cheer the world up with an explosion of positivity

https://editor.p5js.org/AlizarinZ/full/SkCF_3gJV

Main Project Image

Description

Peace bomb is a simple, one-click interactive experience. Instant and surprising, lasting no longer than 30 seconds, the aftershock of Peace Bomb shall shake you to the core long after leaving the scene. Loaded with invasive audio, armed with visual shrapnels, the target of the Peace Bomb is to violently bring you joy.

Classes

Introduction to Computational Media

]]>
/shows/winter2018/peace-bomb/feed/ 0
Vo-5ynth /shows/winter2018/vo-5ynth/ /shows/winter2018/vo-5ynth/#respond Fri, 14 Dec 2018 18:25:52 +0000 https://itp.nyu.edu/shows/winter2018/vo-5ynth/ Continue reading "Vo-5ynth"

]]>
Tushar Goyal

Generative music instrument that makes music algorithmically from the users voice

https://wp.nyu.edu/tushargoyal/2018/12/05/vo-5ynth/

Main Project Image

Description

Vo-5ynth, pronounced “vo-synth” (short for “voice-synthesizer in p5”) is a music instrument, but, its not a an ordinary one. First it doesn't have a sound of its own. Instead, it uses the users voice to make music. Second, its a generative music instrument so it doesn't give the user complete control over the music making process, instead, it encourages the user to (in a way) collaborate with the instrument (“the machine”) to make music. The user can define the pool of notes from which music can be generated but the actual output (notes for the bass, melody, harmony) is generated algorithmically by the instrument. The user can change the selected notes in real time to change how the music sounds and can thus, play this as an instrument that can be performed live or be used as an ideation tool, or just for fun really!

Classes

Intro to Fabrication, Introduction to Computational Media, Introduction to Physical Computing, The Code of Music

]]>
/shows/winter2018/vo-5ynth/feed/ 0
Pandemonium /shows/winter2018/pandemonium/ /shows/winter2018/pandemonium/#respond Fri, 14 Dec 2018 18:25:46 +0000 https://itp.nyu.edu/shows/winter2018/pandemonium/ Continue reading "Pandemonium"

]]>
Dingwen Kou, Ruyi Chen

Our oblivious sonic footprint can cause acoustic trauma or even death to underwater life.

https://wp.nyu.edu/rc3989/2018/12/05/whats-under/

Main Project Image

Description

“We now know that the underwater world is anything but silent. In fact, today's researchers are concerned that underwater noise produced by humans is distracting, confusing – and even killing – aquatic animals.”

Based on prior and current researches, noise pollution caused by human activities in the ocean (underwater explosion, ship or boat sound, construction, etc.) can and does severely disrupt whales' behaviors such as their communication and their ability to find food, mates or avoid predators, thus threatening their lives or even directly causing death. Therefore, we decided to use the microphone as the primary input and interaction in this project to simulate how underwater acoustic trauma is produced by humans and how it is largely affecting marine species which would be whales in this case.

When the sound made by the user from the microphone input hits a certain level:

The background color will change from black to a bloody red as the choppiness of the ocean will be influenced by the volume of the noise. The whales will be disturbed heavily and thus lose their track of peaceful and orderly swimming orbit which too will be affected by the volume of the noise.

When the environment calms down again, fortunately yet ironically everything will go back to the serene environment that the piece starts with, while in reality, most of the time the trauma caused would take decades to ease and heal and the tragedies of death are irreversible.

Primarily, this project acts as an educational and warning piece to every single one of us even though you might think that you are not doing anything bad.

These ocean threats, however, are also human threats.

As Cousteau said, “For most of history, man has had to fight nature to survive; in this century he is beginning to realize that, in order to survive, he must protect it.”

Classes

Creative Computing

]]>
/shows/winter2018/pandemonium/feed/ 0
sensorship /shows/winter2018/sensorship/ /shows/winter2018/sensorship/#respond Fri, 14 Dec 2018 18:25:43 +0000 https://itp.nyu.edu/shows/winter2018/sensorship/ Continue reading "sensorship"

]]>
Helen Hutchens, Xiaoshi Liang

Sensorship explores what happens when censorship is not longer an omission but a transmutation, by challenging two people—whether near-strangers or old friends—to hop in and figure out how to convey a message when your earpiece is against you.

https://carrefinho.com/sensorship/

Main Project Image

Description

Our relationship with those around us is increasingly carried out on screen. As the magnitude of digital communication increases inverse to face-to-face interactions, it belays the question: what happens when what we mean to say is not faithfully transmitted to who we’re communicating with?

The censorship of thought, language, and expression has, of course, been carried out in past ages, our current times, and likely future eras. Yet most of this has come across via blockage—a hand obstructing a printing press, a story cut from a manuscript, a forced deletion on social media…

What happens, though, when your message comes through? We trust that our machines will relay our messages in good faith. But what happens when your message is altered, warped, and effectively pre-packaged for its recipient? What can you do when the medium of communication itself believes in 'fake news'?

Users will assume different roles: one person will be a scientist who is a member of the UN’s Intergovernmental Panel on Climate Change (IPCC) and the other an emissary of the research to the world. They have to try and communicate—somewhat ironically by speaking into two analog phones—through a system that twists the terms that matter most (e.g. “global warming”) replacing them with other terms and thus meanings. Using a speech-to-text-to-speech process and the RiTa language library, both parties will be able to have a conversation, albeit one tampered with. The emissary's task is to receive the correct message from the researcher. They’ll have to beat the system in creative ways.

Classes

Creative Computing

]]>
/shows/winter2018/sensorship/feed/ 0
The invisible bird /shows/winter2018/the-invisible-bird/ /shows/winter2018/the-invisible-bird/#respond Fri, 14 Dec 2018 18:25:11 +0000 https://itp.nyu.edu/shows/winter2018/the-invisible-bird/ Continue reading "The invisible bird"

]]>
Tanic Nakpresha, Yuguang Zhang

How long we’ve been trapped in these cages that we built ?

https://www.nicktanic.com/blog/2018/12/5/itp-winter-show-2018

Main Project Image

Description

We think we are the owner of our thoughts. But the truth is, our thoughts
have been unconsciously influenced by the social groups that we share a sense of unity.

The invisible bird is an interactive sculpture that uses computer algorithm to analyze various topics on Twitter and allows the viewer to guess what public’s attitudes about those topics are.In this way, we hope people to realize that there are invisible cages of thoughts that we’re living in, and how long we’ve been trapped in these cages that we built. 

Classes

Introduction to Computational Media, Introduction to Physical Computing, Introduction to Physical Computing

]]>
/shows/winter2018/the-invisible-bird/feed/ 0
Plant Boye /shows/winter2018/plant-boye/ /shows/winter2018/plant-boye/#respond Fri, 14 Dec 2018 18:25:00 +0000 https://itp.nyu.edu/shows/winter2018/plant-boye/ Continue reading "Plant Boye"

]]>
Faith Zeng

Experience the satisfaction of growing your own plant friend without dealing with the real responsibilities and risks!

http://itp.nyu.edu/classes/cc-f18-tth/plant-boyes-in-progress/

Main Project Image

Description

Using sensors on the Arduino, the users will be able to pick, name, and grow a plant on the computer. The users will interact with each sensor, going through the motions of planting such as patting down soil or watering the plant. Each physical change will cause the virtual plant to grow a bit more. At the end, the user will be able to take a “picture” (screenshot) of their plant.

Classes

Creative Computing

]]>
/shows/winter2018/plant-boye/feed/ 0