Game of life – Eco

Sid Chou

A Cellular automaton for the ecosystem.

https://sidchoublog.wordpress.com/2018/11/13/icm-week-9/

Description

The cellular automaton I’m creating, different from the well known Conway’s Game of Life, it models the entire ecosystem. On the canvas, the following categories of species are modeled, vegetation, herbivore, carnivore, in addition of a landscape element, river.

A new world will be generated every time the game start with river and vegetation. The player plays as a “God”, which they place species of herbivore and carnivore on the field. After amount of time, “human” will be introduced into the world, and start consuming the resource. When the resource was out, the cell will turn gray and be in the permanent dead state, while all the resources are used out, it is game over.

The player can strategically place carnivor to fight and human on the canvas; however, carnivores consume herbivor, and herbivore consume vegetation, so it also use up resource on its own. The goal is find a good balance to sustain the game as long as possible without using up the resource.

The game most likely to end up all the resource being used, it is intend to be almost impossible to achieve equilibrium which is meant to make the user consider about our impact to the environment.

Classes

Introduction to Computational Media

Reflected Landscapes

Dana Elkis, Matthew Ross

An interactive sound and light sculpture that is driven by the principles of echo, feedback and collision.

https://pinkeey9.wixsite.com/danaelkisblog/blog/pcompfinal-process-documentation

Description

Our inspiration and goal for this project is to create an effective metaphor and medium for the term echo as it relates to music. In music, echo can be defined as a “repetition or mimicking of a certain passage, usually with less force and volume than the original statement”(https://musicterms.artopium.com/e/Echo.htm) . Our goal is to give people who aren't necessarily musicians the ability to understand and engage with the principles of echo.

We have implemented this concept by building a 3 x 2ft box with a diffused window overlooking a large(700pixel) LED matrix, with 5-7 sliders that are positioned on the x and y axis of the box. As users move the sliders, they are able to “create echos” that begin spreading through the LED matrix and emit sound as they move and grow. If users move the sliders in such a way that the echos “collide” they will be absorbed into each others feedback loops, thus creating a new larger sound in the network, and introducing more complexity in the visual/audio feedback. Multiple people can be interacting with our project at once, and our goal is to create a novel experience as opposed to an instrument that one can “master”.

Echo is a powerful and compelling effect, one used universally in music and easily understood by a wide audience. As opposed to using echo as purely an effect we want to apply it as our driving artistic statement and in doing so give people the opportunity to easily participate in an engaging and playful musical experience. We are purposefully foregoing detailed instructions on how to use our creation, as our goal is to get people to explore all the possibilities within our system.

Classes

Intro to Fabrication, Intro to Fabrication, Introduction to Physical Computing

Cosmic Harp

Sid Chou, Louise Lessel

Inspired by an armillary sphere, the two concentric circles make up the laser harp instrument and allows the user to play sound decided by satellite data.

http://www.louiselessel.com

Description

The Cosmic Harp is an instrument that allows the user to play abstractions of satellite data. The harp is designed with inspiration the Armillary sphere used in astrology in the 16th century.

Designed as a futuristic-looking midi instrument, and using lasers instead of strings, the harp plays with notions of the new and the old. It is equipped with an accelerometer to measure the rotation of the user’s interaction with the instrument, and thus inspires a new investigation of how to play a harp.

The inner circle rotates and creates patterns in the sound when swung around. The outer circle allows the user to play more traditional harp strings.

By Louise Lessél and Sid Chou

Classes

Introduction to Physical Computing

The invisible bird

Tanic Nakpresha, Yuguang Zhang

How long we’ve been trapped in these cages that we built ?

https://www.nicktanic.com/blog/2018/12/5/itp-winter-show-2018

Description

We think we are the owner of our thoughts. But the truth is, our thoughts
have been unconsciously influenced by the social groups that we share a sense of unity.

The invisible bird is an interactive sculpture that uses computer algorithm to analyze various topics on Twitter and allows the viewer to guess what public’s attitudes about those topics are.In this way, we hope people to realize that there are invisible cages of thoughts that we’re living in, and how long we’ve been trapped in these cages that we built. 

Classes

Introduction to Computational Media, Introduction to Physical Computing, Introduction to Physical Computing

Mine and Yours

Rui An

I'll let users to share the attention I had when I was shooting some of my favorite videos on my phone

https://wp.nyu.edu/ra2901/2018/12/05/mine-and-yours/

Description

I instantly got fascinated by James Benning’s work when my roommate introduced his films to me.The small, mundane views of the world become interesting even provocative when we are forced to pay attention to them. The roaring cars, the empty cross roads and the random bushes scattered along side the fence, all come together and start to make sense to the viewers and the reason for that is our attention ultimately originates from our wishes to understand somebody or something. In this project, I want push people further to not only try to understand the videos but also me, as the person who shoots them — to share the same attention I had when I was shooting the videos.

In the project, a user will first choose a video to play on the phone attached to a robotic arm with just his or her eye gazing. (touch will not trigger anything, as long as user is looking at the screen a bubble will show up that matches the user’s gazing to hint the user to use eye to control the device) Then the robotic arm will adjust the phone to have roughly the same positions and orientations as the phone I used when I was shooting that video. Users are then prompted to look at the same point I looked at (where my attention was) to play the video, if they move away their gaze from that point, the video will stop, prompt them to look back. After the video finishes, the user can choose to watch another video.

Classes

Code to Racist Technologies

Winnie Yoe

"Code to Racist Technologies" is a project about implicit racial bias and colorism, and a subversive argument against technologies developed without thoughtful considerations of implications.

https://www.winnieyoe.com/icm/racist-webcam

Description

“Code to Racist Technologies” is a project about implicit racial bias and colorism. It is also a subversive argument against technologies developed without thoughtful considerations of implications. The project is inspired by Ekene Ijeoma’s Ethnic Filter, Joy Buolamwini’s research “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification” and Dr. Safiya Noble’s talk “Algorithms of Oppression: How Search Engines Reinforce Racism” at Data & Society.

As machine learning is increasingly infiltrating aspects of our society, it is important to recognize how our biases, in this case, implicit racial bias, are translated to technologies we design, and how that could lead to drastic and often detrimental effects.

The project, in form of an interface experience is two-folded. User first completes an implicit bias association test, which I partially adapted from a test developed by researchers from Harvard, UVA and University of Washington. Once completed, user enter section two, where they find out their probability of developing a racist technology and how this technology they develop will affect people of different skin tones. What the user does not know is that none of their test results are recorded. In fact, their test result always shows that they prefer light-skin relative to dark-skin, with a random percentage from 80-90% that they will develop a racist technology.

Initially, the project only consist of section 2, a “racist webcam” that isolates skin tone from the background, determines where the skin tone falls on the Fitzpatrick Skin Type Classification, then map the value and change the degree of pixelation in the user's webcam capture. The darker your skin tone is, the more pixelated your video appears, and the less visible you are. The layout of the program is designed that each time it runs, the user’s video capture will be shown alongside eight pre-recorded videos. The juxtaposition of the user’s experience with the other participants’ heightens the visual metaphor of the effects of racial bias on one’s visibility and voice.

My goal for this project, and the reason why I added section 1, is because I hope that users will realize all of us are bias, and it is only with a very conscious awareness that we will stop creating racist technologies that bring detrimental effects.

What is the code to racist technologies? The code is you and me, especially the you and me who assume because we are “liberal” and “progressive”, we are not or will not be a part of the problem.

Classes

Introduction to Computational Media

Soft Steen

Rachel Lim

A soft, small scale interactive reproduction of Jan Steen's painting, "As the Old Sing, So Pipe the Young"

https://

Description

This project was inspired by Jan Steen's dynamic and humorous paintings and my desire to share my excitement about his works. The painting my project is referencing, “As the Old Sing, So Pipe the Young”, depicts a family raucously celebrating a baby's baptism and is punctuated with moralizing symbols and gestures that imply Steen meant to warn the viewer rather than invite them to copy the family's behavior.
As a small (size of printer paper), simplified soft reproduction, I intended to have this serve as a gentle and friendly introduction to Steen's work and 17th century Dutch paintings, which are abundant in historical motifs and symbolism. It will not be an extremely detailed as I feel accuracy may feel intimidating or too reminiscent of actual artworks at a museum. I'd like to keep it loose (the fabrics in the video are rough outlines!) and simple to keep it playful and create a more inviting appearance.
The symbols in this particular painting (a dog, a hat, a pipe, and a pot), are removed and act almost like puzzle pieces, which the user must correctly place within the painting. Once it is placed, supplementary text and audio on an iPad next to it will explain it's significance and provide context. Its setup is intended to mimic the way artworks are hung at museums/galleries, which have the work and informative text next to it. When all the pieces are put together, the figure at the center of the painting will raise her wine goblet and congratulatory text will appear on the iPad screen to signify that the painting has been completed.

Classes

Introduction to Physical Computing

Conducting Paintings

Chenyu Sun, Nianqi Zhang, Yunze Shi

Conducting the painting — an integrated interactive way to appreciate paintings

https://wp.nyu.edu/nianqi/2018/12/05/pcomp-finalworking-in-progress/

Description

In this project, we want to offer a different approach to appreciate landscape paintings.

We will create an sound interface for user to interact with a Chinese Landscape and an Landscape oil painting through the baton.

The baton will trigger the sound reflecting the elements in the paintings. Also the music will change based on the movement of the baton.

On the music shelf, users will be able to choose between 2 pages of music notes. When page selected, users will be able to interact with the painting which the page reflected.

Classes

Introduction to Physical Computing

Plant Boye

Faith Zeng

Experience the satisfaction of growing your own plant friend without dealing with the real responsibilities and risks!

http://itp.nyu.edu/classes/cc-f18-tth/plant-boyes-in-progress/

Description

Using sensors on the Arduino, the users will be able to pick, name, and grow a plant on the computer. The users will interact with each sensor, going through the motions of planting such as patting down soil or watering the plant. Each physical change will cause the virtual plant to grow a bit more. At the end, the user will be able to take a “picture” (screenshot) of their plant.

Classes

Creative Computing

Household Tongs

Tong Wu

Could human bodies become household goods?

http://tongwumedia.com/blog/performative-avatars-final

Description

The products of technology, such as household goods, have developed way beyond the mere role of everyday tools, it has such a significant existence that it has transcended their natural form. Their meanings in our lives have expanded into something more complex. You could say, human create objects, and objects re-create human beings.
I am deeply obsessed with the vague boundary between humans and objects in modern context. Are humans objects? Or could the human body become object? Inspired by this idea, I did a series of AR tiny installation, in which I turned my avatar into a fan, a spiral bone and a set of headphone. What I want to explore through this project is that, If we deprive ourselves of the uniqueness of being “us“, human beings, freeing our body parts to be duplicated, cut off, folded, twisted, enlarged, shrunk, adapting bodies into inanimate objects and using it to recreate daily life context, would the concept of being human change, or it just better depicts the identity of “human” in this era of tech mania?

Classes

Performative Avatars