Category Archives: Final Project

Our Lives as Snakes

What and Why

For the final project, I made a simple snake game that recorded the replay of each game. The replays of all the people who played the game can then be watched at the same time and the interactions between the snakes become very apparent. The point of the project is to illustrate the way we interact with people in our lives, whether they be complete strangers, simple acquaintances, or close friends. Here is the message that is displayed after a player finishes a game:

“That seemed like a pretty basic snake game, and up until this point it was. But a recording of the game you just played was saved and can be viewed alongside all the other people who played this game. Your snake might have been in the same spot at the same time that another’s was, but there was no way for you to know until you watch all of these recordings in parallel.

Living in New York City, I pass by hundreds of people just on my way to class and I never interact with any of them. It’s interesting to think about where they might have come from and where they might be headed. In half an hour I’d be sitting in a lecture I’m probably not paying attention to and they could be on a train to Boston. But for that brief moment in time while we waited at that stop light, our lives intersected. Maybe we’ve stood at that same spot together multiple times before, or maybe that was the first time we had ever “met”. Maybe those few seconds of our lives would be the only few seconds we will ever spend in the same spot. The point is, we tend not to think about these things. We tend to focus on our own lives and our own goals. While playing this game, you happened to follow a very similar path as someone else who wandered to this site.”

After this message is displayed, the player can watch their own recording alongside the replay that most closely matched their own. This way, the player can see how they were running alongside someone else at the same time but didn’t notice while playing by themselves. Once the replay is finished, another message is displayed to the user:

“As you may have seen, there were points where the two snakes intersected, and there were points where they ran in parallel. I think it’s an interesting illustration of how we interact with people, whether they be complete strangers, colleagues, or close friends. We each have our own goals, but that doesn’t mean we should block out everyone else’s.”

Players can also watch everyone’s replays at the same time here:  http://snakes.kzhang.io/replays

How

To implement this, I used Node.js on the backend to serve the static html files while also storing all of the replays in a JSON file. I also used socket.io to implement real time functionalities like replays being displayed at /replays instantly after someone finishes their game. I use p5.js to draw the game board, snakes, and food. A “snake” is a simple array of coordinates, with which I draw a series of ellipses that progressively become more opaque so that it looks like it is fading out.

To implement replays, I store all of the necessary information in a Javascript object:

This object is stored on the server in memory and also in a JSON file that is parsed when the server is started up. This way, if the server ever crashes, the replays are still saved. To display the replays, I simply go through each point in the replay and draw the snake as well as the food objective at that point. This happens for all of the replays at the same time so they can all be viewed in parallel.

The code can be found on my github: https://github.com/Kinzeng/socket-snakes

Demo

To see all of the replays currently stored on the server, simply click this link: http://snakes.kzhang.io/replays

If you want to try it out, go to http://snakes.kzhang.io and hit space!

Social Weight Instagram Machine

For my final project I set out to create a machine that puts Instagram notifications in the physical world. I initially wanted to work with the Instagram API to make it so I could simply type in a username and the notification data from their most recent post would be pulled (This should have been possible given that this is public content, but apparently Facebook hates developers). When I discovered that I couldn’t access public content without greater API approval than sandbox, and was rejected from that approval process, I decided that as a proof of concept I would simply build the machine for myself.

The Idea of the final product had two elements, a physical representation of the “likes” and a textual representation of the user’s “self-worth” based on the number of likes on a photo. Every time the photo got a like, a ball would drop and a message would show on the screen. The messages began as condescending, and progressed to more pleasant and kind messages. i.e. the first message is “sad” at 15 likes the message “not everyone hates you, just most people” appears, and at 40 likes “good content” is displayed.

Of these two pieces I was only able to get the second one to work consistently. The messages, stored in an array, are displayed in the most condescending font imaginable — comic sans. I would have liked to make this code a bit more efficient, perhaps storing the number of likes in an array as well, therefore only having to type the font information once rather than for every instance of the messages (which became a bit tedious). Regardless, it worked in the way it is written.

The second piece of the project proved much more difficult, I’m still not incredibly proficient with JavaScript (and my lack of sleep didn’t help toward the end, sorry for being incompetent at times…), so making this part took a lot of work, and a bit of help. For a while I was able to print the notification info in the console, but for some reason (which I still don’t know) I couldn’t call it anywhere. I decided it was best to rewrite everything from scratch and organize it more sensibly/comment out more of my code to make it easier to find things.

The machine eventually worked for a bit at a time before failing, so going forward I’d like to do a few things to improve it.

First – I need to update the communication between the Arduino and p5 to set up a ‘checks and balances’ sort of system where p5 doesn’t send the Arduino new information until it asks for it. Currently too much info is being sent and the whole program freezes up after a few notifications.

Second – On the physical build of the machine, I’d either like to better figure out the correct angle of the delivery tube, and rate at which the servo motor opens and closes, or create a new delivery system that doesn’t rely on such finicky machines.

Third – I want to house the whole project in a nice chassis, I plan to 3D print a new chassis for the whole project to make it more aesthetically pleasing and function more smoothly.

Finally – Instead of using Instagram’s API I want to switch to YouTube, making the machine a subscription counter. YouTube’s API is much easier to work with and much less restrictive, so the whole machine would hopefully work better.

 

My Code is available here

It will currently only work with media ID’s from my own instagram @chestahab due to Instagram’s restrictions.

You can try it here (but it’ll just be the text responses since you don’t have the machine – Also it might not work unless you download it)

Here are a few Media Id’s for testing

1469227156566679526_40290595

1470499988697034800_40290595

1471185248866935939_40290595

1471956652910591666_40290595

1247180011493478957

Personally Victimized by Small Robot Hand

I made a video detailing my process here. The basic algorithm is that when I put my phone down (aka when the force resistive sensor (FRS) registers a reading), a twenty five minute timer starts counting down and the ultrasonic sensor & servo kick in. I tried to make it as precise as possible, so that the servo with the tiny hand starts beating down when my hand is in range of the phone. When the phone is taken off (aka FRS no longer registers a reading) the annoying piezo alarm goes off until phone is put back down, and the timer resets. When timer reaches 00:00, a break timer begins for five minutes and it’s safe to take the phone off without any consequences.

Here’s a picture of the final product with all the components. It involves some super tight breadboarding of five individual components: LCD, ultrasonic, FRS, piezo, and servo motor. I could have definitely used a bigger space but I was pretty amazed at how I organized it, and plus there was the added comedy effect of the hand just destroying the ultrasonic sensor.

Code here! Big thanks to the CountUpDown library for timer help and the Arduino.org tutorial on playing melodies with piezo. I still need to come back to update with a few older blog posts but I really enjoyed making this project for class and my friends LOVED it.

Final Project

I made an interactive poem that should be read aloud, and when certain keywords are said, pieces of an illustration appear on screen so when you’re finished you are left with an image.

I did this because I was thinking about imagery in creative text, and how words are so apt to inspire visuals. Reading aloud, especially is a potent way to bring text “to life” and I thought there should be some sort of interface that makes this intangible feeling more tangible. There is an important connection between text, sound, and images and here, the text inspires the sound (reading aloud) and the  sound inspires the illustration.

I did this by linking words to hidden images in a code that combines css html and javascript (basically a website) and set them as a trigger so when they are activated the hidden images become unhidden. Then I used the voice recognition software annyang to make the triggers voice activated.

Now, all didn’t quite work out how I wanted it to, or expected it too. The software if relatively unresponsive so you have to be aggressive in speaking the keywords and they can’t just be read in context of the entire line. This makes for a dramatic reading with too many pauses and lots of yelling out of frustration on my part. It’s a lot more effort than I thought, and it kills the readers ability to read the text naturally.

HOWEVER, I realized that this is probably a more accurate representation of what I was trying to achieve. It’s pretty much impossible to try and FORCE someone to see what you are seeing when you are reading a text. You have to try so hard, repeat words and yell sometimes to communicate a visual message with just words. That’s the beauty of text, people see different things when they read it. Its very frustrating to reach a common ground. I think the overall effect and difficulties involved emulated the process of translation between text, speaking, and images.

CODE https://github.com/g2eleni/Final-Project

101 things to do before graduation

My project combines a countdown timer until graduation day and a checklist of things to do before graduation. The checklist is taken from NYU’s list of 101 things to do before you graduate.

I made this project because I was inspired to visualize the passage of time and the time remaining until graduation, in an effort to make the most of my time left at NYU. I decided to combine the countdown timer with the number of items remaining on the checklist to visualize the number of outstanding items, and how much time users have to complete those items. My hope for this project was for it to serve as a motivation for users to go out and make experiences in their time left at NYU. Personally, I received this checklist during my first semester, and I have made minimal progress in the past two years. By having a constant physical reminder of the elapsing time, I hoped my device would inspire me to spring into action.

My project consists of two parts. The first part is a checklist webpage that displays all 101 items and allows users to check off items they’ve completed. A counter at the top of the page displays the number of outstanding items, as well as the number of completed items.

 

View post on imgur.com

The second part of my project is an LCD display that shows the days, hours, minutes, and seconds remaining until 11:00 AM on May 17th, as well as the number of outstanding items from the checklist. The display assembly is accompanied by two LEDs. The red LED blinks every second and serves as an indicator of passing time, while the green LED lights up once the user has completed all items, or when the timer runs out.

View post on imgur.com

If the user checks off all items on the webpage before the timer runs out, the page displays, “Congratulations, you have done everything!”. If the user fails to check off every item on the webpage before the timer runs out, the page displays, “It’s OK, you don’t have to do everything”. The messaging of the program congratulates users for completing all items but also reaffirms users if they don’t complete every item. Either way, the program ends when the timer runs out, marking commencement as an achievement and end goal in itself.

Ultimately, I wanted to capture the importance of making experiences with one’s time remaining at NYU. Whether or not my project achieves that is up for debate. However, I think seeing the constant passage of time and things to do helps foster a sense of urgency in users.

See it in action | Try it yourself | Code on Github

Final Project

I made a photo booth for my final project. It works the way one would expect a photo booth would, but also has elements from Japanese photo booths, called Purikura, which are popular in Asia. I’ve always really enjoyed making these photos, which are printed on stickers that people used to stick to their phones (before smartphones existed) but it’s always really expensive. I thought it would be nice to recreate some of the fun on my computer.  There are two key differences between photo booths in the US and Japanese ones is that the Japanese ones edit your pictures to make you more attractive and after the photos are taken, the users can draw on and place images on top of the photos. I really could only do create the latter experience, but think that even without the automatic airbrushing, using this photo booth is a bit of a unique experience for those unfamiliar with Asian culture. I would imagine this version would be used at parties as a kind of replacement for a guest book.

This photo booth takes photos of the users and allows them to select with the number pad and paste with a click of the mouse small images on top of their photos before finally saving the whole collection to the computer. To retake the photos, they can refresh the window and can delete objects they’ve placed using the backspace.

Using this booth is a bit difficult and I’ve left it a bit vague. While I’ve been taking photos in these booths all around the world, I’ve never been with someone that actually can read or understand Japanese, so while there are many clear instructions for how to use the machines, I’m never completely sure if I’ve got it right. I was taught how to use them by people who figured it out on their own and I’ve done the same for others. Still, one is able to choose and paste stamps where they want, deleting images for which they didn’t like the placement. They should also be able to save the images when they’re done, as well.

Below is a demonstration video ft. my roommate.

This is the class using the photobooth

You should be able to check it out here.

The code is available here.

Final Artificial Light

WHAT

I made a light source for my living room which imitates the brightness of the sun and the color of the sky.

WHY

New York City apartments oftentimes do not have natural light sources in most rooms (like in my apartment where only bedrooms have light). It’s dark. It’s gloomy. It’s also kind of lonely.

HOW

The materials I used include: an Arduino, an AC/DC power switch tail, an RGB addressable LED strip, thin paper strips, masking tape and artist’s tape.

Here’s my p5 sketch:

As well as my arduino sketch:

 

Final Project

My final project idea was to create a piano with the Arduino and then make a display for it in p5. The piano uses capacitive resistors so the keys play when you touch them. I got the ideas from here:

http://www.instructables.com/id/Capacitive-Touch-Arduino-Keyboard-Piano/

http://playground.arduino.cc/Main/CapacitiveSensor?from=Main.CapSense

The piano works fine, but for some reason there is a large delay between the Arduino and p5 and it is not very responsive. I did get one key to work very well in the test trial but forgot to record it.

For the visual I started with a very simple one that would tell you what note you were playing and wanted to create a way for you to know how often you played that note. So I created rain and blocks. I originally wanted the blocks to stack but that didn’t work. I was keeping track of a bottom variable and moving it up each time a key was pressed for that section, but it wasn’t working. For the second idea I wanted to use Perlin Noise but I was having difficulty modifying the code to do the things I wanted, like changing all the points colors at once from a key press, or using textToPoints() to draw the notes. So I did minor changes to Shiffman’s code, and added an amp so you could tell how loud the sounds were. I thought I could map the points of the letters outline then make them in to the flow field by making the points in to static lines and then add the points as vectors to flow through the letters. This is Shiffman’s orginal code:

https://github.com/CodingTrain/Rainbow-Code/tree/master/challenges/CC_24_PerlinNoiseFlowField

Since the first part was almost completely my code and the final part was all my code I decided to keep this part in and I just really liked how it looked.

For the last part I mad a program that could tell if you were playing Mary Had  a Little Lamb properly and would slowly draw a lamb for you. I put the code in to hook it up to the physical keyboard but the keyboard still isn’t working that well so I left it commented out. I drew the lamb myself and did like a stop motion animation style using Illustrator.

The piano sounds are from here:

https://www.freesound.org/people/jobro/packs/2489/?page=1#sound

I was going to have more octaves initially but for some reason the code I wrote did not work. I was going to add something like:
if(keyCode === UP_ARROW){

notes[i]+=12;

}

But it wasn’t doing anything. I made them in to three seperate codes because the way they were set up it was easier to do that some had the background withdrawn and it would have been a lot with all the sound files and stuff to add in. It is doable but I like it better as three programs. Since the physical keyboard doesn’t work that well I also had built in using the computer keys as keys and layed it out the same as a piano. In the videos you won’t hear the oscillator because you can only hear it with ear buds.

Here is a link to all o my code.

I originally built 13 keys but was having difficulty with the shift register so I left it at 8 and didn’t use sharps and flats. I got the keyboard to work but forgot to press enter so it was stuck in oscillator mode which doesn’t work without headphones I also went back and fixed it so everything else like rain and blocks works for the physical keyboard as well.

The Plantsitter

My final project takes care of plants. It came out a little different from what I imagined when I pitched the idea in class.

First of all, I went to Canal Plastic to buy a half-dome plastic, but they didn’t have it and they were very rude, so I decided to take my business elsewhere. I ended up buying a birdcage from a friendly hardware store near my apartment. 

It’s not exactly how I planned it, but I am happy with this. And I like the idea of plant trapped in a birdcage too.

Second of all, instead of making a gravity water system with a water valve, I decided to use a water pump. It kind of complicated things, but I will talk about it later.

I started with making the soil moisture sensor work. I picked up a moisture sensor from Tinkersphere. I won’t be able to connect the arduino to my desktop all the time, so I used a LCD to show the soil moisture instead of using the serial monitor.

I wanted arduino to water plants only in the evening, but like I said, I won’t be able to connect arduino to the desktop (to get time), so I had to get a RTC (real time clock), so I went back to Tinkersphere again. RTC has its own backup battery and now arduino can track time even after being disconnected from the computer.

Next part was to control the UV light. I ordered Power Switch Tail as it was suggested in the class. The power switch tail came with a manual (unlike other parts you get for arduino) and it was pretty self explanatory.

At 3pm, piezo speaker plays “Twinkle Twinkle” and the light turns on. It turns off at 8pm. Except for Sunday because I like to sleep in. I ended up planting California Bluebells, and they like shades. So, only getting UV light from 3pm to 8pm is good for them.

 

So far, so good. Now I’m trying to use a water pump and that’s when things got confusing. I know we learned how to use 12V motor in class, but I forgot how and was really scared to fry the USB port. I went to Tinkersphere again (the cashier remembers me by now), to get a transistor and diode to protect arduino from the backlash. I also had to get a relay, and that I didn’t know how to use and couldn’t find a useful information online. So, the cashier at Tinkersphere helped me. Anyways, I tested the water pump and it worked.

At 7:30pm, the plantsitter checks if the soil is dry (soil moisture less than 50%), the plantsitter will play “Twinkle Twinkle” again and water the plant. If not, it doesn’t do anything. I did some testing and leaving the pump on for 20 seconds was the best. It’s enough time for water to travel through the tube, waters the plant, but not too long so it doesn’t overflow the pot.

Here is a video of the plantsitter watering the plant after mounting it to the birdcage. Originally I had the water on the floor, but the pump wasn’t strong enough to push water all the way up, so now it’s on the shelf. The plastic container doesn’t look very nice, so I am planning on getting a nice glass to put water in.

I’ve added some moss for the look.

Now it’s finished!

 

code can be found here

Final Post

Final Project 1.1

My final project is an instrument. The player can use the sounds of nature and the sounds of a guitar, which are played by pressing the keys ASDF and JKL; on the keyboard. QWER and UOIP are able to turn on and off the reverb.

outmersion

(working title)

The intention of the project is to provide a relaxing environment in our modern lives. This life has lost all the background sounds that the environment that human beings evolved from. The sound of wind, trees, rain and thunder are muted by the sounds of traffic and other noises of the city.

This instrument is trying to attempt to use the sounds of a man made instrument (the guitar) and the sounds of nature: the wind, the rain, birds and thunder.

The visuals were inspired by the painting Kalinda of SOHAN QADRI.

Kalinda is a Hindi name meaning “Sun”. 

I gave some thought to the colors of the piece and I decided that there will be two gradients: one is the color of the night and daytime sky. When the sound is played, the amplitude controls the gradient of the outer squares to yellow; the color of the sun.

The Code can be found here: https://github.com/adamnnagy/FinalProject (there are many separate JavaScript files.)