All posts by Kevin Zhang

Our Lives as Snakes

What and Why

For the final project, I made a simple snake game that recorded the replay of each game. The replays of all the people who played the game can then be watched at the same time and the interactions between the snakes become very apparent. The point of the project is to illustrate the way we interact with people in our lives, whether they be complete strangers, simple acquaintances, or close friends. Here is the message that is displayed after a player finishes a game:

“That seemed like a pretty basic snake game, and up until this point it was. But a recording of the game you just played was saved and can be viewed alongside all the other people who played this game. Your snake might have been in the same spot at the same time that another’s was, but there was no way for you to know until you watch all of these recordings in parallel.

Living in New York City, I pass by hundreds of people just on my way to class and I never interact with any of them. It’s interesting to think about where they might have come from and where they might be headed. In half an hour I’d be sitting in a lecture I’m probably not paying attention to and they could be on a train to Boston. But for that brief moment in time while we waited at that stop light, our lives intersected. Maybe we’ve stood at that same spot together multiple times before, or maybe that was the first time we had ever “met”. Maybe those few seconds of our lives would be the only few seconds we will ever spend in the same spot. The point is, we tend not to think about these things. We tend to focus on our own lives and our own goals. While playing this game, you happened to follow a very similar path as someone else who wandered to this site.”

After this message is displayed, the player can watch their own recording alongside the replay that most closely matched their own. This way, the player can see how they were running alongside someone else at the same time but didn’t notice while playing by themselves. Once the replay is finished, another message is displayed to the user:

“As you may have seen, there were points where the two snakes intersected, and there were points where they ran in parallel. I think it’s an interesting illustration of how we interact with people, whether they be complete strangers, colleagues, or close friends. We each have our own goals, but that doesn’t mean we should block out everyone else’s.”

Players can also watch everyone’s replays at the same time here:  http://snakes.kzhang.io/replays

How

To implement this, I used Node.js on the backend to serve the static html files while also storing all of the replays in a JSON file. I also used socket.io to implement real time functionalities like replays being displayed at /replays instantly after someone finishes their game. I use p5.js to draw the game board, snakes, and food. A “snake” is a simple array of coordinates, with which I draw a series of ellipses that progressively become more opaque so that it looks like it is fading out.

To implement replays, I store all of the necessary information in a Javascript object:

This object is stored on the server in memory and also in a JSON file that is parsed when the server is started up. This way, if the server ever crashes, the replays are still saved. To display the replays, I simply go through each point in the replay and draw the snake as well as the food objective at that point. This happens for all of the replays at the same time so they can all be viewed in parallel.

The code can be found on my github: https://github.com/Kinzeng/socket-snakes

Demo

To see all of the replays currently stored on the server, simply click this link: http://snakes.kzhang.io/replays

If you want to try it out, go to http://snakes.kzhang.io and hit space!

League of Legends API

For this assignment. I decided to work with Riot’s API for League of Legends. Specifically, I thought it would be interesting to compile some personalized stats for players, such as winrate and how they did in lane against specific champions. I’m guessing that most people in the class don’t play, so basically it just tracks how well you do in the game. Right now, it simply looks at a player’s match history and then stores all of the games and the relevant statistics in a Google Spreadsheet. To make this work, I used Google App Script to be able to access a user’s Google Drive. Here’s the Spreadsheet that resulted from the script: https://docs.google.com/spreadsheets/d/1cg35tjZKfClkMnU8z3s9BSCmy56Dvty5qzLcA675yCg/edit?usp=sharing

The cool thing about the script is that it checks for new games every hour, so I just have the script running continuously and I can just check back on the spreadsheet whenever I like. I chose to use Google App Scripts because I had to familiarize myself with the system for a project I’m working on for a nonprofit organization. Here’s the code for the script (sans the API key for Riot):

 

Nature of Computational Media

Why did computers appear?

Computers appeared because humans have always been making things more efficient and easy for us to accomplish tasks. The wheel is a great example of a technological breakthrough that dates back thousands of years. As technology advanced we came up with different ways of abstraction and continued to build upon these abstractions to form the computers that we have today. I took a course on computer architecture last year and we learned about the actual circuitry of a processor. It’s crazy how everything we see on our screens boils down to 0s and 1s. It’s even crazier to think that our computers can convert the code we write into those 0s and 1s that our computers can understand.

What is hard about communication? What do computers add?

Communication has always been difficult because we interpret things based on ourselves. We might interpret what someone says completely differently from what they meant and in many cases this leads to conflict. Computers and text messages add another possibility of miscommunication because texts don’t contain the verbal tone and body language of the speaker that is inherent to physical communication. Unless you know the speaker very well, it is very easy to misinterpret a text and most of us probably won’t ask “what did you mean?” We tend to assume the meaning of the text based on our own interpretation, not what the speaker means.

Is Computational Media better at expressing your internal life?

I think Computational Media can be very good at expressing my thoughts and beliefs if it’s done well. I think the interactive experience can evoke certain emotions that can’t necessarily be felt through just reading about what I think. We tend to be self centered, and simply hearing about other people don’t impact us nearly as much as feeling it for ourselves.

Does narrative and interactivity go together? Is this good?

I strongly believe that interactivity greatly enhances a narrative. The biggest example I have is the video games that aim to tell a story. I think being put in a world and actually being able to interact with the the characters makes the narrative a lot stronger because, again, we tend to be self centered. If we feel that we are having an effect on the game, the high points in the story stick with us more, and I think this a great thing. It helps us understand the emotions of the characters because if the game is immersive enough, it feels like we are right alongside the other characters. Virtual reality is aiming to make things even more immersive and I think once VR is more widely available (and fine tuned), stories can get really really powerful. Of course, non interactive stories can still be very powerful, but I do think interactivity adds another dimension that books can’t take advantage of.

What is so important about Computational Media? How does computational media makes us more or less understanding of people unlike ourselves?

I think empathy is a very important ability to have. It helps us communicate with those around us and those we don’t necessarily understand. Empathy prevents us from dehumanizing others and those we don’t necessarily understand, and Computational Media is a window into the minds of other people. If we can experience what others have experienced, then we gain a more well rounded point of view on life.

Final Project Idea

For the final project, I’m leaning towards making something musical. Since I made a touch sensor at the beginning of the semester, I was thinking of making some kind of keyboard using those touch sensors. To make the tones I can use just the Arduino, but I can also link it up to p5 using Serial and perhaps display a keyboard on the screen and show which keys are being pressed. I think I’d need to purchase a bunch of speakers and have each one play a certain note, but I’m not sure. I’ll have to look into what I can do with speakers and the Arduino. I’ll also need to solder several wires to conductive “keys” and connect those to the Arduino.

I’ve always wanted to make a keyboard of some kind using Arduino but I’ve always been sidetracked by school or other work, so I think this final project can be a really nice opportunity to turn this idea into reality.

Arduino Controlled Pong Game

For the Serial homework assignment, I extended last week’s homework to use an Arduino to control the paddles. To do this, I put together a circuit with three inputs: two potentiometers and a button. The potentiometers control the location of the paddle, while the button starts the game again when someone has scored. This is what the circuit looks like:

I then had to change a few things in my pong game code. After setting up the code for Serial, I changed the Paddle class to only have one method to change its location: setY. This function takes in the y coordinate of the paddle and changes it to that, so that I can simply pass in the location of the paddle based on the inputs from the potentiometers. Because I’m using the potentiometers to determine the location of the paddles, it actually simplified the code, specifically the Paddle class. It used to take in a lot more information like the paddle speed, the boundaries of the canvas, and the keys to control the paddle. Here’s a video demo of the game:

At the end the input started to lag a lot so I couldn’t continue to film it. I think it has to do with the whole Serial set up being somewhat inefficient, but I’m not exactly sure how to fix it. It also seems to take up a lot of RAM and my laptop can’t really run all of the programs at once.

Spinner Servo

Last week I bought a new toy thing called a fidget spinner, which is essentially just a few weights connected to a ball bearing and all it does is spin. Here’s a video of one spinning for a couple minutes:

So for this week’s assignment I made a servo circuit that just spins one of these things for a bit. Of course, the servos aren’t that powerful so they don’t spin the spinner very well, but it works! To spin, I have three buttons that determine the state of the servo: one for clockwise spinning, one to stop the spinner, and one for counterclockwise spinning. Here’s the circuit:

Now, to actually spin the spinner, I had to hold down the servo so it wouldn’t move, but to take a video I needed something else to keep it from moving while I held the camera in one hand and pressed the buttons with the other. I tried a couple things, like tying it down with a large piece of string and stacking a few decks of cards on top, but neither of them kept the servo from moving. So eventually I settled on taping it down and I’m hoping that it doesn’t mess up the deck’s box when I take it apart. Here’s a video of it in action:

For the p5 assignment, I drew a very amateur version of my face, which you can see here:

http://i6.cims.nyu.edu/~kz657/creative/assignment-4/

 

Stupid Pet Trick

For this week’s assignment, I wasn’t sure exactly what to solder because all the components that came with my Arduino kit didn’t need any soldering. In the end, I decided to solder some wires onto an LED because the pins on the LED are so short. I figured that having longer legs could help in future projects. I also found soldering to be very satisfying! The way the solder melts as soon as it touches the hot metal felt really nice for some reason.

For the circuit assignment I ended up making a very simple phototransistor circuit. The phototransistor detects how much light there is and adjusts the LED brightness based on that:

I reversed the mapping so that when there is little light detected, then the LED is bright. When there is a lot of light, the LED is dim. This makes more sense because we don’t need the LED to be bright if there’s already a lot of light.

Here is the code:

 

Password Switch

For the second assignment I decided to extend the first switch by adding another touch sensor and also a password sequence. If the correct password is entered, then the light turns on or off. Since it was really cold when I was working on this, I cut up an empty soda can instead of going out to buy aluminum foil:

I then attached the three rectangles to my circuit:

So to get the password working, I needed to change my code a bit. To do this I set up a password array that stores the sequence of sensors you need to press to activate the LED. I also store an integer that represents the index of the current correct sensor to press, called current. Current starts at 0, so the sensor at the 0th index is correct sensor to press. When it is pressed, current is incremented to 1, so the sensor at the 1st index is the correct sensor to press. For simplicity, the password is just 0, 1, then 2, which matches up with current.

At first, I wanted to make the code a bit more flexible, like allowing me to change the password to different lengths and such, but arrays in Arduino (C++) are different from what I am used to so I held off on that. I looked it up and found out that to get the length of an array, you need to call sizeof(array) and then divide that by the size of the elements in the array. So since ints are 2 bytes each, to get the number of elements in an int array, you need to calculate: sizeof(array) / 2. So making the code more flexible is an optimization I can do at a later point.

Here is the code:

 

Hands-Free Switch- Kevin Zhang

For my circuit, I wanted to make a switch that could be activated with any part of the body, so I looked up how to make a touch sensor. As it turns out, Arduino has a library called Capacitative Sensor that is designed to do just that. The picture below shows my circuit.

Pins 2 and 6 are the sensor pins. They detect whether or not the wires connected to them are being touched. Since I actually did not have any aluminum foil or metal strips at home, I just used the wire as the touch sensors. If I had the materials, I would have attached aluminum foil squares to the wires. In any case, the LED turns on when both the sensors are touched. Of course, this sensor can be activated by touching with the hands, but it can also be activated by any part of the body including, yes indeed, the butt. Here is the code for the Arduino:

The values touch1 and touch2 are the input from the sensors. When the sensors are not activated, they hang around the 0-20 range, but when touched they shoot up to the hundreds and sometimes the thousands. There are a lot of ways to use the touch sensor. An idea I had was to attach up to 9 sensors and have a key code to turn on the light!