All posts by Sarah Peng

Solar System Documentation

What it is

  This project is an interactive solar system that uses physical computing and p5. Using a sensor we were able to create a spherical controller that when turned, also rotates the planet. There are many interactive solar system applications online but ours is different because it integrates physical computing in a way that we have not seen before for these kinds of projects. The controller and how one interacts with it is what makes our project unique.

Solar System

Why we made it

  We started this project out of a love of outer space. It slowly transformed into something educational, so we could share that love with others.

Who uses it

  Our audience is people, particularly children, who do not know much about the solar system and want to explore. It is especially for people who are tactile learners. Our alternative controller allows the natural movement of turning an object around in one’s hands to translate on to on-screen exploration.

How it works

  It is an online program that is controlled by a physical remote and the mouse. The planets and sun are rotating in space. Using the mouse, you can click on planets/sun and it will zoom in on the planet and display information such as name, size, surface gravity, and more, in the upper left hand corner. From there, the controller can be rotated and the planet will mirror this rotation on the screen. To zoom back out, you must click on the space around the planet and the project will go back to the first screen displayed.

The process

  The process of making it was a tough one because we had a lot of ideas but not all of them were possible given what we’ve learned and the time constraint. Our first idea was to build a planet creator, so that anyone could customize their own solar system in Unity. We talked this out with one of the residents, Jenny Lim, and we realized that this was not going to be able to be made in time. So, we switched to p5 and Arduino and decided to create our solar system.

  At first, we created the planets on a side view that rotated around the sun using webGL. Then, we made a planet class so that we could standardize the planets and their information. To make it work so that the planets were more easily clickable, we changed the idea so that the solar system side view had the planets just rotating in place and not around the sun. We added a zoom so that when the planet is clicked, it is the focus on the screen. Finally, we added some ambient space-themed music looping in the background.


  The controller uses a 6DOF (six degrees of freedom) sensor to sense the rotation (Euler angles) of an object. With serial processing, we fed the values that the sensor recorded into the rotateX, rotateY, and rotateZ values of the planets. For the enclosure, we laser cut a pattern out of poster paper and folded it into a sphere-ish shape. The sensor then went inside a smaller, geometric paper shape and was suspended in the middle by string. At first, the sensor was connected directly to the Arduino with shorter, stiffer wires. For a more polished interaction and stable connection, we soldered on flexible wires, then translated all connections to a solderable breadboard. Both the breadboard and the Arduino when then placed inside a box, to prevent any wires from being pulled out.



Arduino + breadboard

Link to p5 sketch:

Arduino resources: (/Arduino/MPU6050/)

Processing Teapot sketch (used to test sensor in beginning stages):

Color Guesser?

My vision was to train the temple classifier with a feature extractor to recognize colors. I thought it wouldn’t be too hard, since all I would have to do was replace the video with a loop of color pngs and and make some buttons to classify the colors.

But uh…



Everything in the code seemed right, but the image just wouldn’t change. Using createImg with the p5 dom library worked even less, because it didn’t even make a png show up. And when I tried to press the classify button, ml5 didn’t seem to pick up on that there was any image at all.

I tried to make this work for three days at the expense of making progress on my other projects, so I’m gonna let this die now.

Cursèd js code:

let featureExtractor;
let classifier;
var colorimg;
let img;
//let index = 0;
let threshold = 20;
let loss;
let blackimg=0;
let redimg=0;
let orangeimg=0;
let yellowimg=0;
let greenimg=0;
let blueimg=0;
let purpleimg=0;
let whiteimg=0;
let greyimg = 0;
let brownimg =0;

let counter = 1;

 function preload() {
 colorimg = loadImage('assets/Color3.png');
   console.log("image loaded");

function setup() {
   //img = createImg(colorimg);
   img = createImg(colorimg);
  // Create a video element
  // Append it to the videoContainer DOM element
  // Extract the already learned features from MobileNet
  featureExtractor = ml5.featureExtractor('MobileNet', modelReady);
  // Create a new classifier using those features and give the video we want to use
  classifier = featureExtractor.classification(img,imgReady);
  // Set up the UI buttons

// A function to be called when the model has been loaded
function modelReady() {
  select('#modelStatus').html('Base Model (MobileNet) Loaded!');
  classifier.load('./model/model.json', function() {
    select('#modelStatus').html('Custom Model Loaded!');

function imgReady() {
  console.log('image ready');

// Classify the current frame.
function classify() {

function mousePressed() 
     var imgPath='assets/Color'+counter+'.png';
     colorimg = loadImage(imgPath);

// A util function to create UI buttons
function setupButtons() {
  buttonA = select('#blackButton');
  buttonA.mousePressed(function() {
   buttonB = select('#redButton');
  buttonB.mousePressed(function() {

   buttonC = select('#orangeButton');
  buttonC.mousePressed(function() {
     buttonD = select('#yellowButton');
  buttonD.mousePressed(function() {
     buttonE = select('#greenButton');
  buttonE.mousePressed(function() {
     buttonF = select('#blueButton');
  buttonF.mousePressed(function() {
     buttonG = select('#purpleButton');
  buttonG.mousePressed(function() {
     buttonH = select('#whiteButton');
  buttonH.mousePressed(function() {
     buttonI = select('#brownButton');
  buttonI.mousePressed(function() {
  // Train Button
  train = select('#train');
  train.mousePressed(function() {
    classifier.train(function(lossValue) {
      if (lossValue) {
        loss = lossValue;
        select('#loss').html('Loss: ' + loss);
      } else {
        select('#loss').html('Done Training! Final Loss: ' + loss);

  // Predict Button
  buttonPredict = select('#buttonPredict');
    // Save model
   saveBtn = select('#save');
   saveBtn.mousePressed(function() {;

//   // Load model
   loadBtn = select('#load');
   loadBtn.changed(function() {
     classifier.load(loadBtn.elt.files, function(){
       select('#modelStatus').html('Custom Model Loaded!');

// Show the results
function gotResults(err, result) {
  // Display any error
  if (err) {


  <meta charset="UTF-8">
  <script src="p5.min.js"></script>
  <script src="p5.dom.min.js"></script>
  <script src="ml5.min.js" type="text/javascript"></script>
  button {
    margin: 2px;
    padding: 4px;
    width: 300;
    height: 300;
    display: inline;
    font-size: 14px;
    margin: 4px;
    font-weight: lighter;
    font-size: 14px;
    margin-bottom: 10px;


  <div id="imageContainer"></div>
  <h6><span id="modelStatus">Loading base model...</span> | <span id="videoStatus">Loading video...</span></h6>
    <button id="blackButton">Add Black</button>
    </p><p><span id="amountOfBlackImages">0</span>Black</p>
    <br><button id="redButton">Add Red</button>
    <p><span id="amountOfRedImages">0</span>Red</p>
  <br><button id="orangeButton">Add Orange</button>
    <p><span id="amountOfOrangeImages">0</span>Orange</p>
  <br><button id="yellowButton">Add yellow</button>
    <p><span id="amountOfYellowImages">0</span>Yellow</p>
   <br><button id="greenButton">Add green</button>
    <p><span id="amountOfGreenImages">0</span>Green</p>
   <br><button id="blueButton">Add blue</button>
    <p><span id="amountOfblueImages">0</span>Blue</p>
  <br><button id="purpleButton">Add purple</button>
    <p><span id="amountOfpurpleImages">0</span>Purple</p>
  <br><button id="whiteButton">Add white</button>
    <p><span id="amountOfwhiteImages">0</span>White</p>
  <br><button id="brownButton">Add blue</button>
    <p><span id="amountOfbrownImages">0</span>Brown</p>
  <button id="train">Train</button><span id="loss"></span>
    <button id="buttonPredict">Start guessing!</button><br>
    Your custom model labeled this as: <span id="result">...</span>
  <button id="save">Save</button> 
  <label for="avatar">Load Model:</label>
  <input type="file" id="load" multiple=""> 
  <script src="sketch.js"></script>




Algorithms are very much present in our everyday lives, particularly in the consumerism sphere. Corporations like Facebook and Google track our activity across websites so that they can display the most relevant ads to us or show us the posts they think we’d find interesting on social media. And I think at this point it’s been confirmed that these companies have access to our microphones and webcams as well, so if you’re talking about buying dog toys, even if you never search up “dog toys” Google will display dog toy ads. While I don’t think algorithms like these really run our lives, I do think they lure us to buy more stuff, like ads always have. Capitalism and all that. I don’t necessarily think they change the way we got about life. However, Joy Buolamwimi’s talk about coding bias brings another lens to this conversation. It’s almost funny how bias factors into so many things, even something as seemingly neutral as coding. Sometimes I forget that computer scientists have been historically white and male, and still make up the majority of the field. It’s probably because ITP and IMA are quite diverse, which I appreciate. The historical lack of diversity has taken its toll, though. Luckily, I think that this can be a fairly simple wrong to right. After all, code can be easily shared and accessed, and in the case of machine learning, anyone can contribute to training sets. Initiatives like Joy’s Coded Gaze and Ari’s Afrotechtopia are truly important moving forward, as we think about the impact and reach of technology.

Project Progress

I got the sensor to work, and it turns out it wasn’t broken at all. I just needed to solder it. So, that made me livid, but anyway.


She lives

I spent the weekend getting the physical part together. It’s not exactly a core component, since the sensor doesn’t need a fancy-looking encasement, but it was the most practical move since I’ll be heading home for Thanksgiving, where I wouldn’t be able to laser cut anything.

Laser cutting template



The next step on my side would be to translate the Processing code to p5, so that the sensor can control the yaw/pitch/roll for WEBGL objects. Apparently, Processing and p5 are cousins, so it shouldn’t be overly difficult. I might eat those words in a hot second though.

Block Diagram:


End My Suffering

I tried being cool by finding other API keys to code with, but they didn’t really work out.

The first one I tried was with eBay, which I could get a JSON response from, but for some reason couldn’t get the data to load into the code. Scrapped.

The second one was from Flickr, which I wanted to use for making some kind of color palette out of a set of search results. I got the data to load from the search call this time, but not for individual images, even though I was following the documentation? So I was stumped, again. The partially successful code below:

let domain = "";
let apikey = "&api_key=df3ad9406618a769c6577eaa7b778fa1";
let tag;
let query = "&per_page=500&tags="
let format = "&is_commons=true&format=json&nojsoncallback=1";

let url;
let imgurl;
let img;

let userinput;
let button;

function setup() {
  userinput = createInput();
  button = createButton('submit');
  createCanvas(400, 400);

function makeRequest() {
  tag = userinput.value();
  url = domain+apikey+query+tag+format;


function getData(data) {
let farmid =[0].farm;
let serverid =[0].server;
let id =[0].id;
let secret =[0].secret;
imgurl = "https://farm" + farmid + "" + serverid + "/" + id + "_" + secret + "_s.jpg";
img = loadImage(imgurl);

function draw() {

function mousePressed() {
  if (img) {
  image(img, mouseX, mouseY);

And then I gave up and went back to the openweather APIs. I took the UV data and made a sun that gets bigger or smaller depending on the UV index. It’s in beta, though, so the location inputs are based on latitude and longitude instead of city names.


God this is the dumbest thing I’ve ever made and I hate it but I’ve spent too much time with this API-induced migraine. I never want to touch them again, bye.

the thing:

the code:

let domain= "";
let apikey = "&appid=4069f66b6c19b053d48d1a1d88750183";
let lat  = "&lat=";
let lon = "&lon=";

let latslider;
let lonslider;

let latval, lonval;

let url;

let button;

let uvdata;

function setup() {
  latslider = createSlider(-90,90);
  lonslider = createSlider(-180,180);

  button = createButton('submit');

  createCanvas(400, 400);


function makeRequest() {
  url = domain+apikey+lat+latslider.value()+lon+lonslider.value();
  loadJSON(url, getData);


function getData(data) {


function draw() {
  background(0, 204, 255);
  if (uvdata) {
    let sz = map(uvdata.value,0,10,50,200);
    fill(255, 255, 0);
    ellipse(width/2, height/2, sz, sz);


Prototypes + References

So, I really wanted to get a paper prototype together and the 6DOF sensor working this weekend, but I think I broke the thing before I could even get it to work? Or it came dead.


Connections: vcc to 5v, gnd to gnd, scl to a5, sda to a4, a0 to gnd, int to d2

I’m waiting on another one to get here from Amazon because MPU6050s  cost $5 online and $30 at Tinkersphere, and uh…no. Fingers crossed for this next one.

I put the paper prototypes together, though, so they’ll be ready when I finally get the sensor to work.

Outer part:


I cut out long strips of paper to make each unit, then assembled them together into the sphere (see below).



The goal is to suspend the core inside the sphere, but I think that it might be too flimsy? I tried using threading string through the core and tying it to points on the sphere, but it didn’t really hold, and it distorted the shape of the sphere. I’m thinking I’ll learn how to use the laser cutter so I can make the sphere out of cardboard or something more substantial instead, or I’ll create a wire frame inside that’ll prevent it from caving in on itself. For now, this is what I have:



DodecaLEDron (planet explorer/customizer) (another dodecahedron alternative controller) (mpu6050 stuff)



Spring, but gothic? I guess? Have I gotten more emo with my posts? Yes.

Fractal trees are dangerous and this thing almost crashed my browser about 6+ times.


let song;
let angle;
let coef;
let l;
let branches;
let steps;

function preload() {
 	song = loadSound('assets/spring.mp3');

function setup() {
  let a = createVector(width/2, height);
  let b = createVector (width/2, height-100);
  let root = new Branch (a,b);

function draw() {
  let vol = map(mouseY, 0, height, 1, 0);
  let fr = map(mouseY, 0, height, 1, -1);
  l = map(mouseY, 0, width, 200, 50);
  branches = map(mouseY, 0, height, 3, 1);
  angle = map (mouseY, 0, height, 1.5, 0);
  coef = map(mouseY,0,height,0.5,0);
  steps = map(mouseY,0,height,7,0);
  branch(l, steps);


function branch (len,s) {
  if(s > 0) {
    let bcoef = angle/branches;
    for (let i = 1; i<=branches; i++) {


This week’s obligatory coding challenge that saved my ass:

And this other one:


I feel that there’s something sinister about globalization, about its historical context and the implications that it has for the present and future. Because even if imperialism and colonizing are out of fashion,    we still live in a Western-centric world. English is the lingua franca. Language is a powerful instrument of control, and it was utilized by colonizers in Kenya to cause dissociation of students, whose indigenous language was deliberately undervalued in school, from their own cultural environment. Women of color still bleach their skin or get their jaws shaved down and their noses fitted with silicone to look like white women, to fit global, Eurocentric beauty standards. The desire for a global market has contributed to the emergence and continuance of unethical (to say the least) working conditions, for both Lithium miners and Amazon warehouse workers. Things are changing, sure, but globalization has taken its toll. I think that at least now the goal of globalization isn’t inherently malicious, but we tend to be detached from the situation, and might conclude that the ends justify the means.

One part of the globalization of technology in particular that I see as a positive is the spread of open-source hardware and software. This makes plain the inner workings of technology, a bit of an antithesis to the plain, enigmatic shell of the Amazon Echo. Arduino boards, for example, are open-source, available for anyone to recreate or modify in their own designs. It invites the consumer to take things into their own hands, to investigate and tinker, to figure out what makes their machines tick and to build their own for personalized or public use. The consumer is then elevated; they can become sovereign over technology. I believe in putting knowledge and power of the hands of the many, and I think that globalization that involves the unbiased sharing of information, such as open-source code, plays a pivotal role in achieving this.


RM dropped mono and an MV for forever rain so


of course I’ve been really emotional about it for the past week and decided to base my assignment on it.

I made some interactive rainfall. There’s not much to it, but it’s pretty satisfying to watch the rain go by, and rain sounds nice. I watched these two Coding Challenge videos surrounding particle systems to get the vectors and forces:



Initially, I was going to go for something practical – guitar amp was the first idea to come to mind. After talking to some of my friends from the other section, though, whose assignment was literally to create a “pet”, I decided to make a tamagotchi-esque goldfish. It follows the cursor around (supposed to represent the user’s finger), which is controlled by a joystick, and you can feed it by pressing a button. It responds to signs of affection, like being fed, with a heart.



sdffdssfd sdfsdsfds

p5js code:

let fish;
let serial;
let xpos, ypos, button;

let deltaX = 0.0, deltaY = 0.0;
var accelX = 0.0, accelY = 0.0;
var springing = 0.0009, damping = 0.98;

function setup() {
  // Instantiate our SerialPort object
  serial = new p5.SerialPort();

  // Let's list the ports available
  var portlist = serial.list();

  // Assuming our Arduino is connected, let's open the connection to it
  // Change this to the name of your arduino's serial port"/dev/cu.usbmodem14301");

  // Register some callbacks

  // When we connect to the underlying server
  serial.on('connected', serverConnected);

  // When we get a list of serial ports that are available
  serial.on('list', gotList);

  // When we some data from the serial port
  serial.on('data', gotData);

  // When or if we get an error
  serial.on('error', gotError);

  // When our serial port is opened and ready for read/write
  serial.on('open', gotOpen);

  createCanvas(400, 400);
  fish = new Fish();
  print(fish.x, fish.y);
function gotData() {
  var currentString = serial.readStringUntil("\r\n");
  if (currentString.length > 1){
  var values = split(currentString, ',');            // split the string on the commas
    if (values.length > 1) { 
  xpos = map(values[0],  0, 1024, 0, width);
  ypos = map(values[1], 0, 1024, height, 0);
  button = values[2];

function draw() {
  background(12, 169, 255);
  if (button > 0) {

function followCursor() {
  ellipse(xpos, ypos, 20, 20);

function feed() {
    for(let x = 0; x < width; x += 50){
    let y = random (+200); 
     if(y<fish.y) {
      fill(206, 114, 74);

function serverConnected() {
    print("We are connected!");

// Got the list of ports
function gotList(thelist) {
  // theList is an array of their names
  for (var i = 0; i < thelist.length; i++) {
    // Display in the console
    print(i + " " + thelist[i]);

// Connected to our serial device
function gotOpen() {
  print("Serial Port is open!");

// Ut oh, here is an error, let's log it
function gotError(theerror) {

// Got the list of ports
function gotList(thelist) {
  // theList is an array of their names
  for (var i = 0; i < thelist.length; i++) {
    // Display in the console
    print(i + " " + thelist[i]);

// Connected to our serial device
function gotOpen() {
  print("Serial Port is open!");

// Ut oh, here is an error, let's log it
function gotError(theerror) {

class Fish {
  constructor() {
    this.x = width/2;
    this.y = height/2;
  follow() {
  deltaX = xpos - fish.x;
  deltaY = ypos - fish.y;
  deltaX *= springing;
  deltaY *= springing;
  accelX += deltaX;
  accelY += deltaY;

  // move predator's center
  fish.x += accelX;
  fish.y += accelY;

  // slow down springing
  accelX *= damping;
  accelY *= damping;
  heart() {
  show() {
  fill(255, 105, 12);
  rect(fish.x, fish.y, 30,30);

Arduino code:

void setup() {
  // put your setup code here, to run once:

void loop() {
  // put your main code here, to run repeatedly:
int Xpin = analogRead(A1);
int Ypin = analogRead(A0);
int button = digitalRead(2);

Serial.print (Xpin);
Serial.print (",");
Serial.print (Ypin);
Serial.print (",");
Serial.println (button);





Visual Spectrum

I wanted to make some kind of visualization of the color spectrum, where a user could control the color and wavelength of a sine wave type object with a slider. My search for a cool-looking sine wave led me to this video:

I think this kind of stuff is much more useful than the basic tutorial videos or blindly scrolling through the references. It’s a good medium, for me at least, since it introduced me to terms I didn’ t know about before, and I was able to follow along in the coding and breakdown process. After the video, I also took what I learned about mapping and lighting to  get the end result:


let w = 30;
let angle = 0;
let ma;
let maxD;
let nm;

let wavelength;
let freqslider;
let div;

function setup() {
  slider = createSlider (400, 750);
  div = createDiv("Current wavelength: " + wavelength);
  createCanvas(400, 400, WEBGL);
  ma = atan(1 / sqrt(2));
  maxD = dist(0,0,200,200);
function gatherWavelength() {
  wavelength = slider.value();
  div.html ("Current wavelength: " + wavelength);

function draw() {
  let wavelength = map(slider.value(),400,625,255,0);
  pointLight(wavelength, 255, 255, 0, 0, 400);
  pointLight(wavelength, 50, 100, -300, -300, height / 2);
  directionalLight(wavelength, 150, 150, -0.8, -0.8, 0);
  let offset = 0;
  for(let z = 0; z < width; z += w){
    for(let x = 0; x < width; x += w){
    let d = dist(x,z,width/2,height/2);
    let offset = map(d, 0, maxD, -PI, PI);
    let a = angle + offset;
  let h = map(sin(a),-1,1,0,200);
 box (w-2,h,w-2); 
  let nm = map(slider.value(),400,750,0.3,0.1);
  angle -= nm;




Nonlinear Thinking + Digitization

I like to think that I’m fairly non-linear in my thinking, or that I at least strive to be non-linear. One of the basic aspects of this brand of thinking, I believe, is being flexible and pursuing a web of possibilities rather than sticking to a single, rigid plan. When it comes to my creative process especially, I’ll develop the rough idea for an illustration or piece of writing or any other project, then open up around 20 tabs to research a variety of ways in which I can go about getting to that end result, or another result entirely. I allow myself to bounce around in the headspace, be influenced along the way, and respond to changes. Linear thinking tends to put me in a rut. When I try to make myself follow specific steps, I spend too much time thinking about how to adhere to the structure, and I get stuck in a rut.

Open source and digitization, then, is very valuable to me and to all of us. It creates for us the web from which we pull inspiration. It is truly through the 20 tabs of walk cycle tutorials and coding challenges, and infinite paths of digital information, that we can amalgamate ideas into something new, bold, and potentially important.

Communication and Evolution

Communications systems started with the most basic building blocks of symbols, representing the bare necessities of humanity at the time. As our needs and expressions became increasingly complex and diverse, communication systems evolved with us. When it became necessary to teach these systems to others, for example children or those who spoke a different language, humans made the transition to using alphabets, as if instinctually. Except for Chinese and partially Japanese, which kept symbols as part of their systems. So naturally, Chinese characters and Japanese kanji are a struggle for foreign language learners. Technology has inevitably become a core aspect of communication as well. We do most of our communicating over texts, emails, phone calls, and what I’m doing right now, as I write this post, is also a form of communication. While the typical argument has been that this kind of communication makes us emotionally numb and unresponsive, I would counter that this does amplify the power and reach of communication in new and profound ways. We can shoot quick text to our friends to make plans  within a few minutes, or call our parents when we miss them. Nothing quite compares to spending time with friends though. Communication is made up of words, images, and above all, the physical aspect. Technology is cool and convenient, but it needs to be utilized in tandem with our human communication.

Cultural evolution involves societal and intangible change. To make things even more complex, transfer of information in cultural evolution is network-like as a result of communication systems. Human creativity and mind reading, the specific human capacity of being aware what other people have in mind, are motors specific for innovation, invention, and cultural evolution.

Jack in the Box (WIP)

Still a work in progress with a lot of kinks, but it’s proof of concept at least. We decided to make a jack-in-the-box, with the item inside being of conductive material so that it could connect to the next group’s like a basic switch. We’ll probably add in the “Pop Goes the Weasel” tune on a buzzer or speaker just for fun.

We didn’t get a chance to test it out with the groups before and after  us, but we did know that the group before us used an LED as their output. We incorporated that into the schematic for now.

Photos and gifs:


vhjghjgjh fds


#include <Servo.h>

int photocellPin = A0;
int light = 0;
Servo jitbservo;
int pos = 0;

void setup() {
// put your setup code here, to run once:


void loop() {
// put your main code here, to run repeatedly:
light = analogRead (photocellPin);
light = light/2;
Serial.println (light);
if (light > 410)
{pos = 0; pos >= 90; pos += 1;
{pos = 90; pos <= 0; pos -=1;


Bodies, Emotions, the World

Our bodies are the sensory receptors of our experiences. They also serve as the output source for our reactions to these experiences, whether these are emotional, physical, and oftentimes a combination of both. For example, annoyance is always accompanied by at least a side glance, at least for me. The body are greatly influenced by metaphorical thought as well, and this translates to our perception of the world. In a way, bodies are like computers, receiving inputs, executing commands; they’re just a lot frailer and emotional and more flawed. It’s cool though. That’s the human experience.

While it would be nice if our decisions were driven by rationality, they’re at least 90% of the time controlled by emotions. Like right now, I’m up at 2 AM doing homework, even though the rational thing would be to take care of myself and go to sleep, but I’m in a weird state of sadness that’s making me want to torture myself. So here we are. Rationality is cool, and listening to other people try and reason with you is cool, but y’know what’s even cooler? Just doing everything based on impulse and regretting it later.

Computers are bridges between bodies and emotions. Through them, we can connect with different people around the world. I found all my favorite artists through social media, so now I can keep up with their work and draw inspiration from various sources all in my Instagram feed. If we’re talking about how computers themselves can reach us emotionally, as opposed to the content or functions of computers, we’re getting into Siri territory. I refer to her specfically because of that article on how Siri represents feminized labor and technologies. This kind of information contextualizes and humanizes the computer somewhat, so that it might become interesting on a more personal level.

Mr. Coffee (Kinda)

The dream was that I would hook up that coffee maker in the junk shelf to my Arduino. I wanted to make it so that the coffee maker would make either more or less coffee depending on someone’s energy level.  The energy level would be determined by the bend in a person’s neck, measured by a flex sensor. For the expressivity aspect, I also wanted to give this system a bit of personality, so I added an LCD display that said “I’m Mr. Coffee!” and displayed the “status” of the user. The whole idea didn’t quite convert to reality, for a couple reasons.

First off, I couldn’t get my hands on a PowerSwitch Tail over the weekend for the life of me. The coffee maker is still sitting in my locker, waiting for a PSSR, but for now, I’ve substituted it with a test LED.



Second, all the tape in the world couldn’t secure the flex sensor to the back of my neck, so here’s me trying to give the effect of how it would work:

(Note: This started getting a little glitchy as the hours wore on, so the dimming/brightening wasn’t as smooth.)

I’ll be the first to say that this was kind of a failure. Except for the LCD display, which was my one good idea that worked out. Still, I stand by the Mr. Coffee concept, and I still think it’s feasible with more time.

Anyway, here’s the code and schematic. I’m hoping that I can return to this at one point, because that coffee maker calls to me.

#include <LiquidCrystal.h>

const int rs = 12, en = 11, d4 = 5, d5 = 4, d6 = 3, d7 = 2;
LiquidCrystal lcd(rs, en, d4, d5, d6, d7);

int flexPin = analogRead(A0);
int mrCoffee = 9;
int value;

void setup() {
  // put your setup code here, to run once:
  lcd.begin(16, 2);
  lcd.print("I'm Mr. Coffee!");
  pinMode(9, OUTPUT);


void loop() {
  // put your main code here, to run repeatedly:
  lcd.setCursor(0, 1);
  value = analogRead(flexPin);
  value = map(value, 700, 900, 0, 255);
  analogWrite(9, 0-value);

  if (value > 0 & value < 30)
  if (value > 20b & value < 40)
  if (value > 40 & value < 60)
  if (value > 60 & value < 80)



And sources that I used:


While I would like to think of myself as a rational person, I know all too well that my moral decisions are almost always based on my emotional state or relationships, some of the most volatile factors in life. Living in my brain has been one of the most nerve-wracking anxiety adventures ever since I realized how fragile the two pillars of my judgement are. When I think I’ve made the right, morally good decision, I regret it about 5 minutes later, backtracking to analyze everything I did wrong. More often than not, I always come to the conclusion in nearly every scenario that goes slightly awry that I am to blame, that the ultimate epitome of moral righteousness comes down to taking the blame onto myself. It’s almost always been the easiest route, always the right one for me.

Technology, or more accurately, media, acts as an unconscious influence on our morals. Too often, it clouds my judgement. From behind a screen, people are less like people, and I start to idolize certain “people”, certain types of behavior, certain sets of morals. When I try to apply them to my own life, let’s just say that the ideal of maintaining a sarcastic cool girl image has backfired on me more than a couple of times when trying to make moral decisions.

The struggle of making moral decisions is one of the defining aspects of humanity. The whole point is that we wrestle with and evolve with  morality so that we may come to a better understanding of it, even if it seems like a fruitless, tedious endeavor. It’s how we can even slightly confront the world before us. If that responsibility fell to machines, I think we would all lose our collective minds. Machines run on algorithms, nothing more and nothing less. Even if we were to program ones with some kind of moral algorithm, they would never see the context, and something would always feel wrong with their decisions. The pillars that uphold my judgement may shake at the slightest disturbance, but it’s that only foundation that I have. It’s not something I can give to a machine.

Humidity Switch v.2

I wanted to make some improvements to the humidity switch so that it was more obvious when the humidity was too low, just right, or too high. I decided to add another light to the circuit, and to specify the colors of the LEDs: green and red. These colors have pretty universal meanings, so I figured that they would be the most practical. In this newer version, the red light blinks to signal to the owner that their pet’s cage is in need of misting, or the humidifier needs to be turned on. When the correct level of humidity is reached (still 80%-90% for now), the red light turns off, and the green light turns on. If the humidity gets too high, the green light turns off again, and the red light turns on. I think this make the switch much easier to understand and use.

Green LED + red LED + DHT11
Updated schematic

Screenshots of the code

I touched a wet tissue to the module this time so that I wouldn’t break the thing again by spraying water into it.

Here’s a short video of the switch doing its thing.

Also, some other stuff that I didn’t put in my first post: (DHT library and tester code) (This guy made something similar but for hamsters and much more elaborate, w/ automated humidifiers and heaters.)

Media and the Universal Machine

Both computational media and traditional media serve the basic purpose of distributing information to a population. However, computational media refers to this distribution through means of technology, such as social media or videogames, while traditional media makes use of books, newspaper, or radio. Perhaps most importantly, the former prioritizes some form of active interaction between information/story, the user, and at times even the external environment, while the latter provides a kind of passive, private interaction that occurs between only the audience and the piece of media. Wikipedia is a good example of computational media, combining the features of an online encyclopedia with the creative participation of contributors in order to benefit its readers: technology + information + interaction + external influence. I believe that something like reading a paperback book also counts as interaction, but one that occurs solely within the individual.

At the same time, the division of computational and traditional media seem trivial to me, since there are so many points of overlap. If, instead of buying a physical album, I streamed it on Spotify, then music becomes new media. If I had a book on a kindle instead of in paperback, the book is now new media. Eventually, everything seems to travel into the domain of “new media” and blurs together into a blob of  just “media” . Even the line between the producers of each kind of media, as evidenced by the fact that this major exists and the fact that the only thing I had show when applying was my illustration portfolio, yet here I am, learning how to code and make circuits.

In my mind, a Universal Machine should be something that is applicable to all people, everywhere, regardless of economic status or living conditions. However, the only things that might fit into this category are simple machines like levers and pulleys. And while they are machines, they don’t carry with the the intrinsic idealism we like to link so closely to machines. In this case, there is perhaps no machine that is truly universal, only machines that are “universal” in the technologically-advanced areas of the world, since those are the places that present themselves as being at the forefront, as representatives of the world as a whole.

Humidity/Water Switch

Small creatures rule my life, so I thought I’d make a switch inspired by one scaly category of them: reptiles. Their habitats require controlled levels of humidity, so reptile owners need something to monitor cage conditions. This switch is based on the humidity gauges that they usually use. The LED turns on once the DHT11 senses that the humidity has reached desired levels. In this case, we’ll say between 80%-90%. It turns off when the humidity drops below 80% or goes higher than 90%.

First, I had to figure out how to wire and set up the DHT11. Well, a bunch of people had already figured it out for me, but I had to tinker with the wires a little to get it to work. Otherwise, I just uploaded a default DHT11 sketch to my Uno.

Initial setup for DHT11
First attempt: DHT11 sketch + module reading the temperature and humidity in the suite

It seemed easy enough, but what followed was a lot more trial and error and modifications than I expected. I thought that all I would have to do was add some lines to the sketch so that the LED could be triggered to turn on and off depending on the DHT11’s recorded humidity. I ended up downloading another DHT library and running an entirely different code in order to get the new circuit with the LED to work.

DHT11 and LED hooked up to Uno

The good thing is  that I eventually got it to run smoothly. In the gif below, I used a small spray bottle filled with water to mist the area around the module in order to increase humidity.

Before misting (<80%), after misting to 80%-90%, and after misting to >90%

And then I promptly broke the DHT11 after spraying water a little too directly at it. Whoops. It’s sitting in some rice right now.

Anyway, there you have it: a humidity gauge/automated light switch that tells reptile owners when their pets have the right humidity for them to be happy and healthy.

Beyond Binary?

Computers process the world through 1’s and 0’s, and though their patterns have become increasingly nuanced and advanced, their language is still primarily limited to two numbers. They are programmed into states of “true” or “false”, black or white, while individuals and society almost always exist in gray area. If computers could expand to base 3 or 4 and so on, there might be more room for error. The concept of creating a more a inclusive device that might be integrated into society connects to the idea of machines gaining consciousness as well. True moral consciousness, after all, inevitably follows after the discovery of societal gray areas.