Category Archives: Prototype


For our prototype I’m working on the code in p5. Here is my code so far:

let angle = 0;
let sunRatio = 400;
let sunDistance = 200;
var mercText;
var earthText;
function setup()
createCanvas(windowWidth,600, WEBGL);
background(20, 20, 20);
sunDistance = (-windowWidth/2) – 400;
mercText = loadImage(“mercuryTexture.jpeg”);
earthText = loadImage(“earthTexture.jpeg”);


function draw()

stroke(255, 255, 100); //sun
fill(255, 220, 0);
translate(sunDistance ,0, 0);
//rotateY(frameCount * 0.01);

stroke(255,220,165); //mercury
fill(85, 55, 20);
translate(200 + 22.7, 0 ,0);
//rotateY(frameCount * 0.01);
angle += 0.01;

fill(211, 113, 0); //venus
stroke(255, 155, 0);
translate(200 + 35.9, 0, 0);
//rotateY(frameCount * 0.009);
angle += 0.01;

fill(127, 208, 255); //earth
stroke(235, 235, 255,200);

translate(200 + 50.8, 0, 0);
//rotateY(frameCount * 0.008);
angle += 0.01;

fill(231, 133, 0); //mars
stroke(255, 155, 0);
translate(200 + 75.4, 0, 0);
//rotateY(frameCount * 0.007);
angle += 0.01;

fill(188, 136, 84); //jupiter
stroke(150, 150, 90);
translate(200 + 256.1, 0, 0);
//rotateY(frameCount * 0.006);
angle += 0.01;

fill(214, 163, 61); //saturn
stroke(224+20, 173+20, 71+20);
translate(200 + 492.6, 0, 0);
//rotateY(frameCount * 0.005);
angle += 0.01;
// fill(255, 0,0);
//sphere(sunRatio/9, 10, sunRatio/9);

fill(127, 208, 255); //uranus
translate(200 + 1003.5, 0, 0);
//rotateY(frameCount * 0.004);
angle += 0.01;

fill(100, 100, 255); //neptune
stroke(74, 44, 12);
translate(200 + 1502, 0, 0);
//rotateY(frameCount * 0.003);
angle += 0.01



Here’s a screenshot of what it looks like when I run it:


Some of the planets are hard to see from here but they are much easier to see when actually on the computer.

Here is the link to the online version:

(The rotate is commented out on this code because  I commented it out so I could make sure the texture on the planets looked right.)

I’m still working on getting the planets to rotate in place instead of around the sun, getting all the textures on the planets, and being able to zoom in with the click of the mouse. The rotation just needs a little bit more research into WebGL, the textures are easy to put in place, and the clicking will be the most work but it is all doable!

Project Progress

I got the sensor to work, and it turns out it wasn’t broken at all. I just needed to solder it. So, that made me livid, but anyway.


She lives

I spent the weekend getting the physical part together. It’s not exactly a core component, since the sensor doesn’t need a fancy-looking encasement, but it was the most practical move since I’ll be heading home for Thanksgiving, where I wouldn’t be able to laser cut anything.

Laser cutting template



The next step on my side would be to translate the Processing code to p5, so that the sensor can control the yaw/pitch/roll for WEBGL objects. Apparently, Processing and p5 are cousins, so it shouldn’t be overly difficult. I might eat those words in a hot second though.

Block Diagram:


Tangible Course Search – Proof of Concept

User Flow Diagram

Revised user flow

Physical Interface

I kept the physical interface relatively simple for this prototype, as most of the work has been done on programming the digital interface and connecting the physical to the digital. It’s currently just a simple potentiometer connected to the Arduino at Analog port 2. The plan is to 3D print a ‘top’ to the potentiometer (along these lines) to make the component more welcoming to the user.

Server (Physical to Digital)

For the server, I lightly redid my previous Node + Johnny-Five application to allow for the needed expandability for adding a ton of data sources. Now all of the variables are declared ahead of time and  the AJAX response is compacted into a single JSON object that is sent to the client. I tested the potentiometer values to ensure that they were being sent to the client correctly.I’ve also done some work on the backend of the course search with my team, though that isn’t included in the post as it isn’t a part of the project yet.

// Variable Declaration
//Career Dial
    const careerPotentiometerPin = 'A2';
    var careerPotentiometerValue = 0; // 0 - 1023

// Node Server Setup Code
// Module Requirements
var express = require('express');
var path = require('path');
var app = express();

// Set public folder for client-side access

// Send index.html at '/'   
app.get('/', function(req, res){
    res.sendFile(path.join(__dirname + '/views/index.html'));

//Send AJAX data stream at '/data'
app.get('/data', function(req,res) {
    // Compile individual variables into object
        var dataToSendToClient = {
            'careerPotentiometerValue': careerPotentiometerValue
    // Convert javascript object to JSON
        var JSONdata = JSON.stringify(dataToSendToClient);
    //Send JSON to client
//Set app to port 8080

//Log start of app
console.log("App Started");

// Johnny-Five Code
var five = require("johnny-five"),
  board, potentiometer;

board = new five.Board();

board.on("ready", function() {

  // Create a new `potentiometer` hardware instance.
    potentiometer = new five.Sensor({
        pin: careerPotentiometerPin,
        freq: 250

  // "data" get the current reading from the potentiometer
    potentiometer.on("data", function() {
        console.log("Career Potentiometer: " + this.value);
        careerPotentiometerValue = this.value;

Digital Interface

As part of the prototype, I have finished coding the first component of the digital interface: the Academic Career selection. It consists of a simple dial that rotates based off of the potentiometer values received to select different careers as classified by Albert. All data is meant to be manipulated physically, so there are no digital mouse targets to allow for manipulation.

<!DOCTYPE html>
<html lang="en">
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <meta http-equiv="X-UA-Compatible" content="ie=edge">
    <title>Tangible Course Search</title>
    <link rel="stylesheet" href="/css/style.css">
    <script src="/lib/jquery.js"></script>
    <script src="/js/script.js"></script>
    <!--<div id="header">Tangible Course Search</div>-->
    <div id="dialBackground"></div>
    <div id="chooseCareer">
        <div id="chooseMed" class="chooseItem">Medical</div>
        <div id="chooseGrad" class="chooseItem">Graduate</div>
        <div id="chooseUndergrad" class="chooseItem">Undergraduate</div>
        <div id="chooseLaw" class="chooseItem">Law</div>
        <div id="chooseDentist" class="chooseItem">Dentistry</div>
    <div id="chooseDial" class="dial"><div class="arrow-right"></div></div>
//On Document Ready

    //Log to check onload
        console.log('JQuery Loaded');

    //Send AJAX call to server every 20ms
        setInterval(function() {
                    url : 'http://localhost:8080/data',
                    type : 'GET',
                    // On Request Success
                        success : function(data) {
                            // Loop through attributes of given JSON Object
                            // to deconstruct object into variables
                                for (var property in data) {
                                    // Set old property value as var to track change
                                        window[property  + 'Old'] = window[property];                             
                                    // Set property name as var equal to property
                                        window[property] = data[property];
                    // On Request Error
                        error : function(request,error) {
                            console.log("Request: "+JSON.stringify(request));
            }, 20);

    //Change Values On-Screen every 20 ms
        setInterval(function() {
            changeCareerDial(careerPotentiometerValue, careerPotentiometerValueOld);
        }, 20);


// Function Declaration
// Change Career Dial Elements
    function changeCareerDial(potentiometerValue, potentiometerValueOld){
        //If the potentiometer value has changed
            if(potentiometerValue != potentiometerValueOld){
                // 1| 0 to 205
                    if(potentiometerValue>=0 && potentiometerValue<205){
                        $("#chooseDial").css({'transform' : 'rotate(-55deg)'});
                // 2| 206 to 411
                    else if(potentiometerValue>=206 && potentiometerValue<411){
                        $("#chooseDial").css({'transform' : 'rotate(-35deg)'});
                // 3| 412 to 617
                    else if(potentiometerValue>=412 && potentiometerValue<617){
                        $("#chooseDial").css({'transform' : 'rotate(0deg)'});
                // 4| 618 to 823
                    else if(potentiometerValue>=618 && potentiometerValue<823){
                        $("#chooseDial").css({'transform' : 'rotate(35deg)'});
                // 5| 824 to 1023
                    else if(potentiometerValue>=824  && potentiometerValue<1023){
                        $("#chooseDial").css({'transform' : 'rotate(55deg)'});
/* Animations */
@keyframes dialBreathe {
    0% { width: 150vh; height: 150vh; top: -25vh}
    50% { width: 155vh; height: 155vh; top: -27.5vh}
    0% { width: 150vh; height: 150vh; top: -25vh}

/* Styles */
    margin: 0;
    padding: 0;
    background-color: #1f1d1d;
    font-family: sans-serif;
    width: 100vw;
    height: 100vh;
    overflow: hidden;
        color: #999999;
        font-size: 2vw;
        position: absolute;
        top: 1vw;
        left: 1vw;
        font-weight: bold;
        position: absolute;
        top: calc(50vh - 14vw);
        left: 0;
        width: 10vw;
        height: 30vw;
        z-index: 3;
            text-align: center;
            color: #999999;
            font-size: 1.7vw;
            cursor: pointer;
            font-weight: bold;
            border-radius: 1vw;
            padding: 0.5vw;
            width: auto;
            transition: all 1s ease-in-out;
                margin-left: 8vw;
                color: #5e5e5e;
                margin-left: 14vw;
                color: #7c7c7c;
                margin-left: 16vw;   
                margin-left: 14vw;
                color: #7c7c7c;
                margin-left: 8vw; 
                color: #5e5e5e;
                text-shadow: -2px 0 #57068C, 0 2px #57068C, 2px 0 #57068C, 0 -2px #57068C; 
            width: 20vw;
            height: 20vw;
            position: absolute;
            top: calc(50vh - 10vw);
            left: -10vw;
            border-radius: 20vw;
            background-color: #0d0d0d;
            border: 1vw solid #1a1a1a;
            z-index: 4;
            transition: all 1s ease-in-out;
            .arrow-right {
                width: 0; 
                height: 0; 
                border-top: 1vw solid transparent;
                border-bottom: 1vw solid transparent;
                border-left: 1vw solid #1a1a1a;
                position: absolute;
                right: 1.4vw;
                top: calc(10vw - 0.7vw);
            position: absolute;
            left: -50vh;
            top: -25vh;
            z-index: 2;
            height: 150vh;
            width: 150vh;
            border-radius: 100vh;
            animation-name: dialBreathe;
            animation-duration: 6s;
            animation-iteration-count: infinite;
            animation-timing-function: ease-in-out;

The next step for the project is to take the adapter that has been built up and connect it as a test to only the academic careers section of the code. Afterwards, there should be more work done on both the physical and the digital interfaces for the project (though I’m not sure how much fabrication l’ll be able to do over the break, so it’ll probably be mostly digital with Arduino prototyping).

We Got Sole Diagram – Holly & Christshon


Here’s our user block diagram! So far, we have been coding through Arduino and making various patterns for our RGB LED patterns while using a button function to set these off. We decided to not use single-colored LEDs so we can make our own different colors. We also thought about adding a fingerprint scanner so the lights cannot be changed or activated by anyone else but by the user who owns the shoes.

To take our RGB LEDs to the next level, maybe we could have potentiometers for the LEDs on the shoe and have the change depending on the range of the potentiometer.

Also for our design aspect, we are hoping to deconstruct a shoe and to put the component inside.

Next thing on our agenda is to think of different textures for our video mapping that would look cool such as colorful amoeba or a sunset.


I have had a really busy week and did not get everything I wanted to get done for my prototype, but I still made progress.

These are my animatic clips for my protections:


I did not get a chance to finish up and record my poem yet, which is the basis of my project so without that everything else is at a halt.

The block diagram for this project is rather straightforward. There are not many options the user has. The story either pauses if the interaction is not meant or the story continues. The interactions will be based on different sensors and inputs the user will trigger.

This is my block diagram:





So far I have already make the pinwheels spinning on the screen. I also add a sound of wind that may change its volumes with how fast the pinwheels spins. However, I don’t know how to control the volume yet. I plan to add a fat cat on the opposite of the pinwheels, both of the scenes use the voices as input. But it will look interesting because it seems like it is the pinwheels blowing to the fat cat. I think I might use DOM to animate the fur of the fat cat so that it looks like it was blowing by the wind that the pinwheels create.


So far, I’ve gotten the basic codes working and am waiting for my joysticks to come in the mail. Instead of having 10 joysticks in a keyboard style, I think I’m going to have 6 that would be positioned in a “video game controller” style (see pictures below). Right now, all the joysticks are in a sponge because I don’t exactly know what I want the case to look like or how I’m going to make it and the sponge was a cheap easy way to experiment.

Now, I need to figure out what type of sounds I want it to make, how I’m going to make the case, and if I still want to make a display with it. Also, I ran out of analog pins in the arduino and don’t know what to do about that.

Other ideas:

Maybe, instead of having each joystick’s volume and speed be directly proportional, I could have the X axis control volume and the Y axis control speed so each sound could have any combination of volume and speed. I was also thinking to possibly have it be a song remixer instead of whatever I was originally trying to go for.


#define PIN_ANALOG_X 0
#define PIN_ANALOG_Y 1
#define PIN_ANALOG_X2 2
#define PIN_ANALOG_Y2 3
#define PIN_ANALOG_X3 4
#define PIN_ANALOG_Y3 5

int sensor1 = analogRead(A0);
int sensor2 = analogRead(A1);
int sensor3 = analogRead(A2);
int sensor4 = analogRead(A3);
int sensor6 = analogRead(A4);
int sensor7 = analogRead(A5);

void setup() {
pinMode(A0, INPUT_PULLUP);
pinMode(A1, INPUT_PULLUP);
pinMode(A2, INPUT_PULLUP);
pinMode(A3, INPUT_PULLUP);
pinMode(A4, INPUT_PULLUP);
pinMode(A5, INPUT_PULLUP);

// pinMode(2, OUTPUT);

void loop() {







Hold like a video game controller.





Block diagram:



Current Progress: Prototyping

After my materials finally arrived //  over a week later >:(    //,  I was able to finally build a prototype of my lungs. I decided to use red latex balloons to create a “lung” since I thought the latex material would be the easiest to “blow up”, having the least resistance to my fan. I purchased a 12V fan of Amazon and while many customers claimed it to be strong, I knew that it may not be a strong as I needed it to be. Turns out, I was right! It was not strong enough at all.


With the help of Professor David, I was able to learn how to wire my 12V fan to another more powerful power supply. However, the fan was not powerful enough to inflate or deflate my prototype.

Thus, I decided to pivot my idea and take this physical concept and make it AR. Earlier this week, Ellen had asked me if I was incorporating my love for webcams and gifs in my final project and I began to wonder why I had not. I started to brainstorm ways my project could translate into a webcam form and thought of digital mirror in which a user could see their “lungs” and “heart”. When the user did a task, they could see how these organs biologically changed due to anxiety (heart rate increase etc).

I decided to then prototype what that digital mirror could look like. Since I am going home for thanksgiving, I knew I had a lot of time to create the animation that would display on my 6 hour flights (12 hours total).

I wanted to learn exactly what the animation needed to look. I used p5 to generate a simple code that mapped on a heart and then a lung. While p5 claims that it can “tint” images to be transparent, the tint option does not work on gifs. Luckily there are many online resources that can turn a gif transparent.

Here is my output (i cropped out my face cause it was not very flattering haha)

e h

(above is a gif, click to play)


I learned that the organs need to be connected in someway that creates an implicit storyline otherwise it feels weird. For instance, the heart just seems strange and perhaps, this is simply because it is reflected. Yet, nonetheless steps should be taken to create a more completed look. I also know that the design must be high quality and look as close to the a real “organ” as possible. The heart seems to be more impactful than the lungs which were lower quality and less realistic.

I tested it out with a user and they provided similar feedback. I will create an animation on my flight on Tuesday and be able to test during the break with more users!

Here is my block diagram for my new concept:




I am able to connect Kinect to the mac with processing and I am able to determine what the center of user’s hand. I think this is a serious progress. To demonstrate that I drew an ellipse in the center of my hand. I followed Dan’s tutorial.

Prototype for triggering and playing different videos

For this week we focused on trying to trigger the different animations/videos to play using different things. We ran into a few problems. First thing was that p5 didn’t take large files. So we used “dummy” videos that matched  the file requirements. Then we coded the trigger in the online editor. We ran into a few problems there as well. Since we were using keypressed and mousepressed as our makeshift triggers we tried to get that to work. But the Keypressed never worked. Until Terrick the grad student helped us and told us we had to click on the canvas for it to work. So once we figured that out and got it to work, we downloaded our entire project in an attempt to run it locally using the atom text editor and chrome. But ran into problems there as well. Chrome blocked the videos.



So we ran the code in Safari instead.



Music Lessons? Anyone?

So I’ve actually made some pretty good progress so far I have further to go but I’m now moving at a pretty good pace, I’ve created 3 lessons without my controller applied yet, So from here I will have created a more in depth system with pictures and actual controller application. I will also create more lessons to explain more complex rhythms, and I’ll top the whole thing with a full song or two so the user can put in their entire knowledge they gained. In the next week I would like to finish my lessons and have them polished, and then the week after I will apply the controller to that.

Here is the first lesson, I have put them onto tumblr and all the examples and exercises are in P5, lesson 1-3 are basically finished for this prototype, instead of the actual controller though, simple clicks will satisfy.

Also, here is the box diagram for how this project should go

Bongo Lessons