Category Archives: Project Progress

Inspiration for Our Final Project – We Got Sole

Our inspiration for this project were the Nike Air Mags — a shoe popularized by their appearance in the movie Back to the Future and renditioned by Nike in 2015 with Michael J. Fox being the first to get a pair.

These shoes gained popularity for their unique futuristic and out-of-this-world look and their unbelievable price-tag.

msx cm cnv nd

What I love about the shoes is how the lights are so pretty and floe with the shoe with it having a hollow-like feeling within the shoes.  They are integrated into the design.  This is something that Christshon and I strive for. We hope to make our project  interactive by giving the user the ability to choose the colors displayed on their feet — hopefully, through a button or potentiometer.

Official Final Project Documentation

Here is a copy of the project:

https://drive.google.com/drive/folders/1KwPtWHd_sW3Uts3uCWJijKG2_r8ZjxUl?usp=sharing

Here is the project in action during the Sunday show.

While you cannot hear what the user is hearing, you can see what the user sees on the monitor before them and their reaction during the simulation.


Tools Used:

Illustrator, Photoshop, AfterEffects, Audition, P5.play, P5.js, PoseNet, Brackets, FreeSounds.com, and a lot of office hours

Process:

Creating a Prototype

The process to achieve the final result was surprisingly complicated. For my first step,  I took free images of body parts online (lungs, heart, and veins), made them transparent through Photoshop and then animated them on Adobe After Effects.

asdas

I then created a simple subway animation that would be masked to reveal the user and created a “background” of sorts. Since I was unsure if users would resonate with the subway background, I initially used free stock footage. I also created two text animations: one that provides users context before the simulation and one to provide closure afterwards.

sdfsd

 

Once these first draft animations of the body parts and background were created, I decided to continue working with After Effects to create a prototype of my project. I simply used “movie magic” to apply these animations to  prerecorded webcam footage of myself. This allowed users to get a general understanding of the storyline that would be displayed. Finally, I used Audition and Free Sounds.com to create the audio. There are two pieces of audio: the subway noises which play in the beginning to help add context and the panic attack audio which imitates the internal noises (rapid heartbeat, heavy breathing, scattered/panicky thoughts) that a user would experience during a panic attack.

auditojn

Here is a link to the prototype:

 

User Testing with Prototype

I primarily used the prototype for user testing because it allowed me to make changes easily, quickly, and without the sunk cost that completely coding it first would have. Users primarily gave me feedback on the general storyline, specifically providing insights regarding the mini story that exists when the user “experiences the panic attack” in the subway. Originally, the mini story had thrusted the users into the situation without providing the user time to understand the context and in turn, the simulation. Thus, the user testing feedback helped fixed issues with the overall pacing. User testing also provided insights on the semantics used in the text displayed before and after the “simulation”. Specifically, I discovered that the ending text was abrupt and did not provide the necessary closure that a user needed after experiences such a sensory overload.

 

Creating the final project

After testing with almost 20 users over a course of a week, I finally reached a version of my project that I was content with. Now, all I had to do was bring it to life!

I started to by working to get the webcam and body tracking working. Since I knew I was using large animation files, I opted to use Brackets to code rather than the text editor. For some reason, I experienced a strange amount of problems regarding this because my computer was not properly capturing video feed and the text editor made it difficult to debug.

Thus, I pivoted back  to the text editor. I used facial mapping code instead, mapping the lungs x pixels away from the user’s chin. Then I added “filler” animations to create a general structure of my code. I knew that my animations, regardless of the file type, would be too large for the text editor. However, since I was having trouble debugging without the text editor, I decided to put gifs and .movs files that were small enough for the text editor in the places where the real animations would be placed. In other words, where the subway background would be was a random gif of the earth. I just wanted to have the bones of my code down before I moved back to the local text editor.

While currently, the random earth gif has been replaced with the appropriate subway file, here is a link to the first web editor: https://editor.p5js.org/andrikumar/sketches/BJuBq6cy4

During this time I also recorded my own video footage of the subway and substituted it with the stock footage I had been using for user testing.

With the bones created, I then transitioned back to the text editor so that I could input the correct files; yet, I still faced a lot of hiccups. Essentially, After Effects renders extremely large files that would not even work locally. However, these files needed to maintain their transparency so they could not be compressed post rendering. After playing around for days with different files types and ways to maintain transparency, I finally discovered what to do. I decided to convert the subway background into 5 pngs that would loop using p5.play. I turned the pre text, post text, and lungs animation into gifs. While originally, the lungs gradually increased in speed, I could only render 2 seconds of the animations to avoid having too large of a file size. Now, the user sees rapid breathing throughout the simulation.

Once I successfully added the animations to my code, I used different functions and “addCue” to trigger the animations based off the audio as well as create the interactions.

Here is what I ended up with:

https://drive.google.com/open?id=1rZYTTyByN53vB8ByfKPy5V_aUJrzemkv

You can find my code here which you can open up with a text editor to see the final work! I used Brackets!

Here is my code:

asdasdasassadas

Final Changes for the Show

While presenting the project during class, I realized that facial mapping required an extremely well lit room otherwise the code could not “see” the user’s chin. At first, I thought of simply switching the code to map from the eyes down but if something is being mapped onto a user’s body, they are very likely to move around. If the code used the user’s eyes, then the animations would scatter everywhere. Thus, I needed to use something more stable.

As a result, I converted my code from facial mapping based to PoseNet based, mapping the animation of the body parts between the user’s shoulders. For some reason, I am terrible at doing math and struggled to find the mean distance but luckily I was able to in the end!

Since I also understood p5.play better, I decided to take 15 pngs of the lung animations and animate them through p5.play rather that using the gif. I thought users would appreciate the higher quality animation that p5.play offered. However, after during a few rounds of A/B testing with the gif animation versus the p5.play animation, I discovered users preferred the gif animation. They thought the low quality created an “abstractness”, which allowed them to really be immersed in the story.

 

Conclusion

I am honestly happy that I faced all the issues I did because as a result, I got the opportunity to explore libraries, like p5.play, which we did not get the opportunity to in class. I am quite proud of my work, especially because my freshman year I failed my first coding class and now I coded this entire project! Of course, this project would not exist without the help my professors and friends provided me! It was really rewarding during the show to hear users talk to me after the simulation about how anxiety disorders have effected their lives. A lot of the users mentioned that they had a partner who had panic attacks, and while they had learned how to help their partner get through the attack, they never understood what had been going on. However, this experience gave them a glimpse on what it had been like for their partner and finally helped them understand the situation– something that endless conversation simply could not provide. I really hope to keep developing this project further so that it can serve as an educational tool!

Here is a video of my work during the show:

What I will be working on in the future

After having numerous people try out my project at the show, I was able to get a lot of user feedback! While most of it was positive, many users explained that the conclusion could still use some work. They still felt shocked and were unsure what to do after the simulation. One participant even asked if I had a teddy bear they could hold. I have always struggled with making powerful conclusions and so I think this will be the perfect opportunity to work on this skill.

I also got the opportunity to show my work to a medical student that was going to become a psychiatrist. Ideally, I would love my project to be used to educate medical professionals about mental illness. The student provided me some insights on how I could add to the project to help appeal to medical professional’s needs. For instance, he mentioned that I could have users experience the panic attack on the subway and then “go to the ER and hear from a doctor that it was just a panic attack”. Not only would this have a better story arc, but it would help medical professionals understand the importance of empathizing with their patients that just had a panic attack. I think this was a really powerful insight and I plan on brainstorming around it a bit more!

Tangible Course Search – Proof of Concept

User Flow Diagram

Revised user flow

Physical Interface

I kept the physical interface relatively simple for this prototype, as most of the work has been done on programming the digital interface and connecting the physical to the digital. It’s currently just a simple potentiometer connected to the Arduino at Analog port 2. The plan is to 3D print a ‘top’ to the potentiometer (along these lines) to make the component more welcoming to the user.

Server (Physical to Digital)

For the server, I lightly redid my previous Node + Johnny-Five application to allow for the needed expandability for adding a ton of data sources. Now all of the variables are declared ahead of time and  the AJAX response is compacted into a single JSON object that is sent to the client. I tested the potentiometer values to ensure that they were being sent to the client correctly.I’ve also done some work on the backend of the course search with my team, though that isn’t included in the post as it isn’t a part of the project yet.

//---------------------------------------------
// Variable Declaration
//---------------------------------------------
//Career Dial
    const careerPotentiometerPin = 'A2';
    var careerPotentiometerValue = 0; // 0 - 1023

//---------------------------------------------
// Node Server Setup Code
//---------------------------------------------
// Module Requirements
var express = require('express');
var path = require('path');
var app = express();

// Set public folder for client-side access
app.use(express.static('public'));

// Send index.html at '/'   
app.get('/', function(req, res){
    res.sendFile(path.join(__dirname + '/views/index.html'));
});

//Send AJAX data stream at '/data'
app.get('/data', function(req,res) {
    // Compile individual variables into object
        var dataToSendToClient = {
            'careerPotentiometerValue': careerPotentiometerValue
        };
        
    // Convert javascript object to JSON
        var JSONdata = JSON.stringify(dataToSendToClient);
        
    //Send JSON to client
        res.send(JSONdata);
 });
 
//Set app to port 8080
app.listen(8080);

//Log start of app
console.log("App Started");

//---------------------------------------------
// Johnny-Five Code
//---------------------------------------------
var five = require("johnny-five"),
  board, potentiometer;

board = new five.Board();

board.on("ready", function() {

  // Create a new `potentiometer` hardware instance.
    potentiometer = new five.Sensor({
        pin: careerPotentiometerPin,
        freq: 250
    });

  // "data" get the current reading from the potentiometer
    potentiometer.on("data", function() {
        console.log("Career Potentiometer: " + this.value);
        careerPotentiometerValue = this.value;
    });
});

Digital Interface

As part of the prototype, I have finished coding the first component of the digital interface: the Academic Career selection. It consists of a simple dial that rotates based off of the potentiometer values received to select different careers as classified by Albert. All data is meant to be manipulated physically, so there are no digital mouse targets to allow for manipulation.

<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <meta http-equiv="X-UA-Compatible" content="ie=edge">
    <title>Tangible Course Search</title>
    <link rel="stylesheet" href="/css/style.css">
    <script src="/lib/jquery.js"></script>
    <script src="/js/script.js"></script>
</head>
<body>
    <!--<div id="header">Tangible Course Search</div>-->
    <div id="dialBackground"></div>
    <div id="chooseCareer">
        <div id="chooseMed" class="chooseItem">Medical</div>
        <div id="chooseGrad" class="chooseItem">Graduate</div>
        <div id="chooseUndergrad" class="chooseItem">Undergraduate</div>
        <div id="chooseLaw" class="chooseItem">Law</div>
        <div id="chooseDentist" class="chooseItem">Dentistry</div>
    </div>
    <div id="chooseDial" class="dial"><div class="arrow-right"></div></div>
</body>
</html>
//On Document Ready
$(document).ready(function(){

    //Log to check onload
        console.log('JQuery Loaded');

    //Send AJAX call to server every 20ms
        setInterval(function() {
                $.ajax({
                    url : 'http://localhost:8080/data',
                    type : 'GET',
                    dataType:'json',
                    // On Request Success
                        success : function(data) {
                            // Loop through attributes of given JSON Object
                            // to deconstruct object into variables
                                for (var property in data) {
                                    // Set old property value as var to track change
                                        window[property  + 'Old'] = window[property];                             
                                    // Set property name as var equal to property
                                        window[property] = data[property];
                                }              
                        },
                    // On Request Error
                        error : function(request,error) {
                            console.log("Request: "+JSON.stringify(request));
                        }
                });
            }, 20);

    //Change Values On-Screen every 20 ms
        setInterval(function() {
            changeCareerDial(careerPotentiometerValue, careerPotentiometerValueOld);
        }, 20);

});

//------------------------------------------------------------------------
// Function Declaration
//-----------------------------------------------------------------------
// Change Career Dial Elements
    function changeCareerDial(potentiometerValue, potentiometerValueOld){
        //If the potentiometer value has changed
            if(potentiometerValue != potentiometerValueOld){
                // 1| 0 to 205
                    if(potentiometerValue>=0 && potentiometerValue<205){
                        $('.chooseItem').removeClass('chooseSelected');
                        $('#chooseMed').addClass('chooseSelected');
                        $("#chooseDial").css({'transform' : 'rotate(-55deg)'});
                    }
                // 2| 206 to 411
                    else if(potentiometerValue>=206 && potentiometerValue<411){
                        $('.chooseItem').removeClass('chooseSelected');
                        $('#chooseGrad').addClass('chooseSelected');
                        $("#chooseDial").css({'transform' : 'rotate(-35deg)'});
                    }
                // 3| 412 to 617
                    else if(potentiometerValue>=412 && potentiometerValue<617){
                        $('.chooseItem').removeClass('chooseSelected');
                        $('#chooseUndergrad').addClass('chooseSelected');
                        $("#chooseDial").css({'transform' : 'rotate(0deg)'});
                    }
                // 4| 618 to 823
                    else if(potentiometerValue>=618 && potentiometerValue<823){
                        $('.chooseItem').removeClass('chooseSelected');
                        $('#chooseLaw').addClass('chooseSelected');
                        $("#chooseDial").css({'transform' : 'rotate(35deg)'});
                        
                    }
                // 5| 824 to 1023
                    else if(potentiometerValue>=824  && potentiometerValue<1023){
                        $('.chooseItem').removeClass('chooseSelected');
                        $('#chooseDentist').addClass('chooseSelected');
                        $("#chooseDial").css({'transform' : 'rotate(55deg)'});
                        
                    }
            }
    }
/*---------------------------------------------*/
/* Animations */
/*---------------------------------------------*/
@keyframes dialBreathe {
    0% { width: 150vh; height: 150vh; top: -25vh}
    50% { width: 155vh; height: 155vh; top: -27.5vh}
    0% { width: 150vh; height: 150vh; top: -25vh}
}

/*---------------------------------------------*/
/* Styles */
/*---------------------------------------------*/
html,body{
    margin: 0;
    padding: 0;
    background-color: #1f1d1d;
    font-family: sans-serif;
    width: 100vw;
    height: 100vh;
    overflow: hidden;
}
    #header{
        color: #999999;
        font-size: 2vw;
        position: absolute;
        top: 1vw;
        left: 1vw;
        font-weight: bold;
    }
    #chooseCareer{
        position: absolute;
        top: calc(50vh - 14vw);
        left: 0;
        width: 10vw;
        height: 30vw;
        display:flex;
        flex-direction:column;
        justify-content:space-around;
        z-index: 3;
    }
        .chooseItem{
            text-align: center;
            color: #999999;
            font-size: 1.7vw;
            cursor: pointer;
            font-weight: bold;
            border-radius: 1vw;
            padding: 0.5vw;
            width: auto;
            transition: all 1s ease-in-out;
        }
            .chooseItem:nth-child(1){
                margin-left: 8vw;
                color: #5e5e5e;
            }
            .chooseItem:nth-child(2){
                margin-left: 14vw;
                color: #7c7c7c;
            }
            .chooseItem:nth-child(3){
                margin-left: 16vw;   
            }
            .chooseItem:nth-child(4){
                margin-left: 14vw;
                color: #7c7c7c;
            }
            .chooseItem:nth-child(5){
                margin-left: 8vw; 
                color: #5e5e5e;
            }
            .chooseSelected{
                text-shadow: -2px 0 #57068C, 0 2px #57068C, 2px 0 #57068C, 0 -2px #57068C; 
            }
        .dial{
            width: 20vw;
            height: 20vw;
            position: absolute;
            top: calc(50vh - 10vw);
            left: -10vw;
            border-radius: 20vw;
            background-color: #0d0d0d;
            border: 1vw solid #1a1a1a;
            z-index: 4;
            transition: all 1s ease-in-out;
        }
            .arrow-right {
                width: 0; 
                height: 0; 
                border-top: 1vw solid transparent;
                border-bottom: 1vw solid transparent;
                border-left: 1vw solid #1a1a1a;
                position: absolute;
                right: 1.4vw;
                top: calc(10vw - 0.7vw);
            }
        #dialBackground{
            position: absolute;
            left: -50vh;
            top: -25vh;
            z-index: 2;
            height: 150vh;
            width: 150vh;
            border-radius: 100vh;
            background-color:#262626;
            animation-name: dialBreathe;
            animation-duration: 6s;
            animation-iteration-count: infinite;
            animation-timing-function: ease-in-out;
        }

The next step for the project is to take the adapter that has been built up and connect it as a test to only the academic careers section of the code. Afterwards, there should be more work done on both the physical and the digital interfaces for the project (though I’m not sure how much fabrication l’ll be able to do over the break, so it’ll probably be mostly digital with Arduino prototyping).

Prototype

So far, I’ve gotten the basic codes working and am waiting for my joysticks to come in the mail. Instead of having 10 joysticks in a keyboard style, I think I’m going to have 6 that would be positioned in a “video game controller” style (see pictures below). Right now, all the joysticks are in a sponge because I don’t exactly know what I want the case to look like or how I’m going to make it and the sponge was a cheap easy way to experiment.

Now, I need to figure out what type of sounds I want it to make, how I’m going to make the case, and if I still want to make a display with it. Also, I ran out of analog pins in the arduino and don’t know what to do about that.

Other ideas:

Maybe, instead of having each joystick’s volume and speed be directly proportional, I could have the X axis control volume and the Y axis control speed so each sound could have any combination of volume and speed. I was also thinking to possibly have it be a song remixer instead of whatever I was originally trying to go for.

Arduino:

#define PIN_ANALOG_X 0
#define PIN_ANALOG_Y 1
#define PIN_ANALOG_X2 2
#define PIN_ANALOG_Y2 3
#define PIN_ANALOG_X3 4
#define PIN_ANALOG_Y3 5

int sensor1 = analogRead(A0);
int sensor2 = analogRead(A1);
int sensor3 = analogRead(A2);
int sensor4 = analogRead(A3);
int sensor6 = analogRead(A4);
int sensor7 = analogRead(A5);

void setup() {
Serial.begin(9600);
pinMode(A0, INPUT_PULLUP);
pinMode(A1, INPUT_PULLUP);
pinMode(A2, INPUT_PULLUP);
pinMode(A3, INPUT_PULLUP);
pinMode(A4, INPUT_PULLUP);
pinMode(A5, INPUT_PULLUP);



// pinMode(2, OUTPUT);
}

void loop() {

Serial.print(analogRead(PIN_ANALOG_X)+analogRead(PIN_ANALOG_Y));
Serial.print(",");

Serial.print((analogRead(PIN_ANALOG_X2)+analogRead(PIN_ANALOG_Y2))+3000);
Serial.print(",");


Serial.println((analogRead(PIN_ANALOG_X3)+analogRead(PIN_ANALOG_Y3))+6000);
//Serial.print(",");
delay(10);


}

p5:

https://editor.p5js.org/ach549@nyu.edu/sketches/rk3t5zRpX

Prototype:

Hold like a video game controller.

Bottom

a

Top

a

Block diagram:

a

 

Current Progress: Prototyping

After my materials finally arrived //  over a week later >:(    //,  I was able to finally build a prototype of my lungs. I decided to use red latex balloons to create a “lung” since I thought the latex material would be the easiest to “blow up”, having the least resistance to my fan. I purchased a 12V fan of Amazon and while many customers claimed it to be strong, I knew that it may not be a strong as I needed it to be. Turns out, I was right! It was not strong enough at all.

HA

With the help of Professor David, I was able to learn how to wire my 12V fan to another more powerful power supply. However, the fan was not powerful enough to inflate or deflate my prototype.

Thus, I decided to pivot my idea and take this physical concept and make it AR. Earlier this week, Ellen had asked me if I was incorporating my love for webcams and gifs in my final project and I began to wonder why I had not. I started to brainstorm ways my project could translate into a webcam form and thought of digital mirror in which a user could see their “lungs” and “heart”. When the user did a task, they could see how these organs biologically changed due to anxiety (heart rate increase etc).

I decided to then prototype what that digital mirror could look like. Since I am going home for thanksgiving, I knew I had a lot of time to create the animation that would display on my 6 hour flights (12 hours total).

I wanted to learn exactly what the animation needed to look. I used p5 to generate a simple code that mapped on a heart and then a lung. While p5 claims that it can “tint” images to be transparent, the tint option does not work on gifs. Luckily there are many online resources that can turn a gif transparent.

Here is my output (i cropped out my face cause it was not very flattering haha)

e h

(above is a gif, click to play)

 

I learned that the organs need to be connected in someway that creates an implicit storyline otherwise it feels weird. For instance, the heart just seems strange and perhaps, this is simply because it is reflected. Yet, nonetheless steps should be taken to create a more completed look. I also know that the design must be high quality and look as close to the a real “organ” as possible. The heart seems to be more impactful than the lungs which were lower quality and less realistic.

I tested it out with a user and they provided similar feedback. I will create an animation on my flight on Tuesday and be able to test during the break with more users!

Here is my block diagram for my new concept:

userflow

Project Progress 1

In this project, I used the chance of studying sound and light in p5 to make some progress in my final project. I used the mic in computer as a basic detector, and adjust the volume of song played by the computer according to the volume of  noise detected by the mic. This is the volume-adjusting part.

However, I met some problems when I was trying to visualize the volume of music played by the computer. I wanted to show the wave of amplitude. But it always turned out to be a straight line. I think it  might be related to the volume adjusting part of my code, because when I commented the volume adjusting part, the wave of amplitude worked.

I am going to figure out how to connect these two part better.

full screen:   https://editor.p5js.org/Ruojin/full/B19JpYL67

code: https://editor.p5js.org/Ruojin/sketches/B19JpYL67

Prior Art and My progress

Several artworks inspire me to do this project.

The first one is the augmented reality sandbox created by UCLA. This work is really impressive. People play with the sand to build mountains, use their hands to mimic rain. When someone changes the sand, the contour line of displayed on the sand will change as well. The displayed water follows physic principles. It is really fun to play with a real physical object and see the effects. It is very related to my project, because I may use the same kind of sensor that they used in the sandbox.

 

1
1
2
2

The second one is an installation by Urbane Screen called “320 Light”. This is a part of this installation which shows a sense of flow by small lines. It inspires me to show the magnetic field with magnetic induction lines. I hope I can achieve a cool effect.

3
3

The third one is a photography work of the magnetic field on the internet. It looks great. It elevates the physical experiment to art.

 

Progress:

I begin to study the nature of code. It is quite hard.

I find the sensor that I am going to use.

https://www.adafruit.com/product/3317

 

 

 

 

Food Factory Storylines

Doughnut man:

A stereotypical old man whose super bitter about everything and is very mad at humans picking up his friends because it reminds him of his impending doom.

Raechel (Pizza):

An annoying teenager who wants to do nothing but talk about her boyfriends and doesn’t care if you take her boyfriends away. She is always too preoccupied with her current boyfriend. Until she leaves them then she starts wishing for her old boyfriends back but its too late because they’ve already been eaten. And now she has to deal with the harsh reality of life and loneliness.

Dumpling man:

A fun, optimitic and very passive person. Even when you take his family away he doesn’t seem to mind that much. But the more you take the sadder he becomes but he wont show it because he’s very passive and prefers to see you happy than him sad or angry.

References

 

 

Bodega Progress

I decided to make the structure of my bodega. I also changed that the other screens will be projected onto the surface instead of shown on the screen of the computer. I used p5.js to tap into my webcam camera. This way, I can code in p5.js to connect my sensors in order to make the experience interactive and responsive tot he poem that will play in the background.

r

r

In terms of finalizing the poem and recording it so that it can be played in the background, is still in progress. I plan to finish it by Wednesday and update it.

Prototypes + References

So, I really wanted to get a paper prototype together and the 6DOF sensor working this weekend, but I think I broke the thing before I could even get it to work? Or it came dead.

das

das
RIP
sdfsdfdsf
Connections: vcc to 5v, gnd to gnd, scl to a5, sda to a4, a0 to gnd, int to d2

I’m waiting on another one to get here from Amazon because MPU6050s  cost $5 online and $30 at Tinkersphere, and uh…no. Fingers crossed for this next one.

I put the paper prototypes together, though, so they’ll be ready when I finally get the sensor to work.

Outer part:

sfsdfdfsdf

I cut out long strips of paper to make each unit, then assembled them together into the sphere (see below).

sdfds
labor 

Core:

The goal is to suspend the core inside the sphere, but I think that it might be too flimsy? I tried using threading string through the core and tying it to points on the sphere, but it didn’t really hold, and it distorted the shape of the sphere. I’m thinking I’ll learn how to use the laser cutter so I can make the sphere out of cardboard or something more substantial instead, or I’ll create a wire frame inside that’ll prevent it from caving in on itself. For now, this is what I have:

fgdfgfd

References/Inspiration:

DodecaLEDron

https://managore.itch.io/planetarium (planet explorer/customizer)

https://pangenerator.com/projects/dodecaudion/ (another dodecahedron alternative controller)

https://www.hackster.io/Aritro/getting-started-with-imu-6-dof-motion-sensor-96e066

https://forum.arduino.cc/index.php?topic=452392.15 (mpu6050 stuff)

Final Project Progress

This week I started to draw out the final food for our animation, so we can start animating that one.

dumpling

While I also started to make the pizza and donut food out of clay, they still are not done. We decided to make four slices of pizza and four donuts to make our lives easier.

Here are a few pictures that I am using as reference.

Image result for life sized donuts out of clay

Image result for pizza out of clay

Project Progress

This week I made the drawing part of the project. Each layer of different fans has been separated in the document. I will try to upload the layers to p5. If I can’t do so, I will use Photoshop After Effect to make it an animation piece. I plan to buy the sensor next week and use the pressure sensor as a variable that changes how fast these fans rotate.

here is the scene

11

Building Fake Dumpling for Final Project

This week our group decided to design the storylines for our final project and make some food samples for the installations.

We thought about using 3d printing for the food sample first, but it turned out to be too expensive for our project, so we decided to use clay to sculpt our food.

I used Super Sculpey clay to make a fake dumpling. 

super sculpey living doll

Of course, I didn’t use real stuffing for this dumpling. I actually wrapped a piece of tissue in this clay dumpling. The shape and the texture look pretty realistic. 

fake dumpling

fake dumpling

fake dumpling gif

 

Similar Project + My Own

Similar Projects

It was a bit difficult to find school appropriate examples of similar works, since many of them are focused on creating body parts for specific kinds of robots.

But luckily, and strangely enough, a lot of people have tried to make fake lungs.

Here is someone trying to make it with an Ardunio:

https://www.researchgate.net/figure/The-set-of-artificial-organs-and-body-parts-present-in-the-manikin-From-top-to-bottom_fig14_313687401

Here is someone selling an entire fake lung kit:

https://www.boundtree.com/Training-%26-Simulation/Anatomical-Models/BioQuest-Inflatable-Lung-Kit/p/13277

I also found artists to see how they represented anxiety

This one was very focused on the physical aspects of anxiety, which I think aligns with what my focus is as well:

http://portfolios.risd.edu/gallery/69751543/Anxiety-Installation

However I noticed, most artists tend to not depict anxiety with literal human related images. It makes me a bit nervous because I wonder if there is a reason so many people choose to avoid representing anxiety in a more literal sense.

Current Progress

Unfortunately, I am running a bit behind on my project but I found DC fans which I think I can use to inflate the lungs. Also, as I added above, there are a surprising amount of tutorials and blog posts on creating lungs with Arduinos. I have also ordered red latex balloons to build the lungs. These should arrive Monday night and I will build a mini prototype of the lungs to test out the DC motors with when they finally arrive.

DC fans:

https://www.amazon.com/gp/cart/view.html/ref=nav_crt_ewc_hd

More Help:

https://forum.arduino.cc/index.php?topic=61940.0

Drums and Progress

I wanted to make this post go towards both prompts, not because I’m lazy but I feel like I used this weeks project to help me progress on my final project.

Sound and Light:

FULL SCREEN: https://editor.p5js.org/totallyhypnosquid/full/B1zhm_xam

EDITOR:
https://editor.p5js.org/totallyhypnosquid/sketches/B1zhm_xam

Obviously the picture was just used for show and to add to this cute little project, but essentially, once you click on it, a song starts playing, and every time you click a little drum sound will play, essentially letting you kind of “jam out” with the music playing. Of course it is not too in depth but for my final project, I will include much more depth.

Progress:

So this ties into my final project, essentially, I now know how to make my input create sound on P5, and how to make sound play on P5, so I will be using this as a base to allow my input to create this sound on P5 when the user makes an input. This is a very basic interaction, but the goal is eventually to make these inputs affect the rhythm game on the screen, and go with music that is helpful in helping people understand rhythm.

Bongos

I have purchased two containers that I will be customizing to work with an Arduino, to create a controller that allows for two inputs when the user hits the top. I will be trying a variety of sensors that will allow for me to determine which creates the best effect for what I am going for. I am looking forward to making more progress!

Many

For this week, I decided to do something I could also use in my final project. I decided to have a joystick that would change sounds when the user pushed it in different directions. Originally,  I wanted the sound was suppose to come from an arduino buzzer, but for some unknown reason it didn’t want to work. So, I decided to connect it to p5 and have the sound play from my laptop. After some trial and error, it finally worked!

Click the link below to see!

Arduino:

#define PIN_ANALOG_X 0

#define PIN_ANALOG_Y 1

int sensorValue1 = analogRead(A0);
int sensorValue2 = analogRead(A1);

void setup() {
Serial.begin(9600);
pinMode(A0, INPUT_PULLUP);
pinMode(A1, INPUT_PULLUP);
}

void loop() {

if (analogRead(PIN_ANALOG_X)<30 && analogRead(PIN_ANALOG_Y)<30) {
Serial.println(0);
}else{

if (analogRead(PIN_ANALOG_X)<30 && analogRead(PIN_ANALOG_Y)>93){
Serial.println(1);
}else{

if (analogRead(PIN_ANALOG_Y)<30 && analogRead(PIN_ANALOG_X)>93){
Serial.println(2);
}else{

if (analogRead(PIN_ANALOG_Y)>93 && analogRead(PIN_ANALOG_X)>93){
Serial.println(3);
}
delay(200);
}}}}

 b

a

P5:

https://editor.p5js.org/ach549@nyu.edu/sketches/BJe9woL2m