Category Archives: Sound and Light

Sound and Light

For this project, I went back and added audio-amplitude-based width calculations to my original concentric circle “Visual Elements” project. While this was just a fun little side project to get more familiar with using mic input, I wanted to do a similar width modification with sound volume for the drama innovation lab side wall projections in real time during the show, so this was a cool way to start to approach that task. In the future I want to figure out how to make this concept look more fluid with some sort of transition time between sizes.

Concentric Circles Responding to Audio Volume

var ellipseWidth = 1;
var ellipseRadius = 1;

function setup() {
  
  createCanvas(windowWidth, windowHeight);
  
  // Create an Audio input
  mic = new p5.AudioIn();

  // start the Audio Input.
  // By default, it does not .connect() (to the computer speakers)
  mic.start();
  
}

function draw() {
  
  var vol = (mic.getLevel()*1000)+1;
  console.log(vol);
  
  background(30);
  
	for(var i=0;i<100;i++){
    
    //Draw Outer Ellipses
      fill(30);
      stroke(240);
    	ellipseRadius = (ellipseWidth%i)*4*vol;
      ellipse((windowWidth / 2), (windowHeight / 2), ellipseRadius, ellipseRadius);
    
  	//Draw Inner Ellipses
      stroke(255,0,0);
    	ellipseRadius = (ellipseWidth%i)*3*vol;
  		ellipse((windowWidth / 2), (windowHeight / 2), ellipseRadius, ellipseRadius);
    
  }

  ellipseWidth++;
  
  if(ellipseWidth>920){
    ellipseWidth=1;
  }
  
}

Editable: https://editor.p5js.org/ethanprintz/sketches/SJ2mTiYc7

Full Screen: https://editor.p5js.org/ethanprintz/full/SJ2mTiYc7

Sound and Light

In this project, I capture the volume of the music and draw the circulating shape in the middle of the canvas by using the amplitude in p5 library. In addition, I also use peak detecting tools to capture peaks of the music.

full-screen: https://editor.p5js.org/Ruojin/full/BkGIC2IpX

code:https://editor.p5js.org/Ruojin/sketches/BkGIC2IpX

Congratulations

I decided to make something with a sound output for this project.

Ever since we started working with p5, the creation of shapes and movement reminded me of those pop ups that I always used to get on my home computer that said: “Congratulations, you are the 999,999th visitor! Claim your prize!”

I always wanted to know what happened if you actually clicked those, probably viruses for your computer so I never actually clicked them. But, since I’ve always wanted to click the button and see and I never can truly know what would happen, I created my own reimagining of it.

screenshot

here is my code:

var video;
var button;
function preload()
{
video = loadSound(“iykyk.mp3”);
}
function setup()
{
createCanvas(400, 400);
button = createButton(“Click here to claim”);
button.mousePressed(plays);
}

function draw()
{
textSize(10);
background(255);
fill(4, 0, 122);
stroke(220);
rect(100, 150, 200, 20);

fill(220);
rect(100, 170, 200, 150);

rect(255, 155, 10, 10);
rect(270, 155, 10, 10);
rect(285, 155, 10, 10);

fill(0);
rect(257, 163, 7, 2.25);

stroke(0);
fill(220);
rect(272, 157, 7, 7);

fill(0);
text(“x”, 287.5, 164);

fill(255);
textSize(15);
text(“Congratulations!”, 105, 165);

fill(255, 0, 0);
noStroke();
ellipse(140, 200, 20, 20);

fill(255);
textSize(15);
text(“x”, 136, 204);

textSize(17);
fill(0);
text(“You are the”, 155, 205);
text(“999,999th visitor:”, 136, 230);
text(“Congratulations”, 140, 253);
text(“you WON!”, 159, 275);
}

function plays()
{
if(video.isPlaying() == true)
{
video.stop();
}
else(!video.isPlaying() == true)
{
video.play();
}

}

 

And finally, here is the finished piece:

https://editor.p5js.org/aramakrishnan/full/rJW712UaX  

 

Dancing Little Man

I wanted to create a little guy who can dance to the music I insert, so I decided to use sound waves to simulate dancing arms. The sound waves are generated from the music file.

I tried to make a sound wave first. It didn’t look like a typical sound wave because I added color fill to the wave, and the stroke of the wave is white, same as the background color.

code: https://editor.p5js.org/Yulin/sketches/SJE6iw8pQ

Then I managed to add two sound waves as the little man’s arms. At this point, I thought the motion in this piece might be a bit simple so I added dancing brows to the little man’s face. What’s different is that the eyebrows are controlled by the sound of the microphone input but not the inserted music file. The brows move not only when the computer plays music through the speaker, but also when there is sound outside the computer.

code: https://editor.p5js.org/Yulin/sketches/BJPpdF8Tm

Light filter thingy

I tried to make a cool filter trick from this website.

https://creative-coding.decontextualize.com/video/

Instead. It ended up not working or just crashing.

kf

 

So I made this instead.

https://editor.p5js.org/jamesb/full/By_ECI8aX

https://editor.p5js.org/jamesb/sketches/By_ECI8aX

With the help of this website.

https://codeburst.io/instagram-filters-with-javascript-p5-js-83f28c9f7fda

mirror & recorder

For practicing presentation, I make a mirror with a recorder that presses it to start,  a second time for stopping, and a third time for downloading.

 

fullscreen:

https://editor.p5js.org/jeremycricchus@gmail.com/full/rkTUrOMTX

code:

https://editor.p5js.org/jeremycricchus@gmail.com/sketches/rkTUrOMTX

Photo Booth

What I decided to for my project this week is play with the live webcam that was introduced in last class. What I did was really simple, while following a Coding Train tutorial, I create a button that allowed ‘snapshots’ or paused images of the live cam.

Here’s the code:

let capture;

function setup() {
  createCanvas(640, 480);
  capture = createCapture(VIDEO);
capture.size(320,240);
  capture.hide();
  button = createButton("PAUSE");
  button.mousePressed(takesnap);
}

function takesnap(){
  image(capture, 0, 0, width, height);
filter(THRESHOLD, 0.3);
}

Things I wanted to try but for some reason didn’t work:

  • try to add another video on the same canvas
  • try to make the live cam have another effect where it distorted the image/ played with the pixels

I’ll keep trying this stuff out as I watch more tutorials! look out for an updated version!!

full screen link: https://editor.p5js.org/samasrinivas/full/B1cVgXLTQ

@attributions

@The Coding Train! I followed the tutorial to add a button to the live video to create the photobooth effect!

@the p5.js library!

Remix!

I wanted to manipulate two different instrumentals in order to create a new instrumental. This remix tool would manipulate volume and the rate at which the song plays in order for a new sound to be created.

fullscreen:

https://editor.p5js.org/yunglizard/full/ByVF3z86X

code:

https://editor.p5js.org/yunglizard/sketches/ByVF3z86X

*Sometimes the initial song overlaps itself and I’m not sure why*

Big shout out to this tutorial video for helping me out:

 

Face Filters

Context

I became really intrigued by p5’s ability to track and map a face, so I decided to explore it a bit more for this assignment.

I started by first trying to understand the limitations of the facial map as well as a Mac’s web cam. Using the code that we made in lab, I input a gif where the we were originally created an ellipse. The ellipse was placed on a user’s nose (point 62). However, what I discovered was that gif or images, even though they may be transparent pngs, have an invisible square/rectangular border around them. When placed on a spot on a user’s face on p5, the image will be placed based off the top left corner of the invisible border, not the center.

https://editor.p5js.org/andrikumar/sketches/Hkng-e-TX

https://editor.p5js.org/andrikumar/full/Hkng-e-TX

As a result, when I placed the image onto the nose, the top left corner of the image was on the nose. I played with different face points until I was happy with the location (I settled upon point 19, which is the tip of a user’s left eyebrow). However, I realized that when a user gets really close to the camera, the image/gif does not resize and it becomes obvious that the image is being mapped onto the eyebrow and not the user’s face as whole.

I tried to look up any information regarding resizing but unfortunately, I could not find anything and thus, went to Ellen’s office hours Friday morning. I spent the rest of the time leading up to the office hours conceptualizing my idea.

My Idea:

I was always really inspired by the Shell Silverstein’s poetic drawing “The Thinker of Tender Thoughts” (See below). Yet, Silverstein as a whole was a large part of my elementary education. All my teachers would always read his poems to us during our reading breaks and the students would be fascinated by the magical stories he would weave.

poem

I wanted to amplify that experience a bit more and help elementary kids really engage with Silverstein’s works. His poems have such powerful messages and it would be amazing if these messages stuck with children for the rest of their lives. I decided that I could create a “face filter” to help children really see themselves as part of the poems and hopefully, apply the lessons the poems share in their own lives. Since “The Thinker of Tender Thoughts” has such a special place in my heart, I decided to make my face filter based off that piece.

Finalizing The Code! Thank You Ellen

In office hours, I learned that unfortunately, it is difficult to get images to resize along with the face with a Mac’s webcam since it cannot pick up depth well.

While that was frustrating, Ellen helped me learn how to code a few more interactions into my assignment so that I could really engage children who would use it.

-I wanted kids to be able to click the filter and have something be added to experience and she taught me how to make the “filter” a button

-She also helped me understand how the facial mapping worked, specifically the coordinate system it was based on, which helped me better place the “filter”

End Result

full screen: https://editor.p5js.org/andrikumar/full/SJlVvINTm

code: https://editor.p5js.org/andrikumar/sketches/SJlVvINTm

 

Spring

njkhjkh

https://editor.p5js.org/sspeng/full/rk4qG4S67

Spring, but gothic? I guess? Have I gotten more emo with my posts? Yes.

Fractal trees are dangerous and this thing almost crashed my browser about 6+ times.

Code:

let song;
let angle;
let coef;
let l;
let branches;
let steps;

function preload() {
 	song = loadSound('assets/spring.mp3');
 }

function setup() {
  createCanvas(400,400);
  let a = createVector(width/2, height);
  let b = createVector (width/2, height-100);
  let root = new Branch (a,b);
  song.loop();
}

function draw() {
  let vol = map(mouseY, 0, height, 1, 0);
  let fr = map(mouseY, 0, height, 1, -1);
  song.setVolume(vol);
  song.rate(fr);
  
  background(50);
  l = map(mouseY, 0, width, 200, 50);
  branches = map(mouseY, 0, height, 3, 1);
  angle = map (mouseY, 0, height, 1.5, 0);
  coef = map(mouseY,0,height,0.5,0);
  steps = map(mouseY,0,height,7,0);
  
  stroke(200);
  translate(width/2,height);
  branch(l, steps);

}

function branch (len,s) {
  line(0,0,0,-len);
  translate(0,-len);
  
  if(s > 0) {
    let bcoef = angle/branches;
    for (let i = 1; i<=branches; i++) {
      push();
      rotate(i*bcoef);
      branch(len*coef,s-1);
      pop();
      push();
      rotate(-i*bcoef);
      branch(len*coef,s-1);
      pop();
    }
  }
}

 

This week’s obligatory coding challenge that saved my ass:

And this other one:

Many faces

Using Array, mouse interaction, sound, and object-oriented programming

youtube

1
1

 fullscreen

edit

The version with sound is in my Bracket. This is the version withought sound, becasue I don’t know how to use import sound into the web editor.

//visual
let frame;
let _a=0;
let faces=[];
//sound
let song;

function preload(){
song=loadSound(“My Song.mp3”);
}

function setup() {
createCanvas(windowWidth, windowHeight);
for(let j=0;j<height/89;j++){
faces[j]=[];
for(let i=0;i<width/55;i++){
faces[j][i]=new Face(i*55,j*89,sin(j)*i*2,cos(i));
}
}
song.play();
}

function draw() {
background(100);

for(let j=0;j<faces.length;j++){
for(let i=0;i<faces[j].length;i++){
faces[j][i].show();
faces[j][i].change();
faces[j][i].hover();
}
}
}

class Face{

constructor(x,y,a,b){
this.x=x;
this.y=y;
this.a=a;
this.b=b;
}

show(){
fill(255);
rect(this.x,this.y,55,89);
}

change(){
if(this.a<55){
fill(0);
noStroke();
rect(this.x,this.y,this.a, 89);
this.a=this.a+0.3;
this.b=55;
}
else{
fill(0);
noStroke();
rect(this.x,this.y,this.b, 89);
this.b=this.b-0.3;
if(this.b<0){
this.a=0;
}
}
}

hover(){
let d=dist(mouseX,mouseY,this.x+22.5, this.y+44.5);
if(d<52){
this.a=mouseX-this.x+30;
this.b=mouseY-this.y+30;
}}
}

Drums and Progress

I wanted to make this post go towards both prompts, not because I’m lazy but I feel like I used this weeks project to help me progress on my final project.

Sound and Light:

FULL SCREEN: https://editor.p5js.org/totallyhypnosquid/full/B1zhm_xam

EDITOR:
https://editor.p5js.org/totallyhypnosquid/sketches/B1zhm_xam

Obviously the picture was just used for show and to add to this cute little project, but essentially, once you click on it, a song starts playing, and every time you click a little drum sound will play, essentially letting you kind of “jam out” with the music playing. Of course it is not too in depth but for my final project, I will include much more depth.

Progress:

So this ties into my final project, essentially, I now know how to make my input create sound on P5, and how to make sound play on P5, so I will be using this as a base to allow my input to create this sound on P5 when the user makes an input. This is a very basic interaction, but the goal is eventually to make these inputs affect the rhythm game on the screen, and go with music that is helpful in helping people understand rhythm.

Bongos

I have purchased two containers that I will be customizing to work with an Arduino, to create a controller that allows for two inputs when the user hits the top. I will be trying a variety of sensors that will allow for me to determine which creates the best effect for what I am going for. I am looking forward to making more progress!

Many

For this week, I decided to do something I could also use in my final project. I decided to have a joystick that would change sounds when the user pushed it in different directions. Originally,  I wanted the sound was suppose to come from an arduino buzzer, but for some unknown reason it didn’t want to work. So, I decided to connect it to p5 and have the sound play from my laptop. After some trial and error, it finally worked!

Click the link below to see!

Arduino:

#define PIN_ANALOG_X 0

#define PIN_ANALOG_Y 1

int sensorValue1 = analogRead(A0);
int sensorValue2 = analogRead(A1);

void setup() {
Serial.begin(9600);
pinMode(A0, INPUT_PULLUP);
pinMode(A1, INPUT_PULLUP);
}

void loop() {

if (analogRead(PIN_ANALOG_X)<30 && analogRead(PIN_ANALOG_Y)<30) {
Serial.println(0);
}else{

if (analogRead(PIN_ANALOG_X)<30 && analogRead(PIN_ANALOG_Y)>93){
Serial.println(1);
}else{

if (analogRead(PIN_ANALOG_Y)<30 && analogRead(PIN_ANALOG_X)>93){
Serial.println(2);
}else{

if (analogRead(PIN_ANALOG_Y)>93 && analogRead(PIN_ANALOG_X)>93){
Serial.println(3);
}
delay(200);
}}}}

 b

a

P5:

https://editor.p5js.org/ach549@nyu.edu/sketches/BJe9woL2m