Here is what has been done so far for my PComp project (recently re-titled “Look At Me Now”).
First, there was a green screen shoot.
The talent performed great physical feats:
(Worry not, the crutches in the shot are mine, not his!)
The footage was collected and edited*:
And now two sets of code have been written. One is in processing:
And the other is in Max**:
Now, I just need to connect parameters coming in from a Kinect to one of the two sets of code above so that the videos are triggered by motion. Then theoretically I should be able to hook up my computer to a projector and have (gasp) a working project!
You know that squirmy sensation you get when you know that someone is watching you? You lift your eyes to meet theirs, and suddenly they turn away; alternately, you meet their unflinching stare and feel your own head turning, as you back away from the eye contact.
There’s so much power in eye contact and in staring; the very ability to be able to gaze at someone says something about power itself. I want to explore this idea of looking and gazing and staring and glaring in my Phys Comp final (albeit in a less straightforward and more fun sort of way).
Here’s my plan: a projector on the wall will show silhouettes or shadows of people walking across it–but only as long as other people are walking in front of the wall. When an observer stops and tries to look at the shadows-silhouette-people, they will also stop and turn to look at that person. In this way, they function as art pieces that are instantly self-aware. After all, art is meant to be gazed at. What happens when the artwork revolts against its function?
I’m going to need some motion detectors that will be able to know whether people are walking or standing still AND if they are standing, can accurately located them. I’ve also got to figure out how to feed all of the right data into Processing so as to be able to run the right stuff. Plus, I need to determine what format the shadow-silhouette-people are in–perhaps movies that loop?
In short, there’s still a bit more thinking and a lot more work to be done.
(I also have a hand-drawn sketch to go with this post that I will upload soon.)
Bruna, Annelie, and I wanted to create something fun and engaging. We eventually settled on a hula hoop, with an accelerometer attached to it, that would hook up to a Processing sketch. The whole theme was the circus: once you started hula hooping, circus music would start to play and the Processing sketch would show a magical image of sparks that followed the movement of the hoop. Supposedly.
Halfway through the circuitry and programming, Bruna, Annelie, and I realized that for our project to function properly, it needed to be wireless. We figured that out with help from Lisa, ITP’s resident hula hoop champion:
And the problems that we faced:
First, trying to set up the wireless communication using Zigbees turned out to be WAY more complicated than we initially expected. Though we managed to get the Xbees talking to each other, we could not get them to send the right data to each other from the accelerometer to the Arduino.
Secondly, Our accelerometer that we ordered didn’t arrive until the week that our project was due. Had we shipped it faster, we could have had a full two weeks to do substantive work on our project. We also didn’t think to just borrow an accelerometer from someone for those two weeks, which in retrospect seems mind-numbingly obvious. Sigh.
We changed our idea too many times. We spent a TON of time brainstorming ideas and even after we settled on one, we ended up changing our minds a couple more times. This was a mistake. It’s great to brainstorm, but if we had known from the beginning that we’d be using the hula hoop, we could have ordered a different sensor altogether (maybe used the much simpler FSRs instead!).
And finally, in all caps because this was the single most limiting factor, TIME. We didn’t give ourselves enough of it. Between the idea changing and the late shipping, we really crippled ourselves and underestimated how long everything would take. Worked against us.
However, I still enjoyed working with Bruna and Annelie and I learned a lot in the process, too.
I wish I had documented a bit more during the actual process (add that to the list of things I should have done), but there are some photos of the accelerometer, hula hoop, and XBee wiring below.
The other day, I was walking with fellow ITPer Naliaka along 57th when we happened upon this little gem of interactive technology:
There was music playing, it was called the Super Cruze Dance Party (SCDP), and the screen was taunting us with the challenge, “Can you outdance the Chevy Cruz Eco?”. Obviously we had to try.
Unfortunately, we soon discovered that the answer was that we could not outdance said Chevy Cruz Eco, and after a few attempts and the realization that we actually had places to be and things to do and whatnot, we decided to leave.
But of course I came back to check out the SCDP again and yes, perhaps to secretly look into my suspicion that the reason that we were unable to beat it was really the result of the way that the whole piece was set up. You’ll have to believe me when I promise that I’m not being bitter right now, but after a day of watching multiple people experience the SCDP, I can pretty confidently sat that the set-up and interface of the public advertisement (I mean, it may be disguised as a game but that’s still what it is) is flawed.
When most people first walked by the large screen, they heard the music before they noticed exactly what was going on. I noticed that people in groups tended to stop more often to check out the screen than those who were walking alone. A large number of those people would dance along or nod their heads to the music and leave it at that.
It was another subset who actually realized the the screen was demanding more of them than just to dance on their own. They were supposed to dance according to the rules of the game.
So they did. Or at least, they tried to. Except that the instructions were small and poorly located, so it wasn’t 30 seconds after a participant had pressed the start button and started dancing to his or her heart’s content that they received a message on the screen telling them that they had just lost. Confused, they would try one more time, and it was only then that they would notice that there were arrows on the far sides of the screen–far as in out of one’s direct arm reach. So then they would try to follow the arrows a la DDR, because that at least made sense. Participants would dance or shimmy to the left or the right as the arrow demanded. And then they would get a message that they had lost again.
It would take another try or two for them to discover that actually, the arrows needed to be touched! So that whole thing about this being a dance party? Not quite true, unless “dance” actually means “jump from the left to right, frantically trying to slam your hand against an arrow”. It was at this point that participants would be faced with a sad reality: the touch screen didn’t actually work quite right. No matter how quickly you pressed an arrow, it wasn’t quickly enough for the touch screen to register that you had clicked it. In other words, even if you played the game right, you still lost. It was an unwinnable game.
Of course, not everybody stuck around long enough to even figure out how to play the game. A lot of people lost interest once they realized that a simple two-step wasn’t enough for the SCDP. More people lost interest when they couldn’t figure the game out after a few tries. And some of the people who figured out the rules to the game and still couldn’t win actually appeared to be visibly frustrated.
At the end of the day, it seemed like the people who had the most fun were the ones who didn’t even bother to really engage with it, or engaged with the SDCP according to their own rules: by dancing to the music and continuing on their merry ways.
This lab was a little weird for me. I collaborated (once again) with Alessandra and our main issue was that we couldn’t get our values to map correctly. You’ll see in the code below that we could only get the photoresistors to go from about 400 to 600. Our melody sounded a bit….off, and I think that our odd mapping may have had something to do with it.
Here’s a picture of our board with the speaker and the photoresistors.
And this was when we first hooked things up so that different tones would get played as light was added and taken away from the photocells. The sound coming from them is the annoying squealy one, not the stuff that makes it sound like I filmed in a room full of people because I was too lazy to find another room.
And here’s the video where we programmed it to play a little melody. It’s a little hard to hear, but exciting nonetheless.
I’d like to mess around with this lab some more. The first thing I’ll do is try and see if I can get the proper values to map. And the second thing will be to figure out how to program my Arduino to play this. Once again, stay tuned, guys.
The construction part of our little handometer was definitely the more challenging part, but ultimately it proved pretty satisfying. There were two parts that we focused on. First, the stuffing (and FSR-attachment) of the glove (materials pictured below):
And secondly, the construction of the actual device, the boxy part of it that would give the strength readings and also give our hand a nice steady support. We focused first on the board that would give the reading. We found some nice slabs of scrap wood that could function as the base and sides of the structure and then found another piece to be the top. On that last piece, we used the laser cutter to write in the three different levels: “Ouch”, “Firm!” and “Wimp”.
Those three ping-pong looking balls functioned as our diffusers. They were hot-glued to the wood and covered the holes we drilled for our LEDs. With the balls on top of them, the light from the LEDs would be more nicely diffused, and would give us the light-up effect that we were going for.
We stuck our little Arduino and breadboard into the box that we had created, making sure that all of the wires fit through (it took a bit of soldering to make sure that all of the wires were well-connected, just FYI)….
…and were pretty happy when we discovered that the wiring still worked:
(It’s impossible to show all of the troubleshooting that went on, but we ended up needing super-bright LEDs that would fill up the entire plastic ball, so we said sayonara to the ones that I showed in the first post).
Confident that the circuitry would still function correctly within our woodwork, we got another slab of wood and laser-cut the phrase “Nice to Meet You” onto it. This was the board that the hand would be stuck to…
So by the end of that day, we had the basic outline of our final product, minus all the bells and whistles:
Niiiice. So the next day was all about filling in the details. Like the color on the words….
And the filling in of the hand (we named him Ferdinhand, silent “h” of course), so that it (he?) would feel realistic…
Buying a larger FSR, drilling a hole for it through the board, and then connecting the FSR…
and finally, drilling all of the boards together.
And all of that work came together into one final product:
Nice to meet you, indeed.
Check out our final video to see little Ferdinhand working successfully from start to finish (or actually, finish to start). Credit to Alessandra for making it.
And for the curious, here’s a shot of the code. Click to enlarge.
You’ll have to excuse the pun in the entry title, but I really couldn’t help it, it was just too easy. You’ll see why in a minute.
The Stupid Pet Trick is a physical computing assignment that every student has to complete. It is essentially a simple device that responds to a physical action and shows that we understand the basics of digital and analog inputs and outputs. I teamed up with Alessandra to create a handshake meter–a device that measures and tells you how strong your handshake is. As someone who has had to shake a lot of hands, I can tell you that this device should be well-appreciated…in the past, it has taken my hand a good five minutes to recover from the pain of a bone-crushing shaker or the creepiness that a limp fish handshake brings. Ugh…I get shivers just thinking about it.
Our project would require a force sensor inside of the glove that could read in different input values as force and output that into three respective LEDs (so that if a someone gave a weak squeeze/shake, one LED would light up, and firm squeeze would light up the second one, and a crazy super hard shake would cause the third to light up). So step one was figuring out the circuitry of the project:
As you can tell, we began by wiring simply, with just one LED. We really wanted to make sure that we could get our FSR to only light that first LED from the input values that we wanted. Once that worked, we stuck in more LEDs and mapped out more values.
These spindly little guys were the lights that we were initially planning on using (along with the FSR that we initially wanted). As the project expanded, you probably won’t be too surprised to learn that we ended up having to change those plans.
But our surprise of the first day of working was how coding and wiring turned out to be relatively simple. Stay tuned for the next post, which documents the part of the project that was really more difficult to get a grip on: the actual construction and presentation of our handy little device.
This post documents my first foray into the world of using Arduino for more than blinking LEDs! We’re finally getting into the art of making things move, and this week we focused on motors. Servomotors, to be more specific. The beauty of servos is that they can be positioned into specific angular positions; all you have to do is send the servo a signal through a variable resistor (I used a potentiometer, just to break up the FSR rut that my Stupid Pet Trick plunged me into).
I decided to actually provide some good documentation this week, so I grabbed a Panasonic LX5 from the ER and even (gasp!) a tripod. But even though I recorded some nice video, my devices revolted and refused to read any of the stuff that I shot. The best that I could do was one screenshot.
You can see in this one how the servomotor and potentiometer are connected to output and input pins (respectively). The white tip on the servomotor would move (up to 280 degrees) according to how I adjusted the potentiometer, and when I moved the potentiometer up to its highest 1023 setting, the servo started to buz and shake.
See, even when I try to set things up all nicely, the end result is still a blurry photo. Sigh.
Here’s the code (in italics) to make up for the lack of video documentation and prove that I did it: