Augmented Reality Doll
Inspiration: polly pocket
* Theme idea: will be a combination of Disney princesses and female superheroes–we’re planning on deciding on about 3-4 different characters
* Possible Superheroes:
* Jem and the holograms
* GI Jane
* Storm (X-men)
* Possible Princesses:
* Snow white
* Peter Pan (wendy)
* Little Mermaid
* Lion King
* Possible scenes: will change based on characters
* will be designed using Photoshop or illustrator
AR code: we’ll develop using Open Frameworks
Our group decided to our physical interface looks like bamboo drum set, so that each bamboo pole sounds like one of the chinese bamboo instrument. In this way, we are trying to create natural and organic sound using digital interactions. The picture below is not the final outcome, but it will look like something like that.
We are trying to embed each ultrasonic distant sensor in each bamboo pole, and we will put a breadboard and an arduino inside the bottom stand.
|sensors from Amazon (x2)||$13.36|
Drum Set using Arduino and Range sensor
Play test questions:
- What are your thoughts on the functionality of this “drum set”?
- Is it easy to use?
- What suggestions do you have for improving the functionality of this piece?
- How do you feel about the use of a loop vs. an individual sound?
- Overall, respondents were ok with how the drums functioned because it’s activated by motion rather than by touch
- One suggested change included adjusting the distance/range of the motion sensor so that motion is detected a little further
- Generally respondents preferred the loop sounds are best for the purpose of this piece
- Easy to use
- Another suggestion was to create a bigger sound the farther away your hand is such that the volume increases or decreases depending on the distance from the sensors
- A third suggestion was to have the rhythm continue when the hand is removed; continues where the loop ended when hand is in range again
- Can we ensure that the sensors should be in unison in terms of speed
- Is there a way to make sure that someone doesn’t hit the instrument?
- Can we create a visual that indicates hand motion
Possible revisions to the design based on feedback:
- Name the device so that users are less inclined to use motion instead of touch to activate the sound
- Motion drum
- Movement drum
- Create a visual to accompany the piece indicated how it is used
- Adjust the so that motion affects the volume of each loop and motion is detected from further away
Playtest sample video:
I have seen projects related video mirror before. I’ve been interested in creating projects using Xbox Kinect and point clouds. I’ve alson seen the actual project of Romy Achituv’s, the Text Rain. Computer vision is a very simple and easy interacting method to communicate with computers.
After reading the blog post, Hand-as-cursor projects seem to be the most compelling to me. Before I came to ITP, I made several mini-projects related to Augmented Reality. Since then, I had hard time writing codes that detect human fingers/hands movements. As Tom Igoe wrote on the post, it is very difficult to track and computer human fingers/hands in colored environment. Blacking out background would be a solution, but still finding another solution sounds very compelling. However, if it works out, tracking hands movements has a lot of opportunities in developing interactive artworks. Hands are used as main medium when humans make gestures, and many people see that future interfaces will convey hands tracking system. On the other hand, the system has a weakness in details. When people make hands movements, they often express details with fingers. We need to compute and simplify these fingers movements when tracking them. This weakness narrows interactive artists’ point of views.
We all know that interactive artworks utilize human sensory systems as main mediums. (sight, voice, hearing, touch, and etc.) All the interactive artworks similar patterns that react to those sensory systems, and the artworks try to explore virtual and graphic world with the sensory systems. However, as Tom mentions on the blog, human emotional states are the subtle senses that computers often misunderstand and miscalculate. Therefore, reading human emotions/moods is one of the weak points of interactive arts. This is the particular interactive field that I hope to investigate more in depth later. As I study at ITP, I wish develop interactive arts that visualize or react to those subtle senses.
For this week’s lab, I decided to recap basics of using LED pins and potentiometer. I have experiences of using arduino before, but I haven’t touched it for 6 months so this lab has been a good opportunity to remind some basics of flow of electricity.
My task was to connect several LEDs with a potentiometer on a breadboard, making a lightening smily face. A potentiometer works to dim out the connected LED pins.
Below video clearly shows how I used potentiometer to dim out and lighten up LED smily face. In the video, you cannot see the change of yellow color well, but the yellow LEDs do really fade out.
This is a public interactive technology project, called Hand From Above by Chris O’shea. Using the BBC’s Big Screens, which are installed in various UK city centres, the Hand From Above playfully detects and transforms passers-by. People are allowed to interact with the giant hand on the screen, and people seem to enjoy this surprising interaction.
Passers/walkers are detected randomly from a camera attached hidden, and with computation, the screen draws giant hand following the chosen person. Below image clearly explains how the camera detects a random person.
Detecting and computing a walker’s moving body seems the biggest challenge in this project. I wonder how Chris developed an algorithm to choose a random person out of crowds.
This project also seems to user very easiest way to interact with people. In the busy rushed moment of daily life, this giant hand on a screen gives people not only laughs but also break time. Since the giant hand detects only one person, audience are not required to be a party. Sole user could be engaged in this project.
While observing this project, I reflected on this week’s reading. Norman has emphasized the importance of balance between beauty/design and usability. It is important to arouse emotional and positive effects on users. I personally agree deeply with Norman saying this, because I believe that design takes a significant role in dealing with human subtle senses such as emotion or mood. I also believe that future interfaces should employ attractive design that visualize human emotional states. In this project, users are engaged and affected emotionally. With a very minimal design of a giant hand, people are agitated and influenced positively.
You can read more information HERE
Reading Chris Crawford’s definition was a great opportunity to think about definition of physical interaction. According to his definition, interaction is something like talking to each other. We have to listen, think, and respond in order to talk to someone else. As a developer with a programming background, I saw Crawford’s definition as a new approach to interactive design (since he seems to convey more humanities perspective than I do). He basically defines interaction as a communication. I’ve actually never thought about interacting that way.
In my perspective, interactive design is similar to a duck swimming on a pond. The scene of a duck swimming is beautiful, but beneath the surface of water, duck is paddling so hard. I believe that paddling part represents programming in interactive design. In order to develop an engaging design, such algorithm or programming is required. This tells how I’ve never considered interactive design in user’s perspective along with user needs.
Crawford emphasizes what human needs in order to communicate. Human needs can be different along with situations. In each situation, people demand varied things, and designers are required to calculate different demands/wishes of users in order to create good interactive design. I totally agree with his thoughts, and adding to his comments, good interactive design should have simple and easy-to-use interface that visualizes any data/information clearly.
With this in mind, I could see that Bret Victor’s rant emphasizes good interactive design of future interfaces. In the video, every kind of interface seems like very simple and minimal, and it visualizes information graphically.
From the readings, therefore, I now believe that good interaction should convey/absorb
a function that communicates with users
a function that is simple and easy to use
a function that calculates and understands human needs in each situation
a function that visualizes data/information received from users