Black == Our project Blue == example study
Question & purpose: how to show your work in public areas(location) and how to get the ideal situation you want(how to show it better, in a suitable way — quantified)
Analyze the personal emotion and group emotion(self)
Think about this project from: User Experience Design Aspect
From User-Centered to Participatory Design Approaches gives me some basic information about user experience design and psychology
http://www.maketools.com/articles-papers/FromUsercenteredtoParticipatory_Sanders_%2002.pdf
Useful sentences:
In the user-centered design process, we are focused on the thing being designed (e.g., the object, communication, space, interface, service, etc.), looking for ways to ensure that it meets the needs of the user.
The application in user test area
The social scientist/researcher serves as the interface between the user and the designer. The researcher collects primary data or uses secondary sources to learn about the needs of the user. The researcher interprets this information, often in the form of design criteria. The designer interprets these criteria, typically through concept sketches or scenarios. The focus continues then on the design development of the thing. The researcher and user may or may not come back into the process for usability testing.
Design for Experiencing
Today we are beginning to hear about “Experience Design,” whose aim is to design users’ experiences of things, events and places. This influence on design can be attributed to a significant literature being written in the social sciences that has begun to acknowledge the role of emotions in human experience (see Jensen, 1999 for example).
Furthermore, as we know, the user’s behavior would be effected by realizing whether he or she is be observed or not. And also if we ask them for some questionnaires or interviews, they only gives us what they want us to hear. some It is also a good way to get the result that is closest to the natural personal feelings.
The conclusion: Why our project is meaningful?
It is about the recognition that all people have something to offer and that they, when given the means to express themselves, can be both articulate and creative.
How this system can be used in the data analysis?
1. get to know about the personal interests and tastes
2. get to know about the public interests and tastes
3. help the decision making process, by providing the collected data results to a person
4. do research about the exhibition space or show case space, to better plan and organize a exhibition (for example: which position or wall would be firstly realized or paid attention to by the visitors?)
5. get the first hand data about the users’ feeling, it is also a good user test way (in some shops).
6. offering some potential interesting options based on the facial expression and the personal interests
7. Humans interact with each other mainly through speech, but also through body gestures, to emphasize a certain part of the speech and display of emotions. Emotions are displayed by visual, vocal, and other physiological means. There is a growing amount of evidence showing that emotional skills are part of what is called ‘‘intelligence’’ [16,36]. One of the important way humans display emotions is through facial expressions.
Related example from design perspective:
The influence of prototype fidelity and aesthetics of design in usability tests: Effects on user behaviour, subjective evaluation and emotion gives me some basic information about user test
http://www.sciencedirect.com/science/article/pii/S0003687008001129
Think about this project from: Technology Aspect
How to track human beings’ emotion from facial expression
mostly related
Facial expression recognition from video sequences: temporal and static modeling
In this work we report on several advances we have made in building a system for classification of facial expressions from continuous video input.
http://ac.els-cdn.com/S107731420300081X/1-s2.0-S107731420300081X-main.pdf?_tid=9acf5e8e-b522-11e3-aab0-00000aacb361&acdnat=1395864778_d5e622fa00c01bd8c6a62aa22df7d562
Dynamics of facial expression extracted automatically from video
http://ac.els-cdn.com/S0262885605001654/1-s2.0-S0262885605001654-main.pdf?_tid=a4ba168c-b522-11e3-9ef4-00000aab0f26&acdnat=1395864794_c1fce58c8f5ba828748698e6a9710d3a
Three-Dimensional Head Tracking and Facial Expression Recovery Using an Anthropometric Muscle-Based Active Appearance Model
http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4522536
Measuring emotion: The self-assessment manikin and the semantic differential
http://ac.els-cdn.com/0005791694900639/1-s2.0-0005791694900639-main.pdf?_tid=ea81a798-b522-11e3-8a70-00000aacb360&acdnat=1395864912_5a69046afb2509d13a4bfdb0fa9e4f4c
Concept of Ubiquitous Stereo Vision and Applications for Human Sensing
http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1222176
Great examples:
1. Design of a Social Mobile Robot Using Emotion-Based Decision Mechanisms
http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4058870&tag=1
I think this is a good example for us to learn, which has similar foundational idea and willings to ours, but realized by another way.
The paper describes a robot that interacts with humans in a crowded conference environment. The robot detects faces, determines the shirt color of onlooking conference attendants, and reacts with a combination of speech, musical, and movement responses. It continuously updates an internal emotional state, modeled realistically after human psychology research. Using empirically-determined mapping functions, the robot’s state in the emotion space is translated to a particular set of sound and movement responses. This robot’s goal is showing the potential for emotional modeling to improve human-robot interaction.
Using an onboard camera, it detects faces and determines the presence of onlooking people. It is not detect the expression directly, instead, it uses some other input (like the color of the user’s shirt) to help interact with users (the project is not so so meaningful any more, in this case). I think the camera detect expression system is very difficult for us.
Face Recognition Tech — OpenCV, as well
Details here:
To do face detection, OpenCV’s [5] object detection function was used. This function is based on the Viola- Jones face detector [10], which was later improved upon by Rainer Lienhart [7]. It uses a large number of simple Haar- like features, trained using a boost algorithm to return a 1 in the presence of a face, and a 0 otherwise. The OpenCV object detector takes a cascade of Haar classifiers specific to the object being detected, such as a frontal face or a profile face, and returns the bounding box if a face is found. An included cascade for frontal faces was used for this system.
To differentiate actual faces from picture sand other”face-like”stationary objects,we added a motion check based on a difference filter. Whenever the Haar detect or reports a face, the robot stops, and waits for a set time interval to eliminate any oscillations in the camera boom. Once the camera is perfectly stil, the difference operator is executed over a few frames in the bounding box of the face, and the area under it, where the body of the person is supposedly located. If sufficient motion is found(defined by an empirical threshold), the robot transitions from the Wander to the Person state. The motion check coupled with the Haar cascade proved reliable and accurate in all situations where sufficient lighting was present.
Great example
http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=729538
Also mentioned combine with GPS and head-tracker
Another example a little bit too science and tech for me
LAFTER: a real-time face and lips tracker with facial expression recognition
http://ac.els-cdn.com/S0031320399001132/1-s2.0-S0031320399001132-main.pdf?_tid=1f37f24e-b523-11e3-920e-00000aacb35f&acdnat=1395865000_d8cf4d16341b1580fefa5f5244391d6d