I was inspired by the Virtual Reality talk from the class syllabus. Specifically I was intrigued by face detection and face tracking and how one perceives others in relation to themselves physically. I decided to take FaceShift for a spin to create my own avatar.
After the initial 5 minute calibration FaceShift can not only recognize your distinct face, but it also tracks 30 different features over time and send those out via OSC. Below is a chart with tracking smile and eyebrow movement over time.
Yet, in the end these are only bits of data that don’t give any insight into attitudinal or emotional change. Coincidentally I received and email offer for YouEye which uses your own computer webcam to track changes in mood state.
Also, I ran into an article where researchers were using MatLab to track changes in emotion of a user over time.