After understanding how data was coming out Faceshift, it was time to put it to work. I noticed that data comes out in a very granular manner, and so analyzing a full night’s sleep would be hundreds of thousands of data points. Initially I wanted to track my own facial gestures in response to given inputs. But tracking myself was bound to be influenced by my own knowledge of the test. After reading and watching some of the homework material, I wanted to test the idea of mimicry. Todd was keen to play around with Faceshift and so I helped him get set up and wanted to see if I could test mimicry through a staring context and trying to subtly influence his gestures. I was tracking smiling, blinking, eye brows up, eye brows down, lateral and vertical head movements. It worked to a certain extent, but Todd was far too good at starting to be influenced greatly — and he had a pretty good idea of what I was up to.
Here is the blink data.
And here is the smile data. I was obviously smiling way more than he was.