Update on Eye Tracking:
For the midterm I want to finish the eye tracking glasses I have been working on and test out some of the concepts I thought they may be interesting for. . . One of those is comparing eye location to GSR readings. The other is simply representing eye movement and what the subject is the user is looking at, to see if exposing this information makes regular eye contact and conversation different in any way.
I am using the instructions from the eyewriter project, to create the set of glasses, but adding an additional outward camera to track what is being seen. So far I have the camera setup on the glasses (removed in the pic) but need to work on the software for seeing the pupil movement. . . not sure how that is going to work out.
I would love to be able to stream this info wirelessly, so that the glasses can be mobile. And connected to a gsr sensor, but any of these things would do.
Another thing that I would like to experiment with is whether depth of focus can be tracked by using a camera on both eyes, comparing their movement toward each other as well as side to side and up and down. I think that when we look at something closer to us, our eyes are more crossed, and I wonder if this is easily trackable. The applications of this I have seen so far focus on viewing on a two dimensional surface, making this not as important. I just thought also combining it with a wide ranging proximity sensor of some sort that can provide information about how much to narrow the focus of the outward camera.
One of the things I’m really inspired by is the attention research done using eye tracking and a controlled subject.
I have been talking to Kimi about working on some bluetooth sensors. I’m not sure if bluetooth is the best approach for what I would like to make, possibly for the final, but I am interested in sensing proximity and linking wearable devices through phones or other means. I would like to create a set of wearable devices (they either light up, or join forces via speakers to amplify the user’s voice) that are activated on proximity. This will go along nicely with my vibrant technology project which is really similar, except from the angle of the technology not being a tool, being useless, or even contrary.
Kimi and I were talking about experimenting with a lightweight version of proximity sensing for the midterm, and I hope to have time to do this in addition to the eye tracking.