All posts by Salem Al-Mansoori

Rivet: Hardware + Software

 

1

Hardware:

The basic setup that we opted for is the EEG sensor board that is contained in the MindFlex game device, or alternatively, the MindWave headset as it comes with Bluetooth connectivity options built-in. We decided to go with the iPhone handset as the data collection device, and we plan to make use of the onboard sensors, mainly the GPS sensor, as well as light and sound sensors.

The reasons for opting to go with the iPhone are:

 

  1. We were not able to get the GPS shield to work reliably with the EEG sensor, but we were able to get the data from the iPhone.
  2. The MindWave headset comes with an accessible iOS SDK.
  3. Our vision of the project is to have the user receive feedback, both in the form of visuals and possible vibration, to aid them in understanding their attention patterns more.

 

 

Software:

We are in the process of developing an application to get the various required functionalities to work in a number of demo applications.  We will then put all various pieces together in the final prototype.

 

We have put together a basic mockup that we are following, and intend to improve as we go along:

Screen Shot 2014-03-27 at 12.39.07 AM

 

 

In terms of progress, we were able to both get and map the user location through two demo applications that we made, shown below.

iOS Simulator Screen shot Mar 26, 2014, 11.45.37 PMphoto

 

Quantifying Cognition

In their article Activity Recognition for the Mind: Toward a Cognitive “Quantified Self,” researchers Kunze, Iwamura, and Kise, argue that it is possible to track the cognitive activities (as one would with physical activity), through means alternative to EEG, which uses difference in electrical potential that results from brain activities as manifest in brain waves. They present an alternative to EEG due to the disadvantage of it in their opinion, its invasiveness, noisiness compared to muscle movement trackers, and the effort and time it takes for the accompanying signal processing.

The alternative to EEG as the authors present it is the tracking of eye-movement, or EOG (or Electrooculography) which is arguably meaningfully correlated to cognition. For instance, comprehension of a reader of a word is proportionally related the amount of time in which the eye is fixated on it. Other information that can be collected through EOG is the degree of fatigue of the reader, as well as his level of attentiveness.

In addition, measuring the distance traversed by the eye’s saccade can be used to determine parameters related to the consumed media itself, including its type (a page in a magazine has a different saccade pattern than, say, a published article that follows the general scientific paper format), and the number of words read.

EOG, though cheaper than ECG methods, can also be intrusive. For instance, the need to measure the difference in electrical activity between the Retina and Cornea, requires that some kind of physical contact between the device and the skin area around the eye of the user is in effect at all times. An alternative would be to utilize infrared rays to measure the shape of the iris, which can even be combined with the measurement and tracking of facial features to get more accurate data.

EEG, while invasive, is still a good way to measure a person’s response to phenomena. According to NeuroPlace: Making Sense of a Place, a paper authored by Lulwa Al-Barrak and Eiman Kanjo, EEG has relevant applications such as quantifying cognition and state while learning, determining whether a driver is fatigued, and even tagging multimedia according to emotion.

To that effect, the authors combined mobile technology (for GPS location tracking and session marking) and the EEG parameters readily accessibly through NeuroSky, an EEG headset made by Mindwave Mobile, to measure a person’s mental “response” to a place. The headset readily measures the frequencies of various brainwaves, including delta, theta, and multiple intensities of alpha, beta, and theta waves), and accordingly measures the user’s state of “attention” or “meditation.”

An interesting piece of methodology in the work that the author did is the use is the correlation with environmental noise to remove noise from the signal. While no further details were provided on how they achieved that, the possibility of that offers a remedy to the long standing noise problem in EEG-based apparatus. (NeuroSky does reportedly provide information on the level of noise captured too).  Furthermore, the authors were able to categorize places using statistical methods, including logistic regression. Each data point was also coupled with a timestamp label as well as GPS coordinates.

My main takeaway from both papers is that the phenomena related to the brain, whether in terms of cognition or mental states, are possible to track and monitor whether directly, through EEG, or indirectly through secondary channels such as gaze analysis or face features. Each method comes with its own shortcomings. For instance, EEG is prone to environmental noise, whereas gaze analysis is limited in the kind of data it can capture. In addition to that, both technologies have varying levels of invasiveness, which I believe is the main hurdle for this class of QS devices. It is a multifaceted problem that is under the influence of both technology (i.e. how small can the underlying circuitry be made without sacrificing the functionality?) and design (how can we best hide it all?) factors.

 

Hello, I am Salem

Hey there.

I am Salem Al-Mansoori, and I am a first year student. My background is a bit eccentric: My bachelor’s was in computer engineering, however, prior to joining ITP, I was working in a marketing position in a telecommunications company back home in Dubai for 7 years. As I had to make my way through what it takes to manage and market products with no prior relevant experience, I pursued an MBA between 2009 and 2011. All the while I was an active designer (multidisciplinary, generative, digital fabrication “products”) in my free time.

My portfolio is here: http://www.rbknrbkn.com/ and my blog is on http://www.rbknrbkn.com/workandprogress/.

I have chosen this class because of my interest in the quantified self movement, which I have been following for a while now. Specifically, I am interested in ways we can improve facets of our lives through measuring and monitoring specific parameters.

My goals for this class is to come up with a working prototype to measure aspects of cognition and mood in a way that empowers the user to combat stress and anxiety. For instance, what triggers an episode of anxiety (social, generalized, etc) or depression? (I do go through such episodes occasionally, so I have a personal investment in that topic.)

In terms of my usage of QS devices, I wear (or used to wear, at least) a BodyMedia Fit (http://www.bodymedia.com/?whence=) device, which is a more accurate activity measuring device than pedometer-based devices, as it employs an array of biometric sensors, including a pulse monitor and a galvanic response sensor. It did help me be more active when I was using it more regularly, as it comes with an expansive analytics monitoring and goal-setting web-based software suite.