The basic setup that we opted for is the EEG sensor board that is contained in the MindFlex game device, or alternatively, the MindWave headset as it comes with Bluetooth connectivity options built-in. We decided to go with the iPhone handset as the data collection device, and we plan to make use of the onboard sensors, mainly the GPS sensor, as well as light and sound sensors.
The reasons for opting to go with the iPhone are:
- We were not able to get the GPS shield to work reliably with the EEG sensor, but we were able to get the data from the iPhone.
- The MindWave headset comes with an accessible iOS SDK.
- Our vision of the project is to have the user receive feedback, both in the form of visuals and possible vibration, to aid them in understanding their attention patterns more.
We are in the process of developing an application to get the various required functionalities to work in a number of demo applications. We will then put all various pieces together in the final prototype.
We have put together a basic mockup that we are following, and intend to improve as we go along:
In terms of progress, we were able to both get and map the user location through two demo applications that we made, shown below.