I am very keen to build a helmet that combines EEG data with GPS data allowing me to redesign the urban environment based on my emotive behaviour. Emotive 3D Mapping is something I do not just want to present digitally but I will pass the information through python into Rhino and 3D print the objects. I will begin small and even look to redesign furniture I am sitting on before extending my project into the urban environment with the hope of extending my data from objects to architectural structures.
I took a stroll through Central Park to gather GPS data which I then passed through Python into Rhino to produce a staircase. It is pretty basic in terms of the design which is why I am keen to add the EEG data to give a more flamboyant and crisp edge to the design. Fortunately both data sets have a timestamp allowing me to sync each fluidly.
I have successfully hacked the MindFlex EEG headbands and have gathered strong data. I especially enjoy visualising it using the Braingrapher in Processing. Below is a sample of data gathered from the GPS shield and EEG sensor.