I am very keen to build a helmet that combines EEG data with GPS data allowing me to redesign the urban environment based on my emotive behaviour. Emotive 3D Mapping is something I do not just want to present digitally but I will pass the information through python into Rhino and 3D print the objects. I will begin small and even look to redesign furniture I am sitting on before extending my project into the urban environment with the hope of extending my data from objects to architectural structures.



I took a stroll through Central Park to gather GPS data which I then passed through Python into Rhino to produce a staircase. It is pretty basic in terms of the design which is why I am keen to add the EEG data to give a more flamboyant and crisp edge to the design. Fortunately both data sets have a timestamp allowing me to sync each fluidly.



I have successfully hacked the MindFlex EEG headbands and have gathered strong data. I especially enjoy visualising it using the Braingrapher in Processing. Below is a sample of data gathered from the GPS shield and EEG sensor.



1 thought on “#3”

  1. Great Zander, an EEG-sculpted staircase (or other 3D object) sounds very cool! For the midterm, it would be good to see your plan for translating EEG+GPS data into the 3D object. I’d also like to know about your choice of the staircase as the form factor of the physical object.

    Looking ahead to your final presentation, I would be curious to see how your subjects respond to having a physical object that represents their brain state and location. I imagine that unexpected results will arise.

Leave a Reply