I have mainly focused this week on the design of my wearable. I have successfully gathered EEG data and passed it through python into Rhino. The patches I have created are relatively exciting but I am keen to link these directly to my GPS data. I have added Python code to raise points that are closer to each other. Both examples vary in fluctuated depth but it is this that most excites me; I want to test my design skills with the limitations of data. The only question I have at this point is which section of EEG data to sculpt?
I have visualised my GPS routes and placed them onto google maps. However, I am more fascinated by the physical visualisation which is why I sculpted the walk into these objects. The choice of objects is somewhat random and this could be filtered.