All posts by WoonYung Choi

User Study : Inpyo and Woonyung

Bk4PrffIcAAtynC-1
Thanks for Diana, Billy, Salem for great user study!

POSSIBLE AUDIENCES

  • instagram, social media users tech person/ smart phone users
  • trend setter
  • someone who is interest in Tech/ new things someone who is interested in Quantified self

USER STORIES

  • As a 15- 25 years old urbanite I want to be able to use this wherever I go
  • As an active social media user I want to be able to connect to my friends and share my emotions with them especially as we hang out around the city.
  • As a young, male user, I want to use this device to express my emotions without coming across as an overly emotional person.
  • As someone who is interested in quantifiying their mood, I want to have regular feedback & insight about improving my mood.

based on the user study, we create UXUI

 

Inpyo / Woonyung : Spatial Visualization

 

img1

We did spatial visualization by assuming what will look like. Mostly we are going to map the color value based on the data we got from facial tracker (in this case : webcam –> OSC –> MAX/MSP).  Each dots will be placed where users take individual photos (it will be click-able dot that can see little pop up window shows photographs.

We are thinking we are going to start from self journaling system to public recommendation site.

WEEK3 : WORK PROGRESS

Since we got the GPS data last week, we just developed our idea for a final project. ( getting GPS data : Work Progress : week 2 ) What we want to do for final project is integrating  emotional data and GPS data and pulse data, so that we can visualize an emotional data depends on the location. We do not decide yet whether we are going to record our facial expression while we are riding a bicycle or walking around.

For facial expression, we used face tracker from Processing to parse the data (depends on how much mouse is opened / how long is width of mouse / eyebrows / eyes.)

We divided three major facial expression. ( purple dashed line is the what we are going to focus)

1. happy face

01_happy_data
As you can see on the graphs, values of eyebrow and mouse width and heights are increased

2. surprise face

02_surprised_data

3. exhausted face

03_exhausted_data

For the next week (or Spring break), we are going to integrate those facial data and pulse data and GPS data somehow, and we will keep thinking and narrowing down to more specific subject.

week2 : Work Progress

Since Inpyo and I are going to work together, we did sensor test together as well. For startup, we started to test pulse sensor first.

1. purse sensor test

SONY DSC

We put LED on Arduino, so that we are enable to see how it is acted to correspond with pulse.

SONY DSC

SONY DSC
It seems working fine.

2. GPS Logger test

SONY DSC
SONY DSC
SONY DSC
We connected GPS shield to arduino, and sensors seems working okay, yet we were not able to get actual our locational data. In order to get GPS information, we took those out to Washington square park where has fewer buildings.

When we came back and look through all data which is in a text file, it seems working fine.

Screen Shot 2014-02-26 at 8.10.00 PM

Lines that has longer characters shows it started to get the data.

Screen Shot 2014-02-26 at 8.14.21 PM
Because we had walked through Washington Square Park, we were wondering that if GPS got correct data. Latitude and longitude of Washington Square Park is 40.730937 and -73.997319 which seems very close to our data (mostly around  40.4382 and -73.59).

gps5

gps6
However, when we typed one of the data that we got from GPS, it is showed as middle of the river. We assume that GPS does not have enough information to track correct locational data, but we are not quite sure what exact reason is for that.

You can check out a footage that shows our whole process:

WEEK 1 : POST 2 Prior art, future work

Since I have entered Interactive Telecommunication Program (ITP) at NYU for studying interaction, which is among people or between people and environment, machines, and so on, I would like to work on visualizing the comparison between geographical data and our response towards them. The city, where we are living in now, has a different kind of environmental factors, such as sidewalks, buildings, roads, bridges and parks. I am using locational data to find a location of myself, but I have not looked about what is inside of this location map. This is one of the reasons that I choose this class that looking bio-metric data, especially in a relationship between locational data and bio-reaction that we can not see in a regular basis. Most of cases which I would like to explore, my future users would be pedestrians, in response to stimulation surrounding spaces. This is my first attempt of working with geographical data. Therefore I am not quite sure how far I am going to explore and what I can do and what I cannot do yet. However, here is a rough idea list that I would like to explore so far.

  • Maybe a heat map (or even in other forms), visualizing data in response to heart rate
  • relationship between heart rate and emotional stimuli or vice versa
  • Comparison between heart rate and factors in external worlds
  • (if I can get data from people) relationship between environment and emotion of people (macro data)
  • (if I can get data from people) comparison between individual vs group of people

First one will be a heat map (or even in the other forms for visualizing data), which visualize the data in response to heart rates; First of all, I would like to start up with how a human heart rate is related to environmental factors. I am curious about what factor affect to the heart rate and how. I guess it depends on situations and people, but for my personal experience, I feel very nervous when I am surrounded by too many people in such small space, and I feel very comfortable when I am in the capacious park. I would like to look into how my heart rate can be differed by different location or situation. Second one that I want to look is a relationship between heart rate and emotional stimuli and compare heart rate and factors in an external world (such as concentration of CO2, or density of people, building); I assume that heart rate can be changed by emotional stimulation. In the book Emotions and heart rate while sitting on a chair, Jenni Anttonen and Veikko Surakka analyze how heart rate is related to human’s positive or negative emotions. In this paper, authors introduce that heart rate start to decelerate when people are in a negative mood, rather than in neutral or positive mood. I think this implies a great potentiality towards visualizing things that are in the inside of people. I also think heart rate is not only linked to the emotional response, but it is also linked to the other environmental factors, such as dense population around the users, noise, concentration of CO2, and so on. According to the Bio Mapping, what I would like to work on seems one of the ways of bio-mapping, which is a method for analyzing people’s reaction toward their external world. In over twenty five cities across the world and two thousands people have researched this project. In Assessing contextual mood in public transport: a pilot study, the author also points out measuring quality of experience around us may define status which response to the environmental, and which will be information for enhancing people’s lives. Now I am leaning towards visualizing subjective data, but I still cannot assure what exact path would be for my future project. I would like to try whatever it shows interesting consequences.

Week 1 : post 1

1

  • Woonyung Choi
  • First year
  • Graphic design,
    http:// woonyungchoi.com
  • I would like to work on geolocation data or other data and visualize it. I think working with GPS would give me a clue for what I am going to do with geolocation data.
  • working with geolocation data and visualizing it (maybe.. I am still thinking it)
  • for geolocation device……maybe closest one would be Google map and Hopstop.