#2

Hardware Update

I have been working with the GPs and EEG sensor. I have also been experimenting with the Solar Panels and judging how many Volts etc it needs during each moment. Working on sleeping the arduino has been difficult but I have successfully hacked together the EEG and GPS shield. It is not working indoors but as my project is outdoors this should not matter!

IMG_1591 IMG_1595 IMG_1609 IMG_1613

 

Above are photographs documenting my GPS test with Solar Panel. The Tests prove that the Solar Panel can successfully provide enough electricity to the gPS without a lithium battery. Hacking out the EEG has proved more difficult but I am working on removing them form their casing!

GPS

EEG

 

Above is proof of GPs data gathering into CSV. I have also successfully gathered EEG data into CSV and am working on passing these both simultaneously through python into Rhino!

 

Session 2, Assignment 3

Session 2, February 27, was devoted to hardware tinkering with GPS and sensors. My sample hardware post is corrected and updated from our class discussion, including a link to the SDFat library.

For your next post, please do the following and add a link to your post on the A#3 (Assignment #3) page.

  • Finalize your technical setup as much as you can, and post a photo. You can do this with your collaborator.
  • On that note, let me know your progress on collaboration. You can have one collaborator, a group of collaborators, or just a work/study group if you decide to do your projects on your own.
  • Output some data, and post a text file or photo. Your data should include some values for the Quantified Self (i.e. biosensor levels), and some values for Location (i.e. latitude/longitude).

Here is a sample CSV file of the kind of data you should output. It contains 2 quantified self variables (in this case, EEG) and a few location variables (in this case, satellite fix quality and lat/long). There are other variables you will eventually need to collect (i.e. timestamps), but we can worry about that in another assignment.

Don’t stress too much if your data is not accurate, for instance, if latitude and longitude aren’t correct. Just output something, and we can talk more about issues in the next class.

Week 2: Experimenting with Temperature Sensors

My Progress:  I played with temperature sensors and different microcontrollers to see what is best for my project. So far, I have tried Arduino, Arduino with WiFi shield, Teensy and Lilypad. I really loved the size of Teensy. It was also perfect for my simple temperature sensor, but it was difficult to log in all the data. One way to do this can be connecting Teensy and my Android with a micro-USB to USB OTG adapter cable. Luckily, while I was searching for it, one of my resident lent me an RFduino, which is super tiny and has a built-in Bluetooth 4.0. This is perfect to make a connection with my Android, and once it’s all setup, I will combine my temperature data with Foursquare data (thank you for a great suggestion Arlene!). I will post any progress I make with RFduino soon!

feb26_blog1 feb26_blog2 feb26_blog3

My Questions/Issues:  I am not sure what is the best spot to place my temperature sensor. I have been documenting the way people use their phones.  feb26_blog4 feb26_blog5

It seems like the sides are better than back of the phone but how many sensors should I place to get a seamless reading? I need to make a phone case soon and start experimenting on the placement of the sensor. Also, making wearables such as a ring or thimble for hands and an earplug for ears might be interesting possibilities.

week2 : Work Progress

Since Inpyo and I are going to work together, we did sensor test together as well. For startup, we started to test pulse sensor first.

1. purse sensor test

SONY DSC

We put LED on Arduino, so that we are enable to see how it is acted to correspond with pulse.

SONY DSC

SONY DSC
It seems working fine.

2. GPS Logger test

SONY DSC
SONY DSC
SONY DSC
We connected GPS shield to arduino, and sensors seems working okay, yet we were not able to get actual our locational data. In order to get GPS information, we took those out to Washington square park where has fewer buildings.

When we came back and look through all data which is in a text file, it seems working fine.

Screen Shot 2014-02-26 at 8.10.00 PM

Lines that has longer characters shows it started to get the data.

Screen Shot 2014-02-26 at 8.14.21 PM
Because we had walked through Washington Square Park, we were wondering that if GPS got correct data. Latitude and longitude of Washington Square Park is 40.730937 and -73.997319 which seems very close to our data (mostly around  40.4382 and -73.59).

gps5

gps6
However, when we typed one of the data that we got from GPS, it is showed as middle of the river. We assume that GPS does not have enough information to track correct locational data, but we are not quite sure what exact reason is for that.

You can check out a footage that shows our whole process:

“Hardware”

example1

“Hardware” is a strong word for me right now. I’ve focused on getting a webcam to (a) detect faces (easy!) (b) detect a variety of states about the face – pulse, mood, etc. (not easy!). This was all done using OpenCV and Node.js, which I hope will make my life easier if I need to network a bunch of PCs together to coordinate data I collect.

  • Heartbeat seems to be possible using an FFT of the forehead. Right now I’m using Numeric.JS to pull this data, no results yet.
  • Mood is a bit harder. Training sets for image-based recognition are hard to come by, and I don’t feel like I can put one together myself. Right now I’m looking into relative positions of features, e.g. positions of the corners of the mouth and eyes.

Questions/Issues

The key issue I’m facing is that all of these problems are easily solved when given a baseline, e.g. 10 seconds of a face to detect pulse, neutral face state for mood, etc.

How can I make something that pulls just enough interesting data from faces, without taking forever to do it? I think once I figure this piece out, the possibilities just explode.

 

 

Sample Hardware Post: Pulse Sensor + GPS Shield + Arduino

Since we didn’t have class last week, I don’t want to give you a big assignment, but I do want to know what kind of hardware you are starting to tinker with, and more importantly, I’d like to know your questions. Since we didn’t have class, I’ll make a sample post to get things started. For your post, please include

  • A PHOTO of your hardware setup
  • a few sentences on YOUR PROGRESS
  • a few sentences on YOUR QUESTIONS.

——

This week I tinkered with setting up a pulse sensor with the Adafruit GPS logger shield on the Arduino Uno.

IMG_0106

My Progress: The pulse sensor and GPS shield are all working together and outputting Quantified Self (sensor) and Geolocation (lat/long) data to both the Serial Monitor and the microSD card. Even though I’m using the “Soft Serial” switch on the GPS shield, I had to set up a jumper cable (see the yellow jumper cable in my photo above) from TX to the 0 pin, as in this Adafruit image.  My (not yet thoroughly commented) sketches:

My Questions/Issues: The pulse sensor seems to take a few seconds to stabilize and indicate a reasonable BPM of 60-65. I’m not sure if this is normal. Also, I don’t seem to find a decent pulse reading with my fingers or ears– my lip seems to be the only place where I get a decent reading! This must be resolved. Also, the GPS shield, which I set up inside, doesn’t seem to maintain a consistent fix to satellites, so the lat/long values are not consistent. But this tends to resolve through outdoor or antenna use.

Supplemental: Here is a quick video demonstrating this setup. [Note to class: a video is optional.]

And here is a follow-up video with GPS output: Follow-up video with GPS output

 

Quantifying Cognition

In their article Activity Recognition for the Mind: Toward a Cognitive “Quantified Self,” researchers Kunze, Iwamura, and Kise, argue that it is possible to track the cognitive activities (as one would with physical activity), through means alternative to EEG, which uses difference in electrical potential that results from brain activities as manifest in brain waves. They present an alternative to EEG due to the disadvantage of it in their opinion, its invasiveness, noisiness compared to muscle movement trackers, and the effort and time it takes for the accompanying signal processing.

The alternative to EEG as the authors present it is the tracking of eye-movement, or EOG (or Electrooculography) which is arguably meaningfully correlated to cognition. For instance, comprehension of a reader of a word is proportionally related the amount of time in which the eye is fixated on it. Other information that can be collected through EOG is the degree of fatigue of the reader, as well as his level of attentiveness.

In addition, measuring the distance traversed by the eye’s saccade can be used to determine parameters related to the consumed media itself, including its type (a page in a magazine has a different saccade pattern than, say, a published article that follows the general scientific paper format), and the number of words read.

EOG, though cheaper than ECG methods, can also be intrusive. For instance, the need to measure the difference in electrical activity between the Retina and Cornea, requires that some kind of physical contact between the device and the skin area around the eye of the user is in effect at all times. An alternative would be to utilize infrared rays to measure the shape of the iris, which can even be combined with the measurement and tracking of facial features to get more accurate data.

EEG, while invasive, is still a good way to measure a person’s response to phenomena. According to NeuroPlace: Making Sense of a Place, a paper authored by Lulwa Al-Barrak and Eiman Kanjo, EEG has relevant applications such as quantifying cognition and state while learning, determining whether a driver is fatigued, and even tagging multimedia according to emotion.

To that effect, the authors combined mobile technology (for GPS location tracking and session marking) and the EEG parameters readily accessibly through NeuroSky, an EEG headset made by Mindwave Mobile, to measure a person’s mental “response” to a place. The headset readily measures the frequencies of various brainwaves, including delta, theta, and multiple intensities of alpha, beta, and theta waves), and accordingly measures the user’s state of “attention” or “meditation.”

An interesting piece of methodology in the work that the author did is the use is the correlation with environmental noise to remove noise from the signal. While no further details were provided on how they achieved that, the possibility of that offers a remedy to the long standing noise problem in EEG-based apparatus. (NeuroSky does reportedly provide information on the level of noise captured too).  Furthermore, the authors were able to categorize places using statistical methods, including logistic regression. Each data point was also coupled with a timestamp label as well as GPS coordinates.

My main takeaway from both papers is that the phenomena related to the brain, whether in terms of cognition or mental states, are possible to track and monitor whether directly, through EEG, or indirectly through secondary channels such as gaze analysis or face features. Each method comes with its own shortcomings. For instance, EEG is prone to environmental noise, whereas gaze analysis is limited in the kind of data it can capture. In addition to that, both technologies have varying levels of invasiveness, which I believe is the main hurdle for this class of QS devices. It is a multifaceted problem that is under the influence of both technology (i.e. how small can the underlying circuitry be made without sacrificing the functionality?) and design (how can we best hide it all?) factors.

 

Hello, I am Salem

Hey there.

I am Salem Al-Mansoori, and I am a first year student. My background is a bit eccentric: My bachelor’s was in computer engineering, however, prior to joining ITP, I was working in a marketing position in a telecommunications company back home in Dubai for 7 years. As I had to make my way through what it takes to manage and market products with no prior relevant experience, I pursued an MBA between 2009 and 2011. All the while I was an active designer (multidisciplinary, generative, digital fabrication “products”) in my free time.

My portfolio is here: http://www.rbknrbkn.com/ and my blog is on http://www.rbknrbkn.com/workandprogress/.

I have chosen this class because of my interest in the quantified self movement, which I have been following for a while now. Specifically, I am interested in ways we can improve facets of our lives through measuring and monitoring specific parameters.

My goals for this class is to come up with a working prototype to measure aspects of cognition and mood in a way that empowers the user to combat stress and anxiety. For instance, what triggers an episode of anxiety (social, generalized, etc) or depression? (I do go through such episodes occasionally, so I have a personal investment in that topic.)

In terms of my usage of QS devices, I wear (or used to wear, at least) a BodyMedia Fit (http://www.bodymedia.com/?whence=) device, which is a more accurate activity measuring device than pedometer-based devices, as it employs an array of biometric sensors, including a pulse monitor and a galvanic response sensor. It did help me be more active when I was using it more regularly, as it comes with an expansive analytics monitoring and goal-setting web-based software suite.

Session 2: Prototyping, and Teaming Up

This week’s class was cancelled due to snow, but since we were prepared to do some prototyping, let’s spend the next two weeks sharing some tinkering.

Here is the slideshow #2 that I planned to present this week. I will present it briefly in our next class. It opens with

  • Screen Shot 2014-02-13 at 9.37.51 PM
    a diagram on the scope of our class, in response to some student questions
  • Screen Shot 2014-02-13 at 9.38.00 PM
    a diagram on the iterative prototyping cycle, and preparing “Alpha V.1”
  • and a checklist for preparing the midterm.

The rest of the slideshow focuses on some of the hardware, particularly biosensors, that could be appropriate for this class. Most of the slides have a link to a relevant URL. If you’re still looking for a sensor to use in your project, the rest of the slideshow could be useful to you.

Before the next class, put together a quick blog post with a photo of your hardware setup, your progress, and your questions/issues. You can use the recommended hardware (Arduino, GPS shield, bio-sensor), or anything else of your choosing. Ideally, you will show how you are both QUANTIFYING the self (or someone else) and LOCATING them. I will make a sample post for you in the coming days. If you’re having technical trouble, don’t worry– you can demonstrate that in your post, and talk about what you’d like your prototype to do if it was working correctly!

Also, please take a look at the responses to Assignment #1 and start thinking about how you’d like to team up.  Please comment on the posts that resonate with your own work, and perhaps contact students offline to form teams. I will also provide comments and team suggestions in the coming days. Again, you are not required to work in a group, but I encourage you to find at least one person to work with, especially if that person’s skill-set can complement your own.

Blog Post #2 : Prior Art, Future Work

Nowadays, there is an emerging issue about ‘internet of things’.

Big companies now jump into the ‘internet of things’ market. Samsung lunched a smart watch, Galaxy Gear. Google bought a company, Nest which made a intelligent thermostat and smoke alarm system. Also, there is a rumor that Apple will lunch a iWatch this year.

Why big companies are watching the internet of things with great interest? How will these things affect us?

The answer is closely related with quantified self devices.

Quantified self devices like Fitbit Flex, Jawbone UP and Nike Fuel band have gained in popularity over recent years. These devices make it easy to track activity, sync stats, see trends and reach goals. Each companies has their own mobile app and web service so that people can check the stats and result anywhere, anytime, and a function like competition with friends is good way to motivate people to exercise more. I am using Fitbit flex and Aria (a smart scale that measures weight and BMI, and upload to server wirelessly) and I think these devices are a lot helpful to maintain healthy life style.

Among these kind of quantified self devices, the most brilliant use of these device is, when aerobic exercise like cycling and running.

Since I am a big fan of cycling, I am using a heart rate sensor and speed-cadence meter. I connect these sensors to my smartphone  app like Cyclemeter or Wahoo fitness, and thanks to GPS sensor in my smartphone, I could track and see my location data, heart rate, speed and pedaling cadence on a google map so that I could check my condition and cycling record on the map.

Even though, I am not a competitive cyclist, I need a right exercise method and these sensors is greatly helpful to do this.

According to an article ‘Heart Rate Monitoring‘ by Juul achten and Asker E. Jeukendrup, monitoring exercise with HRM is detecting and preventing overtraining. In this article,’Heart rate monitoring during training and competition in cyclists‘, said that heart rate can be used to evaluate a cyclist after training or competition, or to determine the exercise intensity during training.

After reading the articles, I could understand that heart rate monitoring is crucial when it comes to cycling or other cardio workout.  And I found another relationship with HRM and cardio workout.

There is a chart about heart zone training.

As shown in the chart, there are five heart rate zones and closely related to fat or glycogen burning. it seems workout harder, burn glycogen more than fat. This is where heart rate plays a crucial role. According to this chart and report, the most effective heart rate zone for fat burning is between 65% to 75% of your maximum heart rate ( somewhere between zone 2 and zone 3).

Also there is a relationship between bike speed and pedaling cadence.  The make up of your leg muscles, the ratio of fast twitch muscles versus slow twitch muscles combined with your fitness will determine your ideal cadence. Low cadence requires us to push harder on the pedals. To generate this force your leg muscles must recruit faster twitch muscles than slow twitch. I heard that Elite cyclists typically have a cadence of between 80-120 while recreational cyclist pedal at about 60-80 revolutions per minute.

So far, I explored relationship between exercise and quantified self devices with biosensors.  When I first made its acquaintance, I was amazed by its potential, and now I think there will be something more. What if quantified self device and bio-data combine to emotional data. Can it improve workout experience? Can it make workouts more efficiently? I want to explore a correlation between emotional data and cardio workouts in this class.