After discussing last week’s user personal exercises and your responses to them, we’ll spend a bit of time thinking about indoor location. Our guest speaker is Gershon Dublon, a PhD candidate at MIT who assistant taught a class on Location-Based Technologies that I took a few years ago. Gershon does a lot with environmental and somatic sensor networks, location-based tracking, and the arts, as you can see from his site.
We will also talk about final presentations in two weeks, and then we’ll end class early so you have extra work time. I’d like to see evidence that you’ve been through the whole prototyping cycle:
Specifically , please present
– A DEMO (don’t stress if it doesn’t work perfectly), including QS data collection and spatial visualization
– ANALYSIS of a small user study (preferably 2 or more people outside of class)
– REFLECTION on the prototyping cycle and on how you’d prepare for a next iteration, if you were to continue your project.
Though it’s not required, I’m really interested in how (or if!) considering location has changed your perspective on quantified self. This will help me in planning the class for next year.
After discussing our spatial visualizations in the first part of today’s class, we’ll then spend the rest of class talking about users, co-creators, and everyone else outside our projects and departments. This will culminate in a user persona exercise.
Our guest critic from last week, Becky Hurwitz, will give a short talk about the history and theory of co-design, and how her class at MIT Center for Civic Media has contributed to the field. You can follow along through the not-yet-live Civic Design Toolkit site. After Q&A with Becky, we’ll then try a modified version of this user persona exercise. I’ve set up a shared Google Doc for any links and images that we may find during the exercise.
Many of us have experience working with users through methods like clinical trials or product testing, but co-design is, arguably, rarely applied outside of fields of social justice, social change, and humanitarian work. With that in mind, I think it will be an interesting exercise to apply concepts of co-design to our quantified self location projects. I myself have not applied co-design to any of my quantified self projects, so I’m excited to give it a try based on our findings from this class.
Other Resources: I’ve also found the IDEO Human Centered Design Toolkit to be quite handy. Since my experience with co-design was centered on Indonesia, I used the Solo Kota Kita (Firm Foundation) Social Design Field Guide quite a bit, and I think it can be translated to many other projects and regions. In translating user studies to the academic domain, Creswell’s Research Design is a great, short overview of how to develop academic, user-centered studies.
So far, we’ve been thinking a lot about quantified self. In today’s class, we’ll turn our focus to location, and how adding a location element to our QS projects could give them new kinds of potential. I’ll talk about this emerging overlap and do a tech demo with some sample visualization files. We’ll then spend time in our groups thinking about the locational element in preparation for next week’s midterm presentations. Finally, we have a guest speaker in the last section of class: Jennifer Sta. Ines of NYC’s Department of Transportation.
Arlene’s Slideshow: Location in QS
Jennifer’s Slideshow: Spatial Visualization
Session 2, February 27, was devoted to hardware tinkering with GPS and sensors. My sample hardware post is corrected and updated from our class discussion, including a link to the SDFat library.
For your next post, please do the following and add a link to your post on the A#3 (Assignment #3) page.
- Finalize your technical setup as much as you can, and post a photo. You can do this with your collaborator.
- On that note, let me know your progress on collaboration. You can have one collaborator, a group of collaborators, or just a work/study group if you decide to do your projects on your own.
- Output some data, and post a text file or photo. Your data should include some values for the Quantified Self (i.e. biosensor levels), and some values for Location (i.e. latitude/longitude).
Here is a sample CSV file of the kind of data you should output. It contains 2 quantified self variables (in this case, EEG) and a few location variables (in this case, satellite fix quality and lat/long). There are other variables you will eventually need to collect (i.e. timestamps), but we can worry about that in another assignment.
Don’t stress too much if your data is not accurate, for instance, if latitude and longitude aren’t correct. Just output something, and we can talk more about issues in the next class.
Since we didn’t have class last week, I don’t want to give you a big assignment, but I do want to know what kind of hardware you are starting to tinker with, and more importantly, I’d like to know your questions. Since we didn’t have class, I’ll make a sample post to get things started. For your post, please include
- A PHOTO of your hardware setup
- a few sentences on YOUR PROGRESS
- a few sentences on YOUR QUESTIONS.
This week I tinkered with setting up a pulse sensor with the Adafruit GPS logger shield on the Arduino Uno.
My Progress: The pulse sensor and GPS shield are all working together and outputting Quantified Self (sensor) and Geolocation (lat/long) data to both the Serial Monitor and the microSD card. Even though I’m using the “Soft Serial” switch on the GPS shield, I had to set up a jumper cable (see the yellow jumper cable in my photo above) from TX to the 0 pin, as in this Adafruit image. My (not yet thoroughly commented) sketches:
My Questions/Issues: The pulse sensor seems to take a few seconds to stabilize and indicate a reasonable BPM of 60-65. I’m not sure if this is normal. Also, I don’t seem to find a decent pulse reading with my fingers or ears– my lip seems to be the only place where I get a decent reading! This must be resolved. Also, the GPS shield, which I set up inside, doesn’t seem to maintain a consistent fix to satellites, so the lat/long values are not consistent. But this tends to resolve through outdoor or antenna use.
Supplemental: Here is a quick video demonstrating this setup. [Note to class: a video is optional.]
And here is a follow-up video with GPS output: Follow-up video with GPS output
This week’s class was cancelled due to snow, but since we were prepared to do some prototyping, let’s spend the next two weeks sharing some tinkering.
Here is the slideshow #2 that I planned to present this week. I will present it briefly in our next class. It opens with
a diagram on the scope of our class, in response to some student questions
a diagram on the iterative prototyping cycle, and preparing “Alpha V.1”
- and a checklist for preparing the midterm.
The rest of the slideshow focuses on some of the hardware, particularly biosensors, that could be appropriate for this class. Most of the slides have a link to a relevant URL. If you’re still looking for a sensor to use in your project, the rest of the slideshow could be useful to you.
Before the next class, put together a quick blog post with a photo of your hardware setup, your progress, and your questions/issues. You can use the recommended hardware (Arduino, GPS shield, bio-sensor), or anything else of your choosing. Ideally, you will show how you are both QUANTIFYING the self (or someone else) and LOCATING them. I will make a sample post for you in the coming days. If you’re having technical trouble, don’t worry– you can demonstrate that in your post, and talk about what you’d like your prototype to do if it was working correctly!
Also, please take a look at the responses to Assignment #1 and start thinking about how you’d like to team up. Please comment on the posts that resonate with your own work, and perhaps contact students offline to form teams. I will also provide comments and team suggestions in the coming days. Again, you are not required to work in a group, but I encourage you to find at least one person to work with, especially if that person’s skill-set can complement your own.
Here are the links that we discussed at the end of class today. Before the next class, please solder your headers and make sure your shield works. If you have a chance to set up a sensor with your shield, all the better. I will put some sample code together and we can try it together in two weeks.
Here are the libraries you will need for next class.
I also set up a class repo, but I don’t think we need it yet. https://github.com/arlduc/QuantifiedSelfOnLocation