Ever been asked which subway stop you live by? This map redraws New York City neighborhoods according to the closest station. Let the naming begin.
For our Physical Computing midterm, Max Ma, Harry How and I set out to create a remote controlled car that follows a laser beam. Why? Because Max needed a companion to replace his cat that’s living in China.
We’ve got a webcam hanging above the room that’s wired to a laptop running a Cinder app. The app uses OpenCV to track the position of the car in relation to the laser beam and sends steering instructions to the car wirelessly via XBee chips.
We originally tried using IR to transmit the instructions to the car but we found it much slower and less reliable. Switching to the XBees introduced a couple of additional challenges (e.g. debugging without the serial output) but ultimately it was the right tool for the job.
Here are a few videos of the steps we took to construct the end-product:
This video shows the initial state of the app that’s just doing color tracking. The sliders on the left let us configure the exact color ranges that we’re interested in. We also ended up using an excellent little utility called uvc-ctrl that allows us to control the exposure, focus and white balance of our video feed.
We then added a simulated car to the app for play testing and to accommodate the car model in the software design.
We made a quick-and-dirty app to test sending steering directions to the car. Steering is accomplished not pivoting the wheels, but by rotating the wheels in opposition to each other. This gives us a very tight turning radius which is easier to model in the software.
Finally we tied it all together and Laser Cat was born. The vehicle is moving a little slowly in this video because the batteries are getting low. We were all a little tired at this point.
All of the code required to run Laser Cat is available on github: https://github.com/wdlindmeier/lasercat
Sandy was one of the most impactful physical events that the US has ever seen, yet my experience of it in NYC was largely a digital one. We weathered the storm in Williamsburg, Brooklyn and tracked its progress with our friends through a digital network of Facebook, Twitter, SMS, YouTube and other news outlets.
The relevance of these networks blossomed into something I’ve never experienced before, and yet as the storm passed the content quickly shifted back to Halloween costumes and the Presidential election as Sandy scrolled off the screen. To preserve a slice of that experience I created a timeline of Sandy-related activity from my personal network.
Full-screen that browser to replay #Sandy.