For our Physical Computing midterm, Max Ma, Harry How and I set out to create a remote controlled car that follows a laser beam. Why? Because Max needed a companion to replace his cat that’s living in China.
We’ve got a webcam hanging above the room that’s wired to a laptop running a Cinder app. The app uses OpenCV to track the position of the car in relation to the laser beam and sends steering instructions to the car wirelessly via XBee chips.
We originally tried using IR to transmit the instructions to the car but we found it much slower and less reliable. Switching to the XBees introduced a couple of additional challenges (e.g. debugging without the serial output) but ultimately it was the right tool for the job.
Here are a few videos of the steps we took to construct the end-product:
This video shows the initial state of the app that’s just doing color tracking. The sliders on the left let us configure the exact color ranges that we’re interested in. We also ended up using an excellent little utility called uvc-ctrl that allows us to control the exposure, focus and white balance of our video feed.
We then added a simulated car to the app for play testing and to accommodate the car model in the software design.
We made a quick-and-dirty app to test sending steering directions to the car. Steering is accomplished not pivoting the wheels, but by rotating the wheels in opposition to each other. This gives us a very tight turning radius which is easier to model in the software.
Finally we tied it all together and Laser Cat was born. The vehicle is moving a little slowly in this video because the batteries are getting low. We were all a little tired at this point.
All of the code required to run Laser Cat is available on github: https://github.com/wdlindmeier/lasercat