Sophie Laffont, Luis Palacios, Alessandra Villaamil
A Background on Camera Traps:
Camera traps have many advantages for tracking wildlife:
They are non-invasive when using IR flashes, thus having no effect on most animal behavior. Camera traps require low labor and are easy to deploy and can function for weeks with no attention. Using camera traps produce bonus materials (i.e. a record of the animal's behavior which can be important for scientific questions). Over the last few years, wireless sensor networks have been used extensively for ecological monitoring applications. Data can be reviewed by other researchers, avoiding biases and limitations in the field. Multiple cameras set up at a single site can help the identification of an animal.
However, camera traps do present some limitations that open the doorway for innovation and redesign:
Any electronics will suffer under the environment's humidity, so some care is needed. To minimize the impact of seasonality on camera performance, silica desiccant packets (two if possible) should be used to keep the inside components dry. Key items to regularly inspect and service include the rubber gaskets that prevent moisture from entering system components, exposed metal contacts and battery leads (for corrosion and dirt) and external wires. Many field studies end up failing due to the lack of enough data gathering at the site. Camera failure might take a considerable amount of time to realize and fix. If any one single element fails, it might contribute to the entire system’s failure. Large animals may destroy camera traps, and in some areas, camera theft could be a problem. Some animals are particularly difficult to document (behavior, coloring, speed, habitat, population).
The first thing to think about is camera placement. The position of the animal in the photo depends on the following important factors: (a) the size of the detection zone, that in turn depends on how close the camera is to the animal; (b) the trigger speed (or latency time): the length of time between object detection by the sensor and the camera recording a picture; (c) the speed of the passing animal. The camera can be placed independently of the sensor and detector allowing for creative photographs and better use of lighting/composition. Camera traps set along trails require a faster trigger speed whereas camera traps set a mineral licks or trees can be slower since the animal is likely to pause in front of the camera trap. It is suggested to move the camera traps every 15 to 30 days to avoid bias caused by the camera trap locations, populations visiting the traps, and to sample a larger area.
Three variables should be considered to assess cost effectiveness of camera trap models: (a) the cost of the camera traps including batteries; (b) the field costs to visit camera traps for battery/memory replacement and (c) survey duration and effectiveness. For most surveys, the material needed is as follows: camera housing, camera, and cables to attach them on trees, sufficient memory cards and batteries, hand-help GPS unit for recording camera trap locations, and data forms (camera trap setting/monitoring and description of camera trap site).
We tested 7 dog toys to accumulate a percentage of false positives. Results depended on the size of the toy, the speed at which it “moves” (i.e. rolls/slides across the floor) and the distance away from the camera, We had inconsistent results. Attach:testingfalsepositives.ext Δ Attach:falsepositives.ext Δ
Hacking the SD Card
We used CHDK (Canon Hack Development Kit) to automatically control the camera without using the camera's actual buttons. This is necessary as the camera will be inside the housing and inaccessible to use. CHDK uses ubasic and Lua scripts that enable such features as time lapse, motion detection, advanced bracketing, and more. For the purposes of this project, we used a basic script found on their forum for motion detection that triggers exposure in response to motion. By editing the parameters of the script, we were able to maximize functionality for our environment. Here's how it works:
Zones are used to control the sensitivity of motion detection by breaking the field of view into a grid. This way, you can limit the detection of motion to one specific spot in the field of view. The zones that are ignored are known as exclusion zones.
There are different ways of causing a trigger, known as detection modes. You can format the code to detect luminance, blue chrominance, red chrominance or individual R, G or B values.
Sensitivity of detection can be adjusting through determining a threshold.
Thresholds can also be set to determine how long the camera waits before attempting to detect motion, how often the camera checks for motion in the field of view and how many actual pixels are tested at every attempt of motion detection.
The CHDK model of motion detection was a good try, but in our case it proved to be too much of a hassle. To get the camera to open the CHDK menu involves pressing a series of buttons in a precise order. This wouldn't be much of an inconvenience if you were setting up the camera for lightning photography, for example, but it would be much too difficult once the camera is already placed into our camera housing. Right now it works for us on a theoretical basis, but a different method of triggering the camera would probably be a better option for our camera trap.
At first, we made an SD card connected to wires, the camera’s own SD card would then transfer the data onto the one we made. This proved unreliable and fragile. In the end, we sourced ready made cables for SD card data transfer. However, as the delivery time was long, we continued using this one. We tested our SD card and the camera’s SD card with motion detectors to see how well the data was transferred. We then transferred the data to the computer to make sure the card was getting the readings from the motion sensors. Attach:makingsdcard.ext Δ Attach:testingsdcard.ext Δ Attach:transferingdata.ext Δ
The camera’s internal SD card is connected to wires that lead to the outlets from which we will retrieve the information. The wires and case are protected with a waterproof coating. Attach:caseoutlets.png Δ
We used a Pelican case especially made for protecting cameras. We secured the outlets through carefully measured holes in the box. We glazed and coated every opening with sealant. It did look a bit messy but it worked. Ideally, we would be able to make a prototype from scratch. Attach:prototype1.ext Δ Attach:cameratypesideview.ext Δ Attach:sideviewclose.ext Δ Attach:prototype2.ext Δ
Nature Calls - Android Arduino Camera trap
Nature Calls is a mobile phone-based camera trap for animal behavioral studies. With researcher workflow and harsh climates in mind, we designed a modular camera trap that transfers photographs remotely, is completely compartmentalized for easy maintenance, is entirely waterproof, and is minimally disruptive to wildlife. The trap is distinct from off-the-shelf models in that it uses a mobile phone, opening up the world of real-time data analysis, remote data transfer, location stamping, and a host of customizable features for various research objectives.
Research: Motion Sensors: Ultrasonic, Beam Breaker, PIR, Twin PIR, Android
This is our documentation for our research about Motion Sensors that could be eventually used for monkey tracking in camera traps.
Normally Camera Traps count with a PIR sensor which is installed near the camera to track the motion of different animals in the jungle or other environments. This kind of sensor has some specific characteristics that make it suitable for sense motion under certain circumstances like for example a space without constant movement, so it can discriminate if there's a change in the environment characteristics.
Under these circumstances we decided to test other kinds of sensors that treated motion sensing under other conditions such as detection of specific distances (ultrasonic), lighting changes or color/brightness/contrast changes.
So we decided to test different motion sensors that weren't expensive and used different ways to detect motion or detect the presence of something in this particular case, a monkey.
- Ultrasonic Maxbotic LC-Maxsonar Ez1
- PIR Parallax Sensor Rev B (Nº 555-280-27)
- Twin PIR Bravo 6 Dual PIR Motion Detector Sensor
- Laser Beam Breaker
- Android + Computer Vision
Using RFID to monitor social interactions
- One monkey is selected (high status animal?) to wear an RFID reader and data transmitter.
- Other monkeys are given passive RFID tags that will register with reader and log amount of time spent in the same vicinity.
- Transmitter will send proximity information back to researchers for study.
Using IRDA to monitor social interactions
- Several monkeys will be given uniquely pulsing infrared devices that can sense their counterparts and record which devices were contacted and for how long they remained in proximity.
- Data stored in the devices may be transmitted to researchers holding portable infrared devices.
- Range: between 3-6 meters, possibly up to 10.
- Other limitations: Sensors may need a direct line of sight. Therefore, not all proximity interactions will be recorded.
Monkey is in the p-comp lab
- Proof of concept for identifying individuals via IR LED Senders and receivers in the collar.
- Small broches with 555 timer to make LEDS blink in different speeds (so we can detect different individuals) are detected by an IR receiver connected to an arduino.
- When they are detected a message is sent to a website: Monkey X is in the p.comp lab!
- Users can be notified of the location of Monkey via e-mail or sms
- Possibly an android app will be developed as well.
Using accelerometers to infer positional/behavioral states'
Morgen & Christina
- Project Status Updates:
Using Computer Vision to identify individuals
- Free face detection/face recognition software such as iPhoto or Picassa won't do the job: the background is usually too noisy, which leads the software to identify faces were they are not and the fact that the monkeys face is so black might be confusing the traditional FD algorithm that looks for shadows under the eyes and the mouth.
- A simple CV program written in openFrameworks didn't give better results. One way around can come from using the IR LEDS in the collars as a means to illuminate the faces of the individuals, but we haven't tried that yet.
- As a non invasive tracking method using them is ideal. Hopefully we can get to hack the SDK to use in Face detection and face recognition. Things doesn't look that promising on that realm, though, mostly SDK let you access remote triggering but not the face detection libraries.
Arturo Vidich, Michael Knuepfel
Remote control of Canon cameras for camera traps
Arturo Vidich, Michael Knuepfel
Synch from pic + main hot shoe = voltage difference between them that happens right after the shutter moves with the second curtain sync. Two curtains in a shutter: 1st opens up (1st curtain sync) The second shutter opens us and that is the 2nd curtain sync
With long exposure the time between them is quite large. (The use of the laser is dependent on the need for shutter time to grab the photo… just something to consider) When the 2nd curtain sync happens and the shutter goes down. Somewhere in camera manual the shutter synch has to be set with the first shutter sync. Way to test (if found) trigger for transistor. Use an LED and the transistor When fire camera: because of the voltage LED should go on but only as long as needed to trigger lasers through the flash. If you can get a picture when the LED is on: YAY!! You have what they need. In aperture priority mode and manual mode: get a positive signal on the center pin of the sync cable. Shutter Priority mode: NOTHING
Cheaper SYNCH CORD LINK: http://www.meritline.com/canon-ttl-camera-remote-extension-sync-cord---p-39233.aspx?source=fghdac
HOT SHOE LINKS: http://photography-on-the.net/forum/search.php?searchid=16908199
Camera Trap Investigations
Corrie Van Sice, Matthew Richard
- Is there a more efficient way to both sense and analyze motion in front of the camera to reduce junk photos?
- The dead idea: What if we build a pin-hole camera and a simple matrix of photo sensors and do some low-tech image processing on a micro-controller?
- The other plan: What if we put an Android phone (or some other mini-computer) in the box and do live image processing with more robust software?
- How can we maximize the amount of information in each photo?
- image quality, area of focus, lighting, scale, identity, time, frequency of visits, etc.
- What can be learned from the Canon Hack?
- What are the many wonderful ways that lasers can be implemented to get a more precise measurement of the scene and/or the subject of the photograph?
Instructable for cheap camera trap: http://www.instructables.com/id/Cheap-Motion-Detection-Wildlife-Camera/?ALLSTEPS
Lily Szajnberg, Diana Huang, Gabriela Gútierrez, Natalie Be'er
Paul Rothman, Morgen Fleisig & Carolina Vallejo
- Power is probably the biggest problem faced after weight and data retrieval.
- Batteries have been explored elsewhere:
- Lithium-Thionyl Chloride Batteries (Li-SOCl2) Note:Shorting and rapid discharge can lead to explosion.
-Eric Rosenthal recommends putting a resistor in series to act as a fuse and place desicates in the sealed case to combat humidity. -What is the humidity in the region?
- Kinetic Motion Harvesters are a potential resource - Morgen & Carolina
- We did look at an energy harvester by Advanced Linear Devices (the EH300-Kit) but it was more of a large capacitor, designed to even out power spikes and dips from other variable energy capture systems such as solar panels or wind turbines.
- "Pico Hydro" - Paul
- Pico-Hydro - Could be used to power WiFi or other electronics if research area is relatively close to the river.
- Off the shelf Low RPM DC generator
- Research into Pico-Power is ongoing. Preliminary testing on a "home-made" generator produced 5 microVolts at high RPMs. The design is being refined and further research is being done on low RPM generators so that stream flows and rainfall can be harnessed. DIY or purchased generators can be used to power field equipment such as camera traps and radio devices so that the maintenance can be reduced and "on-time" can be increased. These generators can harness water flow in local stream or rainfall that is collected in a container.
- Extensive research into suitable collars was done and a list of manufacturers is located here. The ideal collar would allow researchers to get a GPS fix on a subject every half hour and have a battery that can last up to 18 months.
- The Telemetry Systems was the leading candidate as is possesses GPS tracking, onboard accelerometer, remote data downloading, 125g weight and good battery life. It uses the SiRFstar III GPS chip which in the industry leader and is also used the Garmin GPSmap76CSx used in the field. The unit has been ordered but has not been delivered.
- An attempt was made to break open a radio frequency collar but the effort was thwarted by a ruptured battery. Be careful of applying pressure to the casing as the battery may rupture.
RFID and PIT Tags
- Getting proximity data between two subjects was an interest of the researchers. RFID was a considered option and by placing an RFID reader on one key subject and tags on the others, one could conceivably record the proximity between subjects. PIT tags (Passive Infrared Transponder) were considered because they require no power and could be implanted in various places. Unfortunately, the sociable distance of most Wooly monkeys is between 1 and 2 meters, larger than the possible distance of small, wearable receivers and tags.
- Leading manufacturers of the technology include Trovan and Destron. It should be noted that their technologies are incompatible with each other. See the links page for manufacturers websites.
Getting Started with GIS
- Download shapefiles (vector layers) of New York City
- The GPS unit currently used in Tiputini is the Garmin GPSmap76CSx which uses the SiRFstar III GPS chip known as the leading GPS module for getting fixes even in dense foliage and city environments. The chip draws only 50-500 microAmps. The SiRFstar IV is set to be released in Q1, 2010. See the links page for more information on the SiRF star.
GPS Receiver tests from a few embedded receivers
- Researchers make audio recordings of the monkey subjects for later analysis and identification in the lab. We were attempting to run FFT (Fast Fourier Transform) analysis live in the field to identify specific monkey subjects. C code was found online and it is in the process of being converted for use in the Arduino environment. Current code can be found here: Attach:Fft.zip
Behavior Observation Tools
Data Acquisiton and Storage
Neil Hickey & Nien Lam
- Currently, the researchers need several steps to get the field data into the final Access db (*.mdb), which is stored locally. First, Data is stored on the Palm z22 in Handybase spreadsheets. These are stored as Excel-files on the computer and then added to the Access db.
- From my analysis, the most efficient way to streamline this procedure is to port the whole db to a SQL format and have it rest on a server, which the local dbs reference. Additionally, communication between Garming GPS receiver and the Palm Z22 would be nice. Also, the Palm has IR (not sure about Bluetooth yet) capabilities, which could be used for wireless transfer to the IT-infrastructure. Also, consider that some of the projects might produce currently not observable data. So the whole process and db format should be designed modular, meaning that changes in either the format or the data structure should be easily customizable by average computer users.
- - automate the date and observer settings to guarantee uniqueness keys for automated db (palm)
- - write a Perl script that parses all the data into a SQL db (pc/web)
- - write a access macro that references the remote files (pc/web)
- - figure out Palm's IR protocol and how to connect to script(palm)
- ( - write a fancy toolkit for visualizing data dependencies etc. ...)
- Note: The interface of the Handybase forms is somewhat unintuitive. I tried some custimizing within handybase, but wasn't really happy. However, I am no interface designer, so... feel free to go for that part.
Handheld User InterfaceKenny Chiou
Lisa Maria, Russ, Sonaar, Zeven
900MHZ XBee radio Tests for GPS testing
DIY Radio Trackers and GSM networks
Another possible DIY transmitter I'm not sure if it can get into the range of the standard wildlife collars though.
- Any tracking device is by definition a transmitter, i.e. a tracker most commonly transmits a signal (pulse | ping) on a specified radio frequency which is picked up by a receiver. Who- or whatever is on the receiving end can thereby infer something about the spatial distance between both positions.
- A tracking system can be described as a network that follows a centralized topology (many-to-one) which is appropriate given the particular application (find monkeys close by, establish physical contact/line of sight, observe them).
- Given that there are a lot of DIY implementations of wireless devices it seems possible to build our own
By their very nature, monkeys are incredibly mobile. They are a self-organizing network. They are the perfect self-forming “mobile nodes”. I propose designing an asynchronous mobile data mesh network and communications protocol for monkey radio collars that will tell primate biologists 1) where the monkeys travel, 2) when and 3) how often monkeys come into proximity with other collared monkeys, and 4) where and 5) for how long these social encounters occur.
In the “Mobile Monkey Mesh”, each collared monkey carries a recording, storage, and communications device (in this case, a radio collar) attached to its body that serves as a “mobile data node”. Each device records and stores GPS data about the monkey’s location and proximity event data whenever that monkey comes into contact with another collared monkey. The camera traps and salt licks also each house a recording, storage, and communications device inside the camera casing that serves as a “stationary data node”. Each of these devices also records and stores proximity event data whenever a monkey comes near the camera trap. So, for example, if we have 4 collared monkeys and 2 camera traps, then we have a total of 6 data nodes in our mesh.
Whenever one of these storage devices comes in proximity of another storage device, the data being locally stored on each device is shared across devices. If the data has already been synced before in the past, then only the changes made in each device since the last synchronization will be exchanged between devices, and information about the date / time of each sync is recorded to both devices. In this way, the data captured about many monkeys is shared and distributed across multiple devices, mobile and stationary, in an asynchronous, opportunistic fashion.
By syncing data from each other on-the-fly, the monkeys are actually doing most of the legwork required for data logging and collection in the depths of the jungle, which hopefully would save biologists a lot of time and energy otherwise spent gathering this data from one tranquilized monkey at a time.
Read more about this concept and design on Suzanne's blog.
Automated GPS Data Download
At present, Tony and his team are using the Telemetry Solutions RS 4000 GPS collars to track the monkeys’ locations at specific times during the day / night. The biggest difficulty regarding the retrieval of GPS data from these collars at present is finding the monkeys in the forest and positioning the antenna for a successful download. Currently, Tony and his research assistants must track the collared monkey via radio telemetry, and then stand within 10-20 meters of the monkey in a heavily wooded area with dense foliage and moisture, point the antenna towards the direction of the animal, and then manually press a button on the Telemetry Solutions software to activate download of the GPS data.
We want to make this data gathering process easier for primate biologists who track monkeys in the field. We know that the monkeys come to the salt lick a few times per week, and we know that the camera trap could have a motion sensor connected to a microcontroller to sense when an animal has passed by the salt lick.
Our project idea is to set up an Automated GPS Data Downloading base station inside the camera trap that will automatically download GPS data from a monkey’s collar when it passes by the mineral lick.
Solution 1: USB Bus Pirate
Read about this solution on Suzanne Kirkpatrick's blog for more information on the USB Bus Pirate.
Solution 2: Batch Script & Robot Class in Processing
Read about this solution on Tali Blankfeld's blog for more information on the Batch Script method and the Robot Class in Processing.
While both of our solutions are good attempts at making our own custom solutions, we were able to definitively prove that the robot class can wake up the computer from sleep when motion is detected and sleep the computer after each automated download has completed. For now, this seems like the most viable solution to pursue, and deserves further investigation in Windows OS.
In sum, we were fairly limited in our project experimentation by the gaps in our knowledge about how the Telemetry Solutions automated base station is configured. Had we known more about the product's power system and general power consumption, and the specifics regarding the base station software, we could have gone deeper in our pursuits to build a solution that would really integrate with the Telemetry Solution RS 4000 GPS collars.