- Remote wildlife cameras capture millions of images that humans struggle to organize
- Machine learning saves time by accurately identifying and classifying animals
- Automatically sorted images advance research in ecology, farming, and other fields
Facing a daunting, digital pile millions of photographs high, how would you find the few images you need? Could you, even? It’s a classic needle-in-a-haystack conundrum, and it’s a blaring reality for ecologists, resource managers, and farmers who use camera traps to track wildlife.
Consider Sedgwick Ranch Reserve, part of UC Santa Barbara’s Natural Reserve System. A sprawling 6,000 acres, the protected land used for research and teaching is a veritable nirvana for animals of all kinds. Mountain lions and black bears and deer, oh my.
And they are all represented many times over in the reserve’s massive image archive — millions of pictures of thousands of animals, captured by camera traps and dating back more than a decade. But who has time to sort them all?
Cue the computer scientists. More specifically, UCSB computer science professors Chandra Krintz and Rich Wolski. Their new endeavor — aptly named “Where’s the Bear?” — is training technology to wrangle those images, leaving researchers more time for, well, research. The pair conceived and co-lead the effort that employs machine learning to identify and classify animals caught on camera.
By assigning to computers a vexing task that until now was the sole purview of people, the project saves major manpower. But the innovation also holds potential beyond ecology to farming, agriculture, and even to medical imaging.
“Not only does ‘Where’s the Bear’ work, but it works really well,” says Krintz. “We don’t get any coyote wrong. We don’t get any bears wrong. We get about 12 percent error on deer — there are lots of deer — and we are trying to improve on that. Now all the ecologists are saying, ‘Tell me if the bear is healthy. Is it the same bear? How many deer are there with antlers?’”
“The low-hanging fruit was easy, and we can save people thousands and thousands of hours, literally,” Krintz continues. “So now we can start asking those harder questions by working with ecologists and scientists on how to design that system.”
The multitier, cloud, edge, and sensing system that is “Where’s the Bear” integrates recent advances in machine learning-based image processing to automatically classify animals in images captured by remote, motion-triggered camera traps.
To build a training set, Krintz and Wolski overlaid thousands of stock photos of wildlife onto background images from Sedgwick watering holes.
Using the new, synthetic images to teach their device how to accurately identify and classify those animals — the system has so far helped the Sedgwick team aggregate and analyze more than 1 million images dating back years.
“Everybody uses images to do analysis now,” Krintz says. “It doesn’t have to be animals. It could be people or how many cars pass by a particular street corner.”
“This is hugely powerful technology and we want to bring it to bear on important problems,” she adds. “It saves tons of time — what once took 14 days we can do in 3 hours — and it saves money, communications, energy, and carbon footprint.”
According to Wolski, the project can inform research in endless ways, from identifying which species are present, to illuminating the effects of drought by revealing how animals respond when there is more, or less, water available.
Further, the technology can enhance understanding of the health of, say, the bear population, or enable more accurate deer counts to better inform the number of hunting licenses issued each year. Ranchers can use it to monitor livestock; farmers can use it to monitor crops.
Eventually, the system could even include a notification function, using real-time image analysis to alert property managers or authorities to the presence of poachers, or alert visitors — hikers, researchers, students — that there may be wildlife nearby.
“‘Where’s the Bear’ is innovative both in the remoteness of where the images are being captured, and also in the software architecture that we have developed to manage the process of gathering the images, classifying them accurately, then distributing them to interested scientists,” Wolski explains.
For Kate McCurdy, longtime resident director at Sedgwick Reserve, it’s coming none too soon. She was all in from the moment “Where’s the Bear” was first proposed, and she quickly involved her volunteer research assistant, Grant Canova Parker, who manages the cameras in the field.
“We’ve done data handoffs before to researchers, but it’s pretty unwieldy to give someone five hard drives of digital images and say, ‘Good luck finding the thing that you’re looking for,’” McCurdy says.
“We need to do better, to be able to hand off a spreadsheet that’s been processed and say, ‘Here’s the data. Here are all the pictures of the deer we’ve taken over the last five years — you can crunch your own numbers and find the trends.’”
Wolski and Krintz intend to make the project software open-source and accessible to all who are interested — starting with some of Sedgwick’s counterparts.
Ecology and agriculture are related areas where this technology could have high impact,” says Wolski. “We would love to see the pervasiveness of this work throughout the UC system.