Bird’s-AI View: How Deep Learning Helps Ornithologists Track Migration Patterns
Billions of birds in North America make the trek south each fall, migrating in pursuit of warmer winter temperatures. But at least a quarter of them don’t make it back to northern breeding grounds in the spring, falling victim to predators, weather or man-made hazards like oil pits and cell towers.
Many of these migratory birds fly under the cover of night, making it challenging for birdwatchers and ornithologists to observe them and track long-term trends. But the need to monitor avian population levels is critical.
Recent research estimates that the number of birds in North America has fallen by 3 billion in the past 50 years, impacted by climate change, habitat loss, hunting and pesticides. Spring migration has declined by 14 percent in the last decade.
To better understand how and why bird populations are changing over time, researchers at the University of Massachusetts, Amherst are using AI to analyze more than two decades of data from the national weather radar network. These insights can also improve forecasts of future bird migration and aid conservation efforts.
Two Birds with One Dataset
A network of more than 100 weather radars has been online in the U.S. since the mid-’90s, scanning the atmosphere day and night, adding new measurements roughly every 10 minutes to a public data archive in the cloud.
While the radar network’s original purpose was to inform meteorologists, the instruments also capture flocks of birds (and even patches of insects) in flight, creating a vast trove of data for ornithologists.
Traditional methods for avian monitoring include observing and counting birds in the wild, weighing and measuring them, or tagging them with identification numbers or GPS trackers.
Radar, on the other hand, provides a detailed view of migration trends on a continental scale — giving ornithologists a way to track bird populations as they migrate thousands of miles year after year. But it’s hard to separate the signal from the noise.
When a radar image captures a flock of birds migrating across the skies, an untrained viewer may confuse the pattern for rain or snow. While both humans and AI can learn to tell the difference between birds and precipitation in radar images, using deep learning methods accelerates the process of analyzing an ever-growing dataset of more than 200 million images.
Flocking to AI
Led by Daniel Sheldon, an associate professor of computer science, researchers at UMass Amherst used transfer learning and a dataset of 200,000 radar images from the National Weather Service to develop a neural network that could differentiate between migrating birds and precipitation.
Ph.D. student Tsung-Yu Lin (lead author on the paper) and assistant professor Subhransu Maji developed the model with support from the Cornell Lab of Ornithology.
The team used a cluster of four NVIDIA GPUs to train the deep learning model, which provides an estimate of how much biomass is present in a given radar image. From that figure, ornithologists can approximate the number of birds migrating. Named MistNet, the tool correctly identifies at least 96 percent of the birds within a test set of radar images, the researchers found.
MistNet can be run on every radar image in the public archive to summarize how much migration is occurring at different elevation levels, the direction of the birds and how fast they’re flying. Additional data sources like observations from birdwatchers or the geographic coordinates of the radar image can be used to determine which species of bird corresponds to a radar data trail.
Insights on the Horizon
The researchers have so far analyzed around 28 million scans and found that a large proportion of migration happens in a very concentrated time span. Just one night accounted for 10 percent of migration over Houston last spring.
Looking at these migration spikes over the two decades of available data could help scientists track how bird migration patterns are changing in response to climate change. The team discovered that as food becomes available earlier in the spring, bird migration dates are shifting earlier, particularly for flocks that settle in breeding grounds further north.
Since radar data is updated every few minutes, this work also can be used to project bird migration in the near term. Sheldon works with BirdCast, a collaboration among the Cornell Lab of Ornithology, UMass Amherst and Oregon State University that uses radar data to provide a real-time bird migration map, as well as three-day forecasts.
“These forecasts are exciting because they allow bird watchers to look out and see what’s going to happen, and get excited about big migration events,” he said. “But it also has significant uses in conservation.”
For example, to help birds as they fly through the night, cities could turn off distracting light sources when major migrations are forecast. Artificial lights from skyscrapers or radio towers can distract and disorient migrating birds, impairing their navigation strategies.
Main image by Frank Boston, licensed from Flickr by CC BY 2.0.
The post Bird’s-AI View: How Deep Learning Helps Ornithologists Track Migration Patterns appeared first on The Official NVIDIA Blog.