top of page

Technology Application

AI-Enabled Wildlife Monitoring System: Automatic Bird Detection and Behavior Classification at Military Installations with Solar Farms

Presenters/Co-Authors:

Yukie Hamada, Biophysical Remote Sensing Scientist, Argonne National Laboratory: yhamada@anl.gov


Bryan O'Sullivan, Environmental Protection Assistant, Luke Air Force Base: bryan.osullivan.1@us.af.mil


Diane M. Walsh, Natural Resources Manager, Marine Corps Air Station (MCAS) Camp Pendleton: diane.walsh@usmc.mil


Abstract:

Data collection is the critical first step for understanding wildlife activities on and around military installations to help mitigate risks to military missions (such as Bird/wildlife Aircraft Strike Hazard [BASH] events) while protecting wildlife and infrastructure and developing sound natural resource conservation plans. At the 2021 NMFWA meeting, we presented the early results of our computer vision and machine learning (ML) model—bird detection accuracy of 92% or higher. Since then, in continued partnership with Department of Defense (DoD) Partners In Flight, we have been advancing our model to classify bird activities around solar energy panels by incorporating a deep learning (DL) approach. More than 5,000 hours of daytime video have been collected at Luke Air Force Base, and nearly 21,000 tracks (or image sequences) of birds were labeled for model training. Our earliest activity classification model based on the tracked bird path alone classified bird activities with 72% accuracy, which indicates great potential for automated data collection of bird activities and behavior. In this presentation, we will report the overall performance of the latest model, share our vision for the AI-enabled avian monitoring system planned to be ready for systematic field trial after March 2023, discuss possible applications for various DoD activities, and seek audience feedback.

bottom of page