Monitoring the Spatial Movement of Multiple Individuals using Sound
Acoustic identification and localization of individuals is a long-standing goal in Bioacoustics. It allows researchers and managers to quantify individual-based conservation efforts, infer habitat use, understand population dynamics, and study individual interactions and collective behavior in the wild. However, methods to identify multiple individuals and track their positions using solely sound remain challenging. Here, we present a deep-learning-based approach to acoustically localize the spatial positions and determine the identities of multiple individuals simultaneously. Our method works with species of varied vocal complexity, recognizes and locates individuals of the same and multiple species within the same area, scales automatically to any number of recorders and grid size, and works with off-the-shelf bioacoustic recorders (AudioMoth). We used 1654 calls from 20 individuals of two bird species (Grey warbler - Gerygone igata and Kea - Nestor notabilis) to demonstrate the capabilities of our approach. Data was collected in an 80 m by 80 m grid using 25 GPS-synchronized AudioMoth recorders. We conducted seven experiments to test our approach including the spatial localisation of multiple individuals of one and multiple species, single- and multiple-individual tracking, and phase synchrony and asynchrony scenarios. Our method automates two labor-intensive tasks commonly used in Ecology and Animal Conservation (i.e., individual identification and tracking), significantly reducing the time and effort required to gather information on animal individuals and populationsÂ