Abstract
AbstractMany animals produce distinct sounds or substrate-borne vibrations, but these signals have proved challenging to segment with automated algorithms. We have developed SongExplorer, a web-browser based interface wrapped around a deep-learning algorithm that supports an interactive workflow for (1) discovery of animal sounds, (2) manual annotation, (3) supervised training of a deep convolutional neural network, and (4) automated segmentation of recordings. Raw data can be explored by simultaneously examining song events, both individually and in the context of the entire recording, watching synced video, and listening to song. We provide a simple way to visualize many song events from large datasets within an interactive low-dimensional visualization, which facilitates detection and correction of incorrectly labelled song events. The machine learning model we implemented displays higher accuracy than existing heuristic algorithms and similar accuracy as two expert human annotators. We show that SongExplorer allows rapid detection of all song types from new species and of novel song types in previously well-studied species.
Publisher
Cold Spring Harbor Laboratory