Abstract Dan Stowell 22 October 2018

From IMC wiki
Jump to: navigation, search

Machine learning for bird song learning: Asking birds about sounds

Dan Stowell and Lies Zandberg


In this talk we will introduce our newly-started BBSRC research project, in which we are developing sound comparison methods based on birds' own perception: bird listening tests.

Vocal learning is an unusual behavioural trait: central to human communication, it is found in no other primate species, but has instead evolved in a disparate set of around 7 mammalian and bird taxa. But the thousands of studies that have investigated the neurobiology, development and neurogenomics of bird song learning depend on our ability to measure the similarity of different songs from sound spectrograms. Surprisingly, although computational methods to compare songs have become the new standard in the field, they have been validated only by comparing their output with subjective human judgments of spectrographic similarity. The next wave of bird song research increasingly relies on reliable quantitative measurements of song similarity. The problem of how to assess song similarity is thus an urgent one.

We will describe our listening tests based on training birds to perform A/B tests in modified bird feeders, and our plans for machine learning applied to these data to produce audio "embeddings" from the birds' responses. There are parallels with other work at QMUL - e.g. NLP work on "word embeddings" - and we hope to explore this in discussion.