Apply now

Find out more about the different routes to entry and our eligibility criteria

Predicting Pollinator Sounds from Morphology: AI-driven integration of photogrammetry and bioacoustics for scalable insect monitoring

Illustration showing how artificial intelligence can predict pollinator sounds from morphology. On the left, a realistic 3D model of a honeybee represents photogrammetry. On the right, a diagram shows a neural network connected to a sound waveform and a wireframe bee, with a colorful spectrogram below labelled “Bioacoustic Data.” The layout visually links 3D morphology, AI modeling, and acoustic analysis, highlighting the project’s aim to use digital models and deep learning to predict insect flight
Project Description

Pollinators are vital to global biodiversity and food security but are declining due to climate change, habitat loss, and pesticide exposure. Acoustic monitoring offers a non-invasive way to track pollinators in the field, yet most insect species lack recorded sounds, limiting the reach of this method. This PhD will develop an AI-powered framework to predict and identify insect flight and pollination sounds from morphology, enabling scalable, automated biodiversity monitoring.

Using high-resolution 3D models of insect specimens generated through photogrammetry, the student will quantify morphological traits such as wing shape, venation, and body size. These data will be linked to flight and pollination sounds collected under controlled and field conditions. Deep learning models will then be trained to predict acoustic signatures from morphology, allowing inference of sounds for species with no existing recordings.

The student will explore:
    •    Building and analysing a 3D morphological dataset of diverse bee and hoverfly species.
    •    Recording and characterising insect flight and buzz pollination acoustics.
    •    Developing and validating AI models linking morphology and sound production.
    •    Applying models to improve species detection in passive acoustic monitoring networks.

This project sits at the intersection of ecology, bioacoustics, and machine learning, offering training in computational modelling, 3D imaging, and AI. The outcomes will advance scalable monitoring tools for pollinator conservation, enabling us to “hear” the species that science has yet to record.

Research themes
Project Specific Training

The student will receive interdisciplinary training in photogrammetry, insect bioacoustics, and artificial intelligence. They will learn 3D reconstruction, acoustic data collection and analysis, and machine learning for predicting bioacoustic features from morphology. Training will be delivered through one-to-one supervision, hands-on lab sessions in insect imaging and sound recording, and computational tutorials on Python-based deep learning workflows. Collaboration with museum curators and AI researchers will provide additional specialist training in specimen handling, data management, and model validation. The student will also attend external workshops on bioacoustics and ecological AI to strengthen technical and analytical skills.

Potential Career Trajectory

This project will equip the student with skills in AI, bioacoustics, and 3D imaging—highly transferable across academia, industry, and conservation. Academic pathways include research and teaching in ecology, computational biology, machine learning, or environmental data science. Beyond academia, the student will be well-prepared for roles in biodiversity monitoring, environmental consultancy, agricultural technology, and ecological data analysis. The integration of photogrammetry, AI, and sound analysis also provides a strong foundation for careers in computer vision, robotics, and sensor-based monitoring industries, as well as in governmental and NGO sectors focused on pollinator conservation and environmental policy development.

Project supervisor/s
Rachel Parkinson
Biology
Queen Mary, University of London
r.parkinson@qmul.ac.uk
Madeleine Ostwald
Biology
Queen Mary, University of London
m.ostwald@qmul.ac.uk