Dronespot is a high-precision 3D mapping app helping marine researchers geolocate sea turtles using drone footage and GPS logs with automated synchronisation.
Dronespot is a high-precision 3D mapping app helping marine researchers geolocate sea turtles using drone footage and GPS logs with automated synchronisation.
Design and production by Wolf in Motion for Queen Mary University of London, 2023.
Streamline the tedious, error-prone process of manually linking drone video to GPS data for turtle surveys.
A specialised desktop application that synchronises telemetry with video for accurate 3D geolocation of marine life.
Increased detection counts by 3x and reduced training time, allowing junior personnel to conduct scaled surveys.
Researchers at Queen Mary University London (QMUL) developed an innovative method for surveying sea turtles using drone footage. While flying drones on grid paths allowed for rapid data collection, the subsequent analysis presented a manual bottleneck. Dronespot was created to bridge the gap between raw video and GPS logs, transforming a tedious spreadsheet-based task into a streamlined, high-precision 3D workflow. By streamlining synchronisation and geolocation, the tool enables NGOs and students to process vast amounts of footage with scientific accuracy. The software provides a robust bridge between field-collected telemetry and the vital conservation data required to protect endangered species.
The primary obstacle facing the QMUL team was a total lack of automation. In the initial research phase, analysts were required to playback video, pause on specific frames where a turtle was spotted, and then manually calculate GPS coordinates using complex pixel-measure formulas within spreadsheets. This workflow was not only slow but also heavily prone to human error. It required a deep understanding of camera sensor specifications, monitor resolutions, and frame rates — knowledge that made it impossible to scale the project beyond a few highly specialised researchers.
Technological hurdles were equally daunting. Consumer drones rarely record a precise ‘synchronisation frame’ that matches the video start time to the telemetry log. Furthermore, GPS data is typically logged at intervals of one to seven points per second, whereas video is recorded at 30 or 60 frames per second. This temporal mismatch meant that any manual calculation was, at best, a rough estimate. For scientists attempting to track animals across overlapping flight paths, this lack of precision led to frequent double-counting, undermining the integrity of the population census. The challenge was to create a tool that could resolve these technical discrepancies while being intuitive enough for non-experts to use in the field.
Our intent was to transform this complex data problem into a reliable, professional tool that prioritised scientific rigour without sacrificing user experience. We began by mapping the intricate ‘data flows’ required to unify telemetry from various drone models. Our point of view was that the software should not just be a calculator, but a virtual environment that mirrors the physical flight.
To address the technical lag inherent in H.264 video playback, we developed an automatic resynchronisation feature. This corrected timing errors often caused by GPU performance fluctuations, ensuring the telemetry stayed ‘locked’ to the visual frame. We retro-engineered the drone’s lens projections, effectively allowing the software to recreate the drone’s flight path in a 3D space at a 1:1 scale with reality. This spatial approach meant that when a user saw a turtle on screen, the software already ‘knew’ exactly where that point was in the physical world.
Iteration was key to refining the interface. We worked closely with the researchers to understand the biological data they needed to capture. This led to the development of ‘User Defined Attributes Templates’, allowing the software to be flexible for different types of marine surveys. We also implemented 3D camera tracking algorithms to interpolate missing GPS points, removing the ‘noise’ often found in raw drone logs and providing a smooth, accurate representation of the drone’s position at every millisecond of the video.
Dronespot is a desktop application designed around a linear, four-step workflow: Set Flight, Select Data, Sync Video, and Locate. This structure ensures that even users with minimal technical training can produce high-quality scientific data. The application includes a comprehensive database of drone profiles, enabling the software to automatically account for specific sensor dimensions and field of view (FOV) settings.
The core of the application is the Sync Video interface. Here, analysts use a dual-curve graph that overlays telemetry data — such as altitude, compass heading, or gimbal pitch — against the video timeline. By aligning a visual cue in the video (like the moment of take-off) with a spike in the telemetry data, the user creates a perfect temporal link.
In the final Locate phase, the user simply clicks on a turtle within the video frame. Dronespot then calculates the exact latitude and longitude, the animal’s orientation, and its relative offset. To ensure data integrity, we implemented a ‘sticky pin’ system: if a pin remains perfectly placed over a turtle as the video plays, the user has immediate visual confirmation that the synchronisation is accurate. This removes the guesswork that defined the previous manual methods.
The shift from manual spreadsheets to Dronespot has fundamentally changed the capacity for marine research at QMUL. The impact is most visible in the efficiency of data collection. In controlled trials, participants using Dronespot detected up to 3 times as many turtles as those using manual methods within the same timeframe. Specifically, Dronespot users reached an average detection rate of 2.4 turtles per minute, compared to the labor-intensive manual process which struggled to record coordinates at even a fraction of that speed.
The software maintained a high precision metric of 0.87, ensuring that the increased speed did not compromise the accuracy required for peer-reviewed research. Perhaps most importantly, the training time for the software was significantly reduced. Training for Dronespot took, on average, six minutes and 14 seconds — nearly three minutes faster than the manual protocol. This allowed the task of data analysis to be delegated to Bachelor and Master students, effectively removing the specialist bottleneck.
The qualitative impact of this project is felt most strongly by the NGOs working on the ground in regions like Cape Verde. By providing a tool that makes drone surveys accessible, Dronespot empowers local conservationists to monitor endangered populations with professional-grade tools. The results of these trials were submitted for scientific publication in early 2024, marking Dronespot as a validated, paradigm-shifting tool in the field of marine biology. Through this combination of 3D technology and ecological research, we have helped create a sustainable path for protecting marine life across the globe.
Images: Kevin Neves