Think about every decades-old war movie where the guys on submarines send out sonar pings. They then listen intently for a single echo to come back, using their ears to locate enemy targets they can’t see with their eyes. Now, think about layering the power of modern machine learning atop that technology, and then using it to protect and rebuild the world’s coral reefs.

That’s the essence of what marine researchers are doing right now—and they hope that boaters will help them do a whole lot more of it in the future.

Marine researchers are working with citizen scientists all around the planet to combine what we know about underwater life with the capabilities of Google Research and DeepMind. The work is creating an unprecedented understanding of how to protect and rebuild coral reefs.

“We can actually listen to these habitats and learn about them,” says marine biologist Ben Williams at University College London. “We’re taking what other generations discovered and applying AI to see what we can learn.”

For the past 30 years or so, researchers have been working with bioacoustics, says Steve Simpson, a professor at the University of Bristol School of Biological Sciences. They’ve been listening to individual animals, working out how they communicate, what sounds they might make to attract a mate or scare away a predator—focusing on specifics like whale songs. That research led to an understanding that many animals use the whole soundscape to interpret their environment.

Underwater, sound travels farther than light. It travels in all directions, irrespective of currents. “The ecological soundscape carries huge amounts of information for these animals,” Simpson says. “We realized that we should be tapping into this encyclopedia of life ourselves.”

Having the idea, though, is entirely different from executing it. First was the problem of needing better recording devices, which previously were cumbersome and pricey. “They’ve gotten cheaper and easier to deploy, and we can leave them down for weeks or months, so we can start to see the patterns,” Simpson says. “We know what a healthy reef sounds like. We know what nighttime sounds like.”

Until fairly recently, listening to all those audio recordings was the work of students with headphones. But that takes an enormous amount of time—as long as a day to analyze a single hour of audio accurately. “Humans can only analyze a fraction of the data that we’re able to collect,” Williams says. But AI? Researchers fed the system six months’ worth of recordings. It was “far more than any human could listen to in a lifetime of work,” he says. “AI found 100,000 fish sounds.”

With that capability sorted out, researchers next started to build a collective of people all around the world to help train AI. Google helped the researchers create a platform named Calling in our Corals, so volunteers can log on and learn how to listen, then click every time they hear a fish sound. Having all kinds of people participate from all around the world is important, Simpson says, because no two humans are the same when it comes to hearing.

“Older people hear less of the high-frequency sounds,” he says. “And the language we speak, the environment we live in—all of our hearing is different. By bringing in everybody’s ears, we’ve created this uber-human collective where we can hear things that individuals might miss.”

Calling in our Corals is still an active project seeking volunteers, while the AI layer that researchers have added atop it is called SurfPerch.

“SurfPerch is giving us an opportunity to ask questions that we’ve never even considered before,” Simpson says. “That is the power of AI for interpreting soundscapes.”

At this point, researchers have discovered that the sounds a coral reef emits are like an early-warning system. Audio of a reef can signal that something is going wrong, or going right, long before human eyes can see it.

“So much of restoration is with animals that are difficult to find, to spot, to identify, or that are active only at night, but they make up a lot of that biodiversity recovery,” Simpson says. “Not only are these animals there and making sounds, but those sounds are the natural cues to other animals—the crabs and the clams. The say it’s OK to come and live and make this their home.”

What the researchers need next is more reef recordings, and more people to help train the AI. For the former, boaters can spend $150 to purchase a Hydromoth recorder—which is about the size of a GoPro—and deploy it onto reefs that they return to year after year, or that are in remote places where the reefs otherwise would never be recorded.

Boaters also can help train the AI by logging in and listening through the Calling in our Corals website.

“We find that recreational and professional sailors have that interest, but also generally a pretty good problem-solving mind,” Simpson says. “When you put those two things together, it’s a great community of people who can contribute.”

This article was originally published in the September 2024 issue.