How scientists could use AI to understand whales

From PBS NewsHour.

As artificial intelligence programs like ChatGPT and Midjourney take off for personal use, scientists are also looking to machine learning to aid in their research. One example is Project CETI (Cetacean Translation Initiative), which aims to decode the communication of whales with help from AI.

Sperm whales have the largest brains on Earth and are one of the most socially intelligent animals, with the ability to create creaking and clicking sounds louder than 230 decibels. In addition to using these unique vocalizations for hunting and echolocation, experts think sperm whales make shorter, more distinctly rhythmic patterns of clicks, called “codas,” to socialize with one another.

Large language models require huge datasets to function. The creators of GPT-3 scraped countless books and websites for human language samples, but the CETI team needed to start from scratch. Focusing on a few hundred sperm whales off the coast of Dominica in the eastern Caribbean Sea, researchers have deployed an array of autonomous buoys, underwater robots and cameras to track whale sounds alongside environmental and behavioral data.

With enough data, the team hopes their proposed “whale language model” will be able to synchronize all of this information and detect which codas are heard in conjunction with specific behaviors.

“Can technology bring us closer to nature? That’s the hypothesis that I’m leaving open,” marine biologist David Gruber, founder of Project CETI, told the PBS NewsHour. The team’s goal is not necessarily to talk back to whales, but to learn what they might have to say about themselves, their families and the world we share with them.

PBS NewsHour podcasts:

Stream your PBS favorites with the PBS app:
Find more from PBS NewsHour at
Subscribe to our YouTube channel:

Follow us: