
Getting your Trinity Audio player ready...
|

Introduction By Carl Safina
When we speak of language, we usually picture words—letters on a page, or sounds strung together in conversation. But at its heart, language is not about words at all. It is about connection. It is the bridge between one mind and another, a way to share what we know, what we feel, and what we dream.
For much of human history, we have believed that bridge belongs to us alone. Animals, we thought, could cry out in alarm or signal hunger, but they lacked the structured, symbolic systems we call “language.” We believed ourselves set apart. Yet that certainty is now eroding, not because animals have changed, but because our tools for listening have.
Artificial intelligence is revealing patterns in animal communication that we had long overlooked. Dolphins use signature whistles like names, and non-signature whistles to express surprise or caution. Whales weave songs that ripple across oceans and generations. Parrots identify objects and concepts with startling clarity. Even cuttlefish, solitary and ancient, use a repertoire of gestures that look remarkably like a visual code.
These discoveries invite us into a new way of seeing the world. They ask us to stop assuming silence where there is only difference. They remind us that communication need not resemble ours to be meaningful. Color, scent, gesture, rhythm—these too are languages, just ones tuned to senses and environments not our own.
To listen in this way requires humility. It means acknowledging that the living world is not a backdrop for human drama, but a chorus of voices, many of which we are only now beginning to hear. It means accepting that the line we drew between “us” and “them” may not be a wall at all, but a thin veil—one technology and compassion can begin to lift.
This roundtable you are about to enter explores five dimensions of that unfolding conversation: the very definition of language, the role of AI, the moral responsibilities of listening, the richness of multisensory communication, and the question of who we will “talk to” first. Together, they offer a glimpse of what may be the most profound transformation of our time: learning that we are not the only storytellers on Earth.
(Note: This is an imaginary conversation, a creative exploration of an idea, and not a real speech or event.)

Topic 1: What Counts as Language?

Moderator (Nick Sasaki):
For centuries, humans have insisted that language belongs to us alone. But with AI now revealing complex signals in dolphins, whales, cuttlefish, and birds, the question has grown urgent: what exactly counts as language? Let’s start there.
First Question:
Is animal communication truly “language,” or are we simply stretching the definition when we compare it to human speech?
Irene Pepperberg:
When I worked with Alex, my African grey parrot, he didn’t just mimic sounds—he demonstrated understanding. He could identify colors, numbers, and even concepts like “same” and “different.” If that isn’t language, then we need to admit our definition is too narrow. No, he wasn’t building Shakespearean sonnets, but he was using symbolic representation and syntax. I believe animals are showing us fragments of what we call language, though not in the human sense of infinite recombination.
Steven Pinker:
I’ll push back here. Language is not simply communication—it’s a computational system of symbols and rules that allows humans to generate an infinite variety of expressions, including abstractions, hypotheticals, and counterfactuals. While Alex was remarkable, and dolphins may use whistles to identify each other, these are closer to communication codes, not true language. Calling it “language” risks confusing analogy with equivalence.
Laela Sayigh:
I understand Steven’s caution, but when I listen to dolphins, I can’t help but think we’re witnessing something more than simple signaling. Their signature whistles function like names. Their non-signature whistles can express surprise or warning. When combined with pitch-shifting and vocal learning, you start to see a repertoire that resembles components of language. Perhaps not complete in the human sense, but certainly more than “codes.”
Noam Chomsky:
The distinction here is essential. Human language is unique because it exhibits recursion—the ability to embed phrases within phrases, creating infinite possibilities. So far, no animal system has shown recursion. However, Irene is right that by narrowing our definition too strictly, we may blind ourselves to fascinating communication systems that, while not “language” per se, reveal cognition. We must appreciate both the uniqueness of human language and the richness of animal signals without collapsing them into one.
Yossi Yovel:
I think we’re debating less about animals and more about our own pride. If you define language as exactly human grammar, then no animal has it. But if you see language as a continuum—sounds, gestures, colors, timing—then animals clearly participate in it. Bats I study use specific calls in social coordination that mirror aspects of conversation. Maybe it’s not about proving animals have “our” language, but about redefining what language can mean.
Second Question:
If we broaden the definition, what elements are necessary before we can call something language rather than just communication?
Noam Chomsky:
You need structure. Vocabulary alone isn’t enough. The power of human language lies in syntax, in rules that allow generative creation. Without it, you’re left with signals. For animals, we’ve yet to see grammar in this sense. That’s why I remain firm: communication, yes—language, no.
Laela Sayigh:
But dolphins do show structure. In Sarasota Bay, we see patterns in whistles that repeat across individuals and contexts. When we analyze them with AI, the statistical patterns resemble linguistic organization. Maybe not recursion, but rules nonetheless. If a whistle conveys “What was that?” and another conveys a warning, and they combine in sequences, is that not edging toward syntax?
Irene Pepperberg:
I’d add semantics: the mapping of signals to meanings. Alex wasn’t just saying sounds, he was linking words to objects, numbers, even intentions. When he said “wanna go back,” it wasn’t mimicry—it was expressing desire. That is a level of abstraction, a cornerstone of language.
Yossi Yovel:
Theory of mind is critical. If an animal communicates with the understanding that another individual has different knowledge—and tries to change that knowledge—that’s language-like. Some animals clearly do this. The orangutans delaying alarm calls to “refer” to a past predator event—that’s displacement, a core human linguistic feature.
Steven Pinker:
But we mustn’t dilute the concept. Otherwise, any complex signaling system becomes “language.” Bee dances? Electric fish shocks? These are fascinating, yes, but we risk trivializing the uniqueness of human language if we keep expanding the definition. I think it’s more honest to say animals have communication systems with language-like features—but not language.
Third Question:
Suppose one day AI proves that dolphins, whales, or birds do have full-fledged language. How would that change our relationship with them—and our view of ourselves?
Irene Pepperberg:
It would transform ethics overnight. Once you admit animals speak in structured ways, ignoring their voices becomes morally untenable. Imagine dolphins pleading about noise pollution or whales protesting captivity. We’d be forced to recognize them as fellow communicators, not commodities.
Laela Sayigh:
For me, it would mean respect. We’d have to acknowledge that dolphins have cultural knowledge, traditions, even conversations that matter. It would force humans to confront our arrogance—that we alone have voices worth hearing.
Steven Pinker:
I’d caution again: let’s not romanticize. Even if dolphins have rich communication, it doesn’t necessarily mean they share human-like consciousness or morality. We could discover structure, yes, but that doesn’t make them little philosophers. The challenge is not to anthropomorphize.
Yossi Yovel:
But Steven, if we find that dolphins or whales refer to absent individuals, or share information about the past and future, that implies mental time travel, culture, and shared intentionality. Those are deeply human-like. AI is pushing us to confront possibilities we once dismissed as fantasy.
Noam Chomsky:
If animals do have language in any comparable sense, it would not diminish humanity—it would enrich our understanding of life. We would see intelligence not as a single peak with humans on top, but as a range of mountains. Each species may have its own form of “language,” shaped by evolution for its environment. That recognition would expand, not reduce, what it means to be human.
Moderator (Nick Sasaki):
What I hear from you all is a spectrum of possibility. Some hold fast that human language, with recursion and infinite generativity, remains unique. Others argue animals show fragments—semantics, displacement, theory of mind—that demand we widen the definition. AI is blurring the line, forcing us to ask whether the barrier is real or just our pride.
And perhaps the bigger truth is this: whether or not we call it “language,” the act of listening—truly listening—to other beings may reshape our ethics, our science, and our place in the world.
Topic 2: The AI Translator — Hope or Illusion?

Moderator (Nick Sasaki):
Artificial intelligence is transforming the study of animal communication. We’ve seen sperm whale clicks decoded into “phonetic alphabets,” dolphins’ whistles analyzed across decades, and even cuttlefish gestures classified. But the central question remains: is AI really the key to translating animal language, or are we chasing illusions?
First Question:
What is AI doing for animal communication research that humans simply could not achieve on their own?
Frants Jensen:
Scale, plain and simple. When we study whales, we’re dealing with tens of thousands of hours of recordings. Humans can’t parse that volume, but AI thrives on it. We can now detect repeating features, classify them, and look for statistical structure. AI doesn’t tire or overlook patterns. Without it, projects like long-term dolphin datasets or sperm whale clicks would be impossible to analyze meaningfully.
Fei-Fei Li:
I’d add another dimension: AI can reveal hidden patterns. In computer vision, deep learning uncovered features no human had coded for, and the same holds for acoustic data. Sometimes, AI highlights structures we didn’t even suspect were there. That opens doors to hypotheses we’d never imagine on our own.
David Gruber:
With Project CETI, we’ve seen this first-hand. Sperm whale clicks sounded like noise for decades. But using machine learning, we found 156 distinct patterns that behave like a phonetic alphabet. AI made it visible. More recently, we’ve identified vowel-like qualities in whale clicks, something only possible through advanced computational analysis.
Peter Neri:
The same is true for cuttlefish. We filmed hours of interactions and couldn’t reliably detect patterns. Only when we ran computer vision algorithms did the four main gestures—“up,” “side,” “roll,” and “crown”—emerge clearly. AI gave us a systematic way to categorize what had been too subtle for human eyes.
Yossi Yovel:
And importantly, AI democratizes the process. A graduate student with the right dataset can now contribute insights once limited to massive teams. It’s not just about speed, it’s about accessibility. But—and this is crucial—AI only gives us patterns. Meaning still requires humans.
Second Question:
What are the dangers of relying too heavily on AI for translation? Could it fool us into thinking we’ve cracked the code when we haven’t?
David Gruber:
Absolutely. AI can generate illusions of meaning. You give it enough data, it will always find patterns—even if they’re meaningless. The risk is mistaking correlation for communication. Just because two signals co-occur doesn’t mean they mean the same thing to the animals.
Peter Neri:
Exactly. For instance, our cuttlefish “crown” sign appears in contexts of unease. AI picked it up as a recurring pattern, but it was our interpretation—watching the animals change color, back away—that gave it meaning. Without context, the gesture is just geometry. AI doesn’t understand context.
Fei-Fei Li:
And context is the Achilles’ heel of all machine learning. These models are brilliant statistical mirrors, but they don’t “understand.” They reflect patterns. If our input is biased, or if we strip away environmental context, AI will give us elegant nonsense. That’s why collaboration between AI researchers and biologists is essential.
Yossi Yovel:
There’s also the issue of generalization. AI models often fail when conditions shift slightly—different water acoustics, new recording equipment, a new population. What works in Sarasota dolphins might collapse elsewhere. Without robustness, we risk thinking we’ve “translated” dolphin when we’ve only mapped Sarasota’s dataset.
Frants Jensen:
And let’s not forget the temptation to anthropomorphize. Once AI spits out a structure resembling a word or a sentence, people want to say, “Ah, the dolphins are speaking English!” That’s a fantasy. We need humility. AI is a microscope, not a magic translator.
Third Question:
If AI does succeed in producing a real “translator,” what would that look like, and how close are we?
Fei-Fei Li:
It won’t look like Google Translate. That analogy misleads people. Instead, imagine a probabilistic interface—a system that takes in signals and suggests possible meanings with confidence scores. It might tell us: “This dolphin whistle probably means ‘alert’ with 70% likelihood.” That’s closer to how machine learning operates.
Frants Jensen:
For whales, I see it as layered. First, AI helps us identify basic units—the clicks, whistles, or gestures. Next, it uncovers syntax-like rules. Finally, with long-term behavioral observation, we link signals to context. That’s decades of work. So yes, we’re closer, but not “a translator in five years” close.
David Gruber:
Project CETI’s dream is to get there, though. Imagine standing by a ship and hearing a sperm whale click sequence. The AI, cross-referenced with decades of data, tells you: “This is a greeting between males” or “This signals a nearby predator.” It’s not about perfect translation—it’s about crossing a threshold of comprehension where dialogue becomes possible.
Yossi Yovel:
My bet is that the first true AI translator will be for birds, not marine mammals. Jays, parrots, or nightingales—they’re social, vocal learners, easier to study in controlled conditions. Dolphins and whales are simply harder: deep oceans, long lifespans, complex societies. Birds may be our first breakthrough.
Peter Neri:
I’d broaden it even further. “Translator” might mean multisensory systems. For cuttlefish, AI might one day read a gesture, color change, and body ripple simultaneously and output, “Expression of alarm.” That’s translation in its own right, just not in words.
Moderator (Nick Sasaki):
So the picture is nuanced. AI is a microscope, revealing hidden order in the chaos of sound and gesture. But without human interpretation and context, the risk is mistaking patterns for meaning. Still, the dream persists: that one day AI will serve as our bridge into another species’ world. Whether it’s whales in the deep, dolphins in bays, or birds in forests, the translator may not “speak” for them—but it could finally let us listen.
Topic 3: Morality and Responsibility of Understanding Animals

Moderator (Nick Sasaki):
If AI helps us finally understand animal communication—whether from dolphins, whales, parrots, or primates—what happens next? The implications go far beyond science. Do we have new moral responsibilities once we can hear their voices? Let’s dive in.
First Question:
If we truly decode animal communication, how would it change the way humans treat them?
Peter Singer:
It would change everything. If animals can articulate thoughts, desires, or suffering in structured ways, we’d be forced to rethink their moral status. The case for extending rights—to freedom from captivity, from exploitation—would become undeniable. It would no longer be abstract suffering; it would be articulated by the animals themselves.
Jane Goodall:
I agree. Imagine hearing a chimpanzee describe the loss of a child, or a dolphin expressing fear of noise pollution. We already know they suffer, but hearing it in their “own words” would pierce the human conscience. At least, I hope it would. The tragedy, of course, is that even without translation, we already know enough to act with compassion—and yet we don’t.
Laela Sayigh:
With dolphins, I worry about captivity most. If we learn they have names, gossip, cultural traditions, how can we justify keeping them in tanks where their communication is stifled? It would be like locking a human in solitary confinement for life. Translation would magnify the injustice.
Carl Safina:
I’d emphasize continuity. Decoding language wouldn’t suddenly give animals moral worth; they already have it. What it might do is open more human hearts. People respect what they can relate to. Hearing dolphins “talk” or whales “sing” in decipherable ways could trigger a new era of empathy.
Irene Pepperberg:
For me, the Alex experience showed this power. When people saw a parrot label colors and shapes, it shattered their assumptions. If translation technology amplifies that effect across species, society will have to change. But as Jane says, the question is: will humans have the courage to act on what they hear?
Second Question:
Some argue that giving animals “language” elevates them to our level. Others worry that this is human arrogance—that animals don’t need to “speak” like us to have value. How do you see it?
Carl Safina:
We must be careful here. The danger is making human-style language the gold standard. Dolphins don’t need to speak in paragraphs to be intelligent, nor do elephants. They have their own ways of knowing. Translation should expand our respect, not become a test they must pass to “deserve” dignity.
Irene Pepperberg:
Exactly. Alex didn’t need to master English grammar to show intelligence. Even fragments of communication reveal inner life. Forcing animals into our mold is unfair. We must appreciate the uniqueness of their systems—songs, whistles, gestures, colors.
Peter Singer:
But there is a pragmatic dimension. While I agree value doesn’t depend on human-style language, society often responds only when it sees reflection of itself. Translation could be that mirror. It shouldn’t be necessary, but it may be the lever for change.
Jane Goodall:
I’ve seen this for decades. When people look into a chimpanzee’s eyes, they often feel kinship. Words may deepen that, but they aren’t required. Yet, as Peter says, words could finally compel policymakers, who too often ignore empathy without data.
Laela Sayigh:
And sometimes it’s not about elevating animals to “our level,” but about lowering the wall. If AI shows us that dolphins combine whistles into phrases, or refer to absent individuals, then the wall separating “us” from “them” begins to crumble. That’s the power of this work.
Third Question:
If animals can “speak,” what responsibilities do we as humans have in listening—and how should society prepare?
Jane Goodall:
Responsibility begins with humility. We must approach this not as conquerors unlocking a code, but as guests entering another culture. Listening means respecting their context, their needs, and their right to exist free from exploitation. If we ignore their voices after hearing them, the betrayal would be immense.
Peter Singer:
I would argue for legal frameworks. If dolphins or whales can be shown to have language-like systems, we should recognize them as “non-human persons.” This means protection under law, just as we extend rights to humans who cannot speak for themselves.
Carl Safina:
And I’d stress education. Imagine schoolchildren learning not just “the water cycle” but also “the whale songs.” If society grows up hearing animals as communicators, not resources, then exploitation will wither. Responsibility must be cultural, not just legal.
Laela Sayigh:
Practically, responsibility means changing how we study and interact. For dolphins, it may mean banning sonar that drowns out their conversations, or ending dolphinariums. If we discover their communication is as rich as we suspect, continuing such practices would be unconscionable.
Irene Pepperberg:
And responsibility also includes not misrepresenting them. AI could tempt us into easy soundbites—“the dolphin said hello.” But we must resist simplification. Our duty is accuracy, even when it’s messy or incomplete. Listening means patience, nuance, and honesty.
Moderator (Nick Sasaki):
So it seems the consensus is this: decoding animal communication is not merely a scientific achievement—it’s a moral turning point. Whether through parrots naming colors, dolphins calling each other by signature whistles, or whales synchronizing clicks, each breakthrough forces us to see animals not as resources, but as fellow communicators.
And the burden shifts to us. Will we grant them dignity, protection, and freedom, or will we, having finally heard their voices, still choose to silence them? That may be the greatest ethical test of our generation.
Topic 4: Multisensory Communication — Beyond Words and Sounds

Moderator (Nick Sasaki):
We often think of language as sound—words, whistles, clicks. But animals communicate in far richer ways: cuttlefish with colors and gestures, orangutans with timing of calls, birds with song sequences, even fish with electric pulses. Could multisensory systems expand our definition of communication—and even reshape our own ideas of language?
First Question:
How do non-vocal signals—like gestures, colors, or timing—change the way we think about animal communication?
Sophie Cohen-Bodénès:
Cuttlefish taught me this lesson. They wave arms, shift colors, and ripple skin all at once. When we discovered their four main gestures—“up,” “side,” “roll,” and “crown”—it became clear they’re layering messages visually, not vocally. If we only listened for sound, we’d miss their entire language. Non-vocal signals remind us that communication is embodied, not just acoustic.
Marc Hauser:
I’d push that further. Even in primates, gestures often carry the weight of meaning. A chimpanzee pointing, or a monkey using a facial expression, can shift the context of a vocalization. These signals are combinatorial. They may lack syntax, but they reveal intentionality. To ignore them is to miss the heart of their communicative world.
Barbara Finlay:
And it’s not just about adding channels—it’s about brain architecture. Species like birds and primates map vocalizations in ways similar to humans. But when you add color shifts in cuttlefish or scent trails in mammals, you see how evolution builds multimodal systems. Their brains aren’t mimicking ours—they’re constructing parallel communication logics.
Adriano Lameira:
Orangutans showed me that timing itself can be a signal. When mothers delayed alarm calls after predators, they were referring to the past. That’s displacement, a linguistic feature we thought uniquely human. But here it came through timing, not words. Multisensory systems reveal that language-like traits may hide in unexpected modalities.
Irene Pepperberg:
Yes, and even in Alex’s case, tone of voice mattered. If he said “wanna go back” in a plaintive way, it carried different weight than a flat utterance. Animals weave meaning across sound, gesture, and emotion. If we reduce “language” to sound alone, we amputate half the message.
Second Question:
Do we need to broaden the definition of language itself to include these multisensory systems, or should they remain separate categories?
Marc Hauser:
I’d argue for broadening. Language is fundamentally about symbolic representation and intentional influence. Whether that symbol is a whistle, a gesture, or a flash of color, the function is the same. Restricting language to sound is arbitrary.
Irene Pepperberg:
I’m cautious. There’s value in distinguishing categories. Otherwise, everything becomes language—bee dances, scent trails, electric pulses. But we can still recognize these as “language-like” systems that expand the boundaries. It’s a matter of precision versus inclusiveness.
Barbara Finlay:
I think neuroscience pushes us to broaden. The brain doesn’t silo senses; it integrates them. For humans, too, language is not purely vocal—we gesture, we change tone, we use facial cues. Why deny animals the same complexity?
Adriano Lameira:
For me, displacement is key. If an orangutan delays a call to signal the past, that qualifies as language-like regardless of modality. Multisensory systems should be considered legitimate if they carry features like displacement, abstraction, or intentionality.
Sophie Cohen-Bodénès:
And it’s pragmatic. When I watch cuttlefish, separating gesture from color from ripple is meaningless—they’re all part of one package. If we want to decode their world, we must embrace multisensory “language.” Otherwise, we impose artificial silos.
Third Question:
If humans succeed in decoding multisensory communication, how might it reshape not just our view of animals, but our own understanding of language?
Barbara Finlay:
It would remind us that our own language is embodied. We already rely on gesture, gaze, rhythm, prosody. AI may one day show us that animals’ multisensory codes mirror our own reliance on layers beyond words. It would dissolve the illusion of “purely verbal” language.
Irene Pepperberg:
It could also humble us. If cuttlefish use a gesture-plus-color sequence to mean “unease,” or orangutans shift timing to signal the past, we’d see that animals have found parallel solutions to communication challenges. Our system is not the only one—just one among many.
Adriano Lameira:
And it could expand our imagination. Imagine designing human communication tools inspired by orangutan timing or cuttlefish colors. Language technology could evolve, not just translate. Perhaps one day humans will “speak” in multisensory modes ourselves—visual, tactile, sonic—learning from animals.
Marc Hauser:
But it would also force us to confront the question of theory of mind. Do animals understand that their gestures or colors alter another’s knowledge state? If so, we’re no longer unique. That reshapes philosophy as much as science.
Sophie Cohen-Bodénès:
I’ll add this: once we learn to see multisensory communication, we’ll never look at animals the same way again. Every ripple in a cuttlefish, every delayed orangutan call, every coordinated bird note—suddenly, the world is alive with hidden messages. And we’ll wonder: how much have we been missing all along?
Moderator (Nick Sasaki):
So the message is clear: language may not be sound alone, but a symphony of signals—gestures, colors, timing, and tones. Whether we broaden the definition of language or hold it apart, the truth is undeniable: animals are speaking in more ways than we’ve ever imagined.
And perhaps the most profound shift will not be what we learn about them, but what we learn about ourselves—that we, too, are multisensory communicators who have forgotten how rich communication can be.
Topic 5: Who Will We Talk to First?

Moderator (Nick Sasaki):
We’ve explored definitions, AI tools, moral implications, and multisensory signals. But the question everyone secretly asks is the most exciting: which species will we actually speak with first? Let’s open that debate.
First Question:
Based on current knowledge and technology, which species is most likely to be the first we “talk” with—and why?
Laela Sayigh:
I’d argue for dolphins. We’ve studied them for decades, with incredibly detailed data from Sarasota Bay. We know individuals, family lines, and generations. Their whistles have signature functions, and AI has revealed shared non-signature signals too. The combination of rich data and social complexity makes dolphins prime candidates.
David Gruber:
I’ll put my stake on sperm whales. With Project CETI, we’ve already found their “phonetic alphabet” of click patterns, and now vowel-like qualities. These animals coordinate across oceans, synchronizing clicks like cultural transmissions. Decades of recordings exist, and with AI’s help, we’re edging closer. I believe whales will be our first breakthrough.
Irene Pepperberg:
I’d say parrots—particularly African greys and budgerigars. We already know they can mimic human words, but more importantly, they use them meaningfully. Alex, for instance, could express desire and even concepts like “same” and “different.” Birds are vocal learners, social, and accessible in labs—something whales and dolphins are not. That practicality may make them the first.
Luke Rendell:
From my perspective, humpback whales are contenders. We have decades of recordings tracking how songs spread across oceans like cultural waves. AI has shown statistical structures akin to human language in their songs. With so much longitudinal data, we may crack the grammar of whale song sooner than we think.
Yossi Yovel:
My money is on birds, but not parrots—jays or corvids. They’re social, cooperative, and incredibly intelligent. Unlike marine mammals, we can observe them in real-time, manipulate conditions, and collect precise datasets. Dolphins and whales are inspiring, but practically, a terrestrial bird may be the first to “talk back.”
Second Question:
What would that “first conversation” look like? Are we imagining words, or something very different?
David Gruber:
With whales, it won’t be words—it’ll be patterns of clicks. Perhaps AI will decode sequences like, “Predator nearby” or “Let’s move north.” The first “conversation” may be simple exchanges of coordination, not philosophy. But even that would be historic—proof we can engage.
Irene Pepperberg:
With parrots, it could look astonishingly familiar. Imagine a parrot asking for an object, or naming colors correctly. Alex already did this. The difference would be scaling it up to show it’s not just one gifted bird, but a species-wide capability. That would feel conversational to us.
Laela Sayigh:
For dolphins, I imagine something like shared whistles. They might respond when we mimic their signature whistle, or answer with a non-signature whistle meaning “What was that?” The first conversation might be us asking a question—clumsily—and them recognizing our attempt. It may feel more like play than dialogue.
Yossi Yovel:
Birds may surprise us. A jay could respond to a call we simulate, altering pitch or timing to match. The “conversation” might be turn-taking—call, response, variation. It may not carry propositional content, but it would show reciprocal exchange, which is the essence of dialogue.
Luke Rendell:
With whales, I think the first conversation may be cultural. Their songs spread across populations like memes. If we could insert a sound pattern and see it spread, that would be profound—a conversation not in words, but in music. Humanity would be participating in whale culture.
Third Question:
What will it mean for humanity if we do succeed in having that first conversation with another species?
Luke Rendell:
It will transform our sense of isolation. For millennia, we assumed language set us apart. To find ourselves in dialogue with whales, dolphins, or birds would expand the circle of “who belongs.” It would be like discovering neighbors we never knew we had.
Laela Sayigh:
It could also drive policy change. Imagine dolphins “telling” us about stress from sonar or captivity. Their voices would carry moral weight. Translation could make invisible suffering visible—and undeniable.
Irene Pepperberg:
For me, it’s about humility. Alex showed that a bird’s mind was not empty mimicry but thoughtful and curious. If other species join that chorus, humans will have to finally accept we are not the only intelligent voices on Earth.
Yossi Yovel:
It would also reshape science. If AI cracks communication in one species, it will encourage attempts across many—bats, elephants, corvids, even cuttlefish. We may discover a chorus of intelligences, each with its own form of language. That would fundamentally change biology, psychology, even philosophy.
David Gruber:
And ultimately, it may redefine what it means to be human. Language has always been our crown jewel. If whales or dolphins show they share in it—even partly—it doesn’t diminish us. It expands us. Humanity would no longer be a species apart, but part of a planetary conversation.
Moderator (Nick Sasaki):
So, who will we talk to first? Whales with their clicks, dolphins with their whistles, parrots with their words, or birds with their calls? Perhaps the answer matters less than the act of listening.
Because whether it comes through a parrot’s word, a dolphin’s whistle, or a whale’s song, the first true conversation will remind us of something simple yet profound: that Earth has always been alive with voices—we just haven’t learned how to listen.
Final Thoughts By Carl Safina

If we succeed in decoding the voices of animals, what will it mean? The answer goes beyond science. It touches the very core of our relationship with life.
Imagine hearing a dolphin complain of the noise that disrupts its sonar. Imagine a whale’s song, translated, revealing grief or joy. Imagine parrots describing the world in words we recognize, or orangutans recalling events long past. These are not fantasies; they are possibilities, inching closer as AI sharpens its ear to the symphony of the nonhuman world.
But translation is not the end of the story. It is the beginning of responsibility. Once we hear their voices, we cannot pretend they are silent. Once we know their songs, we cannot continue to drown them out. Once we glimpse their minds, we must decide how to honor them.
The greatest danger is not that we fail to understand animals. It is that we succeed, but do not care.
To listen is to be changed. We will see that intelligence is not a single peak with humans on top, but a vast landscape of minds, each shaped by evolution’s hand. We will learn that language is not owned by any one species, but shared in countless forms—clicks and whistles, songs and gestures, colors and scents.
And perhaps, in listening, we will learn something about ourselves: that the human story is not a solitary tale of dominance, but a chapter in the great conversation of life.
So the question is not just whether we can talk with animals. The question is whether we are ready for what they might say—and whether we are willing to let their voices change us.
Because the conversation has already begun. All that remains is for us to finally listen.
Short Bios:
Carl Safina – Ecologist and author of Beyond Words: What Animals Think and Feel, Safina bridges science and storytelling to reveal the inner lives of animals and the ethical responsibilities humans bear toward them.
David Attenborough – Legendary naturalist and broadcaster, Attenborough has brought the wonders of the natural world to global audiences, inspiring generations to value and protect biodiversity.
Temple Grandin – Animal behaviorist and professor, Grandin revolutionized livestock welfare with her humane designs and brings a unique perspective on non-verbal communication through her work and lived experience with autism.
Noam Chomsky – Linguist, philosopher, and cognitive scientist, Chomsky transformed our understanding of human language with his theories of universal grammar and recursion.
Irene Pepperberg – Cognitive scientist best known for her work with Alex the African grey parrot, Pepperberg demonstrated that parrots can understand concepts and use symbolic communication.
Laela Sayigh – Marine biologist at Woods Hole Oceanographic Institution, Sayigh is a leading authority on dolphin communication, especially signature whistles and long-term acoustic studies.
Yossi Yovel – Neuroecologist at Tel Aviv University, Yovel studies bat communication and cognition, and chairs the Coller Dolittle Challenge to decode non-human languages with AI.
Steven Pinker – Cognitive psychologist and linguist, Pinker is known for his work on language and the mind, arguing that human language is a unique evolutionary adaptation.
David Gruber – Marine biologist and founder of Project CETI, Gruber leads pioneering efforts to decode sperm whale communication using artificial intelligence and linguistics.
Frants Jensen – Bioacoustics researcher at Aarhus University, Jensen applies AI and advanced tools to analyze whale and dolphin communication across massive datasets.
Peter Neri – Researcher at the Italian Institute of Technology, Neri co-discovered cuttlefish “sign language,” showing non-vocal gesture systems in marine animals.
Fei-Fei Li – Computer scientist at Stanford University, Li is a leading figure in artificial intelligence, known for advancing machine learning and computer vision.
Jane Goodall – Primatologist and UN Messenger of Peace, Goodall transformed science and ethics with her groundbreaking research on chimpanzees and her advocacy for conservation.
Carl Safina – (also featured in intro/final thoughts) See above.
Peter Singer – Philosopher at Princeton University, Singer is one of the most influential voices on animal rights and ethics, author of Animal Liberation.
Sophie Cohen-Bodénès – Researcher at Washington University in St. Louis, Cohen-Bodénès studies cuttlefish communication and discovered their use of four distinct gestural signs.
Adriano Lameira – Primatologist at the University of Warwick, Lameira revealed that orangutans can communicate about past events, challenging assumptions about time in animal language.
Barbara Finlay – Neuroscientist at Cornell University, Finlay explores brain evolution and vocal learning across species, linking animal communication with human neurological structures.
Marc Hauser – Former Harvard researcher on primate cognition, Hauser studied theory of mind and the roots of communication in non-human primates.
Luke Rendell – Marine biologist at the University of St Andrews, Rendell researches cultural transmission in whales, particularly how humpback songs spread between populations.
Nick Sasaki – Creator of ImaginaryTalks.com, Nick designs and moderates thought-provoking dialogues that bring together visionaries, scientists, and historical figures to explore the frontiers of human knowledge, spirituality, and imagination.
Leave a Reply