Learn more about these incredible animals! Why Orcas Are Called Killer Whales → th-cam.com/video/FIwjehSYKJg/w-d-xo.html New Species of Orca → th-cam.com/video/TnJVE2oNJH0/w-d-xo.html Why Orcas Are Sinking Ships → th-cam.com/video/C0cGdd9lUgY/w-d-xo.html Dolphin Attacks Are On the Rise → th-cam.com/video/FGjCPr1wLbk/w-d-xo.html
hey i speak a bunch of languages fluently and know ai. this is really really interesting and entirely possible as a breakthrough. basically someone will eventually tokenize known cetacean phonemes. if you ever run into models or open source on this do just reply right here. i'm fluent in russian, chinese, german, french, english, and am really into animal communication in theory and cetaceans may well have sufficient brains for speech to be more than just "let's meet next friday for netflix and chill" or "you gonna eat that" elephants are also candidates.
large language models are not large mathematical models nor are the large video models and they work on tokens (syllables, words, punctuation) not letters. Human-Whale communication will be broken within 3 years as in solved.
I’ve owned many different breeds of dogs and even grew up with hunting dogs. Even different types of dog breeds communicate differently. To know an animal’s heart language you have to be part of their full lives. Wisdom is the being able to do, not just knowing the informational knowledge. Experts in the field have way more reliable information about creatures they work with. AI is just a glorified search engine and we set the search parameters, it’ll always be flawed because we are. Don’t really think we can get their language right till we harness them with electronics that’ll observe everything they do and say. Still though they are amazing creatures that we all know there’s a lot more going on in their minds than just food or sex.
Maybe whales virtue-signal just like humans, and say vacuous stuff like _"Look at how much we poop in the ocean, ruining it for beautiful animals like humans. Whales are just the worst."_
About 15 years ago, I went for a surf and when I got to the beach, there was a huge crowd. Thankfully, it wasn't a shark attack or a drowning, but a large whale (humpback or sperm whale?) had been in the area for a few hours, staying mostly in the same spot. After taking photos for half an hour, I went in the water. I was probably about 70 metres away from the whale and it was obvious it was checking out surfers. Twice, I heard guys ask if the whale might attack. I just thought that if it wanted to do that, it would've done so by now. No, it was really grokking on our species. Knowing that they're very intelligent and have excellent hearing, I slapped the water as hard as I could. Once. The whale then hit the water with a fin, once. I then hit the water twice, leaving a second or so in between slaps. The whale then slapped its fin twice. Yeah, I then hit the water three times and the whale responded by slapping its fin three times. No further submissions, Your Honour, but it sure as hell seemed like this beautiful animal wanted to communicate. After that, a man let his teenage daughter approach a bit closer. The whale stayed there long enough for me to shoot more photos for half an hour afterwards. Small surf, but one of the best days in my nearly 60 years of surfing.
With the whole UFO becoming UAP and the subsequent introduction of the NHI (non human intelligence) into our collective vocabulary, i hope people start realizing our fellow lifeforms on this planet are much smarter and sentient than we realize.
@@Samos900 Years earlier (in the 70s) I was surfing when an orca and calf were swimming near a river mouth. In those days, we didn't have much to do with "killer whales" and besides, they were probably about 300 metres away. But over the decades, through the internet and TH-cam, I learnt how intelligent these magnificent creatures are. Well, I guess it's a bit like all creatures, really: that they're intelligent and many species are empathetic enough to understand that we' have a modicum of intelligence - even if we don't show it by being more considerate of them.
@@amateurmakingmistakes that’s incredible. I absolutely agree. Many animals are more intelligent than they are given credit for. I’ve noticed that most mammals have a lot in common when it comes to empathy. I think we as humans have that empathetic relation with other mammals as opposed to other groups like reptiles. I suppose it’s easy for their intelligence to go unnoticed. We really seem to take note of it when animals play around with their environment to have some fun. While they do play, Most of what animals do in their daily lives consists of a routine centered around survival preparations. This contrasts dogs who have been bred to do chores for humans. Therefore they are less concerned with survival and can allocate their intelligence to other things. They focus on human bonding and having fun as opposed to worrying about hunting and making preparations for hibernation. Although, there may still be some traces of these instincts, like howling.
2 ravens led me to a freshly trained-killed moose. After I opened it up and cut off its hind quarters, the ravens could feast on the guts. Birds can communicate with people if you bother to learn their lingo.
Very cool! Ravens have been cooperatively hunting with wolves like this for a very long time, and since wolves and humans occupy (historically) the same ecological niche in terms of prey and hunting techniques, it makes sense an intelligent corvid like a raven could quickly make the jump to hunting cooperatively with humans. Neat observation!
@@njay4361 Thank you! I couldn't remember where the line came from and It was starting to really bug me cos it was so familiar! (It's 'Galaxy' btw, Hitchhiker's Guide to the Galaxy, written by Douglas Adams)
I read the UC Davis paper when it came out. I'm equally parts amused, fascinated and *mortified* that we successfully carried out a conversation with an arguably sapient species... and the content amounted to essentially a *crank call,* with the humpback ending the convo by "giving us the flipper." 💀
Good luck with that. A pod of Orcas in Australia teamed up with the local whalers to hunt humpbacks for nearly a century until the 1950s. The Orcas located the humpbacks, alerted the whalers to their presence if necessary, led them to the whales (sometimes even dragging the boats) and helped with the hunt. The Orcas got to eat tasty humpback lips, tongues, and genitals, and left everything else for the whalers. Win-win. It is believed that this may have been a centuries, if not millennia, old shared tradition between Orcas and humans in the area. Imagine what humpbacks cruising up and down the east coast of Australia had to talk about.
The analogy to Freddie working the crowd with his interactions... was spot on. A classic media mis-representation of the actual experiment. Very interesting research and a well presented video. Completely cutting edge stuff... I can't wait to hear what we can understand of Whales and their communication. Eventually, we may find out that Whales are smarter than... most of us.
It's like talking to a baby who says their first word. We were saying our first word for the whales and they were so kind to just keep saying "hi" back over and over, just like we do for babies who do the same thing. My son's first word was "hi", and we had "Hi!" "Hi!" "Hi." "Hi!" conversations all day for quite some time. Good times.
Dear KP: I'm 71. I've been talking to animals since the 1950s and plants since the sixties (The Secret Life of Plants) and guess what? I discovered, very slowly (there were a few "alt" resources) that many plants and animals are telempathic. This is not a typo. Telempathy is the ability to receive and send images, intent and feelings (NOT words, plants and animals do not "talk" in the way humans do - it's a fine line between some animal languages and actual words spoken) and now YT features doggos and cats imitating basic human speech AND they understand the words and phrases. Amongst the species designations individual plants and animals communicate in ways not reflective of a species or breed. There are some species - honeybees, for example, that easily perceive human feelings and intent and some humans can easily read a hive mood, yes, you know when they are pissed off or being friendly. Cetaceans are fully "sentient" - they are pretty much all telempathic and do communicate in ways that a few humans understand. "A.I." is only as effective as the persons who write the programs - it takes a genius who IS NOT an ego-driven fool to program computers that do NOT reflect the "human mind", which doesn't allow A.I. to expand. You are on the right track...huzzah! If you KNOW that you can communicate with some critters and plants it's a lot easier to actually do so. Seeing your interactions (muted) with cetaceans it is obvious to me that you easily communicate with the critters, not in words but in mood and intent. Animal intelligence far exceeds what humans recognize - it's a different form of communication than what humans are used to hence time to call the "scientists". Give me a break. Perhaps you are unaware of your skills in communication. If you love the animals and plants most will know and respond in their own way that human egos do not comprehend. "A.I." will expand exponentially when humans stop forcing the human mind into the programs. As long as a person refuses to come down off the pedestal of "sentience" there is zero communication with the telempaths. In 1986 I went body-surfing in Manhattan Beach (L.A.) and in a few minutes I was surrounded by a pod of Pacific porpoises (the ones with the stripe) and it was clear that they were laughing, not at me, but in amusement at my behavior. Many critters just want to play and "science' just doesn't get it. Talk openly to animals and plants without regard to science or superiority.
They have had to learn very hard lessons about interpreting human intentions, separating the curious and awkward from the truly lethal. These porpoises had pegged you quickly as belonging to the former (majority) category. If they had perceived a threat they would've either pummeled you into unconsciousness or made a hasty retreat, more likely the latter.
Very well said, dear human being. Humans can do various languages. We know very well the language of Love and Benevolence. How so? We know it so well, that we have used instruments, music, to allow this intent to shine forth!!! And it does so breathtakingly!!! (For example, Robert Haig Coxon, with the Kryon music. 1 example among millions...).Theres a very profound concept there. Animals and plants, do react to music. I suspect what most ppl want is full on dialogue with most animals. Alas, they do not have the vocal cords. Sign language was taught to koko and she was very good at it, but other animals do not function with that. Has you wisely said, Intent is all. Why use words, when mental pictures and, ultimately, intent, is FAR quicker!! Still, it is aparticularity of our species, this manner of language. The one im typing now😂.
I knew what you are referring to when I was 4 years old. I was in wonder and awe as I stepped cautiously under the umbrella of a beautiful blooming lavender Wisteria that was fashioned into a tree canopy... the whole thing alive and buzzing with hundreds of honey bees. I projected my intent to observe to the bees and meant no harm or disturbance. They noticed me and even considered defending but immediately went on about their business without any threat. I only had a few seconds before a soon to be stepfather who is a horribly, dangerous and abusive man yelled for me to get out of there (from his own ignorant fear) and when I yelled I was fine he soon reached into the canopy and yanked me out from my glorious peaceful experience. I suffered years of attempts on my life and incredible abuse, and I will never forget as he drug me away muttering to deaf ears, "No... you don't understand... they're my friends". To this day, now I am 60 years old I have had bees even land on me but have never once been stung.
As an AI developer (for a small company, I don't work for any of the leading ones) I believe that there is a good chance of being able to build an AI that can translate animal vocalizations if they actually form a language even if it is extremely different from ours. The path towards that could look like this: - We first need to figure out the words/syllables/alphabet (so we can tokenize the inputs), not the meaning, just the individual sounds. And that seems like what they are doing right now. - Once we have words, we can either train an autoencoder or an autoregressive next word predictor with a bottleneck in the middle. That on its own won't give us a translation (may give us a chatbot that only wales understand) but if successfully trained, we can take the activations of the artificial neurons in that bottleneck and use those as an embedding (turns each word into a many dimensional vector, when we do that for human languages individual numbers of that vector can represent things like meaning, ie "dog" and "cat" will share quite a bit of values with each other and with "animal"). We won't know what does each dimension of that embedding mean. - Once we can tokenize and embed we can train an LLM. If we just do that, we would just end up with a better chatbot that only whales understand and that only knows what whales know. (we could do that and let it talk with whales to try getting a bigger dataset) - But we can try something else: Instead of train a whale LLM from scratch, we add another two dimensions to the whale embedding and make sure that the number of dimensions of the embedding that we have generated matches the ones we generate for humans. Those two extra dimensions would contain an 1.0 in only one of them depending on whether it is whale or human language. Then we take a pre trained LLM for humans with those two dimensions added (or two pre existing ones but unused or barely used, repurposed) and fine tune / continue training with both the whale dataset and the human dataset alternating samples of each in each batch. The last part is real tricky but even if the meaning for each dimension excepting the two that we introduced by hand won't match at all, forcing a transformer to understand both under the same architecture is likely to lead to some unified internal representations at the outputs of the intermediate layers (backpropagation and gradient descent would in some sense try to reuse what is in there from the human pre training, interleaving with human data would prevent progressively deviating from that too much). The transformer architecture used in modern chat bots would be particularly apt with this as it acts as some kind of progressive refining through algebraic operations between the embeddings of the tokens (that can act as logical reasoning) in the attention operators and each layer MLP can look up extra related meanings related to the input or do some non linear operations, the fact that the MLPs don't have outputs larger than the embedding space forces the network to compress the reasoning for the next layers in a single embedding vector. That is likely to indeed force some kind of internal translation for at least some of what was learned. It is unlikely to work perfectly but I'd expect that if you train such chatbot and you just ask "What does this mean in English?" followed by some whale language. It would likely answer something vaguely related to what the whale intended, at least some times. That if it works at least to some extent could be an stepping stone that could be iterated upon. There is likely an optimized topology and training procedure that would maximize the reuse of features learned from human languages. Once we have even rudimentary communication, that could bootstrap the building of more comprehensive datasets. We could just ask the whales to ELI5 when we don't understand lol. They may even be interested in helping us understand, something like we show them something = they tell us the whale word for that. I hope someone does that and that it works, it would be about the closest thing to meeting intelligent aliens and good practice for when that may happen too.
@@GoogleAreEnemyCombatants It is not easy to further simplify it without a background in artificial neural networks. I laid out a way to maybe train something like chatgpt to translate whales to English (or other animals or aliens if they do have a language) without actually knowing ourselves any of that language. I am speculating that it could work but I believe that it would take a lot of effort to make it work well. One good thing is that it likely doesn't take something the size of chatgpt but chances are something with a similar architecture but 1/20 the size or a bit less could work. Still the training of such thing would require some significant resources.
@@eruiluvatar236 with enough training, an AI can figure out what sound a whale will make. How do we go from that to asking what their favorite color is?
@@GoogleAreEnemyCombatants I explained that in the original post. The very simplified explanation is that we can force the artificial neural network to use an internal shared representation for human and whale languages, teach it both and ask it to translate. Without pre existing translation examples it won't do great but though what I explained it will likely be able of doing some very rough translation. But once we have even rough ones, we can iterate and refine the model and dataset. It is based on how we trained them to understand human languages, they learned on their own both their internal representations of words inferring their meaning in the process and the relationships to other words.
@@eruiluvatar236 so, the bot gets its own native language and then is trained on human and whale language and uses bot language as an intermediary to translate. But we have never given a bot it's language. It creates its own while training.
It doesn’t matter if this is a success right off the bat, but this is what AI really should be purposed for! Even if the difference engine, language learning model, whatever you wanna call it (it’s not real Ai, yet) gets it wrong 999 times out of 1000 - that one time out of 1000 is likely statistically faster than humans would’ve arrived to that point. Also, it will explore avenues, in ways we don’t, that will yield unexpected(ly beautiful) results! So happy to run across this video!! We all know that our animals, while perhaps not possessing the same level of mental function, also exist in an emotional, and perhaps even spiritual, spectrum. We feel this inside. We don’t have concrete evidence, but this gets us even closer; and I’m ALL FOR IT!!! I WOULD LOVE TO BE ABLE TO SAY “HELLO” TO A WHALE!!! Really, Octopuses/podes/pi are my loves, but I truly love all living beings! ☺️❤️🙏🏼
"Not real AI yet" That's what you'll forever say, no matter how advanced the AI will get. Accept it: we're just complex inference engines ourselves. Btw, I'd love to chat with octopuses and all other animals too.
Most people don't realise that most human languages are nothing like each other, if they haven't been related for only some few thousand years. Parts of speech we're familiar with can't be expected to be the way innumerable human languages work. An animal language can't be expected to resemble a type used by human language groups spoken by billions, nor even the many many thousands working in some completely different way, plus even a larger number that hasn't been anybody's natural language for just a few generations. Most human languages in different groups are not at all like each other, and you can't expect an animal language, even if fairly large or meaningful, to resemble a common human one.
Oh, human languages get even wilder than that. You know the "eSkImOs HaVe 100 WoRdS fOr SnOw" idea? Yeah, no. They have _4 words_ for snow. The thing is, the Inuit languages are polysynthetic, which menas the language doesn't have "words" or "sentences" so much as 50-syllable-pile-ups of what we'd consider "words" in English¹, all glued together into a single whole that _looks like_ a word to an English-speaker¹ but functions like a phrase or even an sentence. So, the Inuit don't have 100 words, they have 4 words that they can glue dozens of adjectives on. Then there's things like word-order and ergativity. Both change how you put together your thoughts. The 2 most common word-orders are Subject-Object-Verb and Subject-Verb-Object, with SOV being slightly more common. And ergativity is about which is considered the "normal" or "default" part of a sentence: the Subject or the Object. Most languages leave the Subject unmarked, meaning the doer of the action gets the attention. And with intransitive verbs, where the doer and the recipient of the action are the same, its one noun is treated as a Subject, as a doer. Example: "I sleep." I am the one doing the sleeping. But in languages that follow the Ergative-Absolutive paradigm, its the _thing being acted on_ that is unmarked and considered the "default". And the noun that operates with intransitive verbs is treated as the direct-object. So the nearest equivalent to my example, under the ergative-absolutive paradigm, would be "sleep me," i.e. the act of being asleep is _happening to me._ That's a fundamentally different way of thinking about the world. Then there's Evidentiality, a _required_ marking on verbs that some languages have that indicates whether or not the speaker is talking about something they saw firsthand, heard from someone who saw it firsthand, heard of it through heresay, or are only guessing about. So, yes, human languages contain a wide range of variations that encode very different ways of thinking about the world. [¹You can replace "English" with any other lanugage in the Indo-European Lanugage family. Or with Chinese. Or Thai. Or most of the languages in Africa. Polysynthetic languages outside of the Americas are _rare._]
@John_Weiss and you also made the case for what they all have in common. This can be used to attempt to identify structure in whale concept formation, regardless of the "lingo" unique to their pod.
Languages are the same across the board. It's just a way to communicate using complex sounds. It's the same for birds, mice or any other animal. I know the alarm sound the magpie makes when my cat is outside. I know when my dog whimpers, it's in pain. I know my cat is content when i pet her because of her purrs. When a baby cries, it's hungry. Animals dont think in words, but in concepts and mental images. When my dog remembers her buried bone outside, she sees a mental picture and reacts on it. Never does she think of the word "Bone". But she can be taught that certain sounds belong to certain objects. When I say Bone, she rushes outside and checks on it. We are all limited to the sounds we are able to make. The more sounds, the bigger the potential for more vocabulary.
I would argue that music DOES contain "information". I agree with the criticism of describing complex non-human communication systems with an anthropocentric focus. Fascinating video! Thanks for the insights.
Thank you for this. The reports of "scientists had a 20 minute conversation with a whale" have bothered me because it's not a conversation if you don't know what you're saying (it's okay to not know what /they/ are saying because that's how both of you learn what the other is saying). So knowing that it was repeated contact calls that were returned, with matching intervals, makes me believe that we were basically saying, "hey. Hey. heeey. hEeeeey." and the whale was responding in kind. It makes me think that some researchers should synthesize their own contact calls and play them each time they approach the pods they are studying. (specifically contact calls for the BOAT) We won't know what the "name" /means/ (if anything) but it would be very interesting to see if the whales start reacting to that particular contact call.
I agree with everything here. Some linguists and biologists have compared the "20 minute" conversation to playing a recording of a wolf howl to another wolf in order to elicit a howl. I do think it was more complex than that, because of the timing and interval matching, but it wasn't a conversation as it has been portrayed. It would be interesting to generate a unique contact call and see how a whale would interact with it... that is an intriguing thought!
You are a mother, you see a weird metal structure about the size of a short human, with even smaller creatures barking at each other on top of it, while looking and pointing at you. The machine plays a female voice saying "come~" you can clearly tell its not directed at you, almost like the voice is calling back her child, but curious you yell back, "come?" as you walk around it confused. I'd like to imagine that's what the mother whale experienced. Honestly the way its described, that every whale is slightly different, reminds me of how my family can usually recognize my dogs bark compared to our neighborhood dogs, even with one or two of them being very close to our dogs breed. (granted I'm not sure how well someone could distinguish barks if they had... 4-12 dogs like with whale pods... but if all you had was barking, it would probably be naturally easier)
This is sooooo much better than forcing whales into tanks and making them do tricks for their food. I'm glad to see humans being much more respectful towards whales and dolphins!
I have played my violin for herds of cows on a mountainside, deer in a snow covered field at night, a rooster in a barnyard... They definitely are interested, and respond in surprising ways. I would love to interact with another species by sketching pictures. There are lots faster artists, but as a cartoonist I can work pretty fast. There are other people who don't think of themselves as artists, but are able to render VERY FAST and Recognizable "iconic" sketches of objects. => Some people have that ability to reduce things to simple icons that are still recognizable depictions of a more complex object. Could a Dolphin enjoy sketching underwater, like the Elephant who has learned to paint with brushes and a canvas on an Easel? Seems a way of starting to work with an intelligent species, Have the objects available, the sketcher makes a drawing and the dolphin should be able to see the relationship between the drawing and the object. Build from there, sequences of eating, fighting, catching fish, sorting objects, assembling objects from parts. et cetera. Come to think of it, There are violins made from injection molded plastic that could be played underwater... Or cast aluminum fiddles... hmmmmm.
You probably didn't know that whales have been in telepathic with some humans for a very long time from decades old research. It never gets talked about anymore and or gets swept under the rug but you ask around the old old timers in the marine industry and they might tell you the story... As far as I recall this was about 30 ish years ago during the early stages of this communication by one particular fellow he was finding the work a little hard to believe (both the communication itself and somethings that were said about sound in physics terms) The whale told him we can prove this and told him to go to such and such a coast line at a specific time. Which he was doubtful any whale would show up for him at all but he and several witnesses and friends witnessed one of the most marvelous interactions between them and humans to date bar none. Not just one, but something like 500 all showed up at the same time, some originally tagged from hundreds of miles away. They all popped their heads up at the same time, there was some kind of synchronized thing of some sort to prove they could all speak telepathically and such and then they all just left going back to where they came from. I suspect there was tears and dropped jaws for some weeks after between the small handful that witnessed it.
No amount of AI will EVER achieve that because animals do not have a true language system. While AI can help us better analyze animal vocalizations, it's important to remember that these are not structured languages with grammar or syntax. We mostly understand what terrestrial animals are communicating through their calls-basic messages like warning signals, mating calls, or expressions of distress. However, when it comes to marine animals, particularly deep-sea species, we know much less about their vocalizations, largely due to the challenges of studying them in their natural environment. The ocean’s vastness and depth make it difficult to gather and interpret the sounds they produce
@@jorgen7180 Yes I also believe we will never decipher animal languages per se but I'm hopeful ai will be able to find patterns etc. so that we can understand and communicate in broad, general terms at least.
@@jorgen7180 How do we know that they do not have a “true” linguistic system, when we only “mostly understand what terrestrial animals” are saying, and know “much less” about marine mammal communication? I think the only thing we know for sure is that human competence is consistently overestimated, otherwise we wouldn’t be destroying the planet we share with creatures who are not.
and this is why i'm studying to be a marine biologist as well - as someone who is becoming fluent in American Sign Language myself, i've had a bit of a perspective change on the definitions of language and how we could connect and understand the wildlife's words, how their perspectives of language are entirely different and yet fundamentally the same despite our differences - it fascinates me. Maybe one day I too could study the language of dolphins and whales, as I used to go whale watching so much growing up. It would be so wonderful to learn their language, to communicate with these ancient wonderful beasts.
The idea that AI and machine learning were all bundled up into essentially chatgpt is misleading. LLMs are considerably innacurate and do not approach reasoning, while models designed for specific domains of tasks reach much closer to 100% error free for the given task. Needless to say, a conversation in an alien language is hard because there is no decodable feedback mechanism.
I have had the privilege of growing up right on Puget Sound. Spending my days and nights on the beach, in the water, and on the water gave me a special seat at "the table" (NOT a foodie so...not talking about fishing, clamming, etc) of Mother Nature. Just found your channel and I'm hooked. I love your smart sense of humor, pleasant easy-going way, and all the info. You and your husband/wife/partner and kids must have a lot fun at home! LOL. I can only imagine. Thanks for these videos. Absolutely fantastic and fun. I'm a new subscriber. Keep 'em coming please
Chat gpt tried to tell me the word like is and has always been spelled "lik" and while arguing about this it uses the word a dozen times and each time spelled it correctly 😂
I worry for people who want to give up their reasoning abilities, for "AI" - it will make us all dumber if we let it. But more than than, who is actually benefiting from this? Clearly it's large tech who want to make us dependent upon it, then they can bill us for the privilege of not having to think.. BTW I work as a programmer in IT.
@psaunder1975 I use to learn and keep brain active. I have real conversations or play brain games like alphabet/country( take turns alphabetically naming random topics like bands or countries ect.) or analyzing photos of rocks I find. Google has already been dumbing the younger generation that have raised with it for research. I have always had the attitude that googling something that you already know, but have forgotten is cheating your self or a weakness. It is a last resort like you can't take anymore. If you're genuinely interested in learning something new then it is essentially the same as I did as a child look it up in the encyclopedias.
Thank you KP. You said what I thought. If language between humans and any animal were to happen, my opinion is that the animals would run away screaming. Our species has been awful to every other one on this earth. Especially Cetaceans.
Maybe they don’t know? I mean not even every human does and we have access to the internet, libraries, TV etc. I mean there are people that when asked to name 3 countries outside NA they say europe, Africa and Mexico. And we are supposed to be a intelligent species.
I think the issue with analyzing intelligence in other species is that people tend to measure intelligence by human standards. We're just one example of intelligence, and by limiting every other possibility to our own definition, We're missing out on so much empathy and connection between species.
8:45 Did you actually expect the audience to think that an incomprehensible amount of machine learning isn't AT BARE MINIMUM worth the trade for the emissions of a SINGLE car driving 2 million miles?????? LOL! Hell the multiple 2 million MILE Toyota's that exist were worth those emissions simply as a source of human pride in mechanical ingenuity.
Carbon emissions are just one of many concerns with AI that have been raised by people who know far more than I do about machine learning; data bias leading to biased outputs, lack of data diversity, poor data quality, overfitting to training data, etc. [1] news.harvard.edu/gazette/story/2020/10/ethical-concerns-mount-as-ai-takes-bigger-decision-making-role/ Additionally, AI systems evaporate several liters of water per kilowatt-hour of energy used for cooling in data centers. This means a significant amount of water is used to keep AI running, particularly in large-scale data centers. Which is arguably a bigger cause of concern and one I should have addressed. [2] cee.illinois.edu/news/AIs-Challenging-Waters [3] oecd.ai/en/wonk/how-much-water-does-ai-consume Discussing these concerns openly and honestly is all I advocated for if we are going to use this exciting new technology for scientific and conservation purposes.
Well, at least when your chat programs ask the whales, "Can I mamu duck-face to flee banana patch?" the cetaceans can still get another good laugh at human ignorance.
Great video! And thanks for all the references to the papers :) Just a comment on the AI things, sure developing a model like GPT3 consumes a lot, and sure, language models are not super precise. BUT, machine learning is a biiig field itself. You can train small models in your computer in minutes processing even a couple of GB of data and there are models for doing things like clustering (for example, to identify different sounds) which can work with a veery high accuracy for many problems. Just taking LLMs as reference is an unfair comparison!
Nope. Not everyone can train a model on their computer. That's a first world thing. Training a simple clustering model with a few thousands of examples takes hours on my computer and the result is, obviously, pretty much useless. If it wasn't for Colab, I would still be hiding away, depressed and mourning for the impossibility of the common man to do ML development. (But I'm still depressed because I still can't pretrain a SOTA LLM from scratch :p)
Thank you, KP, a fascinating video as ever and, as ever, I have learned a lot. Why, though, do I fear for cetaceans should we be able establish some form of interaction in the future? Interracting with humans, other than experts, seldom ends well for wild animals. Sorry to be wet blanket.
Would be totally amazing though to be able to tell a whale "Careful, shallow water ahead, don’t get closer" and the whale answering "thanks mate, bit confused from all this loud noise, you know what’s up with that?".
Maybe this is going to be like the book Arrival. Learning to understand their speech and maybe how they view the world will change how we humans think or view the world in some drastic way.
Orca: "Hey Man, everything is warming up fast!" Human: "We know. It's our fault. We're trying to fix it, but a lot of people don't want to spend the money to do it." Orca: "What's a money?"
As an AI developer and former street gangsta thug of Harrow London (for a small company, I don't work for any of the leading ones) I believe that there is a good chance of being able to build an AI that can translate animal vocalizations if they actually form a language even if it is extremely different from ours. The path towards that could look like this: - We first need to figure out the words/syllables/alphabet (so we can tokenize the inputs), not the meaning, just the individual sounds. And that seems like what they are doing right now. - Once we have words, we can either train an autoencoder or an autoregressive next word predictor with a bottleneck in the middle. That on its own won't give us a translation (may give us a chatbot that only wales understand) but if successfully trained, we can take the activations of the artificial neurons in that bottleneck and use those as an embedding (turns each word into a many dimensional vector, when we do that for human languages individual numbers of that vector can represent things like meaning, ie "dog" and "cat" will share quite a bit of values with each other and with "animal"). We won't know what does each dimension of that embedding mean. - Once we can tokenize and embed we can train an LLM. If we just do that, we would just end up with a better chatbot that only whales understand and that only knows what whales know. (we could do that and let it talk with whales to try getting a bigger dataset) - But we can try something else: Instead of train a whale LLM from scratch, we add another two dimensions to the whale embedding and make sure that the number of dimensions of the embedding that we have generated matches the ones we generate for humans. Those two extra dimensions would contain an 1.0 in only one of them depending on whether it is whale or human language. Then we take a pre trained LLM for humans with those two dimensions added (or two pre existing ones but unused or barely used, repurposed) and fine tune / continue training with both the whale dataset and the human dataset alternating samples of each in each batch. The last part is real tricky but even if the meaning for each dimension excepting the two that we introduced by hand won't match at all, forcing a transformer to understand both under the same architecture is likely to lead to some unified internal representations at the outputs of the intermediate layers (backpropagation and gradient descent would in some sense try to reuse what is in there from the human pre training, interleaving with human data would prevent progressively deviating from that too much). The transformer architecture used in modern chat bots would be particularly apt with this as it acts as some kind of progressive refining through algebraic operations between the embeddings of the tokens (that can act as logical reasoning) in the attention operators and each layer MLP can look up extra related meanings related to the input or do some non linear operations, the fact that the MLPs don't have outputs larger than the embedding space forces the network to compress the reasoning for the next layers in a single embedding vector. That is likely to indeed force some kind of internal translation for at least some of what was learned. It is unlikely to work perfectly but I'd expect that if you train such chatbot and you just ask "What does this mean in English?" followed by some whale language. It would likely answer something vaguely related to what the whale intended, at least some times. That if it works at least to some extent could be an stepping stone that could be iterated upon. There is likely an optimized topology and training procedure that would maximize the reuse of features learned from human languages. Once we have even rudimentary communication, that could bootstrap the building of more comprehensive datasets. We could just ask the whales to ELI5 when we don't understand lol. They may even be interested in helping us understand, something like we show them something = they tell us the whale word for that. I hope someone does that and that it works, it would be about the closest thing to meeting intelligent aliens and good practice for when that may happen too.
I think they should start making a catalogue of each suspected, "word," in the whale language, and keep all the files together the same way the FBI stores fingerprints, for later reference. They could call the system, "Prints of Whales."
I have worked in several fields, at one point I was head of design for an electronics company (got bored with electronics), and I now have a linguistics and psychology heavy degree. Psychology is important in linguistics as "cultural artefacts" as defined by Vygotsky define the semiotic range for words. So if we are looking at a language spoken between whales then ideally we need to figure out what their cultural artefacts are. The flip side of the coin is an idea proposed a while back - that the clicks and whistles used by dolphins etc are used to transfer a sonogram from one individual to another. It's going to take AI far more advanced than we have now to figure that out.
When we consider the millions of dollars drawing us toward more consumption without awareness of the planet your critique of A.I. at the end was spot on. The potential to amplify bias transcends anything we've done previously. Truly open to the public understanding of the code creating a particular direction is the best hope we have to lessen its destructiveness but in a capitalistic model that is not remotely possible. There is no way in our current economic/scientific system the payback for this knowledge can avoid being weaponized and capitalized upon. "A tree should exist only if we can consume it's value." If we were to communicate freely as we do with humans I don't think we would treat them any better than we treat the First Nations of the US or African Americans on the whole. ie net worth of AA is 10x less than their dominant counterpart and we blame AA's for it.
@KPassionate You're channel is fantastic, thank you so much for taking the time to educate the world on Marine Biology subject matter. Happy 2025, wishing you mush success and prosperity.
This reminds me of the music scene from Close Encounters of the Third Kind. Whales and dolphins have language, but because we are aspect-blind to their language because we do not understand their form-of-life. However, AI might actually solve this hemiopia, which would be impressive.
7:39 it's also possible that their sounds change when they see each other, like Italians or Romans vigoursly using their hands to express emotional context 😂
There are so many possibilities, and it often depends on species. Something I left out is that humpback whale songs are relatively stable, but they do change over time... but only west to east. [1] www.smithsonianmag.com/science-nature/humpback-whale-songs-spread-from-west-to-east-176855840/ There is so much we don't know and it's honestly a fascinating field of study. I wish we knew more!
This is fascinating. There's also a good opportunity to learn about personal biases. Older versions of Chatgpt can't see individual letters. They see tokens that consist of multiple letters. That's why it couldn't count correctly the number of r's in strawberry. If you try to shoehorn human expectations and models to animal languages you might do as wildly incorrect conclusions as trying to figure out why Chatgpt couldn't answer such a trivial question.
This is fascinating. All those marine animals are sentient. Being able to communicate with them would open a door to a new world. I'm sure they would teach humans about life in the oceans and also about pollution. I am pretty sure the whales can have a spoken language along with telepathy or something similar.
Orcas are my favorite aquatic animals, love the wolves of the sea. Wolves and dogs are my favorite elsewhere. I also work IT and hate the hallucinating AI search engine integration. So thank you for touching on that. But it is much more than just that.
Those are fabrications (made up stuff) rather than true hallucinations (sensory training defects). Many people in the ML community use incorrect words.
Nice description of cetacean sounds (vocalizations?). I tend to disagree with the notion that music is not the same as communication. It is true that music is different from a verbal communication - but verbal communications include more than mere words. In addition to the words in a verbal exchange, there are several other aspects that are also communicated. Just think about speech for a minute. If we hear someone speak, we can usually figure out if they are male or female, child or adult, stressed or relaxed, tired or active, and other parameters. If we are familiar with the person that is speaking, we can even figure out other parameters, such as deceit or truth, comfort level, paying attention (or not), interest or disinterest, ... All of this counts as communication, above and beyond the mere words. Music does this too, without the complications from the words. I suppose it might be possible that cetaceans are singing music, instead of using words, ... maybe. It remains to be seen (or heard).
All very good points! I had originally planned on a longer segment about how whale songs may be more similar to music, but I ended up cutting it for pacing. But I think you explained it better than I would have. Thanks for posting this!
While assisting my friend as a deckhand on his 60’ steel hulled American schooner sail boat in Alaska, I suspended myself from the chains of the bow sprit to extend my hand to the surface of the water…a pod of Dahl propose responded by proposing partially out of the water and high five’d me! Every one I could see high five’d me…it was an incredible experience, they knew my heart and I knew theirs. I highly recommend the experience if one ever gets the opportunity.
Thank you for this summary. There are audio recordings of crickets slowed down, and they sound so beautiful. I wonder how the clicks and whistles might sound slowed down, and also in different mediums than water too. Also, that the circling whale repeated not just the sound but also the temporal interval suggests a potential spatial aspect to their communication, doesn't it?
There's also the possibility that some vocalizations may even mimic an aspect of echolocation return patterns, and the whales could be speaking what amounts to snapshot pictures of how they see using that. So that's another thing to explore. On top of that, communication could be diverse among species. It seems beluga whales also do a thing that sounds a lot like frequency-shift keying, which is how acoustic modems work. Which is interesting, but FSK is also used for some human undersea communications - so maybe the belugas have picked up something that mimics us? Yet until some scientists look more deeply into things like that, there are a couple gaps to be explored.
Maybe? There are two reasons to think that the vocalizations are not aspects of echolocation. First, is that baleen whales like humpbacks don't have the ability to echolocate. Echolocation is exclusive to the toothed whales, like dolphins and sperm whales. Second, the echolocation vocals are distinct from their other vocalizations and easy to recognize. Vocals like a sperm whale codas and dolphin whistles are something else entirely.
I've watched spectroscopic playback of Orca vocalisations for hours and hours. There is absolutely no doubt that it's communication. There are a few distinct, repeating 'words' in the lower frequencies, overlayed by complicated stuff that almost looks like writing, in the upper frequencies. I think that AI will crack it, if it's approached in the right way. Exciting times!
It's important to note that what we call AI today is not Jarvis, which is why many researchers within the area is trying to change what we call it to "algorithmic problem-solving and machine learning" (I'm doing a master in information science with a focus on digital environmens and AI has been a big part of the education). So, the "AI" they use will have been made specifically for this project and would be substantially different from chatGBT, which is built to mimmick human communication by (in simplified terms) predicting what the next response would most likely be by analyzing other human communication. This AI (with the presumption that they did their study in a scientific way, which I trust to a reasonable degree since I'm Swedish and this is one of the most prominent collages we have - that said, I haven't checked the study myself, so don't take my word for it) would work entirely differently. It would be focused on analyzing communication to "learn" the "language", not on creating a conversation by predicting what the most common responses would be. That said, AI is flimsy (imo) still, and we won't know if they are successful until they would be able to mimmick the "language" and have conversations that could be proven to be accurate and to conclude in a shared understanding. And that, my friends, is not happening any time soon 😄🍀✨🐳🐬
When you said "1939", the google assistant on my old phone woke up all of a sudden and started to talk out loud about what happened in 1939, World War II. I was so surprised, I almost jumped out of my skin. I live alone and hearing an AI voice out of nowhere scared me so much, but I don't understand how that happened 😂 Thank you for the video!
My take on Whale and Dolphin communication is that the sounds are more like communicating moods that range in intensity. Imagine your mouth was sealed, and you still wanted to let someone know you liked them. How would you do it? You would make a sound that you know they and you would understand means..I like you. Even without actually saying anything. And if their mouth was also sealed, they'd respond with a sound that would make it obvious if they were receptive or not.
The properties that must be satisfied to constitute a language are 1) displacement, 2) productivity, 3) arbitrariness, 4) discreteness, 5) duality, 6) cultural transmission. We have satisfied some of these components in various cetaceans, but not all of these have been satisfied for any species/ecotype. bottle-nose dolphins satisfy displacement (by using "names" to call dolphins not currently present). humpbacks and orcas satisfy cultural transmission (i.e. SRKW vs transient calls are distinct despite occupying the same geographic region). 3) discreteness is resolved with various call catalogues. 4) cultural transmission is well known and studied. We are shifting gears to show productivity, duality, and arbitrariness. Re AI/ML: neural networks are universal function approximators. with an unsupervised training objective (like autogregressive next-call generation) there are less bias introduced than with other forms of study. Biases are only introduced via data selection, whereas other behavioural studies of cetaceans involves disturbing them in their environments, or worse removing them from their environment altogether. There is also the opportunity to systematically regress out the known biases from inter-annotator disagreement, etc. While ML training is carbon intensive, the scaling of carbon intensity is not linear. In general model performance scales log-linearly with the number of samples. A model with half as many samples will not have half of the accuracy; rather it would only lose a few percentage points of accuracy. There are lots of natural tradeoffs to be taken into account. For example, maybe you really value that extra performance when you are trying to track endangered north atlantic right whales to avoid ship strikes, but you can sacrifice performance for a detector for humpbacks to save on training expenses. Not to say that AI is the answer to everything. But I tend to take the view of Kentaro Toyama, where technology is an amplifier of existing biases and inequalities. In comparison to chartering research vessels, capturing cetaceans, or flying experts around the world to study cetaceans, using machine learning is quite innocuous, equitable, and energy efficient. Happy to expand more in follow-up.
You should add the ai video of trying to teach machine learning to walk and how many tries it takes to still walk in the most weird inefficient way lol
I hope we are able to speak with cetaceans in my lifetime. That's one thing I'm very excited for. For all the hope about other intelligence in the universe, there is alien intelligence right here on Earth. I do think we will be able to communicate eventually. What you need is underwater drones that can observe a pod over a long span like dozens of years and broadcast their observations to scientists. If we can do this in space we can do it in the ocean.
Dolphins have an enormous visual center in their brain. We might consider that their brains can interpret some or all of their vocalizations visually, in three-dimensional space (much like they use their clicks to visualize, using echolocation). This would be very difficult for humans to translate without creating an apparatus that could do the same. In other words, apparently audio communication may not be primarily or only audio…that might just be what humans expect after developing vocal chords, ears and brains that work well in the medium of air over typically short distances, rather than features that work optimally in water, over potentially very long distances. Even humans use visual communication in the form of body language, but we are often face to face in a world of sunlight. Visual communication may be just as important to whales, but adapted to darkness and distance. I have heard stories of dolphins in captivity making unexpectedly coordinated moves and jumps together, with apparently very little audio communication, so I started to wonder if some of that (minimal) communication was visual: a picture is worth a thousand words. This ability would be very helpful on a coordinated hunt, so a great evolutionary adaptation, if it is physically possible. Just a thought.
It's good to hear the realistic interpretation of this story after seeing the very misleading headlines. I had a number of thoughts during this. One, this seems like the equivalent of playing a Childs call to a mother over and over. Very cruel and callous. Two, the excellent analogy of trying to extract meaning from the jumble of sounds from a crowd begs the question, why not test it on a crowd to see if it works first? Third, the fact that we haven't worked out the real meaning of a single sound does suggest that we are very constrained to thinking how our language works and need to think much further 'outside the box'. Fascinating video, thanks.
I'm so pleased to hear this research exists. That would be exactly my approach. I would like to add this, but please forgive the poor terminology: sounds are sensory experience. Not just sounds, but any energy. At least for me. This energy resonates in your body, and it is a sensation; even every thought is a sensation. It is different if words are something external, objective and analytical, rather if they are a sensation of your body. So the first thing is to synchronize, and that is exactly where these research goes. The people who can help bridge the gap between this different perceptions are probably empaths or highly sensitive people. It is a translation from visual structure to sensorial (if this is the right expression). Thank you!
Well, I’m glad I’m becoming less “crazy” because I always thought it was obvious that animals communicate, especially social ones! It’s madness and hubris to think all sorts of traits are “only human”.
It's not just vocalisations. You need to look at the individuals body language combined with their vocalisation and the context of nearby peers, prey, predators and other environmental issues.
AI is not just one thing. Chatgpt was trained to mimic internet websites it was trained in a monumentally large dataset. The machine learning that segregates audios signals into categories has a much smaller dataset and more specific purpose with a binary correct/incorrect grading. It's an expert system which is much less difficult to produce and much more accurate than a large language model. It's like you're confusing two different genre of machine learning. Expert systems should absolutely be monitored by humans. Good ones produce a confidence interval so they can actually ask for help with confusing input. They can do some tasks more accurately than humans, but they will never be perfect any more than humans are perfect. Their mistakes are different from ours so may appear to be less rational despite being overall more accurate. Say you developed an AI to determine dog breed, but you never told it about cats and you never taught it "this is not a dog" is an optional output. It's going to make a dumb call when you show it a cat. Expert systems, even when trained well, can not cope with input outside of their training data.
About 10 years ago I read a kindle book about dolphins communicating with humans via computers. Then the computers were made into vests the scientists could wear near the dolphin tank. It was a great book but I can’t remember the title or author.
Killer whale is their official English name and the one used by the vast majority of scientific institutions, including the NOAA and the Department of Fisheries Canada.. It is the standardized name used in scientific publications and research papers. [1] www.fisheries.noaa.gov/species/killer-whale [2] www.pac.dfo-mpo.gc.ca/fm-gp/mammals-mammiferes/whales-baleines/killer-whale-epaulard-eng.html If you're interested, I made a video about why marine biologists, including myself and literally everyone I work with, use the name killer whale instead of orca linked below. [3] th-cam.com/video/FIwjehSYKJg/w-d-xo.html
@@KPassionate Political Correctness indicates that we should use the name orca, as that is not pejorative. You insisting on using "killer whale," simply because it had been used in previous studies, would be similar to someone saying that we should continue to use the word "Negroes," because that had been used in the past. See how dumb that sounds?
No, I use it because it's their standard name and the one used by scientific institutions, as you can clearly see in the links I provided. Orcinus orca is their Latin name and we don't typically use Latin names for animals. For example, we don't call lions Panthera leo or dogs Canis lupus familiaris. We call them lions and dogs. Additionally, there are at least 11 different types of killer whales and many of them are likely distinct sub-species or unique species entirely. Revisions to their taxonomy are being considered and if approved then they will no longer be called Orcinus orca (which means "belonging to hell", by the way). [1] royalsocietypublishing.org/doi/10.1098/rsos.231368 Also, your false analogy is absolutely ridiculous and offensive. In no way, shape, or form should we ever compare the literal name of an animal to slurs or the systemic oppression of people.
Pull this video now, or practice sounds related to "Danger, Get away", Ignorant hunters will use this to harm natures miracle. With all my heart I thank you for the care given to our planet and wonders of life. from Bill
I’m not gonna lie, the title with “whale” in it, but a picture of the largest dolphin (orca) almost had me pass the video by. I’m glad I didn’t. 😊 Very cool video.
could it be, that the reason why the whale waited the same interval to respond, is that it's a way of establishing the distance between the whales through the speed of sound when they hear the sound but can not see the other whale. (which is not the case near the boat, ofcourse. maybe the whale was confused by the sound, coming from a boat)
Figuring out how to decode the language of other species on earth is the first step to being able to communicate with species off world. I hope the people working on this understand how profoundly important the endeavor is.
You're absolutely right. Even though that sounds like the plot of a sci-fi channel movie, it's the truth. "By learning how to communicate with humpback whales, scientists hope to develop tools that could one day help us recognize and understand messages from extraterrestrial beings." [1] nautil.us/how-whales-could-help-us-speak-to-aliens-559443/ [2] www.earth.com/news/ai-helps-humans-have-20-minute-conversation-with-humpback-whale-named-twain/
i honestly believe whales and dolphins do have their own language , it may not be as complex as our own , but enough to keep in contact with each other . i also believe lots of animals communicate with each other , even if its just basic stuff . they have more going on up top than we give them credit for .
Scientists translated the language of my cat with supercomputers after 5 years of research and the cost of about 300.000 $ ! Now the sensational result : Meow.
AI getting the spelling of strawberry wrong is actually a consequence of the “biology” of the AI,the models are trained not directly on the letters of the texts they read, but letters are first grouped into “tokens” where tokens are frequently occurring groups of symbols, before the AI gets to see them and learn about them. This spelling problems won’t last long, because the cause is well known and strategies can be build in into the AI to give them perceptual access to individual letters. Other problems of inaccuracy, as well as the problem that AI “doesn’t know when it doesn’t know something” and therefore hallucinates, are harder to fix.
Learn more about these incredible animals!
Why Orcas Are Called Killer Whales → th-cam.com/video/FIwjehSYKJg/w-d-xo.html
New Species of Orca → th-cam.com/video/TnJVE2oNJH0/w-d-xo.html
Why Orcas Are Sinking Ships → th-cam.com/video/C0cGdd9lUgY/w-d-xo.html
Dolphin Attacks Are On the Rise → th-cam.com/video/FGjCPr1wLbk/w-d-xo.html
hey i speak a bunch of languages fluently and know ai.
this is really really interesting and entirely possible as a breakthrough. basically someone will eventually tokenize known cetacean phonemes.
if you ever run into models or open source on this do just reply right here. i'm fluent in russian, chinese, german, french, english, and am really into animal communication in theory and cetaceans may well have sufficient brains for speech to be more than just "let's meet next friday for netflix and chill" or "you gonna eat that" elephants are also candidates.
large language models are not large mathematical models nor are the large video models and they work on tokens (syllables, words, punctuation) not letters.
Human-Whale communication will be broken within 3 years as in solved.
Orca are dolphin.
I’ve owned many different breeds of dogs and even grew up with hunting dogs. Even different types of dog breeds communicate differently. To know an animal’s heart language you have to be part of their full lives. Wisdom is the being able to do, not just knowing the informational knowledge. Experts in the field have way more reliable information about creatures they work with. AI is just a glorified search engine and we set the search parameters, it’ll always be flawed because we are. Don’t really think we can get their language right till we harness them with electronics that’ll observe everything they do and say. Still though they are amazing creatures that we all know there’s a lot more going on in their minds than just food or sex.
When the Whales start saying "stop dumping that crap in my house" - you know its been cracked.
We think it said something along the lines of...
And also: stop making sooooo much noise!!!
Maybe whales virtue-signal just like humans, and say vacuous stuff like _"Look at how much we poop in the ocean, ruining it for beautiful animals like humans. Whales are just the worst."_
About 15 years ago, I went for a surf and when I got to the beach, there was a huge crowd. Thankfully, it wasn't a shark attack or a drowning, but a large whale (humpback or sperm whale?) had been in the area for a few hours, staying mostly in the same spot. After taking photos for half an hour, I went in the water. I was probably about 70 metres away from the whale and it was obvious it was checking out surfers. Twice, I heard guys ask if the whale might attack. I just thought that if it wanted to do that, it would've done so by now. No, it was really grokking on our species.
Knowing that they're very intelligent and have excellent hearing, I slapped the water as hard as I could. Once. The whale then hit the water with a fin, once. I then hit the water twice, leaving a second or so in between slaps. The whale then slapped its fin twice. Yeah, I then hit the water three times and the whale responded by slapping its fin three times. No further submissions, Your Honour, but it sure as hell seemed like this beautiful animal wanted to communicate. After that, a man let his teenage daughter approach a bit closer. The whale stayed there long enough for me to shoot more photos for half an hour afterwards. Small surf, but one of the best days in my nearly 60 years of surfing.
With the whole UFO becoming UAP and the subsequent introduction of the NHI (non human intelligence) into our collective vocabulary, i hope people start realizing our fellow lifeforms on this planet are much smarter and sentient than we realize.
That is so interesting and cool.
Thanks for sharing, that water pat thing was a good idea. It’s cool to think they understand that we are also intelligent
@@Samos900 Years earlier (in the 70s) I was surfing when an orca and calf were swimming near a river mouth. In those days, we didn't have much to do with "killer whales" and besides, they were probably about 300 metres away. But over the decades, through the internet and TH-cam, I learnt how intelligent these magnificent creatures are. Well, I guess it's a bit like all creatures, really: that they're intelligent and many species are empathetic enough to understand that we' have a modicum of intelligence - even if we don't show it by being more considerate of them.
@@amateurmakingmistakes that’s incredible. I absolutely agree. Many animals are more intelligent than they are given credit for. I’ve noticed that most mammals have a lot in common when it comes to empathy. I think we as humans have that empathetic relation with other mammals as opposed to other groups like reptiles.
I suppose it’s easy for their intelligence to go unnoticed. We really seem to take note of it when animals play around with their environment to have some fun. While they do play, Most of what animals do in their daily lives consists of a routine centered around survival preparations.
This contrasts dogs who have been bred to do chores for humans. Therefore they are less concerned with survival and can allocate their intelligence to other things. They focus on human bonding and having fun as opposed to worrying about hunting and making preparations for hibernation. Although, there may still be some traces of these instincts, like howling.
2 ravens led me to a freshly trained-killed moose. After I opened it up and cut off its hind quarters, the ravens could feast on the guts. Birds can communicate with people if you bother to learn their lingo.
I think most animals can, but they have to want to and the person has to be willing to listen, not with their ears but with their whole being.
You're spending way too much time in the forest 😅
@@GunshinzeroThe antithesis to “touch grass”
Very cool! Ravens have been cooperatively hunting with wolves like this for a very long time, and since wolves and humans occupy (historically) the same ecological niche in terms of prey and hunting techniques, it makes sense an intelligent corvid like a raven could quickly make the jump to hunting cooperatively with humans. Neat observation!
The dolphins are saying, 'So long and thanks for all the fish' 😄 Another excellent and thought provoking video, enjoyed it!
Not the first time dolphins have been thought English
It's a reference to Hitchhiker's Guide to the Galaxy (or Universe... can't remember right now)
@@njay4361 Thank you! I couldn't remember where the line came from and It was starting to really bug me cos it was so familiar!
(It's 'Galaxy' btw, Hitchhiker's Guide to the Galaxy, written by Douglas Adams)
Check my name ;)
current rate could be any day now , pretty sure a guy I know is actually Slarty Bartfast and the bogons are very definitely in controll .
I read the UC Davis paper when it came out. I'm equally parts amused, fascinated and *mortified* that we successfully carried out a conversation with an arguably sapient species... and the content amounted to essentially a *crank call,* with the humpback ending the convo by "giving us the flipper." 💀
Honestly not that surprising though when you consider humans 😂
@@KPassionate Gracie is pregnant.
Closer to us than we realise 😂
@@DissociatedWomenIncorporatedComment of the day
What's surprising is we hadn't figured that out 150 yrs ago.
If the only reason we created AI was to be able to talk to other animals, then it was worth it, great stuff )))
If whales have a spoken language, I hope someone teaches them to defend themselves as a group against whalers.
Good luck with that.
A pod of Orcas in Australia teamed up with the local whalers to hunt humpbacks for nearly a century until the 1950s. The Orcas located the humpbacks, alerted the whalers to their presence if necessary, led them to the whales (sometimes even dragging the boats) and helped with the hunt. The Orcas got to eat tasty humpback lips, tongues, and genitals, and left everything else for the whalers. Win-win. It is believed that this may have been a centuries, if not millennia, old shared tradition between Orcas and humans in the area.
Imagine what humpbacks cruising up and down the east coast of Australia had to talk about.
And big cargo ships
Lmao tf are they going to do grow laser beams?
I'm their atty
@@iggymcgeek730 Yes. They will also grow larger flippers to fly into the air and sink ships.
The analogy to Freddie working the crowd with his interactions... was spot on. A classic media mis-representation of the actual experiment. Very interesting research and a well presented video. Completely cutting edge stuff... I can't wait to hear what we can understand of Whales and their communication. Eventually, we may find out that Whales are smarter than... most of us.
I can really appreciate how a clip of Freddy was woven into a video about whales.
I love it.
He’s just a little silhouette of a man! 😂
@ 😆
Maybe the whales are singing "scaramouche scaramouche!"
@@KPassionate
"Scaramouche, Scaramouche, will you do the Fandango?"
The thought of being able to communicate with cetaceans is magnifico to me. 🐋
5:05 It's like we said "Hi" and the other whale said "hi" back and we just kept saying "hi" to each other 😂
Ikr
I think they were probably saying "wtf‽" lol
It's like talking to a baby who says their first word. We were saying our first word for the whales and they were so kind to just keep saying "hi" back over and over, just like we do for babies who do the same thing. My son's first word was "hi", and we had "Hi!" "Hi!" "Hi." "Hi!" conversations all day for quite some time. Good times.
The interaction with Twain seems like seeing a cat, and meowing back at it every time it does.
Exactly! The linguist I cited compared it to recording a barking dog and playing it back to them to see if it elicits a bark.
Dear KP: I'm 71. I've been talking to animals since the 1950s and plants since the sixties (The Secret Life of Plants) and guess what? I discovered, very slowly (there were a few "alt" resources) that many plants and animals are telempathic. This is not a typo. Telempathy is the ability to receive and send images, intent and feelings (NOT words, plants and animals do not "talk" in the way humans do - it's a fine line between some animal languages and actual words spoken) and now YT features doggos and cats imitating basic human speech AND they understand the words and phrases. Amongst the species designations individual plants and animals communicate in ways not reflective of a species or breed. There are some species - honeybees, for example, that easily perceive human feelings and intent and some humans can easily read a hive mood, yes, you know when they are pissed off or being friendly. Cetaceans are fully "sentient" - they are pretty much all telempathic and do communicate in ways that a few humans understand. "A.I." is only as effective as the persons who write the programs - it takes a genius who IS NOT an ego-driven fool to program computers that do NOT reflect the "human mind", which doesn't allow A.I. to expand. You are on the right track...huzzah! If you KNOW that you can communicate with some critters and plants it's a lot easier to actually do so. Seeing your interactions (muted) with cetaceans it is obvious to me that you easily communicate with the critters, not in words but in mood and intent. Animal intelligence far exceeds what humans recognize - it's a different form of communication than what humans are used to hence time to call the "scientists". Give me a break. Perhaps you are unaware of your skills in communication. If you love the animals and plants most will know and respond in their own way that human egos do not comprehend. "A.I." will expand exponentially when humans stop forcing the human mind into the programs. As long as a person refuses to come down off the pedestal of "sentience" there is zero communication with the telempaths. In 1986 I went body-surfing in Manhattan Beach (L.A.) and in a few minutes I was surrounded by a pod of Pacific porpoises (the ones with the stripe) and it was clear that they were laughing, not at me, but in amusement at my behavior. Many critters just want to play and "science' just doesn't get it. Talk openly to animals and plants without regard to science or superiority.
Very well put in to words. Thank you for that.
🧡⚓️
They have had to learn very hard lessons about interpreting human intentions, separating the curious and awkward from the truly lethal. These porpoises had pegged you quickly as belonging to the former (majority) category. If they had perceived a threat they would've either pummeled you into unconsciousness or made a hasty retreat, more likely the latter.
👏❤
Very well said, dear human being. Humans can do various languages. We know very well the language of Love and Benevolence. How so? We know it so well, that we have used instruments, music, to allow this intent to shine forth!!! And it does so breathtakingly!!! (For example, Robert Haig Coxon, with the Kryon music. 1 example among millions...).Theres a very profound concept there. Animals and plants, do react to music.
I suspect what most ppl want is full on dialogue with most animals. Alas, they do not have the vocal cords. Sign language was taught to koko and she was very good at it, but other animals do not function with that. Has you wisely said, Intent is all. Why use words, when mental pictures and, ultimately, intent, is FAR quicker!!
Still, it is aparticularity of our species, this manner of language. The one im typing now😂.
I knew what you are referring to when I was 4 years old. I was in wonder and awe as I stepped cautiously under the umbrella of a beautiful blooming lavender Wisteria that was fashioned into a tree canopy... the whole thing alive and buzzing with hundreds of honey bees. I projected my intent to observe to the bees and meant no harm or disturbance. They noticed me and even considered defending but immediately went on about their business without any threat. I only had a few seconds before a soon to be stepfather who is a horribly, dangerous and abusive man yelled for me to get out of there (from his own ignorant fear) and when I yelled I was fine he soon reached into the canopy and yanked me out from my glorious peaceful experience. I suffered years of attempts on my life and incredible abuse, and I will never forget as he drug me away muttering to deaf ears, "No... you don't understand... they're my friends". To this day, now I am 60 years old I have had bees even land on me but have never once been stung.
As an AI developer (for a small company, I don't work for any of the leading ones) I believe that there is a good chance of being able to build an AI that can translate animal vocalizations if they actually form a language even if it is extremely different from ours. The path towards that could look like this:
- We first need to figure out the words/syllables/alphabet (so we can tokenize the inputs), not the meaning, just the individual sounds. And that seems like what they are doing right now.
- Once we have words, we can either train an autoencoder or an autoregressive next word predictor with a bottleneck in the middle. That on its own won't give us a translation (may give us a chatbot that only wales understand) but if successfully trained, we can take the activations of the artificial neurons in that bottleneck and use those as an embedding (turns each word into a many dimensional vector, when we do that for human languages individual numbers of that vector can represent things like meaning, ie "dog" and "cat" will share quite a bit of values with each other and with "animal"). We won't know what does each dimension of that embedding mean.
- Once we can tokenize and embed we can train an LLM. If we just do that, we would just end up with a better chatbot that only whales understand and that only knows what whales know. (we could do that and let it talk with whales to try getting a bigger dataset)
- But we can try something else: Instead of train a whale LLM from scratch, we add another two dimensions to the whale embedding and make sure that the number of dimensions of the embedding that we have generated matches the ones we generate for humans. Those two extra dimensions would contain an 1.0 in only one of them depending on whether it is whale or human language. Then we take a pre trained LLM for humans with those two dimensions added (or two pre existing ones but unused or barely used, repurposed) and fine tune / continue training with both the whale dataset and the human dataset alternating samples of each in each batch.
The last part is real tricky but even if the meaning for each dimension excepting the two that we introduced by hand won't match at all, forcing a transformer to understand both under the same architecture is likely to lead to some unified internal representations at the outputs of the intermediate layers (backpropagation and gradient descent would in some sense try to reuse what is in there from the human pre training, interleaving with human data would prevent progressively deviating from that too much). The transformer architecture used in modern chat bots would be particularly apt with this as it acts as some kind of progressive refining through algebraic operations between the embeddings of the tokens (that can act as logical reasoning) in the attention operators and each layer MLP can look up extra related meanings related to the input or do some non linear operations, the fact that the MLPs don't have outputs larger than the embedding space forces the network to compress the reasoning for the next layers in a single embedding vector. That is likely to indeed force some kind of internal translation for at least some of what was learned.
It is unlikely to work perfectly but I'd expect that if you train such chatbot and you just ask "What does this mean in English?" followed by some whale language. It would likely answer something vaguely related to what the whale intended, at least some times. That if it works at least to some extent could be an stepping stone that could be iterated upon. There is likely an optimized topology and training procedure that would maximize the reuse of features learned from human languages.
Once we have even rudimentary communication, that could bootstrap the building of more comprehensive datasets. We could just ask the whales to ELI5 when we don't understand lol. They may even be interested in helping us understand, something like we show them something = they tell us the whale word for that.
I hope someone does that and that it works, it would be about the closest thing to meeting intelligent aliens and good practice for when that may happen too.
In English?
@@GoogleAreEnemyCombatants It is not easy to further simplify it without a background in artificial neural networks. I laid out a way to maybe train something like chatgpt to translate whales to English (or other animals or aliens if they do have a language) without actually knowing ourselves any of that language. I am speculating that it could work but I believe that it would take a lot of effort to make it work well.
One good thing is that it likely doesn't take something the size of chatgpt but chances are something with a similar architecture but 1/20 the size or a bit less could work. Still the training of such thing would require some significant resources.
@@eruiluvatar236 with enough training, an AI can figure out what sound a whale will make.
How do we go from that to asking what their favorite color is?
@@GoogleAreEnemyCombatants I explained that in the original post. The very simplified explanation is that we can force the artificial neural network to use an internal shared representation for human and whale languages, teach it both and ask it to translate. Without pre existing translation examples it won't do great but though what I explained it will likely be able of doing some very rough translation. But once we have even rough ones, we can iterate and refine the model and dataset.
It is based on how we trained them to understand human languages, they learned on their own both their internal representations of words inferring their meaning in the process and the relationships to other words.
@@eruiluvatar236 so, the bot gets its own native language and then is trained on human and whale language and uses bot language as an intermediary to translate.
But we have never given a bot it's language. It creates its own while training.
Never did I expect to see Freddie Mercury in a marine biologist’s channel. Love it and Liked!
It doesn’t matter if this is a success right off the bat, but this is what AI really should be purposed for! Even if the difference engine, language learning model, whatever you wanna call it (it’s not real Ai, yet) gets it wrong 999 times out of 1000 - that one time out of 1000 is likely statistically faster than humans would’ve arrived to that point. Also, it will explore avenues, in ways we don’t, that will yield unexpected(ly beautiful) results!
So happy to run across this video!! We all know that our animals, while perhaps not possessing the same level of mental function, also exist in an emotional, and perhaps even spiritual, spectrum. We feel this inside. We don’t have concrete evidence, but this gets us even closer; and I’m ALL FOR IT!!!
I WOULD LOVE TO BE ABLE TO SAY “HELLO” TO A WHALE!!! Really, Octopuses/podes/pi are my loves, but I truly love all living beings! ☺️❤️🙏🏼
"Not real AI yet"
That's what you'll forever say, no matter how advanced the AI will get.
Accept it: we're just complex inference engines ourselves.
Btw, I'd love to chat with octopuses and all other animals too.
Most people don't realise that most human languages are nothing like each other, if they haven't been related for only some few thousand years. Parts of speech we're familiar with can't be expected to be the way innumerable human languages work. An animal language can't be expected to resemble a type used by human language groups spoken by billions, nor even the many many thousands working in some completely different way, plus even a larger number that hasn't been anybody's natural language for just a few generations. Most human languages in different groups are not at all like each other, and you can't expect an animal language, even if fairly large or meaningful, to resemble a common human one.
Well said! Thank you
Oh, human languages get even wilder than that.
You know the "eSkImOs HaVe 100 WoRdS fOr SnOw" idea? Yeah, no. They have _4 words_ for snow. The thing is, the Inuit languages are polysynthetic, which menas the language doesn't have "words" or "sentences" so much as 50-syllable-pile-ups of what we'd consider "words" in English¹, all glued together into a single whole that _looks like_ a word to an English-speaker¹ but functions like a phrase or even an sentence.
So, the Inuit don't have 100 words, they have 4 words that they can glue dozens of adjectives on.
Then there's things like word-order and ergativity. Both change how you put together your thoughts. The 2 most common word-orders are Subject-Object-Verb and Subject-Verb-Object, with SOV being slightly more common. And ergativity is about which is considered the "normal" or "default" part of a sentence: the Subject or the Object. Most languages leave the Subject unmarked, meaning the doer of the action gets the attention. And with intransitive verbs, where the doer and the recipient of the action are the same, its one noun is treated as a Subject, as a doer. Example: "I sleep." I am the one doing the sleeping.
But in languages that follow the Ergative-Absolutive paradigm, its the _thing being acted on_ that is unmarked and considered the "default". And the noun that operates with intransitive verbs is treated as the direct-object. So the nearest equivalent to my example, under the ergative-absolutive paradigm, would be "sleep me," i.e. the act of being asleep is _happening to me._ That's a fundamentally different way of thinking about the world.
Then there's Evidentiality, a _required_ marking on verbs that some languages have that indicates whether or not the speaker is talking about something they saw firsthand, heard from someone who saw it firsthand, heard of it through heresay, or are only guessing about.
So, yes, human languages contain a wide range of variations that encode very different ways of thinking about the world.
[¹You can replace "English" with any other lanugage in the Indo-European Lanugage family. Or with Chinese. Or Thai. Or most of the languages in Africa. Polysynthetic languages outside of the Americas are _rare._]
@John_Weiss and you also made the case for what they all have in common. This can be used to attempt to identify structure in whale concept formation, regardless of the "lingo" unique to their pod.
Languages are the same across the board. It's just a way to communicate using complex sounds. It's the same for birds, mice or any other animal.
I know the alarm sound the magpie makes when my cat is outside. I know when my dog whimpers, it's in pain. I know my cat is content when i pet her because of her purrs. When a baby cries, it's hungry.
Animals dont think in words, but in concepts and mental images.
When my dog remembers her buried bone outside, she sees a mental picture and reacts on it. Never does she think of the word "Bone". But she can be taught that certain sounds belong to certain objects. When I say Bone, she rushes outside and checks on it.
We are all limited to the sounds we are able to make. The more sounds, the bigger the potential for more vocabulary.
@@noctisilva6457 so you think we’re plants?
I would argue that music DOES contain "information".
I agree with the criticism of describing complex non-human communication systems with an anthropocentric focus.
Fascinating video! Thanks for the insights.
I’m glad you liked it! Thanks for watching
Very well said
i believe they communicate with pun
I would say it is far better at communicating emotion than data. Try having someone read you today's headlines using only a flute.
Thank you for this. The reports of "scientists had a 20 minute conversation with a whale" have bothered me because it's not a conversation if you don't know what you're saying (it's okay to not know what /they/ are saying because that's how both of you learn what the other is saying). So knowing that it was repeated contact calls that were returned, with matching intervals, makes me believe that we were basically saying, "hey. Hey. heeey. hEeeeey." and the whale was responding in kind.
It makes me think that some researchers should synthesize their own contact calls and play them each time they approach the pods they are studying. (specifically contact calls for the BOAT) We won't know what the "name" /means/ (if anything) but it would be very interesting to see if the whales start reacting to that particular contact call.
I agree with everything here. Some linguists and biologists have compared the "20 minute" conversation to playing a recording of a wolf howl to another wolf in order to elicit a howl. I do think it was more complex than that, because of the timing and interval matching, but it wasn't a conversation as it has been portrayed. It would be interesting to generate a unique contact call and see how a whale would interact with it... that is an intriguing thought!
Yep, the press portrayed this as a full on Dr Doolittle moment when it's clearly not!
Always love it when someone provides their sources!
You know I won’t be publishing anything without my sources!!!
I can’t help but think of when the Predator called out “help” in the jungle to lure the humans.
Fantastic video, packed with info and I love how you explain everything beyond just "they had a conversation." I subscribed.
I’m glad you liked it!!! Welcome in
Scientists: Marko
Whale: Polo
Scientist: Marko
Whale: Polo
AI learns dolphin slurs
Now we need a dolphin dictionary
Muh Porpoises!
Dolphins are truly evil animals. I'm scared to hear what these monsters are saying
Thank you KP. Always great information and I trust your findings. Happy Thanksgiving! 🐬🐬🐬
Thanks for watching! Happy Thanksgiving to you, too!
A Dolphin makes a sound.
AI: "Humans, leave Earth or die."
You are a mother, you see a weird metal structure about the size of a short human, with even smaller creatures barking at each other on top of it, while looking and pointing at you. The machine plays a female voice saying "come~" you can clearly tell its not directed at you, almost like the voice is calling back her child, but curious you yell back, "come?" as you walk around it confused.
I'd like to imagine that's what the mother whale experienced.
Honestly the way its described, that every whale is slightly different, reminds me of how my family can usually recognize my dogs bark compared to our neighborhood dogs, even with one or two of them being very close to our dogs breed. (granted I'm not sure how well someone could distinguish barks if they had... 4-12 dogs like with whale pods... but if all you had was barking, it would probably be naturally easier)
This is sooooo much better than forcing whales into tanks and making them do tricks for their food. I'm glad to see humans being much more respectful towards whales and dolphins!
I have played my violin for herds of cows on a mountainside, deer in a snow covered field at night, a rooster in a barnyard... They definitely are interested, and respond in surprising ways. I would love to interact with another species by sketching pictures. There are lots faster artists, but as a cartoonist I can work pretty fast. There are other people who don't think of themselves as artists, but are able to render VERY FAST and Recognizable "iconic" sketches of objects. => Some people have that ability to reduce things to simple icons that are still recognizable depictions of a more complex object.
Could a Dolphin enjoy sketching underwater, like the Elephant who has learned to paint with brushes and a canvas on an Easel?
Seems a way of starting to work with an intelligent species, Have the objects available, the sketcher makes a drawing and the dolphin should be able to see the relationship between the drawing and the object. Build from there, sequences of eating, fighting, catching fish, sorting objects, assembling objects from parts. et cetera.
Come to think of it, There are violins made from injection molded plastic that could be played underwater... Or cast aluminum fiddles... hmmmmm.
You're so right about AI. A radio station where I live switched to using AI for its weather forecasts, which immediately became wildly inaccurate.
ChatGPT told me that sharks are marine mammals 😅
You probably didn't know that whales have been in telepathic with some humans for a very long time from decades old research. It never gets talked about anymore and or gets swept under the rug but you ask around the old old timers in the marine industry and they might tell you the story...
As far as I recall this was about 30 ish years ago during the early stages of this communication by one particular fellow he was finding the work a little hard to believe (both the communication itself and somethings that were said about sound in physics terms) The whale told him we can prove this and told him to go to such and such a coast line at a specific time. Which he was doubtful any whale would show up for him at all but he and several witnesses and friends witnessed one of the most marvelous interactions between them and humans to date bar none.
Not just one, but something like 500 all showed up at the same time, some originally tagged from hundreds of miles away.
They all popped their heads up at the same time, there was some kind of synchronized thing of some sort to prove they could all speak telepathically and such and then they all just left going back to where they came from. I suspect there was tears and dropped jaws for some weeks after between the small handful that witnessed it.
This is the kind of stuff I dream of ai making possible. Imagine if we could even basically understand what other creatures are "saying".
No amount of AI will EVER achieve that because animals do not have a true language system. While AI can help us better analyze animal vocalizations, it's important to remember that these are not structured languages with grammar or syntax. We mostly understand what terrestrial animals are communicating through their calls-basic messages like warning signals, mating calls, or expressions of distress. However, when it comes to marine animals, particularly deep-sea species, we know much less about their vocalizations, largely due to the challenges of studying them in their natural environment. The ocean’s vastness and depth make it difficult to gather and interpret the sounds they produce
@@jorgen7180 Yes I also believe we will never decipher animal languages per se but I'm hopeful ai will be able to find patterns etc. so that we can understand and communicate in broad, general terms at least.
Right…the same AI that thinks the founding fathers of the USA were all black women. What can go wrong 😑
@@jorgen7180 How do we know that they do not have a “true” linguistic system, when we only “mostly understand what terrestrial animals” are saying, and know “much less” about marine mammal communication? I think the only thing we know for sure is that human competence is consistently overestimated, otherwise we wouldn’t be destroying the planet we share with creatures who are not.
@@slartibartfast7921 And yet, here you are, doing something that no other animal can even comprehend
and this is why i'm studying to be a marine biologist as well - as someone who is becoming fluent in American Sign Language myself, i've had a bit of a perspective change on the definitions of language and how we could connect and understand the wildlife's words, how their perspectives of language are entirely different and yet fundamentally the same despite our differences - it fascinates me. Maybe one day I too could study the language of dolphins and whales, as I used to go whale watching so much growing up. It would be so wonderful to learn their language, to communicate with these ancient wonderful beasts.
The idea that AI and machine learning were all bundled up into essentially chatgpt is misleading. LLMs are considerably innacurate and do not approach reasoning, while models designed for specific domains of tasks reach much closer to 100% error free for the given task. Needless to say, a conversation in an alien language is hard because there is no decodable feedback mechanism.
Thanks for the information! As a marine biologist, AI is admittedly outside my area of expertise but I am interested in learning more.
I have had the privilege of growing up right on Puget Sound. Spending my days and nights on the beach, in the water, and on the water gave me a special seat at "the table" (NOT a foodie so...not talking about fishing, clamming, etc) of Mother Nature. Just found your channel and I'm hooked. I love your smart sense of humor, pleasant easy-going way, and all the info. You and your husband/wife/partner and kids must have a lot fun at home! LOL. I can only imagine.
Thanks for these videos. Absolutely fantastic and fun. I'm a new subscriber. Keep 'em coming please
Welcome in! I live in Tacoma WA! Love the vibes ❤️
Chat gpt tried to tell me the word like is and has always been spelled "lik" and while arguing about this it uses the word a dozen times and each time spelled it correctly 😂
Ha! Incredible
I worry for people who want to give up their reasoning abilities, for "AI" - it will make us all dumber if we let it. But more than than, who is actually benefiting from this? Clearly it's large tech who want to make us dependent upon it, then they can bill us for the privilege of not having to think.. BTW I work as a programmer in IT.
@psaunder1975 I use to learn and keep brain active. I have real conversations or play brain games like alphabet/country( take turns alphabetically naming random topics like bands or countries ect.) or analyzing photos of rocks I find. Google has already been dumbing the younger generation that have raised with it for research. I have always had the attitude that googling something that you already know, but have forgotten is cheating your self or a weakness. It is a last resort like you can't take anymore. If you're genuinely interested in learning something new then it is essentially the same as I did as a child look it up in the encyclopedias.
It's like my early morning whale communication.
Thank you KP. You said what I thought. If language between humans and any animal were to happen, my opinion is that the animals would run away screaming. Our species has been awful to every other one on this earth. Especially Cetaceans.
Maybe they don’t know? I mean not even every human does and we have access to the internet, libraries, TV etc. I mean there are people that when asked to name 3 countries outside NA they say europe, Africa and Mexico. And we are supposed to be a intelligent species.
My dog understands me. We communicate fine. I understand about 3/4's of her vocalizations. She still digs in the backyard though.
I think the issue with analyzing intelligence in other species is that people tend to measure intelligence by human standards. We're just one example of intelligence, and by limiting every other possibility to our own definition, We're missing out on so much empathy and connection between species.
I couldn't agree more!
If intelligence is the ability to figure things out, especially in one’s environment, then it is a useful measure for any species
8:45 Did you actually expect the audience to think that an incomprehensible amount of machine learning isn't AT BARE MINIMUM worth the trade for the emissions of a SINGLE car driving 2 million miles?????? LOL! Hell the multiple 2 million MILE Toyota's that exist were worth those emissions simply as a source of human pride in mechanical ingenuity.
Carbon emissions are just one of many concerns with AI that have been raised by people who know far more than I do about machine learning; data bias leading to biased outputs, lack of data diversity, poor data quality, overfitting to training data, etc.
[1] news.harvard.edu/gazette/story/2020/10/ethical-concerns-mount-as-ai-takes-bigger-decision-making-role/
Additionally, AI systems evaporate several liters of water per kilowatt-hour of energy used for cooling in data centers. This means a significant amount of water is used to keep AI running, particularly in large-scale data centers. Which is arguably a bigger cause of concern and one I should have addressed.
[2] cee.illinois.edu/news/AIs-Challenging-Waters
[3] oecd.ai/en/wonk/how-much-water-does-ai-consume
Discussing these concerns openly and honestly is all I advocated for if we are going to use this exciting new technology for scientific and conservation purposes.
Ma'am thank you so much for covering and exposing this research to the world in a highly comprehensive video
Well, at least when your chat programs ask the whales, "Can I mamu duck-face to flee banana patch?" the cetaceans can still get another good laugh at human ignorance.
They will deduce that humans are not sentient if we just play chatbot sounds at them
Nice to hear that AI has been used to analyse sounds from whales instead of just creating bullsh*t.
Great video! And thanks for all the references to the papers :) Just a comment on the AI things, sure developing a model like GPT3 consumes a lot, and sure, language models are not super precise. BUT, machine learning is a biiig field itself. You can train small models in your computer in minutes processing even a couple of GB of data and there are models for doing things like clustering (for example, to identify different sounds) which can work with a veery high accuracy for many problems. Just taking LLMs as reference is an unfair comparison!
Thanks for this! AI and machine learning is outside my wheelhouse but I'm interested in learning more.
Nope. Not everyone can train a model on their computer. That's a first world thing.
Training a simple clustering model with a few thousands of examples takes hours on my computer and the result is, obviously, pretty much useless.
If it wasn't for Colab, I would still be hiding away, depressed and mourning for the impossibility of the common man to do ML development.
(But I'm still depressed because I still can't pretrain a SOTA LLM from scratch :p)
I like the idea that our first communication with Whales is the equivalent of "heyyyyyyyyyyyy!" And they're like "heyyyyyyyyyyy!"
Thank you, KP, a fascinating video as ever and, as ever, I have learned a lot. Why, though, do I fear for cetaceans should we be able establish some form of interaction in the future? Interracting with humans, other than experts, seldom ends well for wild animals. Sorry to be wet blanket.
Would be totally amazing though to be able to tell a whale "Careful, shallow water ahead, don’t get closer" and the whale answering "thanks mate, bit confused from all this loud noise, you know what’s up with that?".
Maybe this is going to be like the book Arrival. Learning to understand their speech and maybe how they view the world will change how we humans think or view the world in some drastic way.
Orca: "Hey Man, everything is warming up fast!" Human: "We know. It's our fault. We're trying to fix it, but a lot of people don't want to spend the money to do it." Orca: "What's a money?"
Finally somebody says "Let's dive right in" and then really does it.
Clearly, you are a well trained scientist and practitioner. I truly appreciate your perspective on these subjects which are fascinating.
As an AI developer and former street gangsta thug of Harrow London (for a small company, I don't work for any of the leading ones) I believe that there is a good chance of being able to build an AI that can translate animal vocalizations if they actually form a language even if it is extremely different from ours. The path towards that could look like this:
- We first need to figure out the words/syllables/alphabet (so we can tokenize the inputs), not the meaning, just the individual sounds. And that seems like what they are doing right now.
- Once we have words, we can either train an autoencoder or an autoregressive next word predictor with a bottleneck in the middle. That on its own won't give us a translation (may give us a chatbot that only wales understand) but if successfully trained, we can take the activations of the artificial neurons in that bottleneck and use those as an embedding (turns each word into a many dimensional vector, when we do that for human languages individual numbers of that vector can represent things like meaning, ie "dog" and "cat" will share quite a bit of values with each other and with "animal"). We won't know what does each dimension of that embedding mean.
- Once we can tokenize and embed we can train an LLM. If we just do that, we would just end up with a better chatbot that only whales understand and that only knows what whales know. (we could do that and let it talk with whales to try getting a bigger dataset)
- But we can try something else: Instead of train a whale LLM from scratch, we add another two dimensions to the whale embedding and make sure that the number of dimensions of the embedding that we have generated matches the ones we generate for humans. Those two extra dimensions would contain an 1.0 in only one of them depending on whether it is whale or human language. Then we take a pre trained LLM for humans with those two dimensions added (or two pre existing ones but unused or barely used, repurposed) and fine tune / continue training with both the whale dataset and the human dataset alternating samples of each in each batch.
The last part is real tricky but even if the meaning for each dimension excepting the two that we introduced by hand won't match at all, forcing a transformer to understand both under the same architecture is likely to lead to some unified internal representations at the outputs of the intermediate layers (backpropagation and gradient descent would in some sense try to reuse what is in there from the human pre training, interleaving with human data would prevent progressively deviating from that too much). The transformer architecture used in modern chat bots would be particularly apt with this as it acts as some kind of progressive refining through algebraic operations between the embeddings of the tokens (that can act as logical reasoning) in the attention operators and each layer MLP can look up extra related meanings related to the input or do some non linear operations, the fact that the MLPs don't have outputs larger than the embedding space forces the network to compress the reasoning for the next layers in a single embedding vector. That is likely to indeed force some kind of internal translation for at least some of what was learned.
It is unlikely to work perfectly but I'd expect that if you train such chatbot and you just ask "What does this mean in English?" followed by some whale language. It would likely answer something vaguely related to what the whale intended, at least some times. That if it works at least to some extent could be an stepping stone that could be iterated upon. There is likely an optimized topology and training procedure that would maximize the reuse of features learned from human languages.
Once we have even rudimentary communication, that could bootstrap the building of more comprehensive datasets. We could just ask the whales to ELI5 when we don't understand lol. They may even be interested in helping us understand, something like we show them something = they tell us the whale word for that.
I hope someone does that and that it works, it would be about the closest thing to meeting intelligent aliens and good practice for when that may happen too.
Figuring out what dolphins and whales are talking about, on a tropical island is good job to have
I think they should start making a catalogue of each suspected, "word," in the whale language, and keep all the files together the same way the FBI stores fingerprints, for later reference. They could call the system, "Prints of Whales."
I have worked in several fields, at one point I was head of design for an electronics company (got bored with electronics), and I now have a linguistics and psychology heavy degree. Psychology is important in linguistics as "cultural artefacts" as defined by Vygotsky define the semiotic range for words. So if we are looking at a language spoken between whales then ideally we need to figure out what their cultural artefacts are. The flip side of the coin is an idea proposed a while back - that the clicks and whistles used by dolphins etc are used to transfer a sonogram from one individual to another. It's going to take AI far more advanced than we have now to figure that out.
When we consider the millions of dollars drawing us toward more consumption without awareness of the planet your critique of A.I. at the end was spot on. The potential to amplify bias transcends anything we've done previously. Truly open to the public understanding of the code creating a particular direction is the best hope we have to lessen its destructiveness but in a capitalistic model that is not remotely possible. There is no way in our current economic/scientific system the payback for this knowledge can avoid being weaponized and capitalized upon. "A tree should exist only if we can consume it's value." If we were to communicate freely as we do with humans I don't think we would treat them any better than we treat the First Nations of the US or African Americans on the whole. ie net worth of AA is 10x less than their dominant counterpart and we blame AA's for it.
@KPassionate You're channel is fantastic, thank you so much for taking the time to educate the world on Marine Biology subject matter. Happy 2025, wishing you mush success and prosperity.
I’m glad you enjoy it!! Thank you 😊
This reminds me of the music scene from Close Encounters of the Third Kind. Whales and dolphins have language, but because we are aspect-blind to their language because we do not understand their form-of-life. However, AI might actually solve this hemiopia, which would be impressive.
Google AI now understands where 1919 is located in a calendar.
7:39 it's also possible that their sounds change when they see each other, like Italians or Romans vigoursly using their hands to express emotional context 😂
There are so many possibilities, and it often depends on species. Something I left out is that humpback whale songs are relatively stable, but they do change over time... but only west to east.
[1] www.smithsonianmag.com/science-nature/humpback-whale-songs-spread-from-west-to-east-176855840/
There is so much we don't know and it's honestly a fascinating field of study. I wish we knew more!
This is fascinating. There's also a good opportunity to learn about personal biases.
Older versions of Chatgpt can't see individual letters. They see tokens that consist of multiple letters. That's why it couldn't count correctly the number of r's in strawberry.
If you try to shoehorn human expectations and models to animal languages you might do as wildly incorrect conclusions as trying to figure out why Chatgpt couldn't answer such a trivial question.
This is fascinating. All those marine animals are sentient. Being able to communicate with them would open a door to a new world. I'm sure they would teach humans about life in the oceans and also about pollution. I am pretty sure the whales can have a spoken language along with telepathy or something similar.
Orcas are my favorite aquatic animals, love the wolves of the sea. Wolves and dogs are my favorite elsewhere.
I also work IT and hate the hallucinating AI search engine integration. So thank you for touching on that. But it is much more than just that.
Those are fabrications (made up stuff) rather than true hallucinations (sensory training defects). Many people in the ML community use incorrect words.
20 minutes of the funniest humpback fart jokes that whale had heard in a while.
Nice description of cetacean sounds (vocalizations?). I tend to disagree with the notion that music is not the same as communication. It is true that music is different from a verbal communication - but verbal communications include more than mere words. In addition to the words in a verbal exchange, there are several other aspects that are also communicated. Just think about speech for a minute. If we hear someone speak, we can usually figure out if they are male or female, child or adult, stressed or relaxed, tired or active, and other parameters. If we are familiar with the person that is speaking, we can even figure out other parameters, such as deceit or truth, comfort level, paying attention (or not), interest or disinterest, ... All of this counts as communication, above and beyond the mere words. Music does this too, without the complications from the words. I suppose it might be possible that cetaceans are singing music, instead of using words, ... maybe. It remains to be seen (or heard).
All very good points! I had originally planned on a longer segment about how whale songs may be more similar to music, but I ended up cutting it for pacing. But I think you explained it better than I would have. Thanks for posting this!
"So long, and thanks for all the laughs!" XD
While assisting my friend as a deckhand on his 60’ steel hulled American schooner sail boat in Alaska, I suspended myself from the chains of the bow sprit to extend my hand to the surface of the water…a pod of Dahl propose responded by proposing partially out of the water and high five’d me! Every one I could see high five’d me…it was an incredible experience, they knew my heart and I knew theirs. I highly recommend the experience if one ever gets the opportunity.
Thank you for this summary. There are audio recordings of crickets slowed down, and they sound so beautiful. I wonder how the clicks and whistles might sound slowed down, and also in different mediums than water too. Also, that the circling whale repeated not just the sound but also the temporal interval suggests a potential spatial aspect to their communication, doesn't it?
@@rpstoval2328 it does seem like that is the case!
That would seem to make sense with a species that uses echolocation, which must use some kind of delay-sensing.
There's also the possibility that some vocalizations may even mimic an aspect of echolocation return patterns, and the whales could be speaking what amounts to snapshot pictures of how they see using that. So that's another thing to explore. On top of that, communication could be diverse among species. It seems beluga whales also do a thing that sounds a lot like frequency-shift keying, which is how acoustic modems work. Which is interesting, but FSK is also used for some human undersea communications - so maybe the belugas have picked up something that mimics us? Yet until some scientists look more deeply into things like that, there are a couple gaps to be explored.
Maybe? There are two reasons to think that the vocalizations are not aspects of echolocation. First, is that baleen whales like humpbacks don't have the ability to echolocate. Echolocation is exclusive to the toothed whales, like dolphins and sperm whales. Second, the echolocation vocals are distinct from their other vocalizations and easy to recognize. Vocals like a sperm whale codas and dolphin whistles are something else entirely.
Awesome! Can’t wait until humanity deciphers completely what they are chatting about. I want in on the convo😍👍🏽
I've watched spectroscopic playback of Orca vocalisations for hours and hours. There is absolutely no doubt that it's communication. There are a few distinct, repeating 'words' in the lower frequencies, overlayed by complicated stuff that almost looks like writing, in the upper frequencies. I think that AI will crack it, if it's approached in the right way. Exciting times!
It's important to note that what we call AI today is not Jarvis, which is why many researchers within the area is trying to change what we call it to "algorithmic problem-solving and machine learning" (I'm doing a master in information science with a focus on digital environmens and AI has been a big part of the education). So, the "AI" they use will have been made specifically for this project and would be substantially different from chatGBT, which is built to mimmick human communication by (in simplified terms) predicting what the next response would most likely be by analyzing other human communication. This AI (with the presumption that they did their study in a scientific way, which I trust to a reasonable degree since I'm Swedish and this is one of the most prominent collages we have - that said, I haven't checked the study myself, so don't take my word for it) would work entirely differently. It would be focused on analyzing communication to "learn" the "language", not on creating a conversation by predicting what the most common responses would be. That said, AI is flimsy (imo) still, and we won't know if they are successful until they would be able to mimmick the "language" and have conversations that could be proven to be accurate and to conclude in a shared understanding. And that, my friends, is not happening any time soon 😄🍀✨🐳🐬
When you said "1939", the google assistant on my old phone woke up all of a sudden and started to talk out loud about what happened in 1939, World War II. I was so surprised, I almost jumped out of my skin. I live alone and hearing an AI voice out of nowhere scared me so much, but I don't understand how that happened 😂
Thank you for the video!
4:30 - The timing also suggests that humpback whales have a sort of conscious timing/reckoning.
Sounds like we just pissed off a whale.
Even human language started as small vocalizations.
My take on Whale and Dolphin communication is that the sounds are more like communicating moods that range in intensity. Imagine your mouth was sealed, and you still wanted to let someone know you liked them. How would you do it? You would make a sound that you know they and you would understand means..I like you. Even without actually saying anything. And if their mouth was also sealed, they'd respond with a sound that would make it obvious if they were receptive or not.
The properties that must be satisfied to constitute a language are 1) displacement, 2) productivity, 3) arbitrariness, 4) discreteness, 5) duality, 6) cultural transmission. We have satisfied some of these components in various cetaceans, but not all of these have been satisfied for any species/ecotype. bottle-nose dolphins satisfy displacement (by using "names" to call dolphins not currently present). humpbacks and orcas satisfy cultural transmission (i.e. SRKW vs transient calls are distinct despite occupying the same geographic region). 3) discreteness is resolved with various call catalogues. 4) cultural transmission is well known and studied. We are shifting gears to show productivity, duality, and arbitrariness.
Re AI/ML: neural networks are universal function approximators. with an unsupervised training objective (like autogregressive next-call generation) there are less bias introduced than with other forms of study. Biases are only introduced via data selection, whereas other behavioural studies of cetaceans involves disturbing them in their environments, or worse removing them from their environment altogether. There is also the opportunity to systematically regress out the known biases from inter-annotator disagreement, etc. While ML training is carbon intensive, the scaling of carbon intensity is not linear. In general model performance scales log-linearly with the number of samples. A model with half as many samples will not have half of the accuracy; rather it would only lose a few percentage points of accuracy. There are lots of natural tradeoffs to be taken into account. For example, maybe you really value that extra performance when you are trying to track endangered north atlantic right whales to avoid ship strikes, but you can sacrifice performance for a detector for humpbacks to save on training expenses. Not to say that AI is the answer to everything. But I tend to take the view of Kentaro Toyama, where technology is an amplifier of existing biases and inequalities. In comparison to chartering research vessels, capturing cetaceans, or flying experts around the world to study cetaceans, using machine learning is quite innocuous, equitable, and energy efficient. Happy to expand more in follow-up.
You should add the ai video of trying to teach machine learning to walk and how many tries it takes to still walk in the most weird inefficient way lol
I hope we are able to speak with cetaceans in my lifetime. That's one thing I'm very excited for. For all the hope about other intelligence in the universe, there is alien intelligence right here on Earth. I do think we will be able to communicate eventually. What you need is underwater drones that can observe a pod over a long span like dozens of years and broadcast their observations to scientists. If we can do this in space we can do it in the ocean.
Dolphins have an enormous visual center in their brain. We might consider that their brains can interpret some or all of their vocalizations visually, in three-dimensional space (much like they use their clicks to visualize, using echolocation). This would be very difficult for humans to translate without creating an apparatus that could do the same. In other words, apparently audio communication may not be primarily or only audio…that might just be what humans expect after developing vocal chords, ears and brains that work well in the medium of air over typically short distances, rather than features that work optimally in water, over potentially very long distances. Even humans use visual communication in the form of body language, but we are often face to face in a world of sunlight. Visual communication may be just as important to whales, but adapted to darkness and distance. I have heard stories of dolphins in captivity making unexpectedly coordinated moves and jumps together, with apparently very little audio communication, so I started to wonder if some of that (minimal) communication was visual: a picture is worth a thousand words. This ability would be very helpful on a coordinated hunt, so a great evolutionary adaptation, if it is physically possible. Just a thought.
It's good to hear the realistic interpretation of this story after seeing the very misleading headlines. I had a number of thoughts during this. One, this seems like the equivalent of playing a Childs call to a mother over and over. Very cruel and callous. Two, the excellent analogy of trying to extract meaning from the jumble of sounds from a crowd begs the question, why not test it on a crowd to see if it works first? Third, the fact that we haven't worked out the real meaning of a single sound does suggest that we are very constrained to thinking how our language works and need to think much further 'outside the box'. Fascinating video, thanks.
Loved this !!! And I appreciated a lot when you mentioned that AI 🤖 is often problematic
I'm so pleased to hear this research exists. That would be exactly my approach. I would like to add this, but please forgive the poor terminology: sounds are sensory experience. Not just sounds, but any energy. At least for me. This energy resonates in your body, and it is a sensation; even every thought is a sensation. It is different if words are something external, objective and analytical, rather if they are a sensation of your body. So the first thing is to synchronize, and that is exactly where these research goes. The people who can help bridge the gap between this different perceptions are probably empaths or highly sensitive people. It is a translation from visual structure to sensorial (if this is the right expression). Thank you!
Well, I’m glad I’m becoming less “crazy” because I always thought it was obvious that animals communicate, especially social ones!
It’s madness and hubris to think all sorts of traits are “only human”.
It's not just vocalisations. You need to look at the individuals body language combined with their vocalisation and the context of nearby peers, prey, predators and other environmental issues.
Your channel is very positive. One of the few who educate with objective presentation of published research. And, yah, am a human, not a google bot.
@@raoultesla2292 I’m glad you enjoy the content!
Can I just say that THE cutest animal ever is a newborn dolphin. They look like an American football with fins.
AI is not just one thing. Chatgpt was trained to mimic internet websites it was trained in a monumentally large dataset. The machine learning that segregates audios signals into categories has a much smaller dataset and more specific purpose with a binary correct/incorrect grading. It's an expert system which is much less difficult to produce and much more accurate than a large language model. It's like you're confusing two different genre of machine learning.
Expert systems should absolutely be monitored by humans. Good ones produce a confidence interval so they can actually ask for help with confusing input. They can do some tasks more accurately than humans, but they will never be perfect any more than humans are perfect. Their mistakes are different from ours so may appear to be less rational despite being overall more accurate.
Say you developed an AI to determine dog breed, but you never told it about cats and you never taught it "this is not a dog" is an optional output. It's going to make a dumb call when you show it a cat. Expert systems, even when trained well, can not cope with input outside of their training data.
About 10 years ago I read a kindle book about dolphins communicating with humans via computers. Then the computers were made into vests the scientists could wear near the dolphin tank. It was a great book but I can’t remember the title or author.
The question about "what is their language" reminds me of the movie arrival
I have never heard a marine biologist call an orca a killer whale when speaking in a teaching mode.
Killer whale is their official English name and the one used by the vast majority of scientific institutions, including the NOAA and the Department of Fisheries Canada.. It is the standardized name used in scientific publications and research papers.
[1] www.fisheries.noaa.gov/species/killer-whale
[2] www.pac.dfo-mpo.gc.ca/fm-gp/mammals-mammiferes/whales-baleines/killer-whale-epaulard-eng.html
If you're interested, I made a video about why marine biologists, including myself and literally everyone I work with, use the name killer whale instead of orca linked below.
[3] th-cam.com/video/FIwjehSYKJg/w-d-xo.html
@@KPassionate Political Correctness indicates that we should use the name orca, as that is not pejorative. You insisting on using "killer whale," simply because it had been used in previous studies, would be similar to someone saying that we should continue to use the word "Negroes," because that had been used in the past. See how dumb that sounds?
No, I use it because it's their standard name and the one used by scientific institutions, as you can clearly see in the links I provided. Orcinus orca is their Latin name and we don't typically use Latin names for animals. For example, we don't call lions Panthera leo or dogs Canis lupus familiaris. We call them lions and dogs. Additionally, there are at least 11 different types of killer whales and many of them are likely distinct sub-species or unique species entirely. Revisions to their taxonomy are being considered and if approved then they will no longer be called Orcinus orca (which means "belonging to hell", by the way).
[1] royalsocietypublishing.org/doi/10.1098/rsos.231368
Also, your false analogy is absolutely ridiculous and offensive. In no way, shape, or form should we ever compare the literal name of an animal to slurs or the systemic oppression of people.
Pull this video now, or practice sounds related to "Danger, Get away", Ignorant hunters will use this to harm natures miracle. With all my heart I thank you for the care given to our planet and wonders of life. from Bill
Hopefully they tell us to grow up
I’m not gonna lie, the title with “whale” in it, but a picture of the largest dolphin (orca) almost had me pass the video by. I’m glad I didn’t. 😊 Very cool video.
could it be, that the reason why the whale waited the same interval to respond, is that it's a way of establishing the distance between the whales through the speed of sound when they hear the sound but can not see the other whale. (which is not the case near the boat, ofcourse. maybe the whale was confused by the sound, coming from a boat)
Figuring out how to decode the language of other species on earth is the first step to being able to communicate with species off world. I hope the people working on this understand how profoundly important the endeavor is.
You're absolutely right. Even though that sounds like the plot of a sci-fi channel movie, it's the truth. "By learning how to communicate with humpback whales, scientists hope to develop tools that could one day help us recognize and understand messages from extraterrestrial beings."
[1] nautil.us/how-whales-could-help-us-speak-to-aliens-559443/
[2] www.earth.com/news/ai-helps-humans-have-20-minute-conversation-with-humpback-whale-named-twain/
i honestly believe whales and dolphins do have their own language , it may not be as complex as our own , but enough to keep in contact with each other . i also believe lots of animals communicate with each other , even if its just basic stuff . they have more going on up top than we give them credit for .
Scientists translated the language of my cat with supercomputers after 5 years of research and the cost of about 300.000 $ ! Now the sensational result : Meow.
AI getting the spelling of strawberry wrong is actually a consequence of the “biology” of the AI,the models are trained not directly on the letters of the texts they read, but letters are first grouped into “tokens” where tokens are frequently occurring groups of symbols, before the AI gets to see them and learn about them. This spelling problems won’t last long, because the cause is well known and strategies can be build in into the AI to give them perceptual access to individual letters. Other problems of inaccuracy, as well as the problem that AI “doesn’t know when it doesn’t know something” and therefore hallucinates, are harder to fix.
As a marine biologist, AI is admittedly not my area of expertise. I am interested in learning more, however, so I appreciate this explanation!