Cat got your tongue? How AI could is on cusp of breakthrough that’d allow people and ANIMALS to talk to each other in ’12 to 36 months’

It sounds like the plot of a new Disney movie, but experts predict that AI will allow people to communicate with pets and even wild animals.

Researchers around the world use “digital bioacoustics” – small, portable, digital recorders – to capture the sounds, tics and behaviors of animals that are too quiet or nuanced to be noticed by humans.

These databases will be used to train artificial intelligence to decipher these miniature communications and translate them into something more understandable to us, almost like a ‘ChatGPT for animals’.

Projects such as the Earth Species Project expect a breakthrough in the next 12 to 36 months.

One researcher hopes to unravel the language of dogs

Founded in 2017, the non-profit organization AI aims to capture, understand and ‘talk back’ to animals – from cats and dogs to more unusual species such as whales and crows.

The Earth Species Project’s current experiments include attempts to map the vocal repertoire of crows – and another experiment that aims to generate new vocalizations that birds can understand.

Aza Raskin, one of the co-founders of the Earth Species Project, believes that generating animal vocalizations can happen within a year.

Raskin told Google: ‘Can we do generative, novel animal vocalizations? We think we’ll probably be able to do this for animal communication in the next 12 to 36 months.

“You could imagine that we could build a synthetic whale or crow that speaks whale or crow in a way that they can’t tell they’re not talking to one of them.”

“The plot twist is that we may be able to start a conversation before we understand what we’re saying.”

Below are some of the other projects aimed at achieving understandable communication between humans and animals:

Can AI help us understand what cats say? (Getty)

Cat has your tongue?

Artificial intelligence could finally unravel a mystery that has haunted humanity for centuries: what do cats actually think?

Researchers at the University of Lincoln are using AI to categorize and understand cats’ expressions.

Professor Daniel Mills said: ‘We could use AI to teach us a lot about what animals are trying to tell us.’

AI can learn to identify features such as cats’ ear positions, which could help classify and understand the hundreds of expressions cats use to communicate.

Similarly, a new AI model aims to translate the facial expressions and barks of dogs.

Its creator, Con Slobodchikoff, author of Chasing Doctor Dolittle: Learning the Language of Animals, told Scientific American that understanding animals can reveal surprising facts.

‘Animals have their own thoughts, hopes and perhaps dreams.’

The hitter

Bats have a much more complex language than people thought: they have names, they argue about food, and mother bats use “baby talk” when they talk to children.

That’s the conclusion of a groundbreaking AI study that used a voice recognition program to analyze 15,000 bat calls, with an algorithm correlating the sounds with videos of what the bats were doing.

Yossi Yovel from Tel Aviv University told the BBC: ‘I’ve always dreamed of a Doolittle machine that would allow me to talk to animals. Specifically, I am interested in vocal communication.

‘By teaching the computer how to define the different sounds and how to recognize what each sound means when you can hear it. We teach the AI ​​to distinguish between the different sounds.’

“Eventually the computer will be able to speak the language to understand what they are saying to each other.”

Researchers now know that bats “fight” over food, and that baby bats repeat what their mothers “say” to learn language.

‘Deep learning’ is able to decipher bat language (which is largely ultrasonic and much faster than human speech). People can’t listen to it, but computers can.

Yovel remains skeptical that a “decoder” that can instantly translate bats will emerge “in his lifetime,” but is now trying to understand bats’ longer-term social interactions.

Clicking with whales

Microphones on buoys and robotic fish try to unravel one of the animal kingdom’s most famous ‘voices’: the whale song.

Sperm whales are the world’s largest predators and locate their food using clicks, but also use shorter series of clicks called ‘codas’ to communicate with each other.

The Project CETI team is planting microphones on whales to capture vast amounts of data, with the aim of using machine learning to unravel what the enormous animals are saying.

One project aims to unravel the clicking ‘codas’ of sperm whales (Getty)

To attach the microphones, the team uses a ten-meter-long pole.

The AI ​​team is already able to predict whale codas (sequence of clicks) with up to 95 percent accuracy, and now hopes to increase the amount of data to further establish patterns and figure out what they say.

Related Post