Saw this, and it had me wondering if anyone has trained an audio model of animals somehow?
Yes, I could Google, but thought it might lead to an interesting community discussion!
Saw this, and it had me wondering if anyone has trained an audio model of animals somehow?
Yes, I could Google, but thought it might lead to an interesting community discussion!
Too slow, fellow humans!
Recent advancements in artificial intelligence (AI) have opened avenues for decoding animal communication, particularly among species like whales and dolphins. Projects such as the Earth Species Project and Project CETI (Cetacean Translation Initiative) are at the forefront of this research, utilizing AI to analyze and interpret the complex vocalizations of these marine mammals.
Large Language Models (LLMs), which have been instrumental in human language processing, are now being adapted to study animal languages. By training these models on extensive datasets of animal sounds, researchers aim to identify patterns and structures that could correspond to specific meanings or messages. For instance, in sperm whales, distinct sequences of clicks, known as codas, are believed to serve communicative functions. AI models have been employed to detect these patterns, providing insights into the potential complexity and purpose of these vocalizations.
While the primary focus has been on marine mammals, similar methodologies are being considered for other animals, including dogs. Understanding canine vocalizations and behaviors through AI could enhance human-animal interactions and improve animal welfare. However, the research is still in its early stages, and interpreting animal communication presents significant challenges due to differences in sensory perceptions and environmental contexts.
In summary, the integration of AI and LLMs into the study of animal communication holds promise for bridging the gap between humans and other species. Ongoing research continues to explore these possibilities, aiming to foster a deeper understanding of the rich and diverse languages of the animal kingdom.
I havenāt watched this video yet, but itās an interesting subject where I think AI will help push boundaries a lotā¦
Interesting idea with those physical dog buttons, but they need to be complex shaped buttons, like 3D printed ones, so itās intuitive that each one does a different thing. To a dog, every one of those round buttons will look the same, even if the color is different, I bet.
Good point. I wonder if they could use rows/columns or approximate location?
I also wonder if machine learning etc is paying offā¦ That might lead to the animal being able to communicate to us and vice versa?
New rabbit hole for me haha
I wanted to make a joke, that I need a dog-human translator, wondering if anyone knows an HLLM
Iām glad that a leader posted this now, you canāt kick them out.
My dog is clearly trying to communicate. He looks at me and weeps. If I donāt listen because Iām busy, a few minutes later I have a gift on the balcony.
Problem is to see, if he must go out, or just want to play.
If anyone wants to take this idea seriously:
Dogs are colorblind to some colors.
Use clear individual words that sound distinct.
Only a few commands.
Dogs are not particularly intellectually intelligent, but emotionally they have it figured out. I still use my intuition to know what he wants
If I speak, I can see that he recognizes one word of a sentence. If I ask him if he wants to go for a walk, I can see which word he recognizes.
For other animals, a wide spectrum of communication channels. specially smells. (bees dancing)
I mean I bet if you glued an acorn to the top of one of the buttons, the dog would be able to remember to relate that to going outside, and possibly associate the sound/words the button makes too. Glue a shotgun shell to another button for āLetās Hunt Ducksā, lolz.
Iām just saying even if the buttons have different colors the dog wonāt get that thereās a difference between the buttons.
Brains/LLMs are doing pattern recognition, and with identical buttons thereās no āpatternā at all, other than the sound it makes. Color isnāt a pattern, itās a single scalar.
If dogs have self awareness and world understanding to some extend which I am sure they have and they canāt transform it into words then maybe transformer architecture should not create planning.
Who knows if it does in o3.
Try to decode the nuances in your human-dog interaction, just observe
Then you will recognise from the interaction pattern when your dog wants to play and when he needs to go out - you will save yourself the surprise of the balcony.
Example:
The animal signals are easy to learn if you take the time to observe them. They communicate clearly, without ambiguity.
A ācommon languageā is then found relatively quickly and you understand each other almost blindly.
Simply put, itās all about understanding the specific interaction patterns of the other intelligence and behaving in the same way yourself.
I have observed my cat and I have recently become a horse trainer at the weekend.
Animals recognise the patterns of human behaviour and adapt to them.
Based on these observations, I have been thinking about the topic of āultimate patternsā for AI since Christmas. Iām currently trialling how āultimate patternsā can be incorporated into my REM hybrid approach.
Yes, thatās right, animals are pretty straightforward. Provided we can perceive their communication, it just takes a little attention intuition and experience. I also have some rather cheeky little birds that come into the house every day. You donāt need an LLM to hear whether they are scared, frustrated, joyful, affectionate, combative, etc. They even ask questions like āMay I?ā. A small bird with a brain the size of a lentil has the full spectrum of basic emotions.
(Everybody knows the problems in pure text, non body-language non verbal-sound communications.)
You can hear that mosquitoes are sneaky, bees are peaceful, goats are stupid, and you can see when octopuses are afraid. And you can hear that donkeys are broken horses.
However, my dog doesnāt get frisbees or balls. When I throw a stick, he just looks at me questioningly. Both sides still have something to learnā¦
I have this crazy idea for when I have some proper spare time, to get plants to speak. I would insert one of those off the shelf plant sensors that monitor humidity, temperature, light, and fertility, and then hook it up to Realtime API. I can then ask the plant if itās thirsty, if itās too cold, etc.
I thought also of getting plants to speak to Dalle to generate images based on āhow they are feelingā.
I am convinced that this is possible, plants have a wide range of communication.
It is said that they can recognize the dangers and pain of other living beings. However, Iāve only read about this and have never personally verified it. (You should always do that. Iām not a believer.)
All life is communication. Plants communicate with fungi in the soil, warn each other about herbivores, react to fire and smoke, etc. Before connecting an AI, the right sensors must first be found. I think primarily electromagnetic and olfactory.
Adding differently prepared water sources so they can chose not only to get watered but also what type of fertilizer they want to have added. Maybe even provide a āvoiceā (ultrasonic) controlled light source?
There is also the mycelium networkā¦ When you cut down a tree the other trees around will transfer nutritions to the remaining tree stump - adding sensors to that could potentially give access to their mind on top.
I remember seeing a video on this recently?
Canāt think of it at the moment, though.
Donāt think this was it, but maybe?
@PaulBellow, on these topics of AI and understanding non-human ālanguagesā, you might find interesting the following article: Will artificial intelligence let us talk to the animals? and the following encyclopedia article: Animal Communication.
This would be fantastic. It would lead to a much smarter way of agriculture. We cold learn what they like and what not. And some plants have symbiotic friendships with other plants, and specially mushrooms, moon cycles have a strong influence, etc, a big storryā¦
And who knows what whales and dolphins talk.
The dolphins are about to leave
Answer to everything?
42 (ASCII code for *)
In kindergarten, I already thought that humanity would probably go extinct because it was too evil. And I wondered which animal nature would choose next, and I figured it would probably be dolphins or whales.
I think the computer made a mistake / had a bit switch over time. The answer should be 0. (and 1 / 0)