Can artificial intelligence really help us talk to the animals?

2022-04-29 00:44
英语周报· 高中教师版 2022年15期

Understanding animal vocalisations has longbeen the subject of human fascination and study.Various primates give alarm calls that differ accord?ing to predator; dolphins address one another withsignature whistles; and some songbirds can take ele?ments of their calls and rearrange them to communi?cate different messages. But most experts stopshort of calling it a language, as no animal commu?nication meets all the criteria.

Until recently, decoding has mostly relied onpainstaking observation. But interest has burgeonedin applying machine learning to deal with the hugeamounts of data that can now be collected bymodern animal?borne sensors.“People are startingto use it,”says Elodie Briefer, an associate profes?sor at the University of Copenhagen who studiesvocal communication in mammals and birds.“Butwe dont really understand yet how much we cando.”

Briefer co?developed an algorithm that analysespig grunts to tell whether the animal is experienc?ing a positive or negative emotion. Another, calledDeepSqueak, judges whether rodents are in astressed state based on their ultrasonic calls. A fur?ther initiative — Project CETI (which stands for theCetacean Translation Initiative) — plans to use ma?chine learning to translate the communication ofsperm whales.

Another project involves using AI to generatenovel animal calls, with humpback whales as a testspecies. The novel calls — made by splitting vo?calisations into micro ?phonemes (distinct units ofsound lasting a hundredth of a second) and using alanguage model to“speak”something whale?like —can then be played back to the animals to see howthey respond.If the AI can identify what makes a randomchange versus a semantically meaningful one, itbrings us closer to meaningful communication, ex?plains Raskin.“It is having the AI speak the lan?guage, even though we dont know what it meansyet.”

From The Guardian