AI Enables Humans to Talk to Animals

Until recently, the idea that animals might have their own languages was mocked by the scientific community. Modern technology is being used by scientists all around the world to listen in on animal “conversations” and even speak with them.

Professor Karen Bakker of the University of British Columbia describes some of the most innovative studies on animal and plant communication in her new book The Sounds of Life: How Digital Technology is Enlarging Our Understanding of the Lives of Animals and Plants

Bakker, a director at the UBC Institute for Resources, Environment, and Sustainability, writes: “Digital tools, so frequently associated with our alienation from environment, are affording us an opportunity to listen to nonhumans in compelling ways, revitalizing our connection to the natural world.

She makes the observation that modern computerized listening posts are being used to continuously record the sounds of ecosystems all around the world, from rainforests to the ocean’s bottom. Scientists have even been able to attach microphones to small animals like honeybees because of advances in miniaturization.

Together, these digital tools work like a planetary-scale hearing aid, allowing people to hear and analyze noises in the natural world beyond the range of their sensory perception, according to Bakker. Using artificial intelligence to sort through these noises and give machines the ability to understand animal languages and effectively transcend the barrier of interspecies communication” is the next step for many scientists.

She references a German research team that trained miniature robots to perform the honeybee waggle dance. The scientists were able to direct the honeybees to stop flying and to communicate where to fly in order to harvest a certain nectar using these dance devices. In order to get the honeybees to accept robots as part of their community, the researchers intend to test the idea of implanting robots into the hives.

Additionally, Bakker discusses the work of bioacoustics researcher Katie Payne and her insights into elephant communication. Elephants emit infrasound signals, which are sounds below the human hearing range, as first discovered by Payne. Elephants can communicate over great distances through mud and stones by vibrating these signals. Since then, researchers have discovered that elephants can discern between the signals for “honeybee” and “human,” as well as between “threatening human” and “nonthreatening human. They might be able to help protect elephant populations without removing them from their natural habitats if the power of AI could be used to communicate with elephant herds.

Additionally discussed in Bakker’s book are coral reefs. She says, A healthy coral reef kind of sounds like an underwater symphony. The reef, its residents, and even whales hundreds of kilometres distant are making cracks, burbles, hisses, and clicks. The coral itself might be audible if you could hear in the ultrasonic range. By playing sounds associated with a “healthy coral reef” to coral larvae, scientists may eventually be able to encourage coral to repopulate specific locations.

Though the prospect of a zoological version of Google Translate in the future seems immensely promising, there is concern that dishonest people may use the technology to manipulate animal populations for their own gain. Bakker issues a warning that the idea of using animals for profit raises a lot of alarm bells and that we should never use our newfound powers to assert our dominion over animals and vegetation.

Source link

Most Popular