Archive for the ‘Artificial Intelligence’ Category

AI, Data Analytics Help Create ‘Smart Rainforest’ in Australia

Via Environment & Energy Leader, an article on the world’s first “smart rainforest,” where artificial intelligence and data is used to advance sustainable and cost-effective environmental restoration models across the globe:

In an era where technological innovation meets environmental stewardship, NTT Group has joined forces with ClimateForce, embarking on an ambitious journey to breathe new life into the Daintree Rainforest.

This partnership is set to unveil what it calls the world’s first “smart rainforest,” using artificial intelligence and data to advance sustainable and cost-effective environmental restoration models across the globe.

Advancing Forest Restoration with Technology
At the heart of this initiative lies the Smart Management Platform (SMP) technology, developed by NTT. This innovative platform is designed to rejuvenate a section of Australia’s Daintree Rainforest, previously compromised by agricultural activities and invasive plant species. The technology’s integration promises not only to regenerate the land but also to safeguard it against future ecological threats.

ClimateForce, a point of light in environmental regeneration, has taken up the mantle to restore this section of the rainforest, located adjacent to the Great Barrier Reef. With NTT’s support, the project will utilize advanced AI, data analytics, and predictive analytics to evaluate and implement organic reforestation techniques.

This strategic approach aims to protect biodiversity, mitigate climate change effects, and bolster resilient local economies.

A Shared Vision for a Sustainable Future

The collaboration between NTT is a shared vision for sustainable advances and improving biodiversity.

Barney Swan, the CEO and co-founder of ClimateForce, expressed gratitude for the support from NTT and NTT DATA, emphasizing the project’s potential to accelerate their goals and develop replicable models for ecosystem regeneration worldwide.

NTT DATA’s involvement extends beyond technological support, contributing to operational and fundraising efforts. This collaboration was sparked by a previous sponsorship of an expedition advocating for sustainable practices in Antarctica, highlighting the long-standing commitment of both organizations to environmental sustainability.

By using advanced technology and fostering international cooperation with the creation of the smart rainforest in the Daintree, NTT and ClimateForce said they hope to set a precedent for global environmental restoration efforts. This project not only aims to restore an important ecosystem but also to inspire similar initiatives in other areas across the world, according to the organizers.

“NTT DATA met Barney through our sponsorship of his father’s Undaunted: South Pole 2023 expedition, which advocated for sustainable practices and long-term protections for Antarctica,” said Bob Pryor, CEO, NTT DATA Services. “We’re excited to extend this relationship and help ClimateForce with its mission in the tropics, which perfectly aligns with our own vision for realizing a sustainable future.”

,

Read More »



Decoding Animal Communication with A.I.

Via Aspen Ideas, an interesting podcast on efforts to decode animal communication using A.I.:

Scientists could actually be close to being able to decode animal communication and figure out what animals are saying to each other. And more astonishingly, we might even find ways to talk back. The study of sonic communication in animals is relatively new, and researchers have made a lot of headway over the past few decades with recordings and human analysis. But recent advancements in artificial intelligence are opening doors to parsing animal communication in ways that haven’t been close to possible until now. In this talk from the 2023 Aspen Ideas Festival in partnership with Vox’s “Unexplainable” podcast, two experts on animal communication and the digital world come together to explain what may come next.

Tragically, a few months after this conversation was recorded in June, one of the panelists, Karen Bakker, passed away unexpectedly. Bakker was a professor at the University of British Columbia who looked at ways digital tools can address our most pressing problems. She also wrote the book “The Sounds of Life: How Digital Technology is Bringing Us Closer to the World of Animals and Plants.” The UBC Geography department wrote of Bakker: “We will remember Karen as multi-faceted and superbly talented in all realms.”

Aza Raskin, the co-founder of the Earth Species Project, a nonprofit trying to decode animal communication using A.I., joined Bakker for this discussion. The host of “Unexplainable,” Noam Hassenfeld, interviewed Bakker and Raskin.

,

Read More »



Five Ways AI Is Saving Wildlife

Via The Guardian, a look at why artificial intelligence has been identified as one of the top three emerging technologies in conservation, helping protect species around the world:

There’s a strand of thinking, from sci-fi films to Stephen Hawking, that suggests artificial intelligence (AI) could spell doom for humans. But conservationists are increasingly turning to AI as an innovative tech solution to tackle the biodiversity crisis and mitigate climate change.

A recent report by Wildlabs.net found that AI was one of the top three emerging technologies in conservation. From camera trap and satellite images to audio recordings, the report notes: “AI can learn how to identify which photos out of thousands contain rare species; or pinpoint an animal call out of hours of field recordings – hugely reducing the manual labour required to collect vital conservation data.”

AI is helping to protect species as diverse as humpback whales, koalas and snow leopards, supporting the work of scientists, researchers and rangers in vital tasks, from anti-poaching patrols to monitoring species. With machine learning (ML) computer systems that use algorithms and models to learn, understand and adapt, AI is often able to do the job of hundreds of people, getting faster, cheaper and more effective results.

Here are five AI projects contributing to our understanding of biodiversity and species:

1. Stopping poachers
Zambia’s Kafue national park is home to more than 6,600 African savanna elephants and covers 22,400 sq km, so stopping poaching is a big logistical challenge. Illegal fishing in Lake Itezhi-Tezhi on the park’s border is also a problem, and poachers masquerade as fishers to enter and exit the park undetected, often under the cover of darkness.

Automated alerts mean that just a handful of rangers are needed to provide around-the-clock surveillance. Photograph: Game Rangers International
The Connected Conservation Initiative, from Game Rangers International (GRI), Zambia’s Department of National Parks and Wildlife and other partners, is using AI to enhance conventional anti-poaching efforts, creating a 19km-long virtual fence across Lake Itezhi-Tezhi. Forward-looking infrared (FLIR) thermal cameras record every boat crossing in and out of the park, day and night.

Installed in 2019, the cameras were monitored manually by rangers, who could then respond to signs of illegal activity. FLIR AI has now been trained to automatically detect boats entering the park, increasing effectiveness and reducing the need for constant manual surveillance. Waves and flying birds can also trigger alerts, so the AI is being taught to eliminate these false readings.

“There have long been insufficient resources to secure protected areas, and having people watch multiple cameras 24/7 doesn’t scale,” says Ian Hoad, special technical adviser at GRI. “AI can be a gamechanger, as it can monitor for illegal boat crossings and alert ranger teams immediately. The technology has enabled a handful of rangers to provide around-the-clock surveillance of a massive illegal entry point across Lake Itezhi-Tezhi.”

2. Tracking water loss
Brazil has lost more than 15% of its surface water in the past 30 years, a crisis that has only come to light with the help of AI. The country’s rivers, lakes and wetlands have been facing increasing pressure from a growing population, economic development, deforestation, and the worsening effects of the climate crisis. But no one knew the scale of the problem until last August, when, using ML, the MapBiomas water project released its results after processing more than 150,000 images generated by Nasa’s Landsat 5, 7 and 8 satellites from 1985 to 2020 across the 8.5m sq km of Brazilian territory. Without AI, researchers could not have analysed water changes across the country at the scale and level of detail needed. AI can also distinguish between natural and human-created water bodies.

The Negro River, a major tributary of the Amazon and one of the world’s 10 largest rivers by volume, has lost 22% of its surface water. The Brazilian portion of the Pantanal, the world’s largest tropical wetland, has lost 74% of its surface water. Such losses are devastating for wildlife (4,000 species of plants and animals live in the Pantanal, including jaguars, tapirs and anacondas), people and nature.

“AI technology provided us with a shockingly clear picture,” says Cássio Bernardino, WWF-Brasil’s MapBiomas water project lead. “Without AI and ML technology, we would never have known how serious the situation was, let alone had the data to convince people. Now we can take steps to tackle the challenges this loss of surface water poses to Brazil’s incredible biodiversity and communities.”

3. Finding whales
Knowing where whales are is the first step in putting measures such as marine protected areas in place to protect them. Locating humpbacks visually across vast oceans is difficult, but their distinctive singing can travel hundreds of miles underwater. At National Oceanic and Atmospheric Association (Noaa) fisheries in the Pacific islands, acoustic recorders are used to monitor marine mammal populations at remote and hard-to-access islands, says Ann Allen, Noaa research oceanographer. “In 14 years, we’ve accumulated around 190,000 hours of acoustic recordings. It would take an exorbitant amount of time for an individual to manually identify whale vocalisations.”

In 2018, Noaa partnered with Google AI for Social Good’s bioacoustics team to create an ML model that could recognise humpback whale song. “We were very successful in identifying humpback song through our entire dataset, establishing patterns of their presence in the Hawaiian islands and Mariana islands,” says Allen. “We also found a new occurrence of humpback song at Kingman reef, a site that’s never before had documented humpback presence. This comprehensive analysis of our data wouldn’t have been possible without AI.”

4. Protecting koalas
Australia’s koala populations are in serious decline due to habitat destruction, domestic dog attacks, road accidents and bushfires. Without knowledge of their numbers and whereabouts, saving them is challenging. Grant Hamilton, associate professor of ecology at Queensland University of Technology (QUT), has created a conservation AI hub with federal and Landcare Australia funding to count koalas and other endangered animals. Using drones and infrared imaging, an AI algorithm rapidly analyses infrared footage and determines whether a heat signature is a koala or another animal. Hamilton used the system after Australia’s devastating bushfires in 2019 and 2020 to identify surviving koala populations, particularly on Kangaroo Island.

“This is a gamechanger project to protect koalas,” says Hamilton. “Powerful AI algorithms are able to analyse countless hours of video footage and identify koalas from many other animals in the thick bushland. This system will allow Landcare groups, conservation groups and organisations working on protecting and monitoring species to survey large areas anywhere in Australia and send the data back to us at QUT to process it.

“We will increasingly see AI used in conservation,” he adds. “In this current project, we simply couldn’t do this as rapidly or as accurately without AI.”

5. Counting species
Saving species on the brink of extinction in the Congo basin, the world’s second-largest rainforest, is a huge task. In 2020, data science company Appsilon teamed up with the University of Stirling in Scotland and Gabon’s national parks agency (ANPN) to develop the Mbaza AI image classification algorithm for large-scale biodiversity monitoring in Gabon’s Lopé and Waka national parks.

Conservationists had been using automated cameras to capture species, including African forest elephants, gorillas, chimpanzees and pangolins, which then had to be manually identified. Millions of pictures could take months or years to classify, and in a country that is losing about 150 elephants each month to poachers, time matters.

The Mbaza AI algorithm was used in 2020 to analyse more than 50,000 images collected from 200 camera traps spread across 7,000 sq km of forest. Mbaza AI classifies up to 3,000 images an hour and is up to 96% accurate. Conservationists can monitor and track animals and quickly spot anomalies or warning signs, enabling them to act swiftly when needed. The algorithm also works offline on an ordinary laptop, which is helpful in locations with no or poor internet connectivity.

“Many central African forest mammals are threatened by unsustainable trade, land-use changes and the global climate crisis,” says Dr Robin Whytock, post-doctoral research fellow at the University of Stirling. “Appsilon’s work on the Mbaza AI app enables conservationists to rapidly identify and respond to threats to biodiversity. The project started with 200 camera traps in Lopé and Waka national parks in Gabon but, since then, hundreds more have been deployed by different organisations across west and central Africa. In Gabon, the government and national parks agency are aiming to deploy cameras across the entire country. Mbaza AI can help all these projects speed up data analysis.”

,

Read More »



How to Use AI to Talk to Whales—and Save Life on Earth

Via Wired, a look at how – with ecosystems in crisis – engineers and scientists are using AI to decipher what animals are saying, with the hope that – by truly listening to nature – humans will decide to protect it:

BEFORE MICHELLE FOURNET moved to Alaska on a whim in her early twenties, she’d never seen a whale. She took a job on a whale watching boat and, each day she was out on the water, gazed at the grand shapes moving under the surface. For her entire life, she realized, the natural world had been out there, and she’d been missing it. “I didn’t even know I was bereft,” she recalls. Later, as a graduate student in marine biology, Fournet wondered what else she was missing. The humpbacks she was getting to know revealed themselves in partial glimpses. What if she could hear what they were saying? She dropped a hydrophone in the water—but the only sound that came through was the mechanical churn of boats. The whales had fallen silent amid the racket. Just as Fournet had discovered nature, then, she was witnessing it recede. She resolved to help the whales. To do that, she needed to learn how to listen to them.

Fournet, now a professor at the University of New Hampshire and the director of a collective of conservation scientists, has spent the past decade building a catalog of the various chirps, shrieks, and groans that humpbacks make in daily life. The whales have huge and diverse vocabularies, but there is one thing they all say, whether male or female, young or old. To our meager human ears, it sounds something like a belly rumble punctuated by a water droplet: whup.

Fournet thinks the whup call is how the whales announce their presence to one another. A way of saying, “I’m here.” Last year, as part of a series of experiments to test her theory, Fournet piloted a skiff out into Alaska’s Frederick Sound, where humpbacks gather to feed on clouds of krill. She broadcast a sequence of whup calls and recorded what the whales did in response. Then, back on the beach, she put on headphones and listened to the audio. Her calls went out. The whales’ voices returned through the water: whup, whup, whup. Fournet describes it like this: The whales heard a voice say, “I am, I am here, I am me.” And they replied, “I also am, I am here, I am me.”

Biologists use this type of experiment, called a playback, to study what prompts an animal to speak. Fournet’s playbacks have so far used recordings of real whups. The method is imperfect, though, because humpbacks are highly attentive to who they’re talking to. If a whale recognizes the voice of the whale in the recording, how does that affect its response? Does it talk to a buddy differently than it would to a stranger? As a biologist, how do you ensure you’re sending out a neutral whup?

One answer is to create your own. Fournet has shared her catalog of humpback calls with the Earth Species Project, a group of technologists and engineers who, with the help of AI, are aiming to develop a synthetic whup. And they’re not just planning to emulate a humpback’s voice. The nonprofit’s mission is to open human ears to the chatter of the entire animal kingdom. In 30 years, they say, nature documentaries won’t need soothing Attenborough-style narration, because the dialog of the animals onscreen will be subtitled. And just as engineers today don’t need to know Mandarin or Turkish to build a chatbot in those languages, it will soon be possible to build one that speaks Humpback—or Hummingbird, or Bat, or Bee.

The idea of “decoding” animal communication is bold, maybe unbelievable, but a time of crisis calls for bold and unbelievable measures. Everywhere that humans are, which is everywhere, animals are vanishing. Wildlife populations across the planet have dropped an average of nearly 70 percent in the past 50 years, according to one estimate—and that’s just the portion of the crisis that scientists have measured. Thousands of species could disappear without humans knowing anything about them at all.

To decarbonize the economy and preserve ecosystems, we certainly don’t need to talk to animals. But the more we know about the lives of other creatures, the better we can care for those lives. And humans, being human, pay more attention to those who speak our language. The interaction that Earth Species wants to make possible, Fournet says, “helps a society that is disconnected from nature to reconnect with it.” The best technology gives humans a way to inhabit the world more fully. In that light, talking to animals could be its most natural application yet.

HUMANS HAVE ALWAYS known how to listen to other species, of course. Fishers throughout history collaborated with whales and dolphins to mutual benefit: a fish for them, a fish for us. In 19th-century Australia, a pod of killer whales was known to herd baleen whales into a bay near a whalers’ settlement, then slap their tails to alert the humans to ready the harpoons. (In exchange for their help, the orcas got first dibs on their favorite cuts, the lips and tongue.) Meanwhile, in the icy waters of Beringia, Inupiat people listened and spoke to bowhead whales before their hunts. As the environmental historian Bathsheba Demuth writes in her book Floating Coast, the Inupiat thought of the whales as neighbors occupying “their own country” who chose at times to offer their lives to humans—if humans deserved it.

Commercial whalers had a different approach. They saw whales as floating containers of blubber and baleen. The American whaling industry in the mid-19th century, and then the global whaling industry in the following century, very nearly obliterated several species, resulting in one of the largest-ever losses of wild animal life caused by humans. In the 1960s, 700,000 whales were killed, marking the peak of cetacean death. Then, something remarkable happened: We heard whales sing. On a trip to Bermuda, the biologists Roger and Katy Payne met a US naval engineer named Frank Watlington, who gave them recordings he’d made of strange melodies captured deep underwater. For centuries, sailors had recounted tales of eerie songs that emanated from their boats’ wooden hulls, whether from monsters or sirens they didn’t know. Watlington thought the sounds were from humpback whales. Go save them, he told the Paynes. They did, by releasing an album, Songs of the Humpback Whale, that made these singing whales famous. The Save the Whales movement took off soon after. In 1972, the US passed the Marine Mammal Protection Act; in 1986, commercial whaling was banned by the International Whaling Commission. In barely two decades, whales had transformed in the public eye into cognitively complex and gentle giants of the sea.

Roger Payne, who died earlier this year, spoke frequently about his belief that the more the public could know “curious and fascinating things” about whales, the more people would care what happened to them. In his opinion, science alone would never change the world, because humans don’t respond to data; they respond to emotion—to things that make them weep in awe or shiver with delight. He was in favor of wildlife tourism, zoos, and captive dolphin shows. However compromised the treatment of individual animals might be in these places, he believed, the extinction of a species is far worse. Conservationists have since held on to the idea that contact with animals can save them.

From this premise, Earth Species is taking the imaginative leap that AI can help us make first contact with animals. The organization’s founders, Aza Raskin and Britt Selvitelle, are both architects of our digital age. Raskin grew up in Silicon Valley; his father started Apple’s Macintosh project in the 1970s. Early in his career, Raskin helped to build Firefox, and in 2006 he created the infinite scroll, arguably his greatest and most dubious legacy. Repentant, he later calculated the collective human hours that his invention had wasted and arrived at a figure surpassing 100,000 lifetimes per week.

Raskin would sometimes hang out at a startup called Twitter, where he met Selvitelle, a founding employee. They stayed in touch. In 2013, Raskin heard a news story on the radio about gelada monkeys in Ethiopia whose communication had similar cadences to human speech. So similar, in fact, that the lead scientist would sometimes hear a voice talking to him, turn around, and be surprised to find a monkey there. The interviewer asked whether there was any way of knowing what they were trying to say. There wasn’t—but Raskin wondered if it might be possible to arrive at an answer with machine learning. He brought the idea up with Selvitelle, who had an interest in animal welfare.

For a while the idea was just an idea. Then, in 2017, new research showed that machines could translate between two languages without first being trained on bilingual texts. Google Translate had always mimicked the way a human might use a dictionary, just faster and at scale. But these new machine learning methods bypassed semantics altogether. They treated languages as geometric shapes and found where the shapes overlapped. If a machine could translate any language into English without needing to understand it first, Raskin thought, could it do the same with a gelada monkey’s wobble, an elephant’s infrasound, a bee’s waggle dance? A year later, Raskin and Selvitelle formed Earth Species.

Raskin believes that the ability to eavesdrop on animals will spur nothing less than a paradigm shift as historically significant as the Copernican revolution. He is fond of saying that “AI is the invention of modern optics.” By this he means that just as improvements to the telescope allowed 17th-century astronomers to perceive newfound stars and finally displace the Earth from the center of the cosmos, AI will help scientists hear what their ears alone cannot: that animals speak meaningfully, and in more ways than we can imagine. That their abilities, and their lives, are not less than ours. “This time we’re going to look out to the universe and discover humanity is not the center,” Raskin says.

Raskin and Selvitelle spent their first few years meeting with biologists and tagging along on fieldwork. They soon realized that the most obvious and immediate need in front of them wasn’t inciting revolution. It was sorting data. Two decades ago, a primate researcher would stand under a tree and hold a microphone in the air until her arm got tired. Now researchers can stick a portable biologger to a tree and collect a continuous stream of audio for a year. The many terabytes of data that result is more than any army of grad students could hope to tackle. But feed all this material to trained machine learning algorithms, and the computer can scan the data and flag the animal calls. It can distinguish a whup from a whistle. It can tell a gibbon’s voice from her brother’s. At least, that’s the hope. These tools need more data, research, and funding. Earth Species has a workforce of 15 people and a budget of a few million dollars. They’ve teamed up with several dozen biologists to start making headway on these practical tasks.

An early project took on one of the most significant challenges in animal communication research, known as the cocktail party problem: When a group of animals are talking to one another, how can you tell who’s saying what? In the open sea, schools of dolphins a thousand strong chatter all at once; scientists who record them end up with audio as dense with whistles and clicks as a stadium is with cheers. Even audio of just two or three animals is often unusable, says Laela Sayigh, an expert in bottlenose dolphin whistles, because you can’t tell where one dolphin stops talking and another starts. (Video doesn’t help, because dolphins don’t open their mouths when they speak.) Earth Species used Sayigh’s extensive database of signature whistles—the ones likened to names—to develop a neural network model that could separate overlapping animal voices. That model was useful only in lab conditions, but research is meant to be built on. A couple of months later, Google AI published a model for untangling wild birdsong.

Sayigh has proposed a tool that can serve as an emergency alert for dolphin mass strandings, which tend to recur in certain places around the globe. She lives in Cape Cod, Massachusetts, one such hot spot, where as often as a dozen times a year groups of dolphins get disoriented, inadvertently swim onto shore, and perish. Fortunately, there might be a way to predict this before it happens, Sayigh says. She hypothesizes that when the dolphins are stressed, they emit signature whistles more than usual, just as someone lost in a snowstorm might call out in panic. A computer trained to listen for these whistles could send an alert that prompts rescuers to reroute the dolphins before they hit the beach. In the Salish Sea—where, in 2018, a mother orca towing the body of her starved calf attracted global sympathy—there is an alert system, built by Google AI, that listens for resident killer whales and diverts ships out of their way.

For researchers and conservationists alike, the potential applications of machine learning are basically limitless. And Earth Species is not the only group working on decoding animal communication. Payne spent the last months of his life advising for Project CETI, a nonprofit that built a base in Dominica this year for the study of sperm whale communication. “Just imagine what would be possible if we understood what animals are saying to each other; what occupies their thoughts; what they love, fear, desire, avoid, hate, are intrigued by, and treasure,” he wrote in Time in June.

Many of the tools that Earth Species has developed so far offer more in the way of groundwork than immediate utility. Still, there’s a lot of optimism in this nascent field. With enough resources, several biologists told me, decoding is scientifically achievable. That’s only the beginning. The real hope is to bridge the gulf in understanding between an animal’s experience and ours, however vast—or narrow—that might be.

ARI FRIEDLAENDER HAS something that Earth Species needs: lots and lots of data. Friedlaender researches whale behavior at UC Santa Cruz. He got started as a tag guy: the person who balances at the edge of a boat as it chases a whale, holds out a long pole with a suction-cupped biologging tag attached to the end, and slaps the tag on a whale’s back as it rounds the surface. This is harder than it seems. Friedlaender proved himself adept—“I played sports in college,” he explains—and was soon traveling the seas on tagging expeditions.

The tags Friedlaender uses capture a remarkable amount of data. Each records not only GPS location, temperature, pressure, and sound, but also high-definition video and three-axis accelerometer data, the same tech that a Fitbit uses to count your steps or measure how deeply you’re sleeping. Taken together, the data illustrates, in cinematic detail, a day in the life of a whale: its every breath and every dive, its traverses through fields of sea nettles and jellyfish, its encounters with twirling sea lions.

Friedlaender shows me an animation he has made from one tag’s data. In it, a whale descends and loops through the water, traveling a multicolored three-dimensional course as if on an undersea Mario Kart track. Another animation depicts several whales blowing bubble nets, a feeding strategy in which they swim in circles around groups of fish, trap the fish in the center with a wall of bubbles, then lunge through, mouths gaping. Looking at the whales’ movements, I notice that while most of them have traced a neat spiral, one whale has produced a tangle of clumsy zigzags. “Probably a young animal,” Friedlaender says. “That one hasn’t figured things out yet.”

Friedlaender’s multifaceted data is especially useful for Earth Species because, as any biologist will tell you, animal communication isn’t purely verbal. It involves gestures and movement just as often as vocalizations. Diverse data sets get Earth Species closer to developing algorithms that can work across the full spectrum of the animal kingdom. The organization’s most recent work focuses on foundation models, the same kind of computation that powers generative AI like ChatGPT. Earlier this year, Earth Species published the first foundation model for animal communication. The model can already accurately sort beluga whale calls, and Earth Species plans to apply it to species as disparate as orangutans (who bellow), elephants (who send seismic rumbles through the ground), and jumping spiders (who vibrate their legs). Katie Zacarian, Earth Species’ CEO, describes the model this way: “Everything’s a nail, and it’s a hammer.”

Another application of Earth Species’ AI is generating animal calls, like an audio version of GPT. Raskin has made a few-second chirp of a chiffchaff bird. If this sounds like it’s getting ahead of decoding, it is—AI, as it turns out, is better at speaking than understanding. Earth Species is finding that the tools it is developing will likely have the ability to talk to animals even before they can decode. It may soon be possible, for example, to prompt an AI with a whup and have it continue a conversation in Humpback—without human observers knowing what either the machine or the whale is saying.

No one is expecting such a scenario to actually take place; that would be scientifically irresponsible, for one thing. The biologists working with Earth Species are motivated by knowledge, not dialog for the sake of it. Felix Effenberger, a senior AI research adviser for Earth Species, told me: “I don’t believe that we will have an English-Dolphin translator, OK? Where you put English into your smartphone and then it makes dolphin sounds and the dolphin goes off and fetches you some sea urchin. The goal is to first discover basic patterns of communication.”

So what will talking to animals look—sound—like? It needn’t be a free-form conversation to be astonishing. Speaking to animals in a controlled way, as with Fournet’s playback whups, is probably essential for scientists to try to understand them. After all, you wouldn’t try to learn German by going to a party in Berlin and sitting mutely in a corner.

Bird enthusiasts already use apps to snatch melodies out of the air and identify which species is singing. With an AI as your animal interpreter, imagine what more you could learn. You prompt it to make the sound of two humpbacks meeting, and it produces a whup. You prompt it to make the sound of a calf talking to its mother, and it produces a whisper. You prompt it to make the sound of a lovelorn male, and it produces a song.

NO SPECIES OF whale has ever been driven extinct by humans. This is hardly a victory. Numbers are only one measure of biodiversity. Animal lives are rich with all that they are saying and doing—with culture. While humpback populations have rebounded since their lowest point a half-century ago, what songs, what practices, did they lose in the meantime? Blue whales, hunted down to a mere 1 percent of their population, might have lost almost everything.

Christian Rutz, a biologist at the University of St. Andrews, believes that one of the essential tasks of conservation is to preserve nonhuman ways of being. “You’re not asking, ‘Are you there or are you not there?’” he says. “You are asking, ‘Are you there and happy, or unhappy?’”

Rutz is studying how the communication of Hawaiian crows has changed since 2002, when they went extinct in the wild. About 100 of these remarkable birds—one of few species known to use tools—are alive in protective captivity, and conservationists hope to eventually reintroduce them to the wild. But these crows may not yet be prepared. There is some evidence that the captive birds have forgotten useful vocabulary, including calls to defend their territory and warn of predators. Rutz is working with Earth Species to build an algorithm to sift through historical recordings of the extinct wild crows, pull out all the crows’ calls, and label them. If they find that calls were indeed lost, conservationists might generate those calls to teach them to the captive birds.

Rutz is careful to say that generating calls will be a decision made thoughtfully, when the time requires it. In a paper published in Science in July, he praised the extraordinary usefulness of machine learning. But he cautions that humans should think hard before intervening in animal lives. Just as AI’s potential remains unknown, it may carry risks that extend beyond what we can imagine. Rutz cites as an example the new songs composed each year by humpback whales that spread across the world like hit singles. Should these whales pick up on an AI-generated phrase and incorporate that into their routine, humans would be altering a million-year-old culture. “I think that is one of the systems that should be off-limits, at least for now,” he told me. “Who has the right to have a chat with a humpback whale?”

It’s not hard to imagine how AI that speaks to animals could be misused. Twentieth-century whalers employed the new technology of their day, too, emitting sonar at a frequency that drove whales to the surface in panic. But AI tools are only as good or bad as the things humans do with them. Tom Mustill, a conservation documentarian and the author of How to Speak Whale, suggests giving animal-decoding research the same resources as the most championed of scientific endeavors, like the Large Hadron Collider, the Human Genome Project, and the James Webb Space Telescope. “With so many technologies,” he told me, “it’s just left to the people who have developed it to do what they like until the rest of the world catches up. This is too important to let that happen.”

Billions of dollars are being funneled into AI companies, much of it in service of corporate profits: writing emails more quickly, creating stock photos more efficiently, delivering ads more effectively. Meanwhile, the mysteries of the natural world remain. One of the few things scientists know with certainty is how much they don’t know. When I ask Friedlaender whether spending so much time chasing whales has taught him much about them, he tells me he sometimes gives himself a simple test: After a whale goes under the surface, he tries to predict where it will come up next. “I close my eyes and say, ‘OK, I’ve put out 1,000 tags in my life, I’ve seen all this data. The whale is going to be over here.’ And the whale’s always over there,” he says. “I have no idea what these animals are doing.”

IF YOU COULD speak to a whale, what would you say? Would you ask White Gladis, the killer whale elevated to meme status this summer for sinking yachts off the Iberian coast, what motivated her rampage—fun, delusion, revenge? Would you tell Tahlequah, the mother orca grieving the death of her calf, that you, too, lost a child? Payne once said that if given the chance to speak to a whale, he’d like to hear its normal gossip: loves, feuds, infidelities. Also: “Sorry would be a good word to say.”

Then there is that thorny old philosophical problem. The question of umwelt, and what it’s like to be a bat, or a whale, or you. Even if we could speak to a whale, would we understand what it says? Or would its perception of the world, its entire ordering of consciousness, be so alien as to be unintelligible? If machines render human languages as shapes that overlap, perhaps English is a doughnut and Whalish is the hole.

Maybe, before you can speak to a whale, you must know what it is like to have a whale’s body. It is a body 50 million years older than our body. A body shaped to the sea, to move effortlessly through crushing depths, to counter the cold with sheer mass. As a whale, you choose when to breathe, or not. Mostly you are holding your breath. Because of this, you cannot smell or taste. You do not have hands to reach out and touch things with. Your eyes are functional, but sunlight penetrates water poorly. Usually you can’t even make out your own tail through the fog.

You would live in a cloud of hopeless obscurity were it not for your ears. Sound travels farther and faster through water than through air, and your world is illuminated by it. For you, every dark corner of the ocean rings with sound. You hear the patter of rain on the surface, the swish of krill, the blasts of oil drills. If you’re a sperm whale, you spend half your life in the pitch black of the deep sea, hunting squid by ear. You use sound to speak, too, just as humans do. But your voice, rather than dissipating instantly in the thin substance of air, sustains. Some whales can shout louder than a jet engine, their calls carrying 10,000 miles across the ocean floor.

But what is it like to be you, a whale? What thoughts do you think, what feelings do you feel? These are much harder things for scientists to know. A few clues come from observing how you talk to your own kind. If you’re born into a pod of killer whales, close-knit and xenophobic, one of the first things your mother and your grandmother teach you is your clan name. To belong must feel essential. (Remember Keiko, the orca who starred in the film Free Willy: When he was released to his native waters late in life, he failed to rejoin the company of wild whales and instead returned to die among humans.) If you’re a female sperm whale, you click to your clanmates to coordinate who’s watching whose baby; meanwhile, the babies babble back. You live on the go, constantly swimming to new waters, cultivating a disposition that is nervous and watchful. If you’re a male humpback, you spend your time singing alone in icy polar waters, far from your nearest companion. To infer loneliness, though, would be a human’s mistake. For a whale whose voice reaches across oceans, perhaps distance does not mean solitude. Perhaps, as you sing, you are always in conversation.

MICHELLE FOURNET WONDERS: How do we know whales would want to talk to us anyway? What she loves most about humpbacks is their indifference. “This animal is 40 feet long and weighs 75,000 pounds, and it doesn’t give a shit about you,” she told me. “Every breath it takes is grander than my entire existence.” Roger Payne observed something similar. He considered whales the only animal capable of an otherwise impossible feat: making humans feel small.

Early one morning in Monterey, California, I boarded a whale watching boat. The water was slate gray with white peaks. Flocks of small birds skittered across the surface. Three humpbacks appeared, backs rounding neatly out of the water. They flashed some tail, which was good for the group’s photographers. The fluke’s craggy ridge-line can be used, like a fingerprint, to distinguish individual whales.

Later, I uploaded a photo of one of the whales to Happywhale. The site identifies whales using a facial recognition algorithm modified for flukes. The humpback I submitted, one with a barnacle-encrusted tail, came back as CRC-19494. Seventeen years ago, this whale had been spotted off the west coast of Mexico. Since then, it had made its way up and down the Pacific between Baja and Monterey Bay. For a moment, I was impressed that this site could so easily fish an animal out of the ocean and deliver me a name. But then again, what did I know about this whale? Was it a mother, a father? Was this whale on Happywhale actually happy? The AI had no answers. I searched the whale’s profile and found a gallery of photos, from different angles, of a barnacled fluke. For now, that was all I could know.

,

Read More »



Digital Reefs

Via Woods Hole Oceanographic Institution, a look at efforts to create the first coral reef digital twin, a multidisciplinary effort led by Woods Hole Oceanographic Institution:

The National Science Foundation (NSF) has awarded Woods Hole Oceanographic Institution (WHOI) $5 million to participate in NSF’s ground breaking Convergence Accelerator Program. The project, led by WHOI scientist Anne Cohen, builds the world’s first Coral Reef Digital Twin, a 4-dimensional virtual replica of a living coral reef powered by state-of-the art data and models. “Digital Reefs” will be accessible and usable by coral reef stakeholders around the world who are making critical decisions every day to manage and protect these valuable ocean ecosystems.

The Phase 2 team includes: Siemens Technology, The Nature Conservancy (TNC), Scripps Institution of Oceanography at UC San Diego, and Stanford University, in addition to WHOI. Also on the team are Mote Marine Laboratory, the Marshall Islands Conservation Society, University of Guam, National Oceanic and Atmospheric Administration (NOAA), National Academy of Marine Research (NAMR) Taiwan, and Ebiil Society, Palau, whose major role will be to develop user-inspired modules, lead training workshops and advance the use of Digital Reefs in reef restoration.

“Globally, coral reefs support almost one billion people and are one of the most diverse ecosystems on the planet. But coral reefs continue to decline at an unprecedented rate, despite a surge in data, funding, and political will in the last couple of decades,” said Cohen. “We found the problem is not lack of access to data and information. It is a lack of access, by decision makers—fishermen, managers, risk assessors, government agencies—to intuitive, interactive, actionable data. Almost everyone nowadays has a cellphone, there is a lot of data out there, people just can’t use it.”

“Our goal is to facilitate universal access to data and information that right now are available to just a handful of people. Democratization of information is the only way we can truly ensure that coral reefs have a future on our planet, and Digital Reefs is how we get there,” said Cohen.

The 21st century has brought with it unprecedented challenges for coral reefs, mainly from climate change, demanding new and innovative approaches to management, conservation, and restoration. Fundamental to effective decision-making is access to science-based data, information, and decision making tools.

“As reefs around the world suffer, so do the diverse and often vulnerable coastal ecosystems and humans that depend upon them,” said Joe Pollock, Ph.D., Senior Coral Reef Resilience Scientist at TNC. “We work to empower local communities with the tools, information, and partnerships needed to better safeguard reefs and the lives and livelihoods they sustain. Digital Reefs has the potential to revolutionize reef management and restoration by providing fine-scale, actionable information in an immersive, engaging, and highly visual format.”

Digital Twins are already widely used in industry and healthcare, where exact virtual replicas of engines, railway networks, and even human bodies are used to understand and test what-if scenarios, facilitate collaboration amongst different teams, and assist with decision making. The WHOI-led project, Digital Reefs: A Globally Coordinated, Universally Accessible Digital Twin Network for the Coral Reef Blue Economy, will develop the Digital Reefs prototype of Palmyra Atoll, and apply the prototype technology to reefs in the Marshall Islands and Taiwan as a test of their scaling model to build a global Digital Reefs network.

The Coral Reef Digital Twin is a virtual representation of a real reef, with all its features, consistently updated with new data from sensors and satellites. The digital twin allows users access to the dynamic, 3-dimensional system from a laptop or cellphone anywhere in the world to get real-time the information needed for sustainable harvesting of reef resources. Image credit: Cohen Lab © Woods Hole Oceanographic Institution

Digital Reefs translates complex data and numerical model output into a digital replica of a coral reef. Users accessing the Digital Reef from their computer or cell phone will be immersed in a dynamic 4-D visualization of currents and corals, rather than a spreadsheet of numbers, although the supporting data will be downloadable. Users can move across the reef, and scuba dive into its depths to access information about water flow, temperatures, and reef inhabitants. The Digital Reefs platform will offer users the opportunity to visualize the reef in years past, and in the future as sea level rises and the ocean warms. Decision making tools enable users to change the reef —virtually— and examine the impact of the proposed changes, such as building a hotel or dredging a channel for boat access. Stakeholders can visualize how future climate change will affect their reef and the specific areas where they fish and farm. Restoration practitioners will be able to test which areas of the reef are optimal for restoration and to visualize larval dispersal patterns from restored areas.

“We’re in a crisis of survival for coral reefs,” said Stanford’s Steve Palumbi, “and we’re also in a crisis of data complexity. Everything about a reef is complex, so how do you make a decision about saving your local reef for your community and family when the answer is complicated? Digital Reefs takes data from the physics, biology, physiology, and ecology of a complex reef and shows you what its future is likely to be, depending on what you do, and decisions you make. We are all excited about contributing what we do best, to make this effort work.”

“Our work with NSF and WHOI highlights the boundless opportunities we now have available to us through digital twin technology,” said Virginie Maillard, Head of Siemens Technology US. “As we enter the next phase of this project, Siemens Technology will leverage its expertise in industrial digital twin to create a tangible digital twin of the coral reef that can be utilized by all, no matter background or expertise, for the greater purpose of collaboration to save our planet’s marine ecosystem.”

“There is a unique opportunity in offering people immersive access to underwater habitats,” said Stuart Sandin, professor of marine ecology, at UC San Diego’s Scripps Institution of Oceanography. “The growth of digital data is a first step in understanding coral reefs, but the critical next step is in providing easy access to these data; the Digital Reefs project will build the tools to provide just this access.”

In September 2021, the NSF Convergence Accelerator Program launched the 2021 cohort and awarded $21 million across 28 multidisciplinary teams, focused on Phase 1 of the Track E: Networked Blue Economy and Track F: Trust & Authenticity in Communication Systems. In September 2022, NSF has made an additional $30 million investment in the Networked Blue Economy to advance use-inspired solutions addressing national-scale societal challenges. Track E: The Networked Blue Economy aims to create a smart, integrated, connected, and open ecosystem for ocean innovation, exploration, and sustainable utilization. The Track E Cohort was 16 teams in Phase 1 and is now six teams in Phase 2.

This work was funded by a grant from the National Science Foundation, and leveraged previous research that was supported by the Arthur Vining Davis Foundation.

,

Read More »



AI-Powered Marine Mammal Spotting App

An innovative new app, Seaspotter capitalizes both on AI and citizen scientists to support marine research by allowing anyone to simply take a picture of any marine mammals they encounter. SeaSpotter will then notify you of which animal was spotted, and log the findings to our open-source database ready for scientific research. ,

Read More »


ABOUT
Networked Nature
New technical innovations such as location-tracking devices, GPS and satellite communications, remote sensors, laser-imaging technologies, light detection and ranging” (LIDAR) sensing, high-resolution satellite imagery, digital mapping, advanced statistical analytical software and even biotechnology and synthetic biology are revolutionizing conservation in two key ways: first, by revealing the state of our world in unprecedented detail; and, second, by making available more data to more people in more places. The mission of this blog is to track these technical innovations that may give conservation the chance – for the first time – to keep up with, and even get ahead of, the planet’s most intractable environmental challenges. It will also examine the unintended consequences and moral hazards that the use of these new tools may cause.Read More