Archive for the ‘Artificial Intelligence’ Category

How to Use AI to Talk to Whales—and Save Life on Earth

Via Wired, a look at how – with ecosystems in crisis – engineers and scientists are using AI to decipher what animals are saying, with the hope that – by truly listening to nature – humans will decide to protect it:

BEFORE MICHELLE FOURNET moved to Alaska on a whim in her early twenties, she’d never seen a whale. She took a job on a whale watching boat and, each day she was out on the water, gazed at the grand shapes moving under the surface. For her entire life, she realized, the natural world had been out there, and she’d been missing it. “I didn’t even know I was bereft,” she recalls. Later, as a graduate student in marine biology, Fournet wondered what else she was missing. The humpbacks she was getting to know revealed themselves in partial glimpses. What if she could hear what they were saying? She dropped a hydrophone in the water—but the only sound that came through was the mechanical churn of boats. The whales had fallen silent amid the racket. Just as Fournet had discovered nature, then, she was witnessing it recede. She resolved to help the whales. To do that, she needed to learn how to listen to them.

Fournet, now a professor at the University of New Hampshire and the director of a collective of conservation scientists, has spent the past decade building a catalog of the various chirps, shrieks, and groans that humpbacks make in daily life. The whales have huge and diverse vocabularies, but there is one thing they all say, whether male or female, young or old. To our meager human ears, it sounds something like a belly rumble punctuated by a water droplet: whup.

Fournet thinks the whup call is how the whales announce their presence to one another. A way of saying, “I’m here.” Last year, as part of a series of experiments to test her theory, Fournet piloted a skiff out into Alaska’s Frederick Sound, where humpbacks gather to feed on clouds of krill. She broadcast a sequence of whup calls and recorded what the whales did in response. Then, back on the beach, she put on headphones and listened to the audio. Her calls went out. The whales’ voices returned through the water: whup, whup, whup. Fournet describes it like this: The whales heard a voice say, “I am, I am here, I am me.” And they replied, “I also am, I am here, I am me.”

Biologists use this type of experiment, called a playback, to study what prompts an animal to speak. Fournet’s playbacks have so far used recordings of real whups. The method is imperfect, though, because humpbacks are highly attentive to who they’re talking to. If a whale recognizes the voice of the whale in the recording, how does that affect its response? Does it talk to a buddy differently than it would to a stranger? As a biologist, how do you ensure you’re sending out a neutral whup?

One answer is to create your own. Fournet has shared her catalog of humpback calls with the Earth Species Project, a group of technologists and engineers who, with the help of AI, are aiming to develop a synthetic whup. And they’re not just planning to emulate a humpback’s voice. The nonprofit’s mission is to open human ears to the chatter of the entire animal kingdom. In 30 years, they say, nature documentaries won’t need soothing Attenborough-style narration, because the dialog of the animals onscreen will be subtitled. And just as engineers today don’t need to know Mandarin or Turkish to build a chatbot in those languages, it will soon be possible to build one that speaks Humpback—or Hummingbird, or Bat, or Bee.

The idea of “decoding” animal communication is bold, maybe unbelievable, but a time of crisis calls for bold and unbelievable measures. Everywhere that humans are, which is everywhere, animals are vanishing. Wildlife populations across the planet have dropped an average of nearly 70 percent in the past 50 years, according to one estimate—and that’s just the portion of the crisis that scientists have measured. Thousands of species could disappear without humans knowing anything about them at all.

To decarbonize the economy and preserve ecosystems, we certainly don’t need to talk to animals. But the more we know about the lives of other creatures, the better we can care for those lives. And humans, being human, pay more attention to those who speak our language. The interaction that Earth Species wants to make possible, Fournet says, “helps a society that is disconnected from nature to reconnect with it.” The best technology gives humans a way to inhabit the world more fully. In that light, talking to animals could be its most natural application yet.

HUMANS HAVE ALWAYS known how to listen to other species, of course. Fishers throughout history collaborated with whales and dolphins to mutual benefit: a fish for them, a fish for us. In 19th-century Australia, a pod of killer whales was known to herd baleen whales into a bay near a whalers’ settlement, then slap their tails to alert the humans to ready the harpoons. (In exchange for their help, the orcas got first dibs on their favorite cuts, the lips and tongue.) Meanwhile, in the icy waters of Beringia, Inupiat people listened and spoke to bowhead whales before their hunts. As the environmental historian Bathsheba Demuth writes in her book Floating Coast, the Inupiat thought of the whales as neighbors occupying “their own country” who chose at times to offer their lives to humans—if humans deserved it.

Commercial whalers had a different approach. They saw whales as floating containers of blubber and baleen. The American whaling industry in the mid-19th century, and then the global whaling industry in the following century, very nearly obliterated several species, resulting in one of the largest-ever losses of wild animal life caused by humans. In the 1960s, 700,000 whales were killed, marking the peak of cetacean death. Then, something remarkable happened: We heard whales sing. On a trip to Bermuda, the biologists Roger and Katy Payne met a US naval engineer named Frank Watlington, who gave them recordings he’d made of strange melodies captured deep underwater. For centuries, sailors had recounted tales of eerie songs that emanated from their boats’ wooden hulls, whether from monsters or sirens they didn’t know. Watlington thought the sounds were from humpback whales. Go save them, he told the Paynes. They did, by releasing an album, Songs of the Humpback Whale, that made these singing whales famous. The Save the Whales movement took off soon after. In 1972, the US passed the Marine Mammal Protection Act; in 1986, commercial whaling was banned by the International Whaling Commission. In barely two decades, whales had transformed in the public eye into cognitively complex and gentle giants of the sea.

Roger Payne, who died earlier this year, spoke frequently about his belief that the more the public could know “curious and fascinating things” about whales, the more people would care what happened to them. In his opinion, science alone would never change the world, because humans don’t respond to data; they respond to emotion—to things that make them weep in awe or shiver with delight. He was in favor of wildlife tourism, zoos, and captive dolphin shows. However compromised the treatment of individual animals might be in these places, he believed, the extinction of a species is far worse. Conservationists have since held on to the idea that contact with animals can save them.

From this premise, Earth Species is taking the imaginative leap that AI can help us make first contact with animals. The organization’s founders, Aza Raskin and Britt Selvitelle, are both architects of our digital age. Raskin grew up in Silicon Valley; his father started Apple’s Macintosh project in the 1970s. Early in his career, Raskin helped to build Firefox, and in 2006 he created the infinite scroll, arguably his greatest and most dubious legacy. Repentant, he later calculated the collective human hours that his invention had wasted and arrived at a figure surpassing 100,000 lifetimes per week.

Raskin would sometimes hang out at a startup called Twitter, where he met Selvitelle, a founding employee. They stayed in touch. In 2013, Raskin heard a news story on the radio about gelada monkeys in Ethiopia whose communication had similar cadences to human speech. So similar, in fact, that the lead scientist would sometimes hear a voice talking to him, turn around, and be surprised to find a monkey there. The interviewer asked whether there was any way of knowing what they were trying to say. There wasn’t—but Raskin wondered if it might be possible to arrive at an answer with machine learning. He brought the idea up with Selvitelle, who had an interest in animal welfare.

For a while the idea was just an idea. Then, in 2017, new research showed that machines could translate between two languages without first being trained on bilingual texts. Google Translate had always mimicked the way a human might use a dictionary, just faster and at scale. But these new machine learning methods bypassed semantics altogether. They treated languages as geometric shapes and found where the shapes overlapped. If a machine could translate any language into English without needing to understand it first, Raskin thought, could it do the same with a gelada monkey’s wobble, an elephant’s infrasound, a bee’s waggle dance? A year later, Raskin and Selvitelle formed Earth Species.

Raskin believes that the ability to eavesdrop on animals will spur nothing less than a paradigm shift as historically significant as the Copernican revolution. He is fond of saying that “AI is the invention of modern optics.” By this he means that just as improvements to the telescope allowed 17th-century astronomers to perceive newfound stars and finally displace the Earth from the center of the cosmos, AI will help scientists hear what their ears alone cannot: that animals speak meaningfully, and in more ways than we can imagine. That their abilities, and their lives, are not less than ours. “This time we’re going to look out to the universe and discover humanity is not the center,” Raskin says.

Raskin and Selvitelle spent their first few years meeting with biologists and tagging along on fieldwork. They soon realized that the most obvious and immediate need in front of them wasn’t inciting revolution. It was sorting data. Two decades ago, a primate researcher would stand under a tree and hold a microphone in the air until her arm got tired. Now researchers can stick a portable biologger to a tree and collect a continuous stream of audio for a year. The many terabytes of data that result is more than any army of grad students could hope to tackle. But feed all this material to trained machine learning algorithms, and the computer can scan the data and flag the animal calls. It can distinguish a whup from a whistle. It can tell a gibbon’s voice from her brother’s. At least, that’s the hope. These tools need more data, research, and funding. Earth Species has a workforce of 15 people and a budget of a few million dollars. They’ve teamed up with several dozen biologists to start making headway on these practical tasks.

An early project took on one of the most significant challenges in animal communication research, known as the cocktail party problem: When a group of animals are talking to one another, how can you tell who’s saying what? In the open sea, schools of dolphins a thousand strong chatter all at once; scientists who record them end up with audio as dense with whistles and clicks as a stadium is with cheers. Even audio of just two or three animals is often unusable, says Laela Sayigh, an expert in bottlenose dolphin whistles, because you can’t tell where one dolphin stops talking and another starts. (Video doesn’t help, because dolphins don’t open their mouths when they speak.) Earth Species used Sayigh’s extensive database of signature whistles—the ones likened to names—to develop a neural network model that could separate overlapping animal voices. That model was useful only in lab conditions, but research is meant to be built on. A couple of months later, Google AI published a model for untangling wild birdsong.

Sayigh has proposed a tool that can serve as an emergency alert for dolphin mass strandings, which tend to recur in certain places around the globe. She lives in Cape Cod, Massachusetts, one such hot spot, where as often as a dozen times a year groups of dolphins get disoriented, inadvertently swim onto shore, and perish. Fortunately, there might be a way to predict this before it happens, Sayigh says. She hypothesizes that when the dolphins are stressed, they emit signature whistles more than usual, just as someone lost in a snowstorm might call out in panic. A computer trained to listen for these whistles could send an alert that prompts rescuers to reroute the dolphins before they hit the beach. In the Salish Sea—where, in 2018, a mother orca towing the body of her starved calf attracted global sympathy—there is an alert system, built by Google AI, that listens for resident killer whales and diverts ships out of their way.

For researchers and conservationists alike, the potential applications of machine learning are basically limitless. And Earth Species is not the only group working on decoding animal communication. Payne spent the last months of his life advising for Project CETI, a nonprofit that built a base in Dominica this year for the study of sperm whale communication. “Just imagine what would be possible if we understood what animals are saying to each other; what occupies their thoughts; what they love, fear, desire, avoid, hate, are intrigued by, and treasure,” he wrote in Time in June.

Many of the tools that Earth Species has developed so far offer more in the way of groundwork than immediate utility. Still, there’s a lot of optimism in this nascent field. With enough resources, several biologists told me, decoding is scientifically achievable. That’s only the beginning. The real hope is to bridge the gulf in understanding between an animal’s experience and ours, however vast—or narrow—that might be.

ARI FRIEDLAENDER HAS something that Earth Species needs: lots and lots of data. Friedlaender researches whale behavior at UC Santa Cruz. He got started as a tag guy: the person who balances at the edge of a boat as it chases a whale, holds out a long pole with a suction-cupped biologging tag attached to the end, and slaps the tag on a whale’s back as it rounds the surface. This is harder than it seems. Friedlaender proved himself adept—“I played sports in college,” he explains—and was soon traveling the seas on tagging expeditions.

The tags Friedlaender uses capture a remarkable amount of data. Each records not only GPS location, temperature, pressure, and sound, but also high-definition video and three-axis accelerometer data, the same tech that a Fitbit uses to count your steps or measure how deeply you’re sleeping. Taken together, the data illustrates, in cinematic detail, a day in the life of a whale: its every breath and every dive, its traverses through fields of sea nettles and jellyfish, its encounters with twirling sea lions.

Friedlaender shows me an animation he has made from one tag’s data. In it, a whale descends and loops through the water, traveling a multicolored three-dimensional course as if on an undersea Mario Kart track. Another animation depicts several whales blowing bubble nets, a feeding strategy in which they swim in circles around groups of fish, trap the fish in the center with a wall of bubbles, then lunge through, mouths gaping. Looking at the whales’ movements, I notice that while most of them have traced a neat spiral, one whale has produced a tangle of clumsy zigzags. “Probably a young animal,” Friedlaender says. “That one hasn’t figured things out yet.”

Friedlaender’s multifaceted data is especially useful for Earth Species because, as any biologist will tell you, animal communication isn’t purely verbal. It involves gestures and movement just as often as vocalizations. Diverse data sets get Earth Species closer to developing algorithms that can work across the full spectrum of the animal kingdom. The organization’s most recent work focuses on foundation models, the same kind of computation that powers generative AI like ChatGPT. Earlier this year, Earth Species published the first foundation model for animal communication. The model can already accurately sort beluga whale calls, and Earth Species plans to apply it to species as disparate as orangutans (who bellow), elephants (who send seismic rumbles through the ground), and jumping spiders (who vibrate their legs). Katie Zacarian, Earth Species’ CEO, describes the model this way: “Everything’s a nail, and it’s a hammer.”

Another application of Earth Species’ AI is generating animal calls, like an audio version of GPT. Raskin has made a few-second chirp of a chiffchaff bird. If this sounds like it’s getting ahead of decoding, it is—AI, as it turns out, is better at speaking than understanding. Earth Species is finding that the tools it is developing will likely have the ability to talk to animals even before they can decode. It may soon be possible, for example, to prompt an AI with a whup and have it continue a conversation in Humpback—without human observers knowing what either the machine or the whale is saying.

No one is expecting such a scenario to actually take place; that would be scientifically irresponsible, for one thing. The biologists working with Earth Species are motivated by knowledge, not dialog for the sake of it. Felix Effenberger, a senior AI research adviser for Earth Species, told me: “I don’t believe that we will have an English-Dolphin translator, OK? Where you put English into your smartphone and then it makes dolphin sounds and the dolphin goes off and fetches you some sea urchin. The goal is to first discover basic patterns of communication.”

So what will talking to animals look—sound—like? It needn’t be a free-form conversation to be astonishing. Speaking to animals in a controlled way, as with Fournet’s playback whups, is probably essential for scientists to try to understand them. After all, you wouldn’t try to learn German by going to a party in Berlin and sitting mutely in a corner.

Bird enthusiasts already use apps to snatch melodies out of the air and identify which species is singing. With an AI as your animal interpreter, imagine what more you could learn. You prompt it to make the sound of two humpbacks meeting, and it produces a whup. You prompt it to make the sound of a calf talking to its mother, and it produces a whisper. You prompt it to make the sound of a lovelorn male, and it produces a song.

NO SPECIES OF whale has ever been driven extinct by humans. This is hardly a victory. Numbers are only one measure of biodiversity. Animal lives are rich with all that they are saying and doing—with culture. While humpback populations have rebounded since their lowest point a half-century ago, what songs, what practices, did they lose in the meantime? Blue whales, hunted down to a mere 1 percent of their population, might have lost almost everything.

Christian Rutz, a biologist at the University of St. Andrews, believes that one of the essential tasks of conservation is to preserve nonhuman ways of being. “You’re not asking, ‘Are you there or are you not there?’” he says. “You are asking, ‘Are you there and happy, or unhappy?’”

Rutz is studying how the communication of Hawaiian crows has changed since 2002, when they went extinct in the wild. About 100 of these remarkable birds—one of few species known to use tools—are alive in protective captivity, and conservationists hope to eventually reintroduce them to the wild. But these crows may not yet be prepared. There is some evidence that the captive birds have forgotten useful vocabulary, including calls to defend their territory and warn of predators. Rutz is working with Earth Species to build an algorithm to sift through historical recordings of the extinct wild crows, pull out all the crows’ calls, and label them. If they find that calls were indeed lost, conservationists might generate those calls to teach them to the captive birds.

Rutz is careful to say that generating calls will be a decision made thoughtfully, when the time requires it. In a paper published in Science in July, he praised the extraordinary usefulness of machine learning. But he cautions that humans should think hard before intervening in animal lives. Just as AI’s potential remains unknown, it may carry risks that extend beyond what we can imagine. Rutz cites as an example the new songs composed each year by humpback whales that spread across the world like hit singles. Should these whales pick up on an AI-generated phrase and incorporate that into their routine, humans would be altering a million-year-old culture. “I think that is one of the systems that should be off-limits, at least for now,” he told me. “Who has the right to have a chat with a humpback whale?”

It’s not hard to imagine how AI that speaks to animals could be misused. Twentieth-century whalers employed the new technology of their day, too, emitting sonar at a frequency that drove whales to the surface in panic. But AI tools are only as good or bad as the things humans do with them. Tom Mustill, a conservation documentarian and the author of How to Speak Whale, suggests giving animal-decoding research the same resources as the most championed of scientific endeavors, like the Large Hadron Collider, the Human Genome Project, and the James Webb Space Telescope. “With so many technologies,” he told me, “it’s just left to the people who have developed it to do what they like until the rest of the world catches up. This is too important to let that happen.”

Billions of dollars are being funneled into AI companies, much of it in service of corporate profits: writing emails more quickly, creating stock photos more efficiently, delivering ads more effectively. Meanwhile, the mysteries of the natural world remain. One of the few things scientists know with certainty is how much they don’t know. When I ask Friedlaender whether spending so much time chasing whales has taught him much about them, he tells me he sometimes gives himself a simple test: After a whale goes under the surface, he tries to predict where it will come up next. “I close my eyes and say, ‘OK, I’ve put out 1,000 tags in my life, I’ve seen all this data. The whale is going to be over here.’ And the whale’s always over there,” he says. “I have no idea what these animals are doing.”

IF YOU COULD speak to a whale, what would you say? Would you ask White Gladis, the killer whale elevated to meme status this summer for sinking yachts off the Iberian coast, what motivated her rampage—fun, delusion, revenge? Would you tell Tahlequah, the mother orca grieving the death of her calf, that you, too, lost a child? Payne once said that if given the chance to speak to a whale, he’d like to hear its normal gossip: loves, feuds, infidelities. Also: “Sorry would be a good word to say.”

Then there is that thorny old philosophical problem. The question of umwelt, and what it’s like to be a bat, or a whale, or you. Even if we could speak to a whale, would we understand what it says? Or would its perception of the world, its entire ordering of consciousness, be so alien as to be unintelligible? If machines render human languages as shapes that overlap, perhaps English is a doughnut and Whalish is the hole.

Maybe, before you can speak to a whale, you must know what it is like to have a whale’s body. It is a body 50 million years older than our body. A body shaped to the sea, to move effortlessly through crushing depths, to counter the cold with sheer mass. As a whale, you choose when to breathe, or not. Mostly you are holding your breath. Because of this, you cannot smell or taste. You do not have hands to reach out and touch things with. Your eyes are functional, but sunlight penetrates water poorly. Usually you can’t even make out your own tail through the fog.

You would live in a cloud of hopeless obscurity were it not for your ears. Sound travels farther and faster through water than through air, and your world is illuminated by it. For you, every dark corner of the ocean rings with sound. You hear the patter of rain on the surface, the swish of krill, the blasts of oil drills. If you’re a sperm whale, you spend half your life in the pitch black of the deep sea, hunting squid by ear. You use sound to speak, too, just as humans do. But your voice, rather than dissipating instantly in the thin substance of air, sustains. Some whales can shout louder than a jet engine, their calls carrying 10,000 miles across the ocean floor.

But what is it like to be you, a whale? What thoughts do you think, what feelings do you feel? These are much harder things for scientists to know. A few clues come from observing how you talk to your own kind. If you’re born into a pod of killer whales, close-knit and xenophobic, one of the first things your mother and your grandmother teach you is your clan name. To belong must feel essential. (Remember Keiko, the orca who starred in the film Free Willy: When he was released to his native waters late in life, he failed to rejoin the company of wild whales and instead returned to die among humans.) If you’re a female sperm whale, you click to your clanmates to coordinate who’s watching whose baby; meanwhile, the babies babble back. You live on the go, constantly swimming to new waters, cultivating a disposition that is nervous and watchful. If you’re a male humpback, you spend your time singing alone in icy polar waters, far from your nearest companion. To infer loneliness, though, would be a human’s mistake. For a whale whose voice reaches across oceans, perhaps distance does not mean solitude. Perhaps, as you sing, you are always in conversation.

MICHELLE FOURNET WONDERS: How do we know whales would want to talk to us anyway? What she loves most about humpbacks is their indifference. “This animal is 40 feet long and weighs 75,000 pounds, and it doesn’t give a shit about you,” she told me. “Every breath it takes is grander than my entire existence.” Roger Payne observed something similar. He considered whales the only animal capable of an otherwise impossible feat: making humans feel small.

Early one morning in Monterey, California, I boarded a whale watching boat. The water was slate gray with white peaks. Flocks of small birds skittered across the surface. Three humpbacks appeared, backs rounding neatly out of the water. They flashed some tail, which was good for the group’s photographers. The fluke’s craggy ridge-line can be used, like a fingerprint, to distinguish individual whales.

Later, I uploaded a photo of one of the whales to Happywhale. The site identifies whales using a facial recognition algorithm modified for flukes. The humpback I submitted, one with a barnacle-encrusted tail, came back as CRC-19494. Seventeen years ago, this whale had been spotted off the west coast of Mexico. Since then, it had made its way up and down the Pacific between Baja and Monterey Bay. For a moment, I was impressed that this site could so easily fish an animal out of the ocean and deliver me a name. But then again, what did I know about this whale? Was it a mother, a father? Was this whale on Happywhale actually happy? The AI had no answers. I searched the whale’s profile and found a gallery of photos, from different angles, of a barnacled fluke. For now, that was all I could know.

,

Read More »



Digital Reefs

Via Woods Hole Oceanographic Institution, a look at efforts to create the first coral reef digital twin, a multidisciplinary effort led by Woods Hole Oceanographic Institution:

The National Science Foundation (NSF) has awarded Woods Hole Oceanographic Institution (WHOI) $5 million to participate in NSF’s ground breaking Convergence Accelerator Program. The project, led by WHOI scientist Anne Cohen, builds the world’s first Coral Reef Digital Twin, a 4-dimensional virtual replica of a living coral reef powered by state-of-the art data and models. “Digital Reefs” will be accessible and usable by coral reef stakeholders around the world who are making critical decisions every day to manage and protect these valuable ocean ecosystems.

The Phase 2 team includes: Siemens Technology, The Nature Conservancy (TNC), Scripps Institution of Oceanography at UC San Diego, and Stanford University, in addition to WHOI. Also on the team are Mote Marine Laboratory, the Marshall Islands Conservation Society, University of Guam, National Oceanic and Atmospheric Administration (NOAA), National Academy of Marine Research (NAMR) Taiwan, and Ebiil Society, Palau, whose major role will be to develop user-inspired modules, lead training workshops and advance the use of Digital Reefs in reef restoration.

“Globally, coral reefs support almost one billion people and are one of the most diverse ecosystems on the planet. But coral reefs continue to decline at an unprecedented rate, despite a surge in data, funding, and political will in the last couple of decades,” said Cohen. “We found the problem is not lack of access to data and information. It is a lack of access, by decision makers—fishermen, managers, risk assessors, government agencies—to intuitive, interactive, actionable data. Almost everyone nowadays has a cellphone, there is a lot of data out there, people just can’t use it.”

“Our goal is to facilitate universal access to data and information that right now are available to just a handful of people. Democratization of information is the only way we can truly ensure that coral reefs have a future on our planet, and Digital Reefs is how we get there,” said Cohen.

The 21st century has brought with it unprecedented challenges for coral reefs, mainly from climate change, demanding new and innovative approaches to management, conservation, and restoration. Fundamental to effective decision-making is access to science-based data, information, and decision making tools.

“As reefs around the world suffer, so do the diverse and often vulnerable coastal ecosystems and humans that depend upon them,” said Joe Pollock, Ph.D., Senior Coral Reef Resilience Scientist at TNC. “We work to empower local communities with the tools, information, and partnerships needed to better safeguard reefs and the lives and livelihoods they sustain. Digital Reefs has the potential to revolutionize reef management and restoration by providing fine-scale, actionable information in an immersive, engaging, and highly visual format.”

Digital Twins are already widely used in industry and healthcare, where exact virtual replicas of engines, railway networks, and even human bodies are used to understand and test what-if scenarios, facilitate collaboration amongst different teams, and assist with decision making. The WHOI-led project, Digital Reefs: A Globally Coordinated, Universally Accessible Digital Twin Network for the Coral Reef Blue Economy, will develop the Digital Reefs prototype of Palmyra Atoll, and apply the prototype technology to reefs in the Marshall Islands and Taiwan as a test of their scaling model to build a global Digital Reefs network.

The Coral Reef Digital Twin is a virtual representation of a real reef, with all its features, consistently updated with new data from sensors and satellites. The digital twin allows users access to the dynamic, 3-dimensional system from a laptop or cellphone anywhere in the world to get real-time the information needed for sustainable harvesting of reef resources. Image credit: Cohen Lab © Woods Hole Oceanographic Institution

Digital Reefs translates complex data and numerical model output into a digital replica of a coral reef. Users accessing the Digital Reef from their computer or cell phone will be immersed in a dynamic 4-D visualization of currents and corals, rather than a spreadsheet of numbers, although the supporting data will be downloadable. Users can move across the reef, and scuba dive into its depths to access information about water flow, temperatures, and reef inhabitants. The Digital Reefs platform will offer users the opportunity to visualize the reef in years past, and in the future as sea level rises and the ocean warms. Decision making tools enable users to change the reef —virtually— and examine the impact of the proposed changes, such as building a hotel or dredging a channel for boat access. Stakeholders can visualize how future climate change will affect their reef and the specific areas where they fish and farm. Restoration practitioners will be able to test which areas of the reef are optimal for restoration and to visualize larval dispersal patterns from restored areas.

“We’re in a crisis of survival for coral reefs,” said Stanford’s Steve Palumbi, “and we’re also in a crisis of data complexity. Everything about a reef is complex, so how do you make a decision about saving your local reef for your community and family when the answer is complicated? Digital Reefs takes data from the physics, biology, physiology, and ecology of a complex reef and shows you what its future is likely to be, depending on what you do, and decisions you make. We are all excited about contributing what we do best, to make this effort work.”

“Our work with NSF and WHOI highlights the boundless opportunities we now have available to us through digital twin technology,” said Virginie Maillard, Head of Siemens Technology US. “As we enter the next phase of this project, Siemens Technology will leverage its expertise in industrial digital twin to create a tangible digital twin of the coral reef that can be utilized by all, no matter background or expertise, for the greater purpose of collaboration to save our planet’s marine ecosystem.”

“There is a unique opportunity in offering people immersive access to underwater habitats,” said Stuart Sandin, professor of marine ecology, at UC San Diego’s Scripps Institution of Oceanography. “The growth of digital data is a first step in understanding coral reefs, but the critical next step is in providing easy access to these data; the Digital Reefs project will build the tools to provide just this access.”

In September 2021, the NSF Convergence Accelerator Program launched the 2021 cohort and awarded $21 million across 28 multidisciplinary teams, focused on Phase 1 of the Track E: Networked Blue Economy and Track F: Trust & Authenticity in Communication Systems. In September 2022, NSF has made an additional $30 million investment in the Networked Blue Economy to advance use-inspired solutions addressing national-scale societal challenges. Track E: The Networked Blue Economy aims to create a smart, integrated, connected, and open ecosystem for ocean innovation, exploration, and sustainable utilization. The Track E Cohort was 16 teams in Phase 1 and is now six teams in Phase 2.

This work was funded by a grant from the National Science Foundation, and leveraged previous research that was supported by the Arthur Vining Davis Foundation.

,

Read More »



AI-Powered Marine Mammal Spotting App

An innovative new app, Seaspotter capitalizes both on AI and citizen scientists to support marine research by allowing anyone to simply take a picture of any marine mammals they encounter. SeaSpotter will then notify you of which animal was spotted, and log the findings to our open-source database ready for scientific research. ,

Read More »



Five Revolutionary Technologies Helping Scientists Study Polar Bears

Via Smithsonian Magazine, a look at how researchers are using novel technologies to study polar bears, which live in the rapidly warming Arctic:

When they’re born, polar bears are toothless and blind, and they weigh roughly a pound. But over time—thanks to lots of fat-rich milk and protection from their mother—these helpless cubs grow to become large, powerful predators that are perfectly adapted for their Arctic environment. Though temperatures can dip to minus 50 degrees Fahrenheit in the winter, the massive marine mammals—which live in Canada, Norway, Russia, Greenland and Alaska—stay warm with a thick layer of body fat and two coats of fur. Their huge paws help them paddle through the icy water and gently walk across sea ice in search of their favorite meal, seals.

Their size, power, intelligence and environmental adaptions have long intrigued humans living in the north, including many Indigenous communities, such as the Inuit, the Ket and the Sámi. Biologists are curious about Ursus maritimus for many of the same reasons.

“Bears are fascinating,” says B.J. Kirschhoffer, director of conservation technology at Polar Bears International. “For me, when standing on a prominent point overlooking sea ice, I want to know how any animal can make a living in that environment. I am curious about everything that makes them able to grow to be the biggest bear by living in one of the harshest places on this planet. There is still so much to learn about the species—how they use energy, how they navigate their world and how they are responding to a rapidly changing environment.”

Today, researchers and conservationists want to know about these highly specialized marine mammals because human-caused climate change is reshaping their Arctic habitat. The bears spend much of their time on sea ice hunting for seals. But as temperatures in the Arctic rise, sea ice is getting thinner, melting earlier in the spring and forming later in the fall. Pollution and commercial activity also threaten the bears and their environment. An estimated 26,000 polar bears roam the northern reaches of the world, and conservationists worry they could disappear entirely by 2100 because of global warming.

But investigating mostly solitary creatures who spend much of their time wandering around sea ice, in some of the most remote and rugged places on the planet, is expensive, logistically challenging and dangerous to researchers. For help, scientists are turning to technology. These five innovations are changing the way they study polar bears.

Sticky tracking devices

Researchers can twist three black bottle brushes into a sedated bear’s fur to attach a triangular plate equipped with a tracking device. 3M
Much of what scientists know about polar bears comes from tracking female members of the species. This is largely due to anatomical differences between the sexes: Males have small heads and thick necks, which means tracking collars can easily slip right off. Females, on the other hand, have larger heads and thinner necks.

Neck collars are out of the question for males, and they’re not ideal for young bears, which can quickly outgrow the devices. Other options—like implants—require the bears to undergo minor surgery, which can be potentially risky to their health. Ear tags don’t require surgery, but they are still invasive. They’re also permanent, and polar bear researchers strive to make as minimal an impact on the bears as possible. How, then, can scientists attach tracking devices to young bears and male polar bears?

This was the challenge put to innovators at 3M, the Minnesota-based company that makes everything from medical devices to cleaning supplies to building materials. 3M is particularly good at making things sticky—its flagship products include Post-it Notes and Scotch Tape.

Jon Kirschhoffer spent his nearly 40-year career at 3M as an industrial designer, developing novel solutions to complex problems just like this one. So when B.J. Kirschhoffer, his son, started chatting about the need for a new, noninvasive way of attaching trackers to polar bears, Jon’s wheels started turning. He brought the problem to his colleagues, who set to work studying polar bear fur and building prototypes.

Crimping Device For Polar Bear Fur

One of the most promising designs draws inspiration from the human process of attaching hair extensions. 3M
In the end, they landed on two promising “burr on fur” approaches. One device uses three bottle brushes—small, tubular brushes with a long handle made of twisted metal wire that could fit inside the neck of a skinny bottle—to grab onto clumps of a sedated bear’s fur. They also have the option of applying a two-part epoxy to the bottle brushes to help hold the bear’s fur more securely. Scientists and wildlife managers can use the brushes to firmly attach a triangular plate that contains a tracking device between the animal’s shoulder blades. In tests, the researchers have sedated the animals before attaching the trackers, but some zoos are training their bears to accept the tags while fully alert.

“It’s like a burr: You twist and entangle the fur in the bottle brush, then bend over the handle so it doesn’t untwist,” Jon says. “We do that on three sides and put a little protective cap over it so it’s less likely to get snagged on willows and brush and other things that bears walk through.”

The other option draws inspiration from the process hair stylists use to attach hair extensions to their human clients’ heads. This pentagonal design involves extending a loop of a fishing leader down through five metal ferrules, or tubes; lassoing some hair on a sedated polar bear; and pulling it back through. Scientists can then use pliers to squeeze and crimp the hair in place.

Researchers are testing both devices on wild bears in Churchill, Manitoba, and on bears housed at zoos and aquariums. The verdict is still out on which option is better, and Polar Bears International expects the testing phase to last several more years. Ultimately, by making design modifications based on their experimental learnings, they hope to tweak the devices so they will stick to the bears’ fur for at least 270 days, which is the lifespan of the tracking devices themselves.

But even if they can’t get the sticky devices to stay attached to bears for the full 270 days, the gadgets will still be useful for gathering some amount of data on males and young bears, which is currently lacking. They’re also promising for short-term tracking situations, such as “when a bear has entered a community, been captured and released, and we want to monitor the animal to ensure it doesn’t re-enter the community,” says B.J.

“Bear-dar” detection systems

Radar Tower For Detecting Polar Bears
Scientists are testing several radar systems designed to detect approaching polar bears. Erinn Hermsen / Polar Bears International
When humans and polar bears meet, the encounters can often end in tragedy—for either the bear, the human or both. Conflict doesn’t happen often, but global warming is complicating the issue. Because climate change is causing sea ice to form later in the fall and melt earlier in the spring, the bears are fasting longer. And, with nowhere else to go, they’re also spending more time on land in the Arctic, where an estimated four million humans live. Some are even seeking out easy calories from garbage dumps or piles of butchered whale remains.

Scientists counted 73 reports of wild polar bears attacking humans around the world between 1870 to 2014, which resulted in 20 human deaths and 63 human injuries. (They didn’t include bear outcomes in the study.) After analyzing the encounters, researchers determined that thin or skinny adult male bears in below-average body condition posed the greatest threats to humans. Female bears, meanwhile, rarely attacked and typically only did so while defending their cubs.

To prevent human-bear encounters, scientists are developing early-warning radar detection systems they’ve nicknamed “bear-dar” to help alert northern communities when a bear is getting close. A handful of promising prototypes are in the works: Some teams of researchers are building the systems from scratch, while others are riffing off technologies that are already in use by the military. They all use artificial intelligence models that may be able to discern approaching bears. Scientists have tested the systems in Churchill, Manitoba, and are now tweaking the A.I. models to be more accurate.

“We’ve already established that the radar sees everything,” B.J. Kirschhoffer says in a statement. “Being able to see is not the problem. Filtering out the noise is the problem. … Ideally, we can train them to identify polar bears with a high degree of certainty.”

As the systems are still in testing, they do not alert members of the community or professional responders. But, eventually, communities may develop custom responses depending on the alerts, says Kirschhoffer.

“For instance, if a bear-like target is identified 200 meters out, send a text message,” he says. “If a bear-like target is identified 50 meters out, blink a red light and sound a siren.”

Synthetic aperture radar

Scientists are highly interested in polar bear dens—that is, the cozy nooks female bears dig under the snow to give birth to cubs—for several reasons. Denning, which occurs each year from December to early April, is the most vulnerable time in the life of youngsters and mothers. Though they’re accustomed to covering huge amounts of territory to find prey, mother bears hunker down for the entire denning period to protect their cubs from the Arctic elements and predators. Studying bears at den sites allows researchers to gather important behavioral and population insights, such as the body condition of mothers and cubs or how long they spend inside the den before emerging.

Scientists also want to know where dens are located because oil and gas companies can inadvertently disturb the dens—and, thus, potentially harm the bears—when they search for new sources of fossil fuels. If researchers and land managers know where polar bear dens are located, they can tell energy companies to steer clear.

But finding polar bear dens on the snowy, white, blustery tundra is a lot like finding a needle in a haystack. Historically, scientists have used low-tech methods to find dens, such as heading out on cross-country skis with a pair of binoculars or using dogs to sniff them out. But those options were often inefficient and ineffective, not to mention rough on the researchers. For the last few years, scientists have been using a technology known as forward-looking infrared imagery, or FLIR, which involves using heat-sensing cameras attached to an aircraft to detect the warm bodies of bears under the snow. But FLIR is finicky and only works in near-perfect weather—too much wind, sun or blowing snow basically renders it useless. What’s more, if the den roof is too thick, the technology can’t pick up the heat inside. Tom Smith, a plant and wildlife scientist at Brigham Young University, estimates that aerial FLIR surveys are 45 percent effective, which is far from ideal.

But a promising new technology is on the horizon: synthetic aperture radar (SAR). Affixed to an aircraft, SAR is a sophisticated remote-sensing technology that sends out electromagnetic waves, then records the bounce back, to produce a radar image of the landscape below. SAR is not constrained by the same weather-related issues as FLIR, and it can capture a huge swath of land, up to half a mile wide, at a time, according to Smith.

Scientists are still testing SAR, but, in theory, they hope to use it to create a baseline map of an area during the summer or early fall, then do another flyover during denning season. They can then compare the two images to see what’s changed.

“You can imagine, with massive computing power, it goes through and says, ‘These objects were not in this image before,’” says Smith.

Artificial intelligence

Getting an accurate headcount of polar bears over time gives scientists valuable insights into the species’ well-being amid environmental changes spurred by climate change. But polar bears roam far and wide, traveling across huge expanses of sea ice and rugged, hard-to-reach terrain in very cold environments, which makes it challenging, as well as potentially dangerous and expensive, for scientists to try to count them in the field. As a result, researchers have taken to the skies, looking for the bears while aboard aircraft or via satellites flying over their habitat. After snapping thousands of aerial photos or satellite images taken from space, they can painstakingly pore over the pictures in search of bears.

A.I. may eventually help them count the animals. Scientists are now training A.I. models to quickly and accurately recognize polar bears, as well as other species of marine mammals, in photos captured from above. For researchers who conduct aerial surveys, which produce hundreds of thousands of photos that scientists sift through, this new technology is a game-changer.

“If you’re spending eight hours a day looking through images, the amount of attention that a human brain is going to pay to those images is going to fluctuate, whereas when you have a computer do something … it’s going to do that consistently,” Erin Moreland, a research zoologist with the National Oceanic and Atmospheric Administration, told Alaska Public Media’s Casey Grove in 2020. “People are good at this, but they’re not as good at it as a machine, and it’s not necessarily the best use of a human mind.”

To that same end, researchers are also now testing whether drones work to capture high-resolution images and gather other relevant data. Since they don’t require onboard human pilots, drones are a safer, more affordable alternative to helicopters; they’re also smaller and nimbler, and tend to be less disruptive to wildlife.

Treadmill and swim chamber

Researchers want to understand how much polar bears exert themselves while walking across the tundra or swimming through the Arctic Ocean. To get a handle on the marine mammals’ energy output on land, Anthony Pagano, a biologist with the United States Geological Survey, built a special heavy-duty polar bear treadmill. Study collaborators at the San Diego Zoo and the Oregon Zoo then trained captive polar bears to walk on it. Using shatterproof plastic and reinforced steel, the team constructed a 10-foot-long chamber that encased a treadmill typically used by horses. The 4,400-pound contraption also included a circular opening where researchers could tempt the bears into walking with fish and other tasty treats.

As a follow-up to the walking study, Pagano and biologists at the Oregon Zoo also measured the energy output of the bears while swimming. To do so, they developed a polar bear-sized swim chamber, complete with a small motor that generated waves to simulate the conditions the bears might encounter in the ocean.

Together, the two technologies helped scientists learn that bears expend more energy swimming than walking. Polar bears are good swimmers, but they’re not very efficient ones, thanks to their relatively short arms, their non-aerodynamic body shape and their propensity for swimming at the water’s surface, where drag is greatest. In a world with shrinking sea ice, polar bears likely need to swim more to find food and, thus, will burn precious calories, which could cause them to lose weight and lower their chances of reproducing—decreasing the species’ chances of survival.

Together, these and other technologies are helping researchers learn how polar bears are faring as the climate evolves. This knowledge, in turn, informs conservation decisions to help protect the bears and their environment—and the health of the planet more broadly.

“We need to understand more about how the Arctic ecosystem is changing and how polar bears are responding to loss of habitat if we are going to keep them in the wild,” says B.J. Kirschhoffer. “Ultimately, our fate is tied to the polar bear’s. Whatever actions we take to help polar bears keep their sea ice habitat intact are actions that will help humans protect our own future.”

,

Read More »



How A.I. Helps Protect Ecosystems In The Galápagos Islands

Via Fortune, a look at how A.I. is helping protect ecosystems in the Galápagos Islands:

Within the Galápagos Marine Reserve, one of the world’s largest and most biologically diverse marine protected areas, more than 2,900 species survive off the interconnectedness of healthy ecosystems. Globally, humans also live off the many benefits of a thriving ocean.

Technology underwater is increasing to safeguard the environments needed for the ocean to release the earth’s oxygen, sequester carbon, provide sustenance, and so much more. And data is essential to track the development of the ecosystems as threats, such as illegal fishing and climate change, fluctuate. Collecting and analyzing ocean data at a large scale—including that of rare and endangered species—gives researchers useful insight to help conserve the vital diversity that makes up such a valued part of the earth.

“Wildlife research has, until the past few years, been a world of ‘small data,’” says Jason Holmberg, executive director at Wildbook Image Analysis (WBIA), a tech company that builds open software and A.I. solutions for conservationists. “Observations of rare or endangered species, whether via collars, camera traps, DNA samples, or photos, have been expensive to obtain and often required research teams to go directly into the field for short durations to obtain even small volumes of data.”

But with the use of A.I., the data and capabilities are expanding.

Observing whale sharks through passive tracking
In 2016, the International Union for Conservation of Nature (IUCN) downgraded whale sharks from vulnerable to endangered because of the many anthropogenic impacts they face, including industrial fisheries (both targeted and bycatch), vessel strikes, marine pollution, and climate change.

Whale sharks, which can grow up to 60 feet long, provide significant ecological importance, according to Sofia Green, marine biologist and research scientist at the Galápagos Whale Shark Project, who studies the behavior, ecology, and migration patterns of the species. As predators on the top of the food chain, whale sharks feed off the weak and sick. By doing so, they keep the food chain in order. They also serve as carbon sequesters and nutrient-dense fertilizers by bringing nutrients (via defecation) to unhealthy ecosystems.

“If you protect sharks, you protect the ocean. And if you protect the ocean, you have a healthy planet,” she explains, while emphasizing the incredible importance of preserving the existence of all sharks. “And you cannot save a species if you don’t understand how they behave.”

In the northernmost point of the Galápagos, near the island of Darwin, Green and her team tag, video, photograph, do ultrasounds and blood draws, and take tissue samples to track the health and whereabouts of whale sharks that migrate through the Galápagos reserve. Observations like this are extremely valuable. But when made from limited studies, such as these, they may provide information at a small scale.

This is where technology like “passive tracking”—via photo identification and A.I.—broadens the data set. Whale sharks have unique spots, like human thumbprints, that can be referenced as identification. The Galápagos Whale Shark Project uses sharkbook.ai, a system run by WBIA that taps into A.I. to aggregate images of individual whale sharks uploaded by the likes of Green as well as images and videos posted on social media platforms around the world.

“Consider how whale sharks, the world’s biggest fish, were once rarely observed and even less frequently photographed,” Holmberg says. With the advent of cell phones, numerous sightings show up on YouTube and Instagram. “With this new wealth of data emerging from more cost-effective technology and public observations of wildlife, there is now an overwhelming amount of wildlife data available.”

A.I. helps sort and curate images and videos. Then sharkbook.ai classifies each whale shark based on the spot patterns and uses the photos for “capture/recapture.” For Green, this information shows where sharks appear elsewhere on the planet, when the individual returns to Galápagos, or if one has ended up on land or in an area it typically wouldn’t go (usually a result of being illegally fished).

“Modified growth algorithms like this are used by NASA to track stars,” Green explains. “It’s now being used to track the underwater constellations found on the bodies of these whale sharks.” Before this technology, researchers would obtain identification only from sharks seen on their expeditions. Through new, more expansive data collection, they now track almost 700 identified whale sharks that have migrated through the Galápagos.

When a photo is inputted for the first time, it goes into a newly created profile page. There, a summary of its sightings will build over time based on contributions. “All of this is aimed at a single goal,” Holmberg says: “Identifying the best ways to prevent extinction.”

A.I. to resolve the Galápagos “longline dilemma”
A threat to whale sharks and other species within the Galápagos Marine Reserve is the illegal longline fishing of tuna, which was banned in 2000 as a precautionary measure to prevent unintentional bycatch of endangered, threatened, and protected (ETP) species. However, it is still a common practice. Over the past two decades, people in the Galápagos have debated longline as the biggest threat to biodiversity. As a result, management authorities and environmental organizations hope to maintain the ban. On the other hand, local fishers argue that longline fishing should be authorized because it is the most cost-efficient way to catch tuna.

According to a study done by marine biologist Mauricio Castrejón, Ph.D., as the local population (approximately 30,000 people spread across three habitable islands) grows, so does tuna consumption. Between 1997 and 2017, yellowfin tuna landings increased from 41.1 tons to 196.8 tons per year. Castrejón, who has led small-scale fisheries research and development projects in the Galápagos and the Eastern Tropical Pacific region (Costa Rica, Panama, Colombia, and Ecuador) for more than 18 years, dubs this fishing debate the “longline dilemma.”

This technique of fishing trails a long line with numerous hooks behind a boat. If not properly monitored, it can result in the unintended bycatch of ETP species. Many believe the rate of bycatch isn’t as high as reported or believed. And as of now, Galápagos doesn’t have proper data collection to track and survey the true rate of bycatch. Through science, technology, and innovation, Castrejón is hoping to build better monitoring and practices, and end the debate.

One such pathway is installing video cameras created by Shellcatch onto fishing vessels, an initiative that began in 2021 after being chosen by the Charles Darwin Foundation and WildAid. The cameras capture high-resolution information of fishing activities (technique, bycatch rate, and more) to hold fishers accountable for how and what they are catching, which creates market incentives, like selling their catch at a premium.

Shellcatch uses an A.I. algorithm through its electronic monitoring sensor to quickly detect bycatch of nontarget species. “A.I. is critical in validating sustainable fishing practices and connecting supply and demand in a cost-effective way,” says Alfredo Sfeir, founder and president of Shellcatch. He also cofounded Frescapesca, a seafood marketplace that also uses A.I. With Shellcatch, fishers input data into their logbooks (per usual practice), and scientists/A.I. use the video and accumulated data to validate the information. This proves to consumers that their product is fresh, legal, sustainable, and has a low incidental catch of protected species. Hence, earning a premium market price.

Sfeir believes these technologies, including Frescapesca, will be a game changer. “With A.I., the online seafood market platform can convert a growing number of fishing events into data that optimizes logistics and creates higher margin transactions,” he says. “The end result is a growing number of fishermen and women supplying seafood and buyers ready to purchase when vessels arrive with product anywhere along the shoreline.”

Furthermore, this new data can be integrated into science-based evidence and advice to better validate the best technique, risk-averse gear, and solutions to protect other species and keep tuna fishing sustainable for the people of Galápagos and beyond.

,

Read More »



Earning Its Stripes: Tech Used To Crack Tiger Trade

Via Terra Daily, an article on the use of artificial intelligence to help stop poaching:

In a town in northeastern Scotland, Debbie Banks looks for clues to track down criminals as she clicks through a database of tiger skins.

There are thousands of photographs, including of rugs, carcasses and taxidermy specimens.

Banks, the crime campaign leader for the Environmental Investigation Agency (EIA), a London-based charity, tries to identify individual big cats from their stripes.

Once a tiger is identified, an investigator can pinpoint where it comes from.

“A tiger’s stripes are as unique as human fingerprints,” Banks told AFP.

“We can use the images to cross-reference against images of captive tigers that might have been farmed.”

Currently this is slow painstaking work.

But a new artificial intelligence tool, being developed by The Alan Turing Institute, a centre in the UK for data science and artificial intelligence, should make life much easier for Banks and law enforcement officials.

The project aims to develop and test AI technology that can analyse the tigers’ stripes in order to identify them.

“We have a database of images of tigers that have been offered for sale or have been seized,” Banks said.

“When our investigators get new images, we need to scan those against the database.

“At the moment we are doing that manually, looking at the individual stripe patterns of each new image that we get and cross-referencing it against the ones we have in our database.”

It is hoped that the new technology will help law enforcement agencies determine where tiger skins come from and allow them to investigate the transnational networks involved in trafficking tigers.

Once the officials know the origins of confiscated tiger skins and products, they will be able to tell whether the animal was farmed or poached from a protected area.

Poaching, fuelled by consumer demand, remains a major threat to the survival of the species, according to the EIA.

Tiger skins and body parts are sought after, partly due to their use in traditional Chinese medicine.

An estimated 4,500 tigers remain in the wild across Asia.

“Tigers faced a massive population decline in the last 120 years, so we want to do everything we can to help end the trade in their parts and products, including tiger skins,” Banks said.

Anyone with photographs of tigers is invited to submit them to the EIA to help bolster the AI database.

“We are inviting individuals — whether they are photographers or researchers and academics — who may have images of tigers where their stripe patterns are clear,” Banks said.

“They could be live tigers, dead tigers or tiger parts.

“If they can share those with us, the data scientists can then develop, train and test the algorithm,” she said.

“We need thousands of images just to do that phase of the project.”

,

Read More »


ABOUT
Networked Nature
New technical innovations such as location-tracking devices, GPS and satellite communications, remote sensors, laser-imaging technologies, light detection and ranging” (LIDAR) sensing, high-resolution satellite imagery, digital mapping, advanced statistical analytical software and even biotechnology and synthetic biology are revolutionizing conservation in two key ways: first, by revealing the state of our world in unprecedented detail; and, second, by making available more data to more people in more places. The mission of this blog is to track these technical innovations that may give conservation the chance – for the first time – to keep up with, and even get ahead of, the planet’s most intractable environmental challenges. It will also examine the unintended consequences and moral hazards that the use of these new tools may cause.Read More