Via Woods Hole Oceanographic Institution, a look at efforts to create the first coral reef digital twin, a multidisciplinary effort led by Woods Hole Oceanographic Institution:
The National Science Foundation (NSF) has awarded Woods Hole Oceanographic Institution (WHOI) $5 million to participate in NSF’s ground breaking Convergence Accelerator Program. The project, led by WHOI scientist Anne Cohen, builds the world’s first Coral Reef Digital Twin, a 4-dimensional virtual replica of a living coral reef powered by state-of-the art data and models. “Digital Reefs” will be accessible and usable by coral reef stakeholders around the world who are making critical decisions every day to manage and protect these valuable ocean ecosystems.
The Phase 2 team includes: Siemens Technology, The Nature Conservancy (TNC), Scripps Institution of Oceanography at UC San Diego, and Stanford University, in addition to WHOI. Also on the team are Mote Marine Laboratory, the Marshall Islands Conservation Society, University of Guam, National Oceanic and Atmospheric Administration (NOAA), National Academy of Marine Research (NAMR) Taiwan, and Ebiil Society, Palau, whose major role will be to develop user-inspired modules, lead training workshops and advance the use of Digital Reefs in reef restoration.
“Globally, coral reefs support almost one billion people and are one of the most diverse ecosystems on the planet. But coral reefs continue to decline at an unprecedented rate, despite a surge in data, funding, and political will in the last couple of decades,” said Cohen. “We found the problem is not lack of access to data and information. It is a lack of access, by decision makers—fishermen, managers, risk assessors, government agencies—to intuitive, interactive, actionable data. Almost everyone nowadays has a cellphone, there is a lot of data out there, people just can’t use it.”
“Our goal is to facilitate universal access to data and information that right now are available to just a handful of people. Democratization of information is the only way we can truly ensure that coral reefs have a future on our planet, and Digital Reefs is how we get there,” said Cohen.
The 21st century has brought with it unprecedented challenges for coral reefs, mainly from climate change, demanding new and innovative approaches to management, conservation, and restoration. Fundamental to effective decision-making is access to science-based data, information, and decision making tools.
“As reefs around the world suffer, so do the diverse and often vulnerable coastal ecosystems and humans that depend upon them,” said Joe Pollock, Ph.D., Senior Coral Reef Resilience Scientist at TNC. “We work to empower local communities with the tools, information, and partnerships needed to better safeguard reefs and the lives and livelihoods they sustain. Digital Reefs has the potential to revolutionize reef management and restoration by providing fine-scale, actionable information in an immersive, engaging, and highly visual format.”
Digital Twins are already widely used in industry and healthcare, where exact virtual replicas of engines, railway networks, and even human bodies are used to understand and test what-if scenarios, facilitate collaboration amongst different teams, and assist with decision making. The WHOI-led project, Digital Reefs: A Globally Coordinated, Universally Accessible Digital Twin Network for the Coral Reef Blue Economy, will develop the Digital Reefs prototype of Palmyra Atoll, and apply the prototype technology to reefs in the Marshall Islands and Taiwan as a test of their scaling model to build a global Digital Reefs network.
The Coral Reef Digital Twin is a virtual representation of a real reef, with all its features, consistently updated with new data from sensors and satellites. The digital twin allows users access to the dynamic, 3-dimensional system from a laptop or cellphone anywhere in the world to get real-time the information needed for sustainable harvesting of reef resources. Image credit: Cohen Lab © Woods Hole Oceanographic Institution
Digital Reefs translates complex data and numerical model output into a digital replica of a coral reef. Users accessing the Digital Reef from their computer or cell phone will be immersed in a dynamic 4-D visualization of currents and corals, rather than a spreadsheet of numbers, although the supporting data will be downloadable. Users can move across the reef, and scuba dive into its depths to access information about water flow, temperatures, and reef inhabitants. The Digital Reefs platform will offer users the opportunity to visualize the reef in years past, and in the future as sea level rises and the ocean warms. Decision making tools enable users to change the reef —virtually— and examine the impact of the proposed changes, such as building a hotel or dredging a channel for boat access. Stakeholders can visualize how future climate change will affect their reef and the specific areas where they fish and farm. Restoration practitioners will be able to test which areas of the reef are optimal for restoration and to visualize larval dispersal patterns from restored areas.
“We’re in a crisis of survival for coral reefs,” said Stanford’s Steve Palumbi, “and we’re also in a crisis of data complexity. Everything about a reef is complex, so how do you make a decision about saving your local reef for your community and family when the answer is complicated? Digital Reefs takes data from the physics, biology, physiology, and ecology of a complex reef and shows you what its future is likely to be, depending on what you do, and decisions you make. We are all excited about contributing what we do best, to make this effort work.”
“Our work with NSF and WHOI highlights the boundless opportunities we now have available to us through digital twin technology,” said Virginie Maillard, Head of Siemens Technology US. “As we enter the next phase of this project, Siemens Technology will leverage its expertise in industrial digital twin to create a tangible digital twin of the coral reef that can be utilized by all, no matter background or expertise, for the greater purpose of collaboration to save our planet’s marine ecosystem.”
“There is a unique opportunity in offering people immersive access to underwater habitats,” said Stuart Sandin, professor of marine ecology, at UC San Diego’s Scripps Institution of Oceanography. “The growth of digital data is a first step in understanding coral reefs, but the critical next step is in providing easy access to these data; the Digital Reefs project will build the tools to provide just this access.”
In September 2021, the NSF Convergence Accelerator Program launched the 2021 cohort and awarded $21 million across 28 multidisciplinary teams, focused on Phase 1 of the Track E: Networked Blue Economy and Track F: Trust & Authenticity in Communication Systems. In September 2022, NSF has made an additional $30 million investment in the Networked Blue Economy to advance use-inspired solutions addressing national-scale societal challenges. Track E: The Networked Blue Economy aims to create a smart, integrated, connected, and open ecosystem for ocean innovation, exploration, and sustainable utilization. The Track E Cohort was 16 teams in Phase 1 and is now six teams in Phase 2.
This work was funded by a grant from the National Science Foundation, and leveraged previous research that was supported by the Arthur Vining Davis Foundation.
,
Read More »An innovative new app, Seaspotter capitalizes both on AI and citizen scientists to support marine research by allowing anyone to simply take a picture of any marine mammals they encounter. SeaSpotter will then notify you of which animal was spotted, and log the findings to our open-source database ready for scientific research. ,
Read More »Via Smithsonian Magazine, a look at how researchers are using novel technologies to study polar bears, which live in the rapidly warming Arctic:
When they’re born, polar bears are toothless and blind, and they weigh roughly a pound. But over time—thanks to lots of fat-rich milk and protection from their mother—these helpless cubs grow to become large, powerful predators that are perfectly adapted for their Arctic environment. Though temperatures can dip to minus 50 degrees Fahrenheit in the winter, the massive marine mammals—which live in Canada, Norway, Russia, Greenland and Alaska—stay warm with a thick layer of body fat and two coats of fur. Their huge paws help them paddle through the icy water and gently walk across sea ice in search of their favorite meal, seals.
Their size, power, intelligence and environmental adaptions have long intrigued humans living in the north, including many Indigenous communities, such as the Inuit, the Ket and the Sámi. Biologists are curious about Ursus maritimus for many of the same reasons.
“Bears are fascinating,” says B.J. Kirschhoffer, director of conservation technology at Polar Bears International. “For me, when standing on a prominent point overlooking sea ice, I want to know how any animal can make a living in that environment. I am curious about everything that makes them able to grow to be the biggest bear by living in one of the harshest places on this planet. There is still so much to learn about the species—how they use energy, how they navigate their world and how they are responding to a rapidly changing environment.”
Today, researchers and conservationists want to know about these highly specialized marine mammals because human-caused climate change is reshaping their Arctic habitat. The bears spend much of their time on sea ice hunting for seals. But as temperatures in the Arctic rise, sea ice is getting thinner, melting earlier in the spring and forming later in the fall. Pollution and commercial activity also threaten the bears and their environment. An estimated 26,000 polar bears roam the northern reaches of the world, and conservationists worry they could disappear entirely by 2100 because of global warming.
But investigating mostly solitary creatures who spend much of their time wandering around sea ice, in some of the most remote and rugged places on the planet, is expensive, logistically challenging and dangerous to researchers. For help, scientists are turning to technology. These five innovations are changing the way they study polar bears.
Sticky tracking devices
Researchers can twist three black bottle brushes into a sedated bear’s fur to attach a triangular plate equipped with a tracking device. 3M
Much of what scientists know about polar bears comes from tracking female members of the species. This is largely due to anatomical differences between the sexes: Males have small heads and thick necks, which means tracking collars can easily slip right off. Females, on the other hand, have larger heads and thinner necks.Neck collars are out of the question for males, and they’re not ideal for young bears, which can quickly outgrow the devices. Other options—like implants—require the bears to undergo minor surgery, which can be potentially risky to their health. Ear tags don’t require surgery, but they are still invasive. They’re also permanent, and polar bear researchers strive to make as minimal an impact on the bears as possible. How, then, can scientists attach tracking devices to young bears and male polar bears?
This was the challenge put to innovators at 3M, the Minnesota-based company that makes everything from medical devices to cleaning supplies to building materials. 3M is particularly good at making things sticky—its flagship products include Post-it Notes and Scotch Tape.
Jon Kirschhoffer spent his nearly 40-year career at 3M as an industrial designer, developing novel solutions to complex problems just like this one. So when B.J. Kirschhoffer, his son, started chatting about the need for a new, noninvasive way of attaching trackers to polar bears, Jon’s wheels started turning. He brought the problem to his colleagues, who set to work studying polar bear fur and building prototypes.
Crimping Device For Polar Bear Fur
One of the most promising designs draws inspiration from the human process of attaching hair extensions. 3M
In the end, they landed on two promising “burr on fur” approaches. One device uses three bottle brushes—small, tubular brushes with a long handle made of twisted metal wire that could fit inside the neck of a skinny bottle—to grab onto clumps of a sedated bear’s fur. They also have the option of applying a two-part epoxy to the bottle brushes to help hold the bear’s fur more securely. Scientists and wildlife managers can use the brushes to firmly attach a triangular plate that contains a tracking device between the animal’s shoulder blades. In tests, the researchers have sedated the animals before attaching the trackers, but some zoos are training their bears to accept the tags while fully alert.“It’s like a burr: You twist and entangle the fur in the bottle brush, then bend over the handle so it doesn’t untwist,” Jon says. “We do that on three sides and put a little protective cap over it so it’s less likely to get snagged on willows and brush and other things that bears walk through.”
The other option draws inspiration from the process hair stylists use to attach hair extensions to their human clients’ heads. This pentagonal design involves extending a loop of a fishing leader down through five metal ferrules, or tubes; lassoing some hair on a sedated polar bear; and pulling it back through. Scientists can then use pliers to squeeze and crimp the hair in place.
Researchers are testing both devices on wild bears in Churchill, Manitoba, and on bears housed at zoos and aquariums. The verdict is still out on which option is better, and Polar Bears International expects the testing phase to last several more years. Ultimately, by making design modifications based on their experimental learnings, they hope to tweak the devices so they will stick to the bears’ fur for at least 270 days, which is the lifespan of the tracking devices themselves.
But even if they can’t get the sticky devices to stay attached to bears for the full 270 days, the gadgets will still be useful for gathering some amount of data on males and young bears, which is currently lacking. They’re also promising for short-term tracking situations, such as “when a bear has entered a community, been captured and released, and we want to monitor the animal to ensure it doesn’t re-enter the community,” says B.J.
“Bear-dar” detection systems
Radar Tower For Detecting Polar Bears
Scientists are testing several radar systems designed to detect approaching polar bears. Erinn Hermsen / Polar Bears International
When humans and polar bears meet, the encounters can often end in tragedy—for either the bear, the human or both. Conflict doesn’t happen often, but global warming is complicating the issue. Because climate change is causing sea ice to form later in the fall and melt earlier in the spring, the bears are fasting longer. And, with nowhere else to go, they’re also spending more time on land in the Arctic, where an estimated four million humans live. Some are even seeking out easy calories from garbage dumps or piles of butchered whale remains.Scientists counted 73 reports of wild polar bears attacking humans around the world between 1870 to 2014, which resulted in 20 human deaths and 63 human injuries. (They didn’t include bear outcomes in the study.) After analyzing the encounters, researchers determined that thin or skinny adult male bears in below-average body condition posed the greatest threats to humans. Female bears, meanwhile, rarely attacked and typically only did so while defending their cubs.
To prevent human-bear encounters, scientists are developing early-warning radar detection systems they’ve nicknamed “bear-dar” to help alert northern communities when a bear is getting close. A handful of promising prototypes are in the works: Some teams of researchers are building the systems from scratch, while others are riffing off technologies that are already in use by the military. They all use artificial intelligence models that may be able to discern approaching bears. Scientists have tested the systems in Churchill, Manitoba, and are now tweaking the A.I. models to be more accurate.
“We’ve already established that the radar sees everything,” B.J. Kirschhoffer says in a statement. “Being able to see is not the problem. Filtering out the noise is the problem. … Ideally, we can train them to identify polar bears with a high degree of certainty.”
As the systems are still in testing, they do not alert members of the community or professional responders. But, eventually, communities may develop custom responses depending on the alerts, says Kirschhoffer.
“For instance, if a bear-like target is identified 200 meters out, send a text message,” he says. “If a bear-like target is identified 50 meters out, blink a red light and sound a siren.”
Synthetic aperture radar
Scientists are highly interested in polar bear dens—that is, the cozy nooks female bears dig under the snow to give birth to cubs—for several reasons. Denning, which occurs each year from December to early April, is the most vulnerable time in the life of youngsters and mothers. Though they’re accustomed to covering huge amounts of territory to find prey, mother bears hunker down for the entire denning period to protect their cubs from the Arctic elements and predators. Studying bears at den sites allows researchers to gather important behavioral and population insights, such as the body condition of mothers and cubs or how long they spend inside the den before emerging.
Scientists also want to know where dens are located because oil and gas companies can inadvertently disturb the dens—and, thus, potentially harm the bears—when they search for new sources of fossil fuels. If researchers and land managers know where polar bear dens are located, they can tell energy companies to steer clear.
But finding polar bear dens on the snowy, white, blustery tundra is a lot like finding a needle in a haystack. Historically, scientists have used low-tech methods to find dens, such as heading out on cross-country skis with a pair of binoculars or using dogs to sniff them out. But those options were often inefficient and ineffective, not to mention rough on the researchers. For the last few years, scientists have been using a technology known as forward-looking infrared imagery, or FLIR, which involves using heat-sensing cameras attached to an aircraft to detect the warm bodies of bears under the snow. But FLIR is finicky and only works in near-perfect weather—too much wind, sun or blowing snow basically renders it useless. What’s more, if the den roof is too thick, the technology can’t pick up the heat inside. Tom Smith, a plant and wildlife scientist at Brigham Young University, estimates that aerial FLIR surveys are 45 percent effective, which is far from ideal.
But a promising new technology is on the horizon: synthetic aperture radar (SAR). Affixed to an aircraft, SAR is a sophisticated remote-sensing technology that sends out electromagnetic waves, then records the bounce back, to produce a radar image of the landscape below. SAR is not constrained by the same weather-related issues as FLIR, and it can capture a huge swath of land, up to half a mile wide, at a time, according to Smith.
Scientists are still testing SAR, but, in theory, they hope to use it to create a baseline map of an area during the summer or early fall, then do another flyover during denning season. They can then compare the two images to see what’s changed.
“You can imagine, with massive computing power, it goes through and says, ‘These objects were not in this image before,’” says Smith.
Artificial intelligence
Getting an accurate headcount of polar bears over time gives scientists valuable insights into the species’ well-being amid environmental changes spurred by climate change. But polar bears roam far and wide, traveling across huge expanses of sea ice and rugged, hard-to-reach terrain in very cold environments, which makes it challenging, as well as potentially dangerous and expensive, for scientists to try to count them in the field. As a result, researchers have taken to the skies, looking for the bears while aboard aircraft or via satellites flying over their habitat. After snapping thousands of aerial photos or satellite images taken from space, they can painstakingly pore over the pictures in search of bears.
A.I. may eventually help them count the animals. Scientists are now training A.I. models to quickly and accurately recognize polar bears, as well as other species of marine mammals, in photos captured from above. For researchers who conduct aerial surveys, which produce hundreds of thousands of photos that scientists sift through, this new technology is a game-changer.
“If you’re spending eight hours a day looking through images, the amount of attention that a human brain is going to pay to those images is going to fluctuate, whereas when you have a computer do something … it’s going to do that consistently,” Erin Moreland, a research zoologist with the National Oceanic and Atmospheric Administration, told Alaska Public Media’s Casey Grove in 2020. “People are good at this, but they’re not as good at it as a machine, and it’s not necessarily the best use of a human mind.”
To that same end, researchers are also now testing whether drones work to capture high-resolution images and gather other relevant data. Since they don’t require onboard human pilots, drones are a safer, more affordable alternative to helicopters; they’re also smaller and nimbler, and tend to be less disruptive to wildlife.
Treadmill and swim chamber
Researchers want to understand how much polar bears exert themselves while walking across the tundra or swimming through the Arctic Ocean. To get a handle on the marine mammals’ energy output on land, Anthony Pagano, a biologist with the United States Geological Survey, built a special heavy-duty polar bear treadmill. Study collaborators at the San Diego Zoo and the Oregon Zoo then trained captive polar bears to walk on it. Using shatterproof plastic and reinforced steel, the team constructed a 10-foot-long chamber that encased a treadmill typically used by horses. The 4,400-pound contraption also included a circular opening where researchers could tempt the bears into walking with fish and other tasty treats.
As a follow-up to the walking study, Pagano and biologists at the Oregon Zoo also measured the energy output of the bears while swimming. To do so, they developed a polar bear-sized swim chamber, complete with a small motor that generated waves to simulate the conditions the bears might encounter in the ocean.
Together, the two technologies helped scientists learn that bears expend more energy swimming than walking. Polar bears are good swimmers, but they’re not very efficient ones, thanks to their relatively short arms, their non-aerodynamic body shape and their propensity for swimming at the water’s surface, where drag is greatest. In a world with shrinking sea ice, polar bears likely need to swim more to find food and, thus, will burn precious calories, which could cause them to lose weight and lower their chances of reproducing—decreasing the species’ chances of survival.
Together, these and other technologies are helping researchers learn how polar bears are faring as the climate evolves. This knowledge, in turn, informs conservation decisions to help protect the bears and their environment—and the health of the planet more broadly.
“We need to understand more about how the Arctic ecosystem is changing and how polar bears are responding to loss of habitat if we are going to keep them in the wild,” says B.J. Kirschhoffer. “Ultimately, our fate is tied to the polar bear’s. Whatever actions we take to help polar bears keep their sea ice habitat intact are actions that will help humans protect our own future.”
,
Read More »Via Fortune, a look at how A.I. is helping protect ecosystems in the Galápagos Islands:
Within the Galápagos Marine Reserve, one of the world’s largest and most biologically diverse marine protected areas, more than 2,900 species survive off the interconnectedness of healthy ecosystems. Globally, humans also live off the many benefits of a thriving ocean.
Technology underwater is increasing to safeguard the environments needed for the ocean to release the earth’s oxygen, sequester carbon, provide sustenance, and so much more. And data is essential to track the development of the ecosystems as threats, such as illegal fishing and climate change, fluctuate. Collecting and analyzing ocean data at a large scale—including that of rare and endangered species—gives researchers useful insight to help conserve the vital diversity that makes up such a valued part of the earth.
“Wildlife research has, until the past few years, been a world of ‘small data,’” says Jason Holmberg, executive director at Wildbook Image Analysis (WBIA), a tech company that builds open software and A.I. solutions for conservationists. “Observations of rare or endangered species, whether via collars, camera traps, DNA samples, or photos, have been expensive to obtain and often required research teams to go directly into the field for short durations to obtain even small volumes of data.”
But with the use of A.I., the data and capabilities are expanding.
Observing whale sharks through passive tracking
In 2016, the International Union for Conservation of Nature (IUCN) downgraded whale sharks from vulnerable to endangered because of the many anthropogenic impacts they face, including industrial fisheries (both targeted and bycatch), vessel strikes, marine pollution, and climate change.Whale sharks, which can grow up to 60 feet long, provide significant ecological importance, according to Sofia Green, marine biologist and research scientist at the Galápagos Whale Shark Project, who studies the behavior, ecology, and migration patterns of the species. As predators on the top of the food chain, whale sharks feed off the weak and sick. By doing so, they keep the food chain in order. They also serve as carbon sequesters and nutrient-dense fertilizers by bringing nutrients (via defecation) to unhealthy ecosystems.
“If you protect sharks, you protect the ocean. And if you protect the ocean, you have a healthy planet,” she explains, while emphasizing the incredible importance of preserving the existence of all sharks. “And you cannot save a species if you don’t understand how they behave.”
In the northernmost point of the Galápagos, near the island of Darwin, Green and her team tag, video, photograph, do ultrasounds and blood draws, and take tissue samples to track the health and whereabouts of whale sharks that migrate through the Galápagos reserve. Observations like this are extremely valuable. But when made from limited studies, such as these, they may provide information at a small scale.
This is where technology like “passive tracking”—via photo identification and A.I.—broadens the data set. Whale sharks have unique spots, like human thumbprints, that can be referenced as identification. The Galápagos Whale Shark Project uses sharkbook.ai, a system run by WBIA that taps into A.I. to aggregate images of individual whale sharks uploaded by the likes of Green as well as images and videos posted on social media platforms around the world.
“Consider how whale sharks, the world’s biggest fish, were once rarely observed and even less frequently photographed,” Holmberg says. With the advent of cell phones, numerous sightings show up on YouTube and Instagram. “With this new wealth of data emerging from more cost-effective technology and public observations of wildlife, there is now an overwhelming amount of wildlife data available.”
A.I. helps sort and curate images and videos. Then sharkbook.ai classifies each whale shark based on the spot patterns and uses the photos for “capture/recapture.” For Green, this information shows where sharks appear elsewhere on the planet, when the individual returns to Galápagos, or if one has ended up on land or in an area it typically wouldn’t go (usually a result of being illegally fished).
“Modified growth algorithms like this are used by NASA to track stars,” Green explains. “It’s now being used to track the underwater constellations found on the bodies of these whale sharks.” Before this technology, researchers would obtain identification only from sharks seen on their expeditions. Through new, more expansive data collection, they now track almost 700 identified whale sharks that have migrated through the Galápagos.
When a photo is inputted for the first time, it goes into a newly created profile page. There, a summary of its sightings will build over time based on contributions. “All of this is aimed at a single goal,” Holmberg says: “Identifying the best ways to prevent extinction.”
A.I. to resolve the Galápagos “longline dilemma”
A threat to whale sharks and other species within the Galápagos Marine Reserve is the illegal longline fishing of tuna, which was banned in 2000 as a precautionary measure to prevent unintentional bycatch of endangered, threatened, and protected (ETP) species. However, it is still a common practice. Over the past two decades, people in the Galápagos have debated longline as the biggest threat to biodiversity. As a result, management authorities and environmental organizations hope to maintain the ban. On the other hand, local fishers argue that longline fishing should be authorized because it is the most cost-efficient way to catch tuna.According to a study done by marine biologist Mauricio Castrejón, Ph.D., as the local population (approximately 30,000 people spread across three habitable islands) grows, so does tuna consumption. Between 1997 and 2017, yellowfin tuna landings increased from 41.1 tons to 196.8 tons per year. Castrejón, who has led small-scale fisheries research and development projects in the Galápagos and the Eastern Tropical Pacific region (Costa Rica, Panama, Colombia, and Ecuador) for more than 18 years, dubs this fishing debate the “longline dilemma.”
This technique of fishing trails a long line with numerous hooks behind a boat. If not properly monitored, it can result in the unintended bycatch of ETP species. Many believe the rate of bycatch isn’t as high as reported or believed. And as of now, Galápagos doesn’t have proper data collection to track and survey the true rate of bycatch. Through science, technology, and innovation, Castrejón is hoping to build better monitoring and practices, and end the debate.
One such pathway is installing video cameras created by Shellcatch onto fishing vessels, an initiative that began in 2021 after being chosen by the Charles Darwin Foundation and WildAid. The cameras capture high-resolution information of fishing activities (technique, bycatch rate, and more) to hold fishers accountable for how and what they are catching, which creates market incentives, like selling their catch at a premium.
Shellcatch uses an A.I. algorithm through its electronic monitoring sensor to quickly detect bycatch of nontarget species. “A.I. is critical in validating sustainable fishing practices and connecting supply and demand in a cost-effective way,” says Alfredo Sfeir, founder and president of Shellcatch. He also cofounded Frescapesca, a seafood marketplace that also uses A.I. With Shellcatch, fishers input data into their logbooks (per usual practice), and scientists/A.I. use the video and accumulated data to validate the information. This proves to consumers that their product is fresh, legal, sustainable, and has a low incidental catch of protected species. Hence, earning a premium market price.
Sfeir believes these technologies, including Frescapesca, will be a game changer. “With A.I., the online seafood market platform can convert a growing number of fishing events into data that optimizes logistics and creates higher margin transactions,” he says. “The end result is a growing number of fishermen and women supplying seafood and buyers ready to purchase when vessels arrive with product anywhere along the shoreline.”
Furthermore, this new data can be integrated into science-based evidence and advice to better validate the best technique, risk-averse gear, and solutions to protect other species and keep tuna fishing sustainable for the people of Galápagos and beyond.
,
Read More »Via Terra Daily, an article on the use of artificial intelligence to help stop poaching:
In a town in northeastern Scotland, Debbie Banks looks for clues to track down criminals as she clicks through a database of tiger skins.
There are thousands of photographs, including of rugs, carcasses and taxidermy specimens.
Banks, the crime campaign leader for the Environmental Investigation Agency (EIA), a London-based charity, tries to identify individual big cats from their stripes.
Once a tiger is identified, an investigator can pinpoint where it comes from.
“A tiger’s stripes are as unique as human fingerprints,” Banks told AFP.
“We can use the images to cross-reference against images of captive tigers that might have been farmed.”
Currently this is slow painstaking work.
But a new artificial intelligence tool, being developed by The Alan Turing Institute, a centre in the UK for data science and artificial intelligence, should make life much easier for Banks and law enforcement officials.
The project aims to develop and test AI technology that can analyse the tigers’ stripes in order to identify them.
“We have a database of images of tigers that have been offered for sale or have been seized,” Banks said.
“When our investigators get new images, we need to scan those against the database.
“At the moment we are doing that manually, looking at the individual stripe patterns of each new image that we get and cross-referencing it against the ones we have in our database.”
It is hoped that the new technology will help law enforcement agencies determine where tiger skins come from and allow them to investigate the transnational networks involved in trafficking tigers.
Once the officials know the origins of confiscated tiger skins and products, they will be able to tell whether the animal was farmed or poached from a protected area.
Poaching, fuelled by consumer demand, remains a major threat to the survival of the species, according to the EIA.
Tiger skins and body parts are sought after, partly due to their use in traditional Chinese medicine.
An estimated 4,500 tigers remain in the wild across Asia.
“Tigers faced a massive population decline in the last 120 years, so we want to do everything we can to help end the trade in their parts and products, including tiger skins,” Banks said.
Anyone with photographs of tigers is invited to submit them to the EIA to help bolster the AI database.
“We are inviting individuals — whether they are photographers or researchers and academics — who may have images of tigers where their stripe patterns are clear,” Banks said.
“They could be live tigers, dead tigers or tiger parts.
“If they can share those with us, the data scientists can then develop, train and test the algorithm,” she said.
“We need thousands of images just to do that phase of the project.”
,
Read More »Via Fast Company, an article on a new tool from Google that shows how the planet is changing in near real time:
The planet changes quickly: More than half a million acres are burning in New Mexico. A megadrought is shrinking Lake Mead. The Alps are turning from white to green. Development continues to expand, from cities to massive solar farms. All of these changes impact the Earth’s climate and biodiversity. But in the past, such changes have been difficult to track in detail as they’re happening.
A new tool from Google Earth Engine and the nonprofit World Resources Institute pulls from satellite data to build detailed maps in near real time. Called Dynamic World, it zooms in on the planet in 10-by-10-meter squares from satellite images collected every two to five days. The program uses artificial intelligence to classify each pixel based on nine categories that range from bare ground to trees, crops, and buildings.
Researchers, nonprofits, and other users can “explore and track and monitor changes in these terrestrial ecosystems over time,” says Tanya Birch, senior program manager for Google Earth Outreach. As the tool was being built last year, Birch used it in the days after the Caldor Fire, a wildfire that burned more than 200,000 acres in California. The pixels in satellite images quickly changed from being classified as “trees” to “shrub and scrub.”
Scientists used to rely on statistical tables that were sometimes released only every five years, says Fred Stolle, deputy director of the World Resources Institute’s Forests Program. “That’s clearly not good enough anymore,” he says. “We’re changing so fast, and the impact is so fast, that satellites are now the way to go.”
Researchers and planners already use satellite data in some applications—the World Resources Institute, for example, previously worked with Google to build Global Forest Watch, a tool that can track deforestation using satellite images. But the new data is much more detailed; now it’s sometimes possible to see if one or two trees are cut down in a tropical forest, even when a larger area is intact, Stolle says.
In cities, planners could use the data to easily see which neighborhoods don’t have enough green space. Researchers studying smallholder farms in Africa could use it to see the impacts of drought and when crops are being harvested. Because the data is continuously updated, it’s also possible to watch the seasons change throughout the year across the entire planet. The data goes back five years, and using the new tool, anyone can enter date ranges to see how a location has changed over time.
“I encourage people to dive into it and explore,” Birch says. “There’s a lot of depth and a lot of richness in Dynamic World. . . . I feel like this is really pushing the frontier of mapmaking powered by AI in an incredibly novel way.”
,
Read More »