An AI-Powered Robot and Gaming Are Helping Scientists Identify New Deep-Sea Species

Via Bloomberg, an article on how an AI-powered robot and gaming are helping scientists identify new deep-sea species:

Nearly a dozen miles off the California coast on a foggy October morning, a crane lifts a boxy yellow robot off the deck of the research vessel Rachel Carson and lowers it into Monterey Bay’s choppy gunmetal-gray waters. The remotely operated vehicle bristling with cameras and lights remains tethered to the ship by an unspooling cable but artificial intelligence has given it a mind of its own.

Descending to a depth of 200 meters (656 feet), the robot called MiniROV beams back images of a rarely seen jellyfish to a wall of video screens lining a cramped control room aboard the ship. Monterey Bay Aquarium Research Institute principal engineer Kakani Katija lightly grips a pair of joysticks, maneuvering MiniROV closer to the tiny and translucent critter swimming through a marine snowstorm of organic particles falling to the seafloor.

Then senior electrical engineer Paul Roberts presses a key on a laptop and announces, “We’ve started agent tracking.”

The MiniROV aboard the RV Rachel Carson off the California Coast in October 2024.
The MiniROV aboard the RV Rachel Carson off the California Coast in October 2024.
The MiniROV is lowered into the Pacific Ocean.
The MiniROV is lowered into the Pacific Ocean.
The MiniROV in the water, tethered to the Rachel Carson.
The MiniROV in the water, tethered to the Rachel Carson.

The “agent” is an AI program integrated into MiniROV’s control algorithms. It’s being deployed for the first time in the ocean, allowing the robot to autonomously locate and track marine organisms. Katija takes her hands off the joysticks and smiles as the robot begins following the square-shaped jellyfish of its own volition, firing thrusters to maintain pace with the animal as it rapidly swims away. MiniROV keeps up the chase for five minutes before the agent disengages after losing the trail.

“The goal is to track individual animals for up to 24 hours so we can answer questions about the animal’s behavior and ecology,” says Katija. “Imagine having autonomous vehicles that are constantly monitoring the ocean, changing behavior when it comes across something that we aren’t familiar with or haven’t seen before.”

To build that AI-powered future, MBARI researchers are enlisting citizen scientists to play a game that helps train the machines at a rate much faster than a small group of researchers ever could. The need for speed is clear: The ocean is the world’s bulwark against climate change and marine organisms play key roles in cycling carbon out of the atmosphere.

“We’re relying on these biological systems to help address our climate challenges, but there’s a massive gap in our understanding of these animals,” Katija says.

A portrait of Kakani Katija aboard the Rachel Carson.
Kakani Katija aboard the Rachel Carson.

Thanks to advances in robotics, underwater cameras and sensor technology, researchers have accumulated millions of images of deep-sea life, yet most creatures remain unknown. Only about 8% of the 5,580 species detected in the deep ocean targeted for seabed mining, for instance, have been identified. That’s because classifying unknown organisms can be a years-long endeavor constrained by the availability of overtaxed taxonomists. The jellyfish that MiniROV is tracking was first discovered in 1990 but wasn’t identified until 2003 as a new species, Stellamedusa ventana.

The game for phones and tablets — dubbed FathomVerse — populates a virtual ocean with images of marine critters in their deep-sea habitats stored in a sprawling database known as FathomNet. Some photos are of ocean animals whose identity has been verified by scientists. Others are organisms labeled by the AI or that have yet to be classified.

Players first embark on training dives where they’re taught to distinguish the characteristics of 47 different types of marine animals. Some are familiar — jellyfish and octopuses — while others, such as a predatory pelagic worm called a chaetognath, are positively extraterrestrial looking.

One of the cameras on the MiniROV.
One of the cameras on the MiniROV.
An MBARI researcher controls the camera on the MiniROV from the control room.
An MBARI researcher controls the camera on the MiniROV from the control room.
MBARI scientists monitor the MiniROV from the control room.
MBARI researchers work with the MiniROV from the control room.

Once players are trained up as amateur marine biologists, they take on missions drifting along ocean currents looking for pulsating dots that indicate where marine life has been recorded. Players tap the screen to see the animal and identify if it’s familiar or tag it as unknown. The game then reveals whether their choices match the consensus of other players or if the creature remains undetermined. They win points for correct classifications as well as the number of organisms they spot. Players also score bonus points for correctly labeling a previously unidentified life form when consensus is reached. When players accumulate enough points, they move up to the next of the game’s 15 levels.

On the backend, researchers verify the players’ consensus findings and compare them to the AI’s classifications. The collaboration between players and researchers allows the scientists to “train better and better algorithms” that can operate in the real-world ocean under a variety of conditions, says Katija, who leads MBARI’s Bioinspiration Lab, which develops imaging technology to observe marine life.

Chris Jackett, a research scientist at Australia’s Commonwealth Scientific and Industrial Research Organisation (CSIRO), says games like FathomVerse can play a significant role in training AI. “The human ability to recognize patterns and identify unusual features still remains unmatched, and having multiple observers examine the same imagery helps build robust training datasets,” says Jackett, who works on AI detection and identification of corals and other marine species.

He notes that human annotation of deep-sea images is particularly important for training as organisms can look different under various ocean conditions, confounding AI. Light levels, water clarity, camera angle and turbulence all alter animals’ appearance.

The value of a pair of human eyes becomes apparent as you play FathomVerse and encounter a vast array of otherworldly but real lifeforms that look like nothing found on land.

 A virtual ocean with images of marine critters on the Fathomverse game.
A virtual ocean with images of marine critters on the Fathomverse game. Source: Fathomverse

Asking AI to figure out whether that elongated bioluminescent blob is a physonect siphonophore, a larvacean or some deep-sea denizen never seen before is like asking it to identify the alien species in the Star Wars bar scene.

Human eyes — especially many sets of them — have a far better chance at landing on the right answer, at least until AI becomes far more proficient. For instance, AI was stumped by an orange critter with yellow spots, green eyes and little bristles that look like three-day stubble. Players eventually determined it was a species of bony fish.

A healthy dose of competition and game rewards are also distinctly human incentives driving engagement. “I was up half the night playing,” wrote one enthusiast in a post on the game’s Discord server, a platform where players discuss their finds and trade tips.

So, too, does an appreciation of nature and a drive to detect new life forms. “Every time I discover something I’ve never seen before, and [it] has beauty and complexity, it blows my mind” posted a player while another so-called Fathomnaut wrote on Discord that they were “SO EXCITED because I’m absolutely obsessed with cephalopods.”

Nearly 17,500 people have downloaded FathomVerse, and as of Jan. 21, players had reached consensus on 47,799 unidentified images of marine organisms. That represents about 14% of the labeled images in the FathomNet database.

The players’ contributions will train AI underwater bots to recognize different animals on their deep sea missions. For instance, the AI has consistently mistaken black coral, anemones and other animals for sea fans, a type of coral. The players’ findings will retrain the AI to correctly identify those organisms.

The images of marine life that AI-enabled robots gather in the future will be added to FathomNet and the game, ensuring players have a reason to keep coming back.

“I see a huge potential for this kind of gamification approach to AI,” says Linda See, a principal research scholar at the International Institute for Applied Systems Analysis, a global research institute based in Austria that has created several citizen science gaming apps. Game players have helped create datasets used to improve tree cover maps and precipitation prediction. Some players have contributed so much data that they’ve been named as co-authors on scientific papers, she adds.

In the days before MiniROV’s test deployment in October, researchers taught its AI to tail jellyfish by having the robot chase fake ones made of bottle caps and zip ties around a 10-meter (30-foot) deep tank at MBARI’s laboratory in Moss Landing, California.

It’s apparently a fast learner. In the ocean, it locks on and tracks Stellamedusa ventana jellyfish, which are less than 10 inches long, as well as a giant larvacean, a gelatinous organism that surrounds its body with a three-foot-wide “snot palace” made of mucus to trap its dinner.

MiniROV proves distractable, though. The robot breaks off following one Stella to track another after the two jellyfish cross paths. “That’s something we expect partly because I don’t think we’ve figured out how to reward it for staying on the same object,” says Katija.

When the robot remains locked on an organism while maintaining a proper distance for at least a minute it receives algorithmic “points” to reinforce the behavior. If it strays away from the target by, say, misfiring its thrusters, it loses points.

“These AI-enabled vehicles could provide valuable insights into daily movement patterns, feeding behaviors and species interactions that are difficult to capture through traditional observation methods,” says the CSIRO’s Jackett. Today, most marine animals are randomly observed during relatively short expeditions where human-controlled ROVs and other technology are deployed. For instance, the Stellamedusa ventana jellyfish had been seen only seven times between 1990 and 2003. Autonomous AI robots would allow persistent monitoring of marine life.

Tommy Knowles puts collection canisters back onto the MiniROV after the expedition.
Tommy Knowles puts collection canisters back onto the MiniROV after the expedition.
Tommy Knowles holds a jellyfish collected from the MiniROV after the expedition.
Tommy Knowles holds a jellyfish collected from the MiniROV.

At the University of Minnesota, scientists are developing an open-source, AI-enabled underwater robot called MeCO to autonomously identify and map invasive freshwater species, which are particularly hard to detect when lakes are frozen over in winter.

“Our biologists here have very little data on what’s actually growing underneath the ice and sending someone to dive in those times is extremely hazardous,” says Junaed Sattar, director of the Minnesota Interactive Robotics and Vision Lab.

One of the biggest challenges of deploying computationally intensive AI to track and tag marine species is that the cloud doesn’t work underwater. “The problem right now is those algorithms require so much compute that we can’t put them on a vehicle yet,” says Katija. To overcome that obstacle, researchers have to develop AI architecture that can be contained onboard an ocean-going autonomous robot. (MiniROV currently uses AI transmitted from the ship through a cable.)

MiniROV returned to the ocean last week for more AI trials, and in March, the next version of FathomVerse is set to be released.

“FathomVerse players are identifying animals that didn’t have an identity before and they’re probably the first people to even look at these animals,” says Katija.




ABOUT
Networked Nature
New technical innovations such as location-tracking devices, GPS and satellite communications, remote sensors, laser-imaging technologies, light detection and ranging” (LIDAR) sensing, high-resolution satellite imagery, digital mapping, advanced statistical analytical software and even biotechnology and synthetic biology are revolutionizing conservation in two key ways: first, by revealing the state of our world in unprecedented detail; and, second, by making available more data to more people in more places. The mission of this blog is to track these technical innovations that may give conservation the chance – for the first time – to keep up with, and even get ahead of, the planet’s most intractable environmental challenges. It will also examine the unintended consequences and moral hazards that the use of these new tools may cause.Read More