Your gaming skills could help teach an AI to identify jellyfish and whales

Your gaming skills could help teach an AI to identify jellyfish and whales

Today, there are more ways to take photos of the underwater world than anyone could have imagined at the start of the millennia, thanks to ever-improving designs for aquatic cameras. They have given us a glimpse into the marine world. However, they have also presented marine biologists with mountains and mountains of visual data that has made it tedious and time-consuming to sort through.

The Monterey Bay Aquarium Research Institute, California, has suggested a solution: a machine-learning platform that can process images and videos. It’s called Ocean Vision AI . It combines human-made annotations and artificial intelligence. Think of it like the ebird or iNaturalist app, but modified for marine life.

The project is a multidisciplinary collaboration among data scientists, oceanographers and game developers as well as human-computer interaction specialists. On Tuesday, the National Science Foundation showed support for the two-year-project by awarding it $5 million in funding.

“Only one percent of the hundreds of thousand hours of ocean video footage and imagery have been viewed and analyzed in their entirety, and even less has been shared with the global scientific communities,” Katy Croff, founder and president, Ocean Discovery League, and co-principal investigator of Ocean Vision AI, stated in a press statement . Analyzing complex interactions between organisms and their environment requires manual labeling by experts. This is a resource-intensive and difficult to scale approach.

[Related: Why ocean researchers want to create a global library of undersea sounds]

” As more industries and institutions seek to use the ocean, it is becoming more important to understand the space where their activities intersect. Growing the BlueEconomy requires understand[ing] its impact on the ocean environment, particularly the life that lives there,” Kakani Katija, a principal engineer at MBARI and the lead principal investigator for Ocean Vision AI, wrote in a Twitter post.

This is where artificial intelligence comes in. Marine biologists have already been experimenting with using AI-software to classify sounds, like whale songs, in the ocean. The idea of Ocean Vision AI is to create a central hub that can collect new and existing underwater visuals from research groups, use these to train an organism-identifying artificial intelligence algorithm that tell apart the crab versus the sponge in frame, for example, and share the annotated images with the public and the wider scientific community as a source of open data.

[Related: Jacques Cousteau’s grandson is building a network of ocean floor research stations]

A key part of the equation is an open-source image database called FathomNet. According to NSF’s 2022 Convergence Accelerator portfolio, “the data in FathomNet are being used to inform the design of the OVAI [Ocean Vision AI] Portal, our interface for ocean professionals to select concepts of interest, acquire relevant training data from FathomNet, and tune machine learning models. OVAI’s ultimate goal, , is to make ocean imagery accessible to all .

Ocean VisionAI will also include a video-game component to engage the public. The team is creating a video game that “will educate players while generating annotations” that can improve accuracy of the AI models.

Although the game is still in prototype testing, a sneak peek of it can be seen in a video posted by NSF to YouTube showing an interface which asks users whether a photo they saw contained a jellyfish (images of what a jellyfish looks like are present at the top of the screen).

Here is the current timeline. The team expects the first version of FathomNet, which is currently in beta, to be fully functional by next summer. They also have a preliminary set data set. In 2024, the team will start exporting machine learning-labeled ecological survey data to repositories like the Global Biodiversity Information Facility, and look into building a potential subscription model for institutions. The video game’s modules will be integrated into popular games and museum and aquarium experiences. After field testing different versions, the team will finalize their design and release a standalone, multiplatform game in late 2024.

“The health of our planet’s oceans is vital, but we only have a small amount of it,” Katija stated in a press statement . “Together we are developing tools that are urgently required to help us better understand our blue planet .”

.

Charlotte Hu

Read More