Researchers from NASA Ames Research Center created a way to use virtual reality to detect reef health.
Ocean researchers have been collecting image libraries of underwater environments for years. 3D imagery is collected by divers and snorkelers and 2D images are gathered from satellites. This is a large amount of data, and the team wanted to find a way to quickly analyze patterns and classifications in the images. The team automated this process using a convolutional neural network (CNN), a type of artificial intelligence (AI).
Classification allows researchers to see how coral reef ecosystems change over time. CNNs analyze items, search for features and locate where those features are compared to the rest of the image. While this type of classification is very helpful, it requires a large database to train the system to closely analyze complex 3D images of coral reefs.
To build the database, the team used a citizen science approach in the form of a video game called NeMO-Net, which relies on users to create training datasets. In the game, the players explore a virtual underwater world while learning about and classifying coral species. Player-produced classification labels were used to train NaMO-Net’s CNN.
Not only does the video game collect data, but it is also an educational tool that gives people a better understanding of coral reefs. Since its release, the game has seen over 300 million players. Researchers are hopeful that this tool and their CNN will be valuable for other conservation and mapping projects.
A paper on this research was published in Frontiers in Marine Science.