Comfort Eboigbe hopes to use AI to make a time-intensive task easier for students and researchers studying aquatic life.
The ocean data specialist at the Marine Institute says that graduate students and young researchers typically spend months watching sometimes murky underwater footage to document the fish, crustaceans or corals they see.
“It’s a long and tedious process,” said Ms. Eboigbe, who manages the data collected by oceanographic buoys and the Holyrood observatory operated by the Marine Institute’s Centre of Applied Ocean Technology. “Artificial intelligence can be used for the preliminary processing of these videos, and it would allow for keyword searches of the annotated video.”
Originally from Nigeria, Ms. Eboigbe received her master of applied science (environmental systems engineering and management) degree from Memorial in 2020.
She also holds a bachelor in chemical engineering degree from Covenant University in Ota, Nigeria.
Open-source model
Late last year, Ms. Eboigbe started training an open-source, AI model that uses a deep learning algorithm to detect objects and animals in videos.
She says she got the idea from a presentation at the Oceans Research in Canada Alliance conference in St. John’s last summer.
“The researchers used bathymetry data to train an AI model to be able to recognize certain things in the ocean and I thought it was interesting. Turns out, they used a model called YOLOv5 for their project.”
Although she does not have an AI background, Ms. Eboigbe researched the YOLOV model and found an updated version using the Python format with which she is familiar.
“Some fish you have to handle to tell the difference.”
There is, however, plenty of work to do.
The model still requires a human to convert all the videos into still pictures and annotate them. There is no automatic way to do it.
Converting 1,278 images took almost three weeks.
Many hours of video
To train the AI model to detect marine life, Ms. Eboigbe is using underwater video from the Holyrood subsea observatory and the Marine Institute’s marine conservation area project that started in 2022.
The five-year collaborative project is monitoring and gathering data in federal conservation areas off Newfoundland and Labrador.
Among the many tools they’re using are baited cameras on the seafloor and drop cameras to collect video of marine species and habitat.
“It could take many hours to go through all the footage. With this model, it could take minutes.”
So far, the project has collected 249 hours of video footage — or 3.35 terabytes of data — and plans are underway to gather more data this summer.
Ms. Eboigbe says the AI model will speed up the preliminary review process.
Humans, however, will still play a role in identifying variations within the same species of fish, such as skate or sculpin.
“The model can identify that there are skate in the video, but not the different sub-groups within the skate family,” she said. “It’s really difficult to tell them apart in video. Some fish you have to handle to tell the difference.”
Hours vs. minutes
Anything that speeds up the video reviewing process will be a welcome tool for Robyn Whelan and Hannah Steele.
Both women are working on the marine conservation area project.
“One of the goals of the project is to collect baseline information to provide an overview of the biodiversity within these areas,” said Ms. Whelan, a fisheries technologist with the Centre for Fisheries Ecosystems Research at the Marine Institute.
She says it takes about an hour to analyze and annotate 10 minutes of video.
“It could take many hours to go through all the footage. With this model, it could take minutes to do the same review.”
Ms. Steele, a part-time Centre for Fisheries Ecosystems Research research assistant and project blogger working on her bachelor of technology degree at the Marine Institute, agrees.
“There is a lot of marine life biodiversity in the oceans off Newfoundland and Labrador,” she said. “You spend a lot of time flipping through textbooks to identify corals and sponges.”