Go to page content

Deep learning

Marine Institute AI model may make fish and coral detection a breeze

Research

By Moira Baird

Comfort Eboigbe hopes to use AI to make a time-intensive task easier for students and researchers studying aquatic life.

A woman stands with her arm on a rail in an atrium with a Marine Institute banner hanging behind her.
Comfort Eboigbe is an ocean data specialist with the Centre for Applied Ocean Technology at the Marine Institute.
Photo: Angie White

The ocean data specialist at the Marine Institute says that graduate students and young researchers typically spend months watching sometimes murky underwater footage to document the fish, crustaceans or corals they see.

“It’s a long and tedious process,” said Ms. Eboigbe, who manages the data collected by oceanographic buoys and the Holyrood observatory operated by the Marine Institute’s Centre of Applied Ocean Technology. “Artificial intelligence can be used for the preliminary processing of these videos, and it would allow for keyword searches of the annotated video.”

Originally from Nigeria, Ms. Eboigbe received her master of applied science (environmental systems engineering and management) degree from Memorial in 2020.

She also holds a bachelor in chemical engineering degree from Covenant University in Ota, Nigeria.

Open-source model

Late last year, Ms. Eboigbe started training an open-source, AI model that uses a deep learning algorithm to detect objects and animals in videos.

She says she got the idea from a presentation at the Oceans Research in Canada Alliance conference in St. John’s last summer.

“The researchers used bathymetry data to train an AI model to be able to recognize certain things in the ocean and I thought it was interesting. Turns out, they used a model called YOLOv5 for their project.”

Although she does not have an AI background, Ms. Eboigbe researched the YOLOV model and found an updated version using the Python format with which she is familiar.

“Some fish you have to handle to tell the difference.” — Comfort Eboigbe

There is, however, plenty of work to do.

The model still requires a human to convert all the videos into still pictures and annotate them. There is no automatic way to do it.

Converting 1,278 images took almost three weeks.

Many hours of video

To train the AI model to detect marine life, Ms. Eboigbe is using underwater video from the Holyrood subsea observatory and the Marine Institute’s marine conservation area project that started in 2022.

The five-year collaborative project is monitoring and gathering data in federal conservation areas off Newfoundland and Labrador.

Among the many tools they’re using are baited cameras on the seafloor and drop cameras to collect video of marine species and habitat.

“It could take many hours to go through all the footage. With this model, it could take minutes.” — Robyn Whelan

So far, the project has collected 249 hours of video footage — or 3.35 terabytes of data — and plans are underway to gather more data this summer.

Ms. Eboigbe says the AI model will speed up the preliminary review process.

Humans, however, will still play a role in identifying variations within the same species of fish, such as skate or sculpin.

“The model can identify that there are skate in the video, but not the different sub-groups within the skate family,” she said. “It’s really difficult to tell them apart in video. Some fish you have to handle to tell the difference.”

Hours vs. minutes

Anything that speeds up the video reviewing process will be a welcome tool for Robyn Whelan and Hannah Steele.

Both women are working on the marine conservation area project.

A woman wearing a life jacket and high visibility hat stands on the deck of a boat on the water in front of a hillside.
Robyn Whelan is a fisheries technologist with the Centre for Fisheries Ecosystems Research.
Photo: Submitted

“One of the goals of the project is to collect baseline information to provide an overview of the biodiversity within these areas,” said Ms. Whelan, a fisheries technologist with the Centre for Fisheries Ecosystems Research at the Marine Institute.

She says it takes about an hour to analyze and annotate 10 minutes of video.

“It could take many hours to go through all the footage. With this model, it could take minutes to do the same review.”

Ms. Steele, a part-time Centre for Fisheries Ecosystems Research research assistant and project blogger working on her bachelor of technology degree at the Marine Institute, agrees.

“There is a lot of marine life biodiversity in the oceans off Newfoundland and Labrador,” she said. “You spend a lot of time flipping through textbooks to identify corals and sponges.”


To receive news from Memorial in your inbox, subscribe to Gazette Now.


Latest News

Reflecting on student services

A Q&A with Dr. Donna Hardy Cox

Music for all

School of Music's newest band extends invitation to all Memorial players

Hidden talents

An illustrator, Irish bagpipe-builder, creative writer and a father-daughter black belt duo

‘It feels like home’

Memorial University and partners launch Francophone Living and Learning Community

Studentview

Christmas magic comes from thoughtful celebrating, not excessive spending

Solutions and strategies

Reducing stress and its effects on police search and rescue personnel