Fish Detection AI, sonar image-trained detection, counting, tracking models

In progress License 

The Fish Detection AI project aims to improve the efficiency of fish monitoring around marine energy facilities to comply with regulatory requirements. Despite advancements in computer vision, there is limited focus on sonar images, identifying small fish with unlabeled data, and methods for underwater fish monitoring for marine energy.
A Faster R-CNN (Region-based Convolutional Neural Network) was developed using sonar images from Alaska Fish and Games to identify, track, and count fish in underwater environments. Supervised methods were used with Faster R-CNN to detect fish based on training using labeled data of fish. Customized filters were specifically applied to detect and count small fish when labeled datasets were unavailable. Unsupervised Domain Adaptation techniques were implemented to enable trained models to be applied to different unseen datasets, reducing the need for labeling datasets and training new models for various locations. Additionally, elastic shape analysis (ESA), hyper-image analysis, and various image preprocessing methods were explored to enhance fish detection.
In this research we achieved:
1. Faster R-CNN for Sonar images
- Applied Faster R-CNN reached > 0.85 average precision (AP) for large fish detection, providing robust results for higher-quality sonar images.
- Integrated Norfair tracking to reduce double-counting of fish across video frames, enabling more accurate population estimates.
2. Small Fish Identification
- Established customized filtering methods for small, often unlabeled fish in noisy acoustic images.

This submission of data includes several sub-directories:
- FryCounting: contains information on how to count small fish (i.e., fry) in the sonar image data
- SG_aldi_addons: contains additions to the ALDI code (SG = Steven Gutstein, primary author) such as the trained models used in this experiment, which should match the models achieved when the training instructions are followed, and code for how to make the sonar images into movies
- Summaries_Dir: contains information on how to set up the foundation to perform these experiments, such as installing all required packages and versions, and creating the PyTorch and ALDI environments
These experiments boil down to a 2-part structure as described in the uploaded readme file:
Part I: Installing and Using ALDI & Norfair Code
- This is used for tracking and counting fish, and is a replication of the article that is linked, namely the Align and Distill (Aldi) work done by Justin Kay and others
- This part relates to the Summaries_Dir subfolder, and the SG_aldi_addons sub-folder
Part II: Installing and Using Fry Code
- This is used to track and count smaller fish (aka fry)
- This relates to the FryCounting sub-directory
Also included here are links to the downloadable sonar data and the article that was replicated in this study.

Citation Formats

Water Power Technology Office. (2024). Fish Detection AI, sonar image-trained detection, counting, tracking models [data set]. Retrieved from https://mhkdr.openei.org/submissions/604.
Export Citation to RIS
Gutstein, Steven, Slater, Katherine, and Scott, Brett. Fish Detection AI, sonar image-trained detection, counting, tracking models. United States: N.p., 25 Aug, 2024. Web. https://mhkdr.openei.org/submissions/604.
Gutstein, Steven, Slater, Katherine, & Scott, Brett. Fish Detection AI, sonar image-trained detection, counting, tracking models. United States. https://mhkdr.openei.org/submissions/604
Gutstein, Steven, Slater, Katherine, and Scott, Brett. 2024. "Fish Detection AI, sonar image-trained detection, counting, tracking models". United States. https://mhkdr.openei.org/submissions/604.
@div{oedi_604, title = {Fish Detection AI, sonar image-trained detection, counting, tracking models}, author = {Gutstein, Steven, Slater, Katherine, and Scott, Brett.}, abstractNote = {The Fish Detection AI project aims to improve the efficiency of fish monitoring around marine energy facilities to comply with regulatory requirements. Despite advancements in computer vision, there is limited focus on sonar images, identifying small fish with unlabeled data, and methods for underwater fish monitoring for marine energy.
A Faster R-CNN (Region-based Convolutional Neural Network) was developed using sonar images from Alaska Fish and Games to identify, track, and count fish in underwater environments. Supervised methods were used with Faster R-CNN to detect fish based on training using labeled data of fish. Customized filters were specifically applied to detect and count small fish when labeled datasets were unavailable. Unsupervised Domain Adaptation techniques were implemented to enable trained models to be applied to different unseen datasets, reducing the need for labeling datasets and training new models for various locations. Additionally, elastic shape analysis (ESA), hyper-image analysis, and various image preprocessing methods were explored to enhance fish detection.
In this research we achieved:
1. Faster R-CNN for Sonar images
- Applied Faster R-CNN reached > 0.85 average precision (AP) for large fish detection, providing robust results for higher-quality sonar images.
- Integrated Norfair tracking to reduce double-counting of fish across video frames, enabling more accurate population estimates.
2. Small Fish Identification
- Established customized filtering methods for small, often unlabeled fish in noisy acoustic images.

This submission of data includes several sub-directories:
- FryCounting: contains information on how to count small fish (i.e., fry) in the sonar image data
- SG_aldi_addons: contains additions to the ALDI code (SG = Steven Gutstein, primary author) such as the trained models used in this experiment, which should match the models achieved when the training instructions are followed, and code for how to make the sonar images into movies
- Summaries_Dir: contains information on how to set up the foundation to perform these experiments, such as installing all required packages and versions, and creating the PyTorch and ALDI environments
These experiments boil down to a 2-part structure as described in the uploaded readme file:
Part I: Installing and Using ALDI & Norfair Code
- This is used for tracking and counting fish, and is a replication of the article that is linked, namely the Align and Distill (Aldi) work done by Justin Kay and others
- This part relates to the Summaries_Dir subfolder, and the SG_aldi_addons sub-folder
Part II: Installing and Using Fry Code
- This is used to track and count smaller fish (aka fry)
- This relates to the FryCounting sub-directory
Also included here are links to the downloadable sonar data and the article that was replicated in this study.}, doi = {}, url = {https://mhkdr.openei.org/submissions/604}, journal = {}, number = , volume = , place = {United States}, year = {2024}, month = {08}}

Details

Data from Aug 25, 2024

Last updated Mar 11, 2025

Submission in progress

Organization

Water Power Technology Office

Contact

Victoria Sabo

Authors

Steven Gutstein

Water Power Technology Office

Katherine Slater

Water Power Technology Office

Brett Scott

Water Power Technology Office

DOE Project Details

Project Name Department of Energy (DOE), Office of Energy Efficiency and Renewable Energy (EERE), Water Power Technologies Office (WPTO)

Project Lead Samantha Eaves

Project Number EERE T 540.210-09

Share

Submission Downloads