Marine pollution is a critical issue impacting the global community, with underwater waste a particularly daunting challenge. While autonomous detection and collection of underwater waste is highly desirable, these are extremely difficult tasks. This difficulty arises from the in
...
Marine pollution is a critical issue impacting the global community, with underwater waste a particularly daunting challenge. While autonomous detection and collection of underwater waste is highly desirable, these are extremely difficult tasks. This difficulty arises from the intrinsic complexities of the aquatic environment, including variable lighting conditions, reduced visibility, and the complex nature of water currents. This paper focuses on novel approaches for autonomous detection of underwater waste and proposes to incorporate domain knowledge to refine deep-learning-based underwater object detection techniques. More specifically, the domain knowledge is represented with models in the state space form that describe the motion of the target objects to assist the classification of objects. Moreover, optical flow is combined with a k-means clustering algorithm to extract the trajectory of the target objects from videos. These trajectories are subsequently fed into a neural network that is trained using knowledge distillation, enhanced with domain knowledge. For our experiments, a simulator is devised to facilitate the creation of a dataset for developing and testing the proposed architecture. The results of the experiments demonstrate that including domain knowledge within the object detection approach with neural networks provides numerous substantial advantages, including enhanced robustness against noisy and poorly labelled data, facilitation of semi-supervised learning, and consistent superiority in accuracy over the baseline scenario. Additionally, combining the domain knowledge with a neural network significantly increases the computational speed of object detection compared to using the standalone domain knowledge module.