MULTILEVEL INVERSE PATCHMATCH NETWORK WITH LOCAL AND GLOBAL REFINEMENT FOR UNDERWATER STEREO MATCHING

Multilevel Inverse Patchmatch Network with Local and Global Refinement for Underwater Stereo Matching

Multilevel Inverse Patchmatch Network with Local and Global Refinement for Underwater Stereo Matching

Blog Article

Vision-based underwater autonomous systems play a significant role in marine exploration.Stereo matching is one of the most popular applications for vision-based underwater autonomous systems, which recovers the geometric information of underwater scenes via stereo disparity estimation.While stereo matching in the air has achieved great progress with the development of neural networks, it generalizes poorly to the underwater scenario due to the challenging underwater degradation.In this paper, we propose a novel Multilevel Inverse Patchmatch Network (MIPNet) to iteratively model pair-wise correlations under underwater degradation Steering and estimate stereo disparity with both local and global refinements.Specifically, we first utilized the inverse Patchmatch module in a novel multilevel pyramid structure to recover the detailed stereo disparity from the input stereo images.

Secondly, we introduced a powerful Attentional Feature Fusion module to model pair-wise correlations with global context, ensuring high-quality stereo disparity estimation for both in-air and underwater scenarios.We evaluate the proposed method on the popular real-world ETH3D benchmark, and the Bosch PXX875D34E Serie 8 82cm 4 Burners A Induction Hob Touch Control Black highly competitive performance against the popular baselines demonstrates the effectiveness of the proposed method.Moreover, with its superior performance on our real-world underwater dataset, e.g., our method outperforms the popular baseline RAFT-Stereo by 27.

1%, we show the good generalization ability of our method to underwater scenarios.We finally discuss the potential challenges for underwater stereo matching via our experiments on the impact of water.

Report this page