3D Reconstruction of Underwater Environment Using Acoustic Camera
In recent years, waterfront development, such as construction and reclamation projects related to airports, ports, and submarine tunnels, has become considerably more critical. To conduct such heavy work, there exist underwater construction machines operated by divers in an underwater environment. However, hazards may prohibit human access and the limited field of vision due to turbidity and lack of illumination makes underwater operations difficult. To complete tasks, such as inspection, removal of hazardous materials, or excavation work, a remote control robot equipped with a 3D system for reconstructing the underwater environment is required, as shown in Fig. 1.
Fig. 1 Example of underwater construction using a remote control underwater crawler-type robot based on dense 3D mapping of surrounding environment using acoustic camera.
Recently, the development of forward-looking sonars, which is also known as acoustic cameras, such as the dual frequency identification sonar (DIDSON) and adaptive resolution imaging sonar (ARIS), which can generate high-resolution and wide-range images, has facilitated our understanding of underwater situations. This type of sonar sensor is relatively small and can easily be mounted on an underwater robot and gather information of a relatively larger area considerably faster. The acoustic camera can also be mounted on an arm of a crawlertype robot, and thus, the robot can fulfill complex underwater tasks, such as manipulation, even in turbid water.
In this research, a novel dense 3D mapping paradigm for an acoustic camera in an underwater situation is proposed. As a result, it is possible to build a dense 3D map of the underwater environment precisely and robustly as shown in Mov. 1.
Mov. 1 3D underwater mapping using the acoustic camera mounted on the robot arm.
First, each of the 3D local maps is generated from each viewpoint of the acoustic camera as shown in Mov. 2. Here, an effective rotation for probability updates, which rotates around the acoustic axis (i.e., the roll rotation of the acoustic camera), is performed for each viewpoint by the rotator mounted on the acoustic camera. Then, odometry (i.e., the movement of the acoustic camera) is estimated from the transform matrices of consecutive local maps without requiring internal sensor data.
Mov. 2 3D local map generation from each viewpoint of the acoustic camera by roll rotation.
Finally, a graph optimization process is performed to realize the accurate pose estimation of each viewpoint and generate a 3D global map simultaneously. As a result shown in Mov. 3, it is possible to build a dense 3D map of the underwater environment precisely and robustly.
Mov. 3 Dense 3D global map built by graph optimization process.
Additionally, we propose another novel approach to estimate the missing dimension (i.e., estimating the unknown elevation angle) in 2D acoustic images based on a deep neural network for 3D reconstruction, as shown in Mov. 4. Here, the deep neural network is trained using simulated images. To mitigate the sim-real gap, a neural style transfer method is implemented to generate a realistic image dataset for training.
Mov. 4 Dense 3D global map built by graph optimization process.
Forward-looking Sonar Simulator
The difficulty and high cost of acquiring acoustic images in real experiments encourage researchers to consider the generation of simulated
acoustic image datasets. Therefore, we develop a novel simulator to generate realistic acoustic datasets for forward-looking sonars, as shown in Mov. 4 and Fig. 2. We first build a novel user-friendly acoustic image simulator based on 3D modeling software. Then, the CycleGAN is applied to generate realistic acoustic images based on the generated dataset from the simulator. [https://github.com/sollynoay/Sonar-simulator-blender]
Fig. 2 Configuration in the simulator which can simulate sound waves in an active sonar system.
ACMarker: Acoustic Camera-Based Fiducial Marker System
ACMarker shown in Mov. 5 is an acoustic camera-based fiducial marker system designed for underwater environments. Optical camera-based fiducial marker systems have been widely used in computer vision and robotics applications such as augmented reality (AR), camera calibration, and robot navigation. However, in underwater environments, the performance of optical cameras is limited owing to water turbidity and illumination conditions. We propose methods to recognize a simply designed marker and to estimate the relative pose between the acoustic camera and the marker. The proposed system can be applied to various underwater tasks such as object tracking and localization of unmanned underwater vehicles.
The markers can be placed directly on the walls of an underwater structure or on the seabed with a concreteor plaster base. This facilitates the navigation of autonomous underwater vehicles (AUVs), as well as underwater structure inspection. The contributions of the system can be summarizedas follows:
• We propose detection and ID identification methods basedon simply designed square markers.
• We propose a method to accurately and precisely estimatethe 6DoF relative pose between the acoustic camera andthe marker.
• Detection and pose estimation can be processed based ona single image and should work in real time.
Mov. 5 ACMarker system.
Related Papers
• Yusheng Wang, Yonghoon Ji, Hiroshi Tsuchiya, Hajime Asama, and Atsushi Yamashita, "Learning Pseudo Front Depth for 2D Forward-Looking Sonar-based Multi-view Stereo," Proceedings of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2022), October 2022.
• Yusheng Wang, Yonghoon Ji, Dingyu Liu, Hiroshi Tsuchiya, Atsushi Yamashita and Hajime Asama, "Simulator-aided Edge-based Acoustic Camera Pose Estimation," OCEANS 2022 Chennai, February 2022.
• Dingyu Liu, Yusheng Wang, Yonghoon Ji, Hiroshi Tsuchiya, Atsushi Yamashita, and Hajime Asama, "Development of Image Simulator for Forward-looking Sonar Using 3D Rendering," Proceedings of the SPIE 11794, 15th International Conference on Quality Control by Artificial Vision (QCAV2021), Vol. 11794, pp. 117940H, Tokushima, Japan, May 2021. [doi:10.1109/10.1117/12.2590004]
• Yusheng Wang, Yonghoon Ji, Dingyu Liu, Hiroshi Tsuchiya, Atsushi Yamashita, and Hajime Asama, "Elevation Angle Estimation in 2D Acoustic Images Using Pseudo Front View," IEEE Robotics and Automation Letters, Vol. 6, No. 2, pp. 1535-1542, April 2021. (Impact Factor 3.6)[doi:10.1109/LRA.2021.3058911]
• Dingyu Liu, Yusheng Wang, Yonghoon Ji, Hiroshi Tsuchiya, Atsushi Yamashita, and Hajime Asama, "CycleGAN-based Realistic Image Dataset Generation for Forward-looking Sonar," Advanced Robotics, Vol. 35, 2021. (Impact Factor 1.5)[doi:10.1080/01691864.2021.1873845]
• Dingyu Liu, Yusheng Wang, Yonghoon Ji, Hiroshi Tsuchiya, Atsushi Yamashita, and Hajime Asama, "Development of Image Simulator for Forward-looking Sonar Using 3D Rendering," Proceedings of the SPIE 11794, 15th International Conference on Quality Control by Artificial Vision (QCAV2021), Vol. 11794, pp. 117940H, Tokushima, Japan, May 2021. [doi:10.1109/10.1117/12.2590004]
• Yusheng Wang, Yonghoon Ji, Hanwool Woo, Yusuke Tamura, Hiroshi Tsuchiya, Atsushi Yamashita, and Hajime Asama, "Acoustic Camera-based Pose Graph SLAM for Dense 3-D Mapping in Underwater Environments," IEEE Journal of Oceanic Engineering, Vol. 46, No. 3, pp. 829-847, July 2021. (Impact Factor 3.005) [doi: 10.1109/JOE.2020.3033036]
• Yusheng Wang, Yonghoon Ji, Dingyu Liu, Yusuke Tamura, Hiroshi Tsuchiya, Atsushi Yamashita, and Hajime Asama, "ACMarker: Acoustic Camera-based Fiducial Marker System in Underwater Environment," IEEE Robotics and Automation Letters, Vol. 5, No. 4, pp. 5018-5025, October 2020. (Impact Factor 3.6)(SICE International Young Authors Award for IROS2020 (SIYA-IROS2020) (Yusheng Wang))[doi:10.1109/LRA.2020.3005375]
• Yusheng Wang, Yonghoon Ji, Hanwool Woo, Yusuke Tamura, Hiroshi Tsuchiya, Atsushi Yamashita, and Hajime Asama, "Planar AnP: A Solution to Acoustic-n-Point Problem on Planar Target," Global OCEANS 2020, Singapore, Singapore, October 2020. [doi:10.1109/IEEECONF38699.2020.9389267]
• Yusheng Wang, 池 勇勳, 劉 丁瑜, 田村 雄介, 土屋 洋, 山下 淳, 淺間 一, "音響カメラに基づいた水中環境における人工マーカシステムの開発," 日本機械学会ロボティクス・メカトロニクス講演会20講演論文集 (ROBOMECH2020), 金沢, May 2020.
• Yusheng Wang, Yonghoon Ji, Hanwool Woo, Yusuke Tamura, Hiroshi Tsuchiya, Atsushi Yamashita, and Hajime Asama, "Rotation Estimation of Acoustic Camera Based on Illuminated Area in Acoustic Image," Proceedings of the 12th IFAC Conference on Marine Systems (CAMS2019), pp. 217-222, Daejon, Korea, September 2019.
• Yusheng Wang, Yonghoon Ji, Hanwool Woo, Yusuke Tamura, Atsushi Yamashita, and Hajime Asama, "Three-dimensional Underwater Environment Reconstruction with Graph Optimization Using Acoustic Camera," Proceedings of the 2019 IEEE/SICE International Symposium on System Integrations (SII2019), pp. 28-33, Paris, France, January 2019. [Link]
• Yusheng Wang, Yonghoon Ji, Hanwool Woo, Yusuke Tamura, Atsushi Yamashita, and Hajime Asama, "3D Occupancy Mapping Framework Based on Acoustic Camera in Underwater Environment," Proceedings of the 12th IFAC Symposium on Robot Control (SYROCO2018), pp. 1-7, Budapest, Hungary, August 2018. (IFAC PaperOnLine, Vol. 51, No. 22, pp. 324-339, August 2018) [Link]
• Ngoc Trung Mai, Yonghoon Ji, Hanwool Woo, Yusuke Tamura, Atsushi Yamashita, and Hajime Asama, "Acoustic Image Simulator Based on Active Sonar Model in Underwater Environment," Proceedings of the 15th International Conference on Ubiquitous Robots (UR2018), pp. 781-786, Hawaii, USA, June 2018. [Link]
• Ngoc Trung Mai, Hanwool Woo, Yonghoon Ji, Yusuke Tamura, Atsushi Yamashita, and Hajime Asama, "3D Reconstruction of Line Features Using Multi-view Acoustic Images in Underwater Environment," Proceedings of 2017 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI2017), pp. 312-317, Daegu, Korea, November 2017. [Link]
• Ngoc Trung Mai, Hanwool Woo, Yonghoon Ji, Yusuke Tamura, Atsushi Yamashita, and Hajime Asama, "3-D Reconstruction of Underwater Object Based on Extended Kalman Filter by Using Acoustic Camera Images," Preprints of the 20th World Congress of the International Federation of Automatic Control, pp. 1066-1072, Toulouse, France, July 2017. (IFAC PaperOnLine, Vol. 50, No. 1, pp. 1043-1049, July 2017) [Link]
• Yonghoon Ji, Seungchul Kwak, Atsushi Yamashita, and Hajime Asama, "Acoustic Camera-based 3D Measurement of Underwater Objects through Automated Extraction and Association of Feature Point," Proceedings of the 2016 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI2016), pp. 224-230, Baden-Baden, Germany, September 2016. [Link]
• マイ ゴクチュン, 禹 ハンウル, 池 勇勳, 田村 雄介, 山下 淳, 淺間 一, "音響カメラ画像を用いた拡張カルマンフィルタに基づく水中物体の3次元計測手法の構築," 第34回日本ロボット学会学術講演会予稿集 (RSJ2016), RSJ2016AC1C3-06, pp. 1-4, 山形, September 2016.
• Seungchul Kwak, Yonghoon Ji, Atsushi Yamashita, and Hajime Asama, "3-D Reconstruction of Underwater Objects Using Arbitrary Acoustic Views," Proceedings of the 2016 11th France-Japan congress on Mechatronics 9th Europe-Asia congress on Mechatronics 17th International Conference on Research and Education in Mechatronics (MECHATRONICS-REM2016), pp. 74-79, Compiegne, France, June 2016. [Link]
• 곽 승철, 지 용훈, Atsushi Yamashita, and Hajime Asama, "다시점의 음향카메라 영상을 이용한 수중물체의 3차원 형상 복원," 2016 제31회 제어・로봇・시스템학회 학술대회, pp. 1-2, 서울, March 2016.
• Seungchul Kwak, Yonghoon Ji, Atsushi Yamashita, and Hajime Asama, "3-D Reconstruction of Underwater Object: Analytical System for Extracting Feature Points Using Two Different Acoustic Views," Proceedings of the 2015 JSME/RMD International Conference on
Advanced Mechatronics (ICAM2015), pp.197-198, Tokyo, Japan, December 2015. [Link]
• 郭 承澈, 池 勇勳, 山下 淳, 淺間 一, "2視点における音響カメラ画像をを用いた水中物体の特徴点の3次元計測," 第33回日本ロボット学会学術講演会予稿集 (RSJ2015), pp. 1-4, 東京, September 2015.
• Seungchul Kwak, Yonghoon Ji, Atsushi Yamashita, and Hajime Asama, "Development of Acoustic Camera-Imaging Simulator Based on Novel Model,"
Proceedings of the 2015 IEEE International Conference on Environment and Electrical Engineering (EEEIC2015), pp. 1719-1724, Rome, Italy, June 2015. [Link]