The use of robotics in harsh environments, such as nuclear decommissioning, has increased in recent years. Environments such as the Fukushima Daiichi accident site from 2011 and the Sellafield legacy ponds highlight the need for robotic systems capable of deployment in hazardous environments unsafe for human workers. To characterise these environments, it is important to develop robust and accurate localization systems that can be combined with mapping techniques to create 3D reconstructions of the unknown environment. This paper describes the development and experimental verification of a localization system for an underwater robot, which enabled the collection of sonar data to create 3D images of submerged simulated fuel debris. The system was demonstrated at the Naraha test facility, Fukushima prefecture, Japan. Using a camera with a bird's-eye view of the simulated primary containment vessel, the 3D position and attitude of the robot was obtained using coloured LED markers (active markers) on the robot, landmarks on the test-rig (passive markers), and a depth sensor on the robot. The successful reconstruction of a 3D image has been created through use of a robot operating system (ROS) node in real-time.
Keywords: 3D reconstruction; ROV; localization; mapping; nuclear characterization; robotics; submersible; underwater; vision.