ACRS 2025
Conference Management System
Main Site
Submission Guide
Register
Login
User List | Statistics
Abstract List | Statistics
Poster List
Paper List
Reviewer List
Presentation Video
Online Q&A Forum
Ifory System
:: Abstract ::

<< back

Stereo Image-based Relocalization for Robust Visual Odometry
Yusuke Eshima(a*), Masafumi Nakagawa(a)

a) Shibaura Institute of Technology, Japan
ah20034[at]shibaura-it.ac.jp
b) Tokyo University of Marine Science and Technology, Japan


Abstract

Mobile mapping systems (MMS) and unmanned aerial vehicles (UAVs) are widely used to quickly and safely collect 3D data for inspecting infrastructure, such as bridges, dams, roads, and railroads. One technical challenge is that self-position estimation by visual odometry is not easy when images are blurred due to camera movement and rotation. Therefore, we focused on visual odometry issues when mounting autonomous mobile robots such as indoor flying UAVs. Conventional research in the field of seamless indoor-outdoor UAV navigation has focused on visual simultaneous localization and mapping (Visual SLAM) integrated with Robot Operating System (ROS), as well as the 2D modeling and orthoimage generation of complex structures using UAVs equipped with OpenREALM. Studies utilizing ROS-based frameworks have typically conducted comparative analyses of sensing modalities, including LiDAR, monocular RGB cameras, and stereo camera systems. In our previous work, we developed flight control algorithms designed specifically for UAV-based infrastructure inspection tasks. Additionally, we proposed a method for enhancing the stability of Visual Odometry by incorporating multi-directional inertial measurements in conjunction with stereo imagery. Furthermore, we introduced a seamless positioning approach that enables seamless transitions between visual odometry and RTK-GNSS modes, thereby ensuring continuous and reliable localization in both GNSS-available and GNSS-denied environments. However, technical issues with visual odometry include the difficulty estimating self-position when images are blurred during camera movement and rotation. Therefore, we proposed a methodology that identifies visual odometry errors and directs users to the restarted position. Our methodology also restarts visual odometry using image matching on a sequence of images.

Keywords: visual odometry, image matching, motion blur, odometry reinitialization

Topic: Topic D: Geospatial Data Integration

Plain Format | Corresponding Author (Yusuke Eshima)

Share Link

Share your abstract link to your social media or profile page

ACRS 2025 - Conference Management System

Powered By Konfrenzi Ultimate 1.832M-Build8 © 2007-2025 All Rights Reserved