서지주요정보
이동 로봇의 위치 추정을 위한 비전과 레이저 거리계의 센서 융합 = Sensor fusion of vision and laser range finder for mobile robot localization
서명 / 저자 이동 로봇의 위치 추정을 위한 비전과 레이저 거리계의 센서 융합 = Sensor fusion of vision and laser range finder for mobile robot localization / 안상태.
저자명 안상태 ; Ahn, Sang-Tae
발행사항 [대전 : 한국과학기술원, 2004].
Online Access 원문보기 원문인쇄

소장정보

등록번호

8014941

소장위치/청구기호

학술문화관(문화관) 보존서고

MME 04043

휴대폰 전송

도서상태

이용가능

대출가능

반납예정일

리뷰정보

초록정보

Even though the usage of a single sensor is an inexpensive way to recognize the environment for a mobile robot, multi-sensor systems have been studied by researchers to obtain better information on the environment for recent years. A combination of different types of sensors has the advantage of complementarity. Cooperation of multiple sensors is also more precision and robuster so that a mobile robot can achieve improved performance with the higher levels of information on the environment. Due to these advantages, the following various combinations of sensors have been studied so far: a ultrasonic sensor array and a CCD camera; a laser range finder and a stereo camera system; a trinocular vision system; a laser structured light system and a ultrasonic sensor array; and so on. In this thesis, a procedure of the map-based localization for the mobile robot is researched, and a camera and a laser structured light system are used to obtain the data on environment and to be fused for mobile robot localization. From images acquired by the camera, vertical lines are extracted, which are meaningful features in the indoor environment, and geometric landmarks composed of line segments are obtained by the laser structured light sensor. Although each of them is usable to localize mobile robots, the mobile robot gets lost occasionally. For successful and reliable localization in general cases, the data obtained by the two sensors are fused. The proposed sensor fusion algorithms are based on the weighted average method. Especially, this concerns with illumination-robustness. Illumination condition has influence upon environment recognition using a camera. Under the bad illumination situation, it is not easy to extract appropriate landmarks for localization from an image acquired by the camera. Therefore, the extracted landmarks have poor reliability. Consequently, the landmarks with poor reliability could cause the mobile robot to get lost in the worst case. To overcome this sort of situations, the illumination-robust sensor system and operation algorithms are proposed in this research. Firstly, the error characteristics of each sensor are analyzed, and the sensor reliability is modeled. Secondly, the localization methods using each sensor are described and fundamental experiments are performed. Thirdly, the fusion algorithms, based on the weighted average method and the sensor reliability model, are proposed. Finally, for evaluating the proposed algorithms, the experiments are performed in indoor navigation environments under the various illumination conditions. The experiment results are presented and discussed to show the advantage of using multi-sensor fusion.

서지기타정보

서지기타정보
청구기호 {MME 04043
형태사항 vi, 94 p. : 삽도 ; 26 cm
언어 한국어
일반주기 저자명의 영문표기 : Sang-Tae Ahn
지도교수의 한글표기 : 조형석
공동교수의 한글표기 : 권대갑
지도교수의 영문표기 : Hyung-Suck Cho
공동교수의 영문표기 : Dae-Gab Gweon
학위논문 학위논문(석사) - 한국과학기술원 : 기계공학전공,
서지주기 참고문헌 : p. 91-94
주제 센서융합
이동로봇의 위치추정
레이저 비전 센서
환경 특징치
Sensor fusion
Mobile robot localization
Laser vision sensor
Environmental features
QR CODE qr code