For autonomous navigation of the mobile robots in real 3D world, the robots' capability to recognize 3D environment is essential. In this thesis, an on-line 3D map building method for autonomous mobile robots is proposed. To get range data on the environment, a sensor system using structured light and a COD camera is used, which is based on optical triangulation method. The structured laser is projected as a horizontal stripe on the scene. The sensor system has pan-tilt and elevation mechanism on the mobile robot. Scanning the system, laser stripe image for the environments is acquired and planes composing environments are updated through some image processing steps. From the laser stripe on the acquired image, center points of each column are found and line segments are made from blabbing the center points. Then, the planes of the environments are updated. These steps are performed on-line in scanning phase. Through the proposed method, the 3D map about the environments is composed effectively.
In many case, the mobile robot is not able to get whole 3D map of environments with single view of visual sensing systems, because occlusions can be occurred. The change of the position or orientation of the sensor system is necessary for solving occlusion problems. In this thesis, the sensor planning method for solving occlusion problems is proposed. The method follow these steps: 1) making candidates for next position and orientation of the sensor system using occlusions, 2) checking the candidates whether the sensor system can be placed, 3) evaluating the candidates, 4) selecting the best one and the point is the next point of the sensor system. For evaluation, robot moving distance and area of solved occlusions were considered. In this thesis, 2D simulations and experiments of the sensor planning method are represented. Whole 2D environment is acquired without occlusion by the sensor planning method. Future work is necessary to expand this sensor planning to 3D.