Navigating Mobile Robots: Sensors and Techniques -

CD-ROM Edition


J. Borenstein, H. R. Everett, and L. Feng

Contributing authors: S. W. Lee and R. H. Byrne
Publisher: A. K. Peters, Ltd., Wellesley, MA
Ph.: +1-617-235-2210
Fax.: +1-617-235-2404


This CD-ROM is the electronic version of a comprehensive survey of the state of the art in sensors, systems, methods and technologies utilized by mobile robots to determine their position in the environment. The many potential "solutions" are roughly categorized into two groups: relative and absolute position measurements. The first includes odometry and inertial navigation; the second comprises active beacons, artificial and natural landmark recognition, and model matching. The authors compare and analyze these different methods based on technical publications and on commercial product and patent information. Comparison is centered around the following criteria: accuracy of position and orientation measurements, equipment needed, cost, sampling rate, effective range, computational power required, processing needs, and other special features. No robotics hobbyist or professional should be without this extraordinarily comprehensive look at robot positioning.

This CD-ROM also includes seven 2- to 4-minute video clips that can be played back on Macs, PCS, and possibly UNIX workstations. Also included are 17 background papers and detailed author information. The content of the CD-ROM includes a total of 441 MB of data (of which 36 MB are non-video information).

The CD-ROM is available for $39.95 (price subject to change without further notice) from publisher A. K. Peters, Ltd., Wellesley, MA

NOTE: As of 1999 this CD-ROM  is "out of print". However, you can download the CD-ROM publication in its entirety (without the video clips and under the title "Where am I" Report).





(1) Dr. Johann Borenstein
The University of Michigan
Department of Mechanical
Engineering and Applied Mechanics
Mobile Robotics Laboratory
1101 Beal Avenue
Ann Arbor, MI 48109
Ph.: (313) 763-1560
Fax: (313) 944-1113


(2) Commander H. R. Everett
Naval Command, Control, and Ocean Surveillance Center
RDT&E Division 5303
271 Catalina Boulevard
San Diego CA 92152-5001
Ph.:(619) 553-3672
Fax:(619) 553-6188
Email: Everett@NOSC.MIL


(3) Dr. Liqiang Feng
The University of Michigan
Department of Mechanical
Engineering and Applied Mechanics
Mobile Robotics Laboratory
1101 Beal Avenue
Ann Arbor, MI 48109
Ph.: (313) 936-9362
Fax: (313) 763-1260



This research was sponsored by the Office of Technology Development, U.S. Department of Energy, under contract DE-FG02-86NE37969 with the University of Michigan.

The authors wish to thank the Department of Energy (DOE), and especially Dr. Linton W. Yarbrough, DOE Program Manager, Dr. William R. Hamel, D&D, Technical Coordinator, and Dr. Clyde Ward, Landfill Operations Technical Coordinator for their technical and financial support of the research, which forms the basis of this work.

Parts of the text were adapted from Sensors for Mobile Robots: Theory and Application, by H. R. Everett, A K Peters, Ltd., Wellesley, MA, Publishers.

Chapter 9 was contributed entirely by Sang W. Lee from the Artificial Intelligence Lab at the University of Michigan

Significant portions of Chapter 3 were adapted from "Global Positioning System Receiver Evaluation Results." by Raymond H. Byrne, originally published as Sandia Report SAND93-0827, Sandia National Laboratories, 1993.


The authors further wish to thank Professors David K. Wehe and Yoram Koren at the University of Michigan for their support, and Mr. Harry Alter (DOE) who has befriended many of the graduate students and sired several of our robots. Thanks are also due to Todd Ashley Everett for making most of the line-art drawings.



Leonard and Durrant-Whyte [1991] summarized the general problem of mobile robot navigation by three questions: "Where am I?," "Where am I going?," and "How should I get there?." This book surveys the state-of-the-art in sensors, systems, methods, and technologies that aim at answering the first question, that is: robot positioning in its environment.

Perhaps the most important result from surveying the vast body of literature on mobile robot positioning is that to date there is no truly elegant solution for the problem. The many partial solutions can roughly be categorized into two groups: relative and absolute position measurements. Because of the lack of a single, generally good method, developers of automated guided vehicles (AGVs) and mobile robots usually combine two methods, one from each category. The two categories can be further divided into the following subgroups.

Relative Position Measurements

a. Odometry This method uses encoders to measure wheel rotation and/or steering orientation. Odometry has the advantage that it is totally self-contained, and it is always capable of providing the vehicle with an estimate of its position. The disadvantage of odometry is that the position error grows without bound unless an independent reference is used periodically to reduce the error [Cox, 1991].

b. Inertial Navigation This method uses gyroscopes and sometimes accelerometers to measure rate of rotation and acceleration. Measurements are integrated once (or twice) to yield position. Inertial navigation systems also have the advantage that they are self-contained. On the downside, inertial sensor data drifts with time because of the need to integrate rate data to yield position; any small constant error increases without bound after integration. Inertial sensors are thus unsuitable for accurate positioning over an extended period of time. Another problem with inertial navigation is the high equipment cost. For example, highly accurate gyros, used in airplanes, are inhibitively expensive. Very recently fiber-optic gyros (also called laser gyros), which are said to be very accurate, have fallen dramatically in price and have become a very attractive solution for mobile robot navigation.

Absolute Position Measurements

c. Active Beacons This method computes the absolute position of the robot from measuring the direction of incidence of three or more actively transmitted beacons. The transmitters, usually using light or radio frequencies, must be located at known sites in the environment.

d. Artificial Landmark Recognition In this method distinctive artificial landmarks are placed at known locations in the environment. The advantage of artificial landmarks is that they can be designed for optimal detectability even under adverse environmental conditions. As with active beacons, three or more landmarks must be "in view" to allow position estimation. Landmark positioning has the advantage that the position errors are bounded, but detection of external landmarks and real-time position fixing may not always be possible. Unlike the usually point-shaped beacons, artificial landmarks may be defined as a set of features, e.g., a shape or an area. Additional information, for example distance, can be derived from measuring the geometric properties of the landmark, but this approach is computationally intensive and not very accurate.

e. Natural Landmark Recognition Here the landmarks are distinctive features in the environment. There is no need for preparation of the environment, but the environment must be known in advance. The reliability of this method is not as high as with artificial landmarks.

f. Model Matching In this method information acquired from the robot's onboard sensors is compared to a map or world model of the environment. If features from the sensor-based map and the world model map match, then the vehicle's absolute location can be estimated. Map-based positioning often includes improving global maps based on the new sensory observations in a dynamic environment and integrating local maps into the global map to cover previously unexplored areas. The maps used in navigation include two major types: geometric maps and topological maps. Geometric maps represent the world in a global coordinate system, while topological maps represent the world as a network of nodes and arcs.

This book presents and discusses the state-of-the-art in each of the above six categories. The material is organized in two parts: Part I deals with the sensors used in mobile robot positioning, and Part II discusses the methods and techniques that make use of these sensors.

Mobile robot navigation is a very diverse area, and a useful comparison of different approaches is difficult because of the lack of commonly accepted test standards and procedures. The research platforms used differ greatly and so do the key assumptions used in different approaches. Further difficulty arises from the fact that different systems are at different stages in their development. For example, one system may be commercially available, while another system, perhaps with better performance, has been tested only under a limited set of laboratory conditions. For these reasons we generally refrain from comparing or even judging the performance of different systems or techniques. Furthermore, we have not tested most of the systems and techniques, so the results and specifications given in this book are merely quoted from the respective research papers or product spec-sheets.

Because of the above challenges we have defined the purpose of this book to be a survey of the expanding field of mobile robot positioning. It took well over 1.5 man-years to gather and compile the material for this book; we hope this work will help the reader to gain greater understanding in much less time.

Table of Contents

Introduction 10
Part I Sensors for Mobile Robot Positioning
Chapter 1 Sensors for Dead Reckoning 13
1.1 Optical Encoders13
1.1.1 Incremental Optical Encoders14
1.1.2 Absolute Optical Encoders16
1.2 Doppler Sensors17
1.2.1 Micro-Trak Trak-Star Ultrasonic Speed Sensor18
1.2.2 Other Doppler-Effect Systems19
1.3 Typical Mobility Configurations19
1.3.1 Differential Drive19
1.3.2 Tricycle Drive21
1.3.3 Ackerman Steering21
1.3.4 Synchro Drive23
1.3.5 Omnidirectional Drive25
1.3.6 Multi-Degree-of-Freedom Vehicles26
1.3.7 MDOF Vehicle with Compliant Linkage27
1.3.8 Tracked Vehicles28
Chapter 2 Heading Sensors30
2.1 Mechanical Gyroscopes30
2.1.1 Space-Stable Gyroscopes31
2.1.2 Gyrocompasses32
2.1.3 Commercially Available Mechanical Gyroscopes32 Futaba Model Helicopter Gyro33 Gyration, Inc. 33
2.2 Piezoelectric Gyroscopes33
2.3 Optical Gyroscopes34
2.3.1 Active Ring Laser Gyros36
2.3.2 Passive Ring Resonator Gyros38
2.3.3 Open-Loop Interferometric Fiber Optic Gyros39
2.3.4 Closed-Loop Interferometric Fiber Optic Gyros42
2.3.5 Resonant Fiber Optic Gyros42
2.3.6 Commercially Available Optical Gyroscopes43 The Andrew "Autogyro"43 Hitachi Cable Ltd. OFG-344
2.4 Geomagnetic Sensors45
2.4.1 Mechanical Magnetic Compasses46
2.4.2 Fluxgate Compasses47 Zemco Fluxgate Compasses 52 Watson Gyrocompass55 KVH Fluxgate Compasses56
2.4.3 Hall-Effect Compasses57
2.4.4 Magnetoresistive Compasses59 Philips AMR Compass59
2.4.5 Magnetoelastic Compasses60
Chapter 3 Ground-Based RF-Beacons and GPS65
3.1 Ground-Based RF Systems65
3.1.1 Loran65
3.1.2 Kaman Sciences Radio Frequency Navigation Grid66
3.1.3 Precision Location Tracking and Telemetry System67
3.1.4 Motorola Mini-Ranger Falcon68
3.1.5 Harris Infogeometric System69
3.2 Overview of Global Positioning Systems (GPSs)70
3.3 Evaluation of Five GPS Receivers by Byrne [1993]78
3.3.1 Project Goals78
3.3.2 Test Methodology78 Parameters tested79 Test hardware81 Data post processing82
3.3.3 Test Results83 Static test results84 Dynamic test results88 Summary of test results91
3.3.4 Recommendations91 Summary of problems encountered with the tested GPS receivers92 Summary of critical integration issues92
Chapter 4 Sensors for Map-Based Positioning95
4.1 Time-of-Flight Range Sensors95
4.1.1 Ultrasonic TOF Systems97 Massa Products Ultrasonic Ranging Module Subsystems97 Polaroid Ultrasonic Ranging Modules99
4.1.2 Laser-Based TOF Systems101 Schwartz Electro-Optics Laser Rangefinders101 RIEGL Laser Measurement Systems107 RVSI Long Optical Ranging and Detection System109
4.2 Phase-Shift Measurement112
4.2.1 Odetics Scanning Laser Imaging System115
4.2.2 ESP Optical Ranging System116
4.2.3 Acuity Research AccuRange 3000117
4.2.4 TRC Light Direction and Ranging System119
4.2.5 Swiss Federal Institute of Technology's "3-D Imaging Scanner"120
4.2.6 Improving Lidar Performance121
4.3 Frequency Modulation123
4.3.1 Eaton VORAD Vehicle Detection and Driver Alert System 125
4.3.2 Safety First Systems Vehicular Obstacle Detection and Warning System127
Part II Systems and Methods for Mobile Robot Positioning
Chapter 5 Odometry and Other Dead-Reckoning Methods130
5.1 Systematic and Non-Systematic Odometry Errors130
5.2 Measurement of Odometry Errors132
5.2.1 Measurement of Systematic Odometry Errors132 The Unidirectional Square-Path Test132 The Bidirectional Square-Path Experiment134
5.2.2 Measurement of Non-Systematic Errors136
5.3 Reduction of Odometry Errors137
5.3.1 Reduction of Systematic Odometry Errors138 Auxiliary Wheels and Basic Encoder Trailer138 The Basic Encoder Trailer139 Systematic Calibration139
5.3.2 Reducing Non-Systematic Odometry Errors143 Mutual Referencing143 Internal Position Error Correction143
5.4 Inertial Navigation145
5.4.1 Accelerometers146
5.4.2 Gyros146 Barshan and Durrant-Whyte [1993; 1994; 1995]147 Komoriya and Oyama [1994]148
5.5 Summary149
Chapter 6 Active Beacon Navigation Systems151
6.1 Discussion on Triangulation Methods152
6.1.1 Three-Point Triangulation152
6.1.2 Triangulation with More Than Three Landmarks153
6.2 Ultrasonic Transponder Trilateration154
6.2.1 IS Robotics 2-D Location System155
6.2.2 Tulane University 3-D Location System155
6.3 Optical Positioning Systems157
6.3.1 Cybermotion Docking Beacon158
6.3.2 Hilare159
6.3.3 NAMCO LASERNET160 U.S. Bureau of Mines' application of the LaserNet sensor161
6.3.4 Denning Branch International Robotics LaserNav Position Sensor163
6.3.5 TRC Beacon Navigation System163
6.3.6 Siman Sensors and Intelligent Machines Ltd., ROBOSENSE164
6.3.7 Imperial College Beacon Navigation System165
6.3.8 MTI Research CONACTM166
6.3.9 Spatial Positioning Systems, inc.: Odyssey170
6.4 Summary172
Chapter 7 Landmark Navigation 173
7.1 Natural Landmarks174
7.2 Artificial Landmarks175
7.2.1 Global Vision176
7.3 Artificial Landmark Navigation Systems176
7.3.1 MDARS Lateral-Post Sensor177
7.3.2 Caterpillar Self Guided Vehicle178
7.3.3 Komatsu Ltd, Z-shaped landmark179
7.4 Line Navigation180
7.4.1 Thermal Navigational Marker181
7.4.2 Volatile Chemicals Navigational Marker181
7.5 Summary 183
Chapter 8 Map-based Positioning 184
8.1 Map Building185
8.1.1 Map-Building and Sensor Fusion186
8.1.2 Phenomenological vs. Geometric Representation, Engelson & McDermott [1992]186
8.2 Map Matching187
8.2.1 Schiele and Crowley [1994]188
8.2.2 Hinkel and Knieriemen [1988] The Angle Histogram189
8.2.3 Wei, Wetzler, and Puttkamer More on the Angle Histogram191
8.2.4 Siemens' Roamer193
8.2.5 Bauer and Rencken: Path Planning for Feature-based Navigation194
8.3 Geometric and Topological Maps 196
8.3.1 Geometric Maps for Navigation197 Cox [1991]198 Crowley [1989]199 Adams and von Fle202
8.3.2 Topological Maps for Navigation203 Taylor [1991]203 Courtney and Jain [1994]203 Kortenkamp and Weymouth [1993]204
8.4 Summary206
Chapter 9 Vision-Based Positioning207
9.1 Camera Model and Localization207
9.2 Landmark-Based Positioning209
9.2.1 Two-Dimensional Positioning Using a Single Camera209
9.2.2 Two-Dimensional Positioning Using Stereo Cameras211
9.3 Camera-Calibration Approaches211
9.4 Model-Based Approaches213
9.4.1 Three-Dimensional Geometric Model-Based Positioning214
9.4.2 Digital Elevation Map-Based Localization215
9.5 Feature-Based Visual Map Building215
9.6 Summary and Discussion216
Appendix A A Word on Kalman Filters218
Appendix B Unit Conversions and Abbreviations219
Appendix C Systems-at-a-Glance Tables221
References 236
Subject Index262
Author Index274
Company Index278
Bookmark Index279
Video Index280
Full-length Papers Index281