Applications

Editor: Mike Potel

Vehicle Teleoperation Using 3D Maps and GPS Time Synchronization Taro Suzuki Tokyo University of Marine Science and Technology Yoshiharu Amano and Takumi Hashizume Waseda University Nobuaki Kubo Tokyo University of Marine Science and Technology

S

ystems that use wireless communication to remotely control mobile robots and vehicles1–3 effectively support the operation of semiautomatic systems, thus improving the efficiency of labor-intensive work. Such teleoperation systems are especially required when the semiautomatic systems operate in a vast outdoor environment. Example applications include semiautomatic construction systems, robots that guard and monitor large-scale facilities, and systems for preventing and mitigating emergency situations at atomic-energy facilities. Vehicle teleoperation involves two important problems: ■■

■■

providing an interface that helps the operator to intuitively comprehend the remote driving environment so that he or she can operate the vehicle safely and effectively; and implementing data communication between the vehicle and operator, including handling (possibly large) delays and failures in communication.

Solving the first problem involves effectively informing the operator about the vehicle’s state and environment, including, for example, the vehicle’s location, the environment’s topography, and the vehicle’s distance from obstacles. To solve these problems, we’ve devised a system that uses 3D maps and GPS time synchronization to provide effective teleoperation using low-rate wireless communication. 82

September/October 2013

System Basics The system comprises a front-wheel-drive vehicle and a base station for the operator (see Figure 1). For selflocalization, the vehicle has odometers on its wheels, a one-axis fiber-optic gyroscope as an attitude sensor, and a dual-frequency GPS receiver. For real-time observation of nearby obstacles, a stereo camera and laser range finder (LRF) are parallel to the road surface. The base station displays the interface, which shows the vehicle’s location superimposed on a 3D map. Two pairs of 1.2-GHz wireless modems provide full-duplex wireless transmission between the base station and vehicle. The modems can transmit at up to 9,600 bps and have a range of up to 1.5 km. The actual transmission rate might vary, depending on line conditions. Transmitting a large quantity of data with the modems is difficult. To estimate the vehicle’s location, the system measures communication delays directly from the times of the GPS receivers in the vehicle and base station. It measures the delays by simultaneously performing GPS time synchronization and transmission of the GPS compensation data from the base station to the vehicle. Furthermore, because the system constructs a control model of the vehicle, it can estimate and interpolate the vehicle’s location and show that information to the operator. So, operators can easily perform teleoperation even when communication delays occur.

Providing a Teleoperation Interface Using 3D maps obtained in advance reduces the

Published by the IEEE Computer Society

0272-1716/13/$31.00 © 2013 IEEE

Vehicle

Base station Time synchronization

GPS Odometer

Delay calculation

GPS correction data Waypoints

Autonomous navigation Stereo camera

GPS

Position, orientation Obstacles Prediction and interpolation

Waypoints Operator

Stereo camera Teleoperation interface

3D map database

Laser scanner Laser scanner

Figure 1. The teleoperation system. Two GPS receivers synchronize the time between the vehicle and the base station

CCD camera

LRF

IMU

GPS antenna

Figure 2. The mobile mapping system (MMS). The MMS employs a GPS gyroscope and an inertial measurement unit (IMU) to obtain combined estimations of position and attitude. We use an extended Kalman filter to combine the attitude values estimated by the gyroscope using the three GPS antennas. The MMS uses two laser range finders (LRFs) for 3D measurement. (CCD stands for charge-coupled device.)

problem of comprehending the environment to that of obtaining real-time information about obstacles around the vehicle. This in turn reduces the quantity of data that must be communicated. Also, the operator can use the system’s interface, instead of a camera image, to vary the point from which he or she makes an observation. This simplifies teleoperation and comprehending the environment around the vehicle.

Constructing a 3D Map Recent developments in laser measurement have made it feasible to obtain 3D measurements outdoors. In particular, a mobile mapping system (MMS) for 3D measurements of outdoor environments can precisely acquire wide-ranging 3D outdoor point cloud data.4 So, a 3D map consisting of 3D point clouds serves as our system’s interface.

Our MMS (see Figure 2) uses a GPS gyroscope and an inertial measurement unit to obtain combined estimations of position and attitude. We use an extended Kalman filter to combine the attitude values estimated by the gyroscope using three GPS antennas. The MMS allows high-precision estimation of position and attitude in urban or leafy environments. The MMS uses two LRFs for 3D measurement; they measure distances with a catalog precision of ±35 mm through degree-by-degree scanning of the 180-degree range. It can take 3D measurements of outdoor environments with a precision having a standard deviation of 10 cm (1s).4 Using the acquired position, attitude, and distance data, the MMS reconstructs environmental 3D point cloud data. Figure 3 shows a reconstruction of a 3D outdoor scene using the MMS. IEEE Computer Graphics and Applications

83

Applications

The Interface The interface displays the vehicle’s location and orientation as a CG model of the vehicle. Information about nearby obstacles appears as polygons on the 3D map, color-coded according to the obstacles’ distance from the vehicle (see Figure 4a). Operators can also change the view to a bird’s-eye view (see Figure 4b) or driver’s view (see Figure 4c). This function, which facilitates an understanding of the spatial relationships among the obstacles and vehicle, would be difficult to implement if the system used a fixed camera.

Semiautonomous Vehicle Control

Figure 3. Two 3D maps consisting of color point clouds, reconstructed by the MMS. The MMS allows high-precision estimation of location and orientation in urban or leafy environments.

(a)

To control the vehicle, instead of transmitting a speed or steering angle from the base station, our system transmits waypoints (virtual target points). The system takes waypoint coordinates from an operator-controlled joystick. It then automatically generates waypoint coordinates several meters ahead of the vehicle, depending on the joystick input and the vehicle’s speed. As the vehicle receives the waypoints, it calculates a target steering angle toward the next waypoint and thus moves semiautonomously. So, the operator can control the vehicle in a manner like conventional steering, without undue concentration. We denote the vehicle’s location and orientation as (x v, y v) and y in the 2D plane. The ith waypoint is (x wpi, y wpi). The steering angle d is  y wpi − y v  . d = yv − tan−1   x wpi − x v 

(b)

When a communication failure occurs, the vehicle stops at the last successfully transmitted waypoint, to ensure safety. Furthermore, using the 3D maps, the system transmits to the vehicle the obstacles’ distances from the generated waypoint along with the waypoint, after confirming the vehicle’s safety. This prevents operator input errors. Additionally, to prevent accidents, the vehicle automatically stops if one of its sensors detects an obstacle in its path.

The Vehicle Motion Model (c) Figure 4. The system interface. (a) The standard view. (b) The bird’s-eye view. (c) The driver’s view. The system displays any detected obstacles as polygons, color-coded according to the obstacles’ distance from the vehicle. 84

September/October 2013

The system estimates the vehicle’s location using the measured communication delay and the vehicle’s past location, on the basis of our motion model of autonomous driving. In addition, it interpolates the vehicle’s location at high frequency until the base station receives the next data. Figure 5 illustrates this process.

Operator

Vehicle

Operator

Vehicle

tv1

on

ssi

tdelay

i nsm

y ela

d

tdelay

Tra

Estimation

tb1 tv2 lay

∆t

Interpolation

e nd

tdelay

io iss

sm

n Tra

tb2

Time (a)

Time (b)

Figure 5. Estimating and interpolating the vehicle’s location. (a) With the standard teleoperation system, transmissions delays make it impossible for the vehicle operator to know the vehicle’s exact position. (b) Our system uses transmission delays to estimate and interpolate the vehicle’s location. The system measures transmission delays through GPS time synchronization. tb refers to the time of the base station’s GPS transmission; t v refers to the time of the vehicle’s GPS transmission.

Measuring Communication Delay For calculating a position accurate within meters, time signals must be accurate within nanoseconds; GPS systems can provide this accuracy. This, together with the broad availability of cheap GPS devices, makes GPS systems an ideal means for time synchronization of computers. Pulse-per-second signals provide accurate time pulse output of GPS receivers; we use them to obtain GPS time stamps to measure communication delay. Our system adds the time stamp during data transmission and measures the delay when the base station receives the data.

t

xest (t ) = x0 + V



yest (t ) = y0 + V



yest (t ) = y0 +



0

t 0

t 0

cos (yest (t ))Dt, sin (yest (t ))Dt,

y est (t )Dt,

where y est is the orientation rate at t. We must estimate the orientation rate to estimate and interpolate the vehicle’s location. To describe the vehicle’s dynamics, we use a bicycle model. This model assumes that the vehicle is rigid and that its wheels can’t slip sideways. So, we compute the orientation rate as

Estimating and Interpolating the Vehicle’s Location Figure 6 shows an overview of prediction and interpolation of the vehicle’s location using the received vehicle location and the target waypoint. The interpolation time step is Dt; the time t, which we use for the estimation, is t = tdelay + nDt, where n is the number of interpolations. For autonomous driving, we assume that the vehicle’s velocity V is a specified constant value. The received location and orientation are (x0, y0) and y0, respectively. The estimated location and orientation at t are (xest, yest) and yest:

V y est (t ) = dest (t ) , L where L is the vehicle’s wheelbase. We can estimate d by the vehicle’s previous orientation and the direction q from the vehicle to the current waypoint: dest (t ) = yest (t − Dt ) − q (t − Dt ) . Figure 7 illustrates the received and estimated vehicle location using our method. The received location is a discrete value because of the low transmission frequency. In contrast, as we mentioned IEEE Computer Graphics and Applications

85

Applications Estimated position (xest , yest, θest , tb ) tdelay

∆t

V y

∆t Waypoint

Received position (xv , yv , θv , tv )

(xwp , ywp )

x

Figure 6. Predicting and interpolating the vehicle’s location. The vehicle automatically drives toward the current waypoint. q is the direction from the vehicle to the current waypoint.

Predicted position

60.8

A

Interpolated position

62

Received position

60.6

Waypoint 60.4

A

60.2

60 60.0 59.8 59.6

North (m)

58

(b) 55.4

56

–8.2

–8.0

–7.8

–7.6

B

55.2 55.0

B

54.8

54

54.6 54.4

52 54.2

–14 (a)

–12

–10 East (m)

–8

54.0

–14.4

–14.2

–14.0 –13.8

–13.6

(c)

Figure 7. The predicted and interpolated vehicle location. (a) The entire graph. (b) A close-up at point A. (c) A close-up at point B. The received location is a discrete value because of the low transmission frequency. In contrast, the estimated location is successfully determined via interpolation. 86

September/October 2013

before, the estimated location is successfully determined via interpolation.

Testing the System We compared our system to a conventional camerabased system in an actual environment. The conventional system used a camera with a horizontal angle of view of 60 degrees. It transmitted images through a wireless LAN to the base station at approximately 15 Hz. In this experiment, each of the 10 participants tried to teleoperate a vehicle approximately 80 m to the destination and then back to the starting point (see Figure 8). With the camera-based system, obstacle recognition was difficult owing to the narrow angle of view (see Figure 9a). In contrast, with our interface, determining the vehicle’s and obstacles’ locations was easy (see Figure 9b). Moreover, the use of obstacle sensors made determining the obstacles’ distances from the vehicle easy. When using our system, every participant reached the goal. On the other hand, for every participant, the vehicle with the camera-based system stopped automatically when it came too close to an obstacle; it couldn’t reach the goal. Next, we evaluated maneuverability with and without location estimation (that is, with and without taking measures against communication delays). As in the earlier experiment, we specified a target course and goal. Then, each participant drove the vehicle at 1.1 m/s–1, three times with location estimation and three times without it. We evaluated maneuverability in terms of ■■

■■

(b)

Target passage

Base station

(c)

Goal

(a)

(d)

Figure 8. Testing our system. (a) The environment and driving path. (b) A participant at the controls. (c) The interface. (d) The vehicle. The participants had to teleoperate the vehicle approximately 80 m to the destination and then back to the starting point.

the completion rate—whether the vehicle arrived at the destination—and the achievement rate—the extent to which the vehicle followed the course.

We defined completion as reaching a point within a radius of 2 m from the goal. We defined achievement as staying within a radius of 0.5 m from the target passage. Furthermore, we calculated the distance error between the target course and the vehicle’s location. To investigate how the experience of teleoperation influenced the achievement rate, five participants first used the system with location estimation and the other five participants first used the system without it. The completion rate was 66.7 percent without location estimation and 86.7 percent with it. The achievement rate was 78.8 percent without location estimation and 92.2 percent with it. In addition, employing location estimation improved the distance error by more than 10 cm.

Start

(a)

(b)

Figure 9. Comparing (a) the camera image with (b) our interface. With the camera-based system, obstacle recognition was difficult owing to the narrow angle of view. With our interface, determining the vehicle’s and obstacles’ locations was easy.

These results confirmed that maneuverability improves if the system takes communication delays into account. When the vehicle’s speed or the IEEE Computer Graphics and Applications

87

Applications

communication delay increases, the vehicle moves farther during the delay. So, we expect a significant difference between systems with and without location estimation.

4. K. Ishikawa et al., “A Mobile Mapping System for Precise Road Line Localization Using a Single Camera and 3D Road Model,” J. Robotics and Mechatronics, vol. 19, no. 2, 2007, pp. 174–180.

A

Taro Suzuki is a postdoctoral fellow at the Tokyo University of Marine Science and Technology. Contact him at tsuzuk0@ kaiyodai.ac.jp.

future challenge is to test this system using a satellite line that enables long-distance communication. Data communication using a satellite has a communication delay of several seconds. Furthermore, we’ll evaluate our method in actual applications such as guarding an atomic-energy facility.

Yoshiharu Amano is a professor at Waseda University’s Research Institute for Science and Engineering. Contact him at [email protected]. Takumi Hashizume is a professor at Waseda University’s Research Institute for Science and Engineering. Contact him at [email protected].

References 1. S. Tachi, “Real-Time Remote Robotics—toward Networked Telexistence,” IEEE Computer Graphics and Applications, vol. 18, no. 6, 1998, pp. 6–9. 2. D.A. Bowman et al., “Travel in Immersive Virtual Environments: An Evaluation of Viewpoint Motion Control Techniques,” Proc. 1997 Virtual Reality Ann. Int’l Symp., IEEE, 1997, pp. 45–52. 3. L.A. Nguyen et al., “Virtual Reality Interfaces for Visualization and Control of Remote Vehicles,” Autonomous Robots, vol. 11, no. 1, 2001, pp. 59–68.

Nobuaki Kubo is an associate professor at the Tokyo University of Marine Science and Technology. Contact him at [email protected]. Contact department editor Mike Potel at potel@wildcrest. com. Selected CS articles and columns are also available for free at http://ComputingNow.computer.org.

AdvertiSer informAtion • September/october 2013

Advertising Personnel Marian Anderson: Sr. Advertising Coordinator Email: [email protected] Phone: +1 714 816 2139 | Fax: +1 714 821 4010 Sandy Brown: Sr. Business Development Mgr. Email [email protected] Phone: +1 714 816 2144 | Fax: +1 714 821 4010 Advertising Sales Representatives (display) Central, Northwest, Far East: Eric Kincaid Email: [email protected] Phone: +1 214 673 3742 Fax: +1 888 886 8599 Northeast, Midwest, Europe, Middle East: Ann & David Schissler Email: [email protected], [email protected] Phone: +1 508 394 4026 Fax: +1 508 394 1707

88

Southwest, California: Mike Hughes Email: [email protected] Phone: +1 805 529 6790 Southeast: Heather Buonadies Email: [email protected] Phone: +1 973 585 7070 Fax: +1 973 585 7071 Advertising Sales Representatives (Classified Line) Heather Buonadies Email: [email protected] Phone: +1 973 585 7070 Fax: +1 973 585 7071 Advertising Sales Representatives (Jobs Board) Heather Buonadies Email: [email protected] Phone: +1 973 585 7070 Fax: +1 973 585 7071

September/October 2013

cga_half_noadsmarapr2013.indd 1

8/15/13 1:49 PM

Vehicle teleoperation using 3D maps and GPS time synchronization.

In conventional vehicle teleoperation systems, using low-bandwidth, high-delay transmission links causes a serious problem for remote control of the v...
2MB Sizes 2 Downloads 3 Views