Autonomous Unmanned Ground Vehicle and Indirect Driving
Total Page:16
File Type:pdf, Size:1020Kb
Autonomous Unmanned Ground Vehicle and Indirect Driving ABSTRACT This paper describes the design and challenges faced in the development of an Autonomous Unmanned Ground Vehicle (AUGV) demonstrator and an Indirect Driving Testbed (IDT). The AUGV has been modified from an M113 Armoured Personnel Carrier and has capabilities such as remote control, autonomous waypoint seeking, obstacle avoidance, road following and vehicle following. The IDT also enables the driver of an armoured vehicle to drive by looking at live video feed from cameras mounted on the exterior of the vehicle. These two technologies are complementary, and integration of the two technology enablers could pave the way for the deployment of semi-autonomous unmanned (and even manned) vehicles in battles. Ng Keok Boon Tey Hwee Choo Chan Chun Wah Autonomous Unmanned Ground Vehicle and Indirect Driving ABSTRACT This paper describes the design and challenges faced in the development of an Autonomous Unmanned Ground Vehicle (AUGV) demonstrator and an Indirect Driving Testbed (IDT). The AUGV has been modified from an M113 Armoured Personnel Carrier and has capabilities such as remote control, autonomous waypoint seeking, obstacle avoidance, road following and vehicle following. The IDT also enables the driver of an armoured vehicle to drive by looking at live video feed from cameras mounted on the exterior of the vehicle. These two technologies are complementary, and integration of the two technology enablers could pave the way for the deployment of semi-autonomous unmanned (and even manned) vehicles in battles. Ng Keok Boon Tey Hwee Choo Chan Chun Wah Autonomous Unmanned Ground Vehicle and Indirect Driving 54 AUGV has been tested. Section 4 describes the System Architecture We integrated Commercial-off-the-Shelf (COTS) 1. INTRODUCTION Indirect Vision Operations (IVO) and Indirect hardware with custom algorithms to gain Driving Demonstrator design. Section 5 lists Under the project, various enabling machine perception. The Unstructured Terrain In the last decade, advances in computing and the lessons learnt from both developments technologies were explored. Through this Vision Module (UTVM) uses COTS stereo vision networking technologies have brought and discusses the synergies and possible future project, we have built up local capabilities in system (a passive sensor) to provide 3D range battlefield robotics another step closer to concepts that may arise from the integration these technologies. Such local capabilities data, whilst the Laser Scanner Module (LSM) reality. Robotics will play an increasingly of the two technologies. Section 6 then reports would benefit us in the future fielding of provides “2 1/2 D“ range data (the laser provides important role in the future battlefield as the the synergies between the two technologies unmanned ground vehicles. The overall system 3D range but only in a plane). These two technology matures and allows force projection and how they can be exploited for the concept architecture of the AUGV is shown in modules are mainly “vision” sensors, as they and multiplication, with minimal exposure of of Teaming. Section 7 concludes this paper. Figure 3. The overall system consists of five only provide terrain elevation only. The Road our soldiers to hazardous situations and subsystems: Visual Guidance System (VGS), Segmentation Module (RSM) and Night Driving environments. Besides the maturation of Vehicle Piloting System (VPS), Vehicle Control Module (NDM) give the vehicle some robotics technologies, the battlefield will also 2. AUGV DESCRIPTION System (VCS), Teleoperation Control System perception capabilities. Through these two evolve into one that will see more unmanned (TCS) and Robust Communications System (RCS). modules, the vehicle is able to distinguish a operations, arising from the proliferation of The AUGV was designed to develop As seen, major subsystems have names ending road in day and night. The RSM makes use of wireless ad hoc networks and non line-of-sight autonomous technologies through using the in “system”. At the next level, the functions colour information to differentiate between weapons. With such technologies as robotics M113 as a prototype platform. From this that make up each subsystem have names roads and vegetation, and passes the road and data networks comes a rapid expansion project, we have developed significant local ending in “module”. centre information to the Vehicle Navigation of the boundaries of situational awareness know-how in technologies such as Machine Module (VNM), beyond what conventional sensors can give. Perception, Autonomous Navigation and while the NDM The future fighting vehicle will have to be re- Supervisory Control, Vehicle Control and uses an infrared invented to be more effective in this new Actuation and Teleoperation. environment. The crew will no longer need to camera to be exposed to the dangers of driving “hatch- Vehicle Platform differentiate up” and still be able to have the same or better between the warmer roads and situational awareness compared to operating The AUGV is based on a M113A2 tracked the cooler hatch-up. Such a concept, where the crew can Armoured Personnel Carrier (APC) as the base vegetation. The drive and operate the fighting vehicle under platform. Besides various sensors, it has to be Vehicle Following the protection of the hull, is termed Indirect retrofitted with actuators to control its steering Driving. By integrating Indirect Driving levers, gear change lever and accelerator pedal, Module (VFM) technology with robotic technologies, we can along with other actuators for safety. Due to Figure 3. System Architecture of AUGV makes use of the create a new concept of manned and the tracked locomotion and large mass, the laser scanners to unmanned platforms operating in synergy with control algorithm development was Visual Guidance System follow a known one another. challenging and required many trials in the vehicle. The field. Figure 1 shows the M113 integrated information from these sensors is fused by the One of the biggest challenges for autonomous This paper is organised into seven sections: with the sensors and Figure 2 is a CAD drawing Sensor Fusion Module (SFM) in different ways, capability is machine perception. Machine Section 2 describes the design of the AUGV. of the platform. perception entails the sensing and depending on the operating mode, to provide Section 3 focuses on the modes under which understanding (sensing + understanding = a combined map for the VNM. The VFM GPS receiver perception) of the environment so that the processes the laser scans of the target vehicle Vision Sensor suite consisting of stereo vision, and allows the AUGV to follow it. Figure 4 laser scanner, IR camera, CCD camera Reverse robotic vehicle can make informed and camera intelligent decisions. Machine perception is shows the sensors that make up difficult because we live in a 3D world, and the VGS. visual sensors will need to Safety collect and process a vast sensors amount of information in real time (or near real time). Modern computer 55 Inertial Measurement technology has just begun Vehicle Unit to make them portable and Actuation Figure 1. M113 integrated cheap enough for Figure 4. VGS sensors with sensors Figure 2. CAD drawing of robotic M113 field application. Autonomous Unmanned Ground Vehicle and Indirect Driving 54 AUGV has been tested. Section 4 describes the System Architecture We integrated Commercial-off-the-Shelf (COTS) 1. INTRODUCTION Indirect Vision Operations (IVO) and Indirect hardware with custom algorithms to gain Driving Demonstrator design. Section 5 lists Under the project, various enabling machine perception. The Unstructured Terrain In the last decade, advances in computing and the lessons learnt from both developments technologies were explored. Through this Vision Module (UTVM) uses COTS stereo vision networking technologies have brought and discusses the synergies and possible future project, we have built up local capabilities in system (a passive sensor) to provide 3D range battlefield robotics another step closer to concepts that may arise from the integration these technologies. Such local capabilities data, whilst the Laser Scanner Module (LSM) reality. Robotics will play an increasingly of the two technologies. Section 6 then reports would benefit us in the future fielding of provides “2 1/2 D“ range data (the laser provides important role in the future battlefield as the the synergies between the two technologies unmanned ground vehicles. The overall system 3D range but only in a plane). These two technology matures and allows force projection and how they can be exploited for the concept architecture of the AUGV is shown in modules are mainly “vision” sensors, as they and multiplication, with minimal exposure of of Teaming. Section 7 concludes this paper. Figure 3. The overall system consists of five only provide terrain elevation only. The Road our soldiers to hazardous situations and subsystems: Visual Guidance System (VGS), Segmentation Module (RSM) and Night Driving environments. Besides the maturation of Vehicle Piloting System (VPS), Vehicle Control Module (NDM) give the vehicle some robotics technologies, the battlefield will also 2. AUGV DESCRIPTION System (VCS), Teleoperation Control System perception capabilities. Through these two evolve into one that will see more unmanned (TCS) and Robust Communications System (RCS). modules, the vehicle is able to distinguish a operations, arising from the proliferation