
v.2, n.1, p.2-18, 2019. CarNotFound: Project and Development of an Autonomous R/C Car Isabelle Diniz Orlandi, Daniel Pereira Cinalli, Italo Milhomem de Abreu Lanza, Thiago Lima de Almeida, Tito Caco Curimbaba Spadini, Pedro Ivo da Cruz, Filipe Ieda Fazanaro Federal University of ABC (UFABC) ABSTRACT Robots are fascinating, mainly, due to the fact that it is essential that the acknowledgment of different areas - as electrical engineering, computer engineering, mechanical engineering, physics, mathematics and biology - must be integrated so that the robot can interact with the environment. In the context of autonomous robots, it is also fundamental the capability to locate itself without any human interference, only by processing the information obtained by numerous types of sensors. The work described here aims to study the essential concepts associated with autonomous robots, specifically, an autonomous R/C car capable to navigate in a closed environment, following a dashed reference line, using only a single camera. The processing of the images acquired and the execution of the control system were performed by a Raspberry Pi 3B+ using codes in Python and OpenCV. The results obtained indicate strong dependence on the rate of the processed frames per second and, moreover, a simple PD controller was sufficient to adjust the direction of the car along the trajectory to be followed. Keywords: Monocular vision, OpenCV, PD controller, PiCamera, Python, Raspberry Pi 3B+. INTRODUCTION Since Leonardo da Vinci’s first sketches, through the initial decades of the 20th century and gaining more attention after the publications of Isaac Asimov’s works, robotics have become a topic that excites the human imagination. Although we are a long way from having the robots as described in science fiction, i.e. humanoid robots interacting with human beings or assisting pilots on rebel spaceships fighting against the Empire, it is relatively affordable to buy simple structures - compared to fiction -, such as a vacuum cleaner robot, to help in the daily activities. An incipient view about robotics would be associated with the robot manipulators [1] cooperatively operating in an automotive production line. The manipulators has transcended this barrier long ago and nowadays are employed in several aspects of the society. In medicine, for example, there were developed studies showing how robotics has assisted surgical procedures, diagnoses and remotely monitor the patients [2], being the medical robotic system Da Vinci one of the the most important [3]. 2 v.2, n.1, p.2-18, 2019. Throughout the last decades, the world contemplated the development of another class of robotics, named mobile robots. As there are studies and research related to different types of such robots, e.g. the ones which are capable to move through the air [4, 5, 6] or in the water [7, 8], its core arises from the 1920s, when the first radio controlled vehicles were developed [9]. In this context, probably autonomous cars can be seen to be one of the most emblematic example of mobile robots. In recent years, it has been possible to see a constant growing interest in the research related to autonomous vehicles led, mainly, by companies such as Google, Tesla, Amazon, NVidia and Intel. However, a project of this nature is extremely complex and until now has open questions [10]. In this context, the work reported here aims to roughly understand such questions and studies the fundamental techniques necessary to allow a vehicle - in this case, a radio controlled (R/C) 1:14 scale car - to move autonomously within a closed envi- ronment, using a single camera and computer vision concepts, respecting the initial motivation of the project: the RoboCar Race [11]. In the next sections, it is presented the main aspects related to the vehicle’s deve- lopment, followed by the concepts used to implement its control system and the results obtained during the experiments. Finally, some discussions and conclusions about the project are presented. AN OVERVIEW ABOUT THE VEHICLE DEVELOPED The vehicle’s development can be analyzed considering the hardware - which is related to the motors, the power system and all the embedded electronic - and the software, which is associated with the computational vision algorithm and the control system. The main aspects of both are briefly described in the following. Hardware: the main aspects The embedded electronic of the vehicle was based on simple and basic components [12] and can be synthesized with the help of the block diagram presented in Figure 1. Power System Raspberry Powerbank PiCamera V2 Pi 3B+ • 7.2V Battery LM298 Driver Servomotor DC Motors • Figure 1. An overview describing how the embedded electronic is structured. As illustrated in Figure 1, a Raspberry Pi 3B+ (RPi) [13] is the responsible for processing the visual information obtained by a PiCamera V2 [14] and controlling the DC motors. The car contains two DC motors [15] (Motor A and B installed, respectively, at the right and at the left sides of the chassis) which are driven by an H-bridge (L298N) [16]. In Figure 2, it is possible to observe the signals IN1A, IN1B, IN2A and IN2B which are directly 3 v.2, n.1, p.2-18, 2019. connected, in pairs, to the RPi’s GPIO, i.e. IN1A and IN1B are both connected to the GPIO BCM 23, and IN2A and IN2B to the GPIO BCM 22. This solution allowed to drive both DC motors (e.g. rotate clockwise, counterclockwise and stop) at the same time, because it was not necessary to separately control their rotation. Figure 2. Illustration of how the DC motors were connected to the GPIOs of the Raspberry Pi. The LD-1501MG servomotor [17] was installed in the front of the chassis and used to rotate the front wheels and, consequently, adjusting the direction of the car. This servomotor can be controlled by employing a PWM signal generated at the RPi by software - using any GPIO -, or by hardware (GPIO BCM 13; dedicated pin) [18]. As can be seen in Figure 3, the behavior of the PWM generated by software is noisier when compared to the signal generated by hardware. During some initial experiments, this characteristic caused the servomotor to slightly move when no command was being sent and, therefore, it was used the hardware PWM to control the servomotor. Figure 3. Comparation of the PWM signals generated by the RPi. The curve in blue (at the top) was generated by hardware. In both cases, the duty cycle is 25% and the high voltage is 3.3V. From Figure 1, it is also possible to observe that the power system is divided in two subsystems. A common NiCd 7.2V/2200mAh battery - used in R/C cars - was employed to supply the necessary energy for the DC motors. A Xiaomi Mi Powerbank 2i (model PLM09ZM), with two outputs with 5V/2.4A each, was connected to the RPi. The servomotor was connected directly to the GPIO 2 (or 4) of the RPi. Hardware: some informations about the camera The most important sensor employed in this project - in fact, the only one - was the PiCamera V2 [14]. Although it was possible to use standard USB webcams, the advantage of the PiCamera is that it is connected directly to 4 v.2, n.1, p.2-18, 2019. the RPi’s CPU (BCM2837 SoC) through an CSI-2 interface, with a 2 Gpbs bandwidth. The PiCamera captures the image using a rolling shutter mechanism [19] where each image is built from the pixels values of each line of its sensor, i.e. each line of a single frame is obtained at different time instants. To avoid possible distortions in the frame captured, i.e. to ensure that continuous reading of frames always occurs within the same time, the associated capture processing is performed by the RPi’s GPU - which runs its own real-time operating system (VCOS). For further information, it is strongly recommended to review the reference [20, section 6.1.5. - Division of labor]. Hardware: the “404” autonomous R/C car As previously mentioned, the rules of the Robocar Race [11] establishes that just one camera could be used for sensing the environment and all the processing must be done in a Raspberry Pi (any model). Additionally, the chassis dimensions must be smaller than 420mm (length) × 200mm (width). The chassis used [21] and all the embedded electronic are shown in Figure 4. Figure 4. An image of the car. It is possible to see all the embedded electronic and the power system. Some general considerations about the software Since a RaspberryPi was employed as the processing unit, it was used the Raspbian Lite (version 2018-10-09; Linux kernel 4.14.79) as the operational system. It was defined a script which initializes during the OS booting and is the responsible to configure the PIGPIO library and the hardware PWM [18]. It must be observed that all the programs were developed using Python 3.5.3 programming language and based on object oriented con- cepts - which facilitated the debugging and the development itself, and, more important, the code portability. Some additional libraries and packages, as the NumPy (mathematical functions and arrays manipulations) and the RPi.GPIO (control of the IO peripherals), were also employed. Finally, the computer vision algo- rithms were developed using the OpenCV (version 3.4.3) library which were installed from the source using the tutorial presented in [22]. Further explanations of how the algorithms were developed and some related 5 v.2, n.1, p.2-18, 2019. discussions are presented in the following. IMAGE PROCESSING AND CAR CONTROL The main objective of the competition consists in an autonomous R/C car being capable to travel along a pre-established course in the shortest possible time.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages17 Page
-
File Size-