Frugal Flight: Indoor Stabilization of a Computationally Independent Drone Without GPS
Total Page:16
File Type:pdf, Size:1020Kb
Frugal Flight: Indoor Stabilization of a Computationally Independent Drone without GPS Nikhil Devanathan Acknowledgements I wish to acknowledge my parents, Ram Devanathan and Subha Narayanan, for helping me by providing materials and equipment, reviewing my design, providing guidance, answering questions, offering pointers, instructing me on data analysis, taking photographs, and proof reading my written material. I thank Mr. Jonathan Tedeschi at Pacific Northwest National Laboratory for patiently answering my questions about drone sensor systems and stabilization and speaking to me about his own work on sophisticated drones. I am indebted to Dr. David Atkinson at Pacific Northwest National Laboratory for taking the time to serve as my project’s mentor once more and listening to my progress reports on this project on numerous Sundays. The advice, suggestions, and review from both Mr. Tedeschi and Dr. Atkinson was invaluable to my project and prompted me to develop many of the capabilities and systems on the drone. All photographs associated with this project, shown in this report and in poster displays, were taken by me or by my parents. Contents Abstract ........................................................................................................................................................ 1 1. Introduction ............................................................................................................................................. 2 2. Literature Review ................................................................................................................................... 5 3. Engineering Goals and Design Specification ........................................................................................ 7 4. Design and Development ........................................................................................................................ 9 4.1 Hardware ........................................................................................................................................... 9 First Prototype .................................................................................................................................... 9 Second Prototype .............................................................................................................................. 11 Third Prototype ................................................................................................................................. 13 4.2 Flight Controller Configuration .................................................................................................... 15 Ground Control Configuration........................................................................................................ 18 Serial Configuration ......................................................................................................................... 19 4.3 Software ........................................................................................................................................... 19 Computer Vision ............................................................................................................................... 20 Parallelization .................................................................................................................................... 21 5. Tuning and Calibration ........................................................................................................................ 22 5.1 Accelerometer and Gyroscope ....................................................................................................... 24 5.2 Ultrasonic Sensor ............................................................................................................................ 25 5.3 Camera ............................................................................................................................................. 26 6. Future Direction .................................................................................................................................... 27 7. Conclusions and Outlook ..................................................................................................................... 28 8. Bibliography .......................................................................................................................................... 30 9. Appendix ................................................................................................................................................ 32 Abstract Unmanned aerial vehicles (drones) represent an innovative technology with potential indoor applications in warehouse inspection, nuclear reactor and construction sites, decommissioning after industrial accidents, and law enforcement. Current approaches to stabilize drones using global positioning system (GPS) limit their application in these practical tasks, where the GPS signal may be lost. GPS based-stabilization is robust in position holding, but has limited functionality in moving a drone precisely in a small indoor area. The goal of this engineering project was to design, prototype, program, and test an inexpensive drone to demonstrate small- scale stabilization and precise motion without the aid of GPS or external computing. This was achieved through the use of efficient computer vision. The drone utilized two interconnected onboard computers. The first computer, a Raspberry Pi, ran Python code that polled a connected camera and ran computer vision algorithms to stabilize the drone. It was connected to the flight controller, the second computer, which controlled the drone thrust and stabilization using data from the Raspberry Pi. Different drone processes were run concurrently by taking advantage of multiple cores. The drone was built on an open carbon fiber frame for structural integrity and ease of alteration. It was powered by a 3-cell, 2200 milliamp hour Lithium Polymer battery which was able to easily power all systems on the drone. This device was able to position itself using Optical flow, which was altered for use with low computational power systems. The drone can also deliver or move a small payload. This drone serves as a proof of concept that can be augmented to be part of a swarm and can benefit from artificial intelligence with additional processing power. Using this technology, drones could be utilized indoors to inspect, move components, deliver payload, or conduct co-operative tasks that are unsafe for humans. 1 1. Introduction Drone (unmanned aerial vehicle or UAV) popularity and use has skyrocketed over recent years. Drones are expected to have an annual impact between $31 billion and $46 billion on the Gross Domestic Product of the United States by 2026 (Levick, 2018). This advance can largely be attributed to the improvements in battery technology which have led to increased drone flight time. As computer technology has also simultaneously evolved, resulting in smaller, more efficient, and more powerful computers, the potential of drones with on-board computers has grown. The idea of equipping drones with on-board computers is not new, but previous iterations have had their flaws. Drones with on-board computers have previously been large in size. This allowed for engineers to compensate for the large power consumption of the on-board computer with a large battery or fuel tank (Allain, 2018). Smaller drones that were computationally independent were very uncommon until recent years. The more common solution involved in utilizing small drones, especially indoors, was to use external computing (Zhou, 2015). Drones were run in highly structured environments and monitored by motion capture cameras. The camera systems then connected to a powerful computer which would determine the position of the drone in real time and maneuver or stabilize it. While this technique was useful when working in extremely structured environments, the drones had limited range and were unable to function independently. In more recent years, companies such as DJI (https://www.dji.com) and NVIDIA (https://www.nvidia.com) have pioneered putting computers on drones (Smolyanskiy, 2017). By utilizing highly efficient, mobile graphics processors, these corporations have been able to equip drones with significant computational power. This can be especially useful for stabilizing drones without the help of global positioning system (GPS). GPS is currently the primary way to stabilize drones and is often used in conjunction with data from an accelerometer and gyroscope for added precision. GPS functionality is excellent in open spaces like fields, parks, and tarmacs, but quickly loses accuracy when the sky is obstructed. In most cases, GPS is estimated to have an accuracy of 0.715 m, though the GPS in smartphones can have an error of up to 4.9 m according to gps.gov. This level of error is not suitable for stabilizing small drones, performing precision tasks, or 2 operating within obstructed areas such as cities or forests. Despite the disadvantages, GPS has remained the preferred method of drone stabilization due to the low computational and power cost of using GPS. An alternative to GPS for drone stabilization is Light Detection and Ranging (LIDAR) technology. LIDAR is functionally similar to radar technology, and involves the use of a pulse of light and the measurement of the frequency and intensity of the returned light in order to collect detailed information about 3 dimensional