Signature Redacted

Total Page:16

File Type:pdf, Size:1020Kb

Signature Redacted Long-range Outdoor Monocular Localization with Active Features for Ship Air Wake Measurement by Brandon J. Draper B.S., Aerospace Engineering, Univ. of Maryland, College Park (2016) Submitted to the Department of Aeronautics and Astronautics in partial fulfillment of the requirements for the degree of M•s-tttr BBJchdM of Science in Aerospace Engineering at the MASSACHUSETTS INSTITUTE OF TECHNOLOGY February 2019 © Massachusetts Institute of Technology 2019. All rights reserved. Author..-Signature redacted . ................ Departme~t of Aeronautics and Astronautics October 1, 2019 Signature redacted Certified by .......... Jonathan P. How R. C. Maclaurin Professor of Aeronautics and Astronautics, MIT Thesis Supervisor --- . / Signature redacted Accepted by .... \..._..,/ Sertac Karaman Associate Professor of Aeronautics and Astronautics, MIT MASSACHUSETTSINSTITUTE Chair, Graduate Program Committee OF TECHNOLOGY MAR 12 2019 LIBRARIES ARCHIVES 2 Long-range Outdoor Monocular Localization with Active Features for Ship Air Wake Measurement by Brandon J. Draper Submitted to the Department of Aeronautics and Astronautics on October 1, 2019, in partial fulfillment of the requirements for the degree of Bachelor of Science in Aerospace Engineering Abstract Monocular pose estimation is a well-studied aspect of computer vision with a wide ar- ray of applications, including camera calibration, autonomous navigation, object pose tracking, augmented reality, and numerous other areas. However, some unexplored areas of camera pose estimation remain academically interesting. This thesis provides a detailed description of the system hardware and software that permits operation in one application area in particular: long-range, precise monocular pose estimation in feature-starved environments. The novel approach to pose extraction uses special hardware, including active LED features and a bandpass-interference optical filter, to significantly simplify the image processing step of the Perspective-n-Point (PnP) problem. The PnP problem describes the calculation of pose from n extracted image points corresponding to n known 3D world points. The proposed application method operates in tandem with a tethered unmanned aerial vehicle (UAV) and mobile ground control station (GCS). The integrated localization and flight system serves as a plat- form for future U.S. Navy air flow research. Indoor tests at the RAVEN flight space of MIT's Aerospace Controls Lab and outdoor tests at a grass strip runway demonstrate the system's efficacy in providing an accurate and precise pose estimate of the UAV relative to the mobile GCS. Thesis Supervisor: Jonathan P. How Title: R. C. Maclaurin Professor of Aeronautics and Astronautics, MIT 3 4 Acknowledgments I would first like to thank my advisor, Jon How, both for the opportunity to advance myself here at MIT and for providing insight and advice at each obstacle I faced in the project. Your guidance proved invaluable in the completion of the project and in my personal development. I have learned such a tremendous amount in such a short time. You have helped me to improve both technically and analytically, giving me the skills that provide access to much greater success in the future. I would like to thank Ben Cameron, John Walthour, Will Fisher, and many others of Creare, LLC. for their technical expertise and roles in the completion of the project. Thank you to all of my fellow ACL members for their support during my time at MIT. Justin Miller took me under his wing when I first joined the lab, helping me transition into grad school and offering support in the early stages of the project. Brett Lopez has provided extensive hardware support, and has propped me up when- ever I felt particularly discouraged. Michael Everett was always my go-to when it came to Linux and Robotic Operating System (ROS) issues, and has played a vi- tal role in the completion of the project. Lastly, my UROP, Ryan Scerbo, helped tremendously in the final stages of the project, implementing software updates and providing much-needed flight test support. Thanks to my parents, who encourage me to set the bar high and give it my all. You're the reason for my ambition and dedication, and I owe you much of my success. I want to thank my incredible partner, Dema Tzamaras, for her unending sup- port, patience, and resilience through our long-distance relationship. Thank you for encouraging me when I faltered, being the rock that I relied on, and being a constant source of joy. I hope I bring as much happiness to your life as you do to mine. Lastly, I would be remiss not to acknowledge the sources of funding for the project and my educational pursuits at MIT: Creare, LLC., the Small Business Technology Transfer (STTR), and the Office of Naval Research. 5 6 Contents 1 Introduction 15 1.1 Project Overview ...... ....... ...... ...... .... 15 1.2 M otivation ....... ............. ............ 16 1.3 Related Work ..... ........ ........ ......... 17 1.3.1 Operational Environment Overview ...... ..... .... 18 1.3.2 Non-camera-based Localization Methods .... ........ 18 1.3.3 Camera-based Localization Methods ........... ... 20 1.4 Thesis Contributions .... ............ ........... 21 2 Monocular Camera Model and Perspective-n-Point Problem 23 2.1 Monocular Camera Model .... .................... 23 2.1.1 Pinhole Camera Model .......... ........... 24 2.1.2 Real Lens Calibration ...................... 25 2.2 Perspective-n-Point Problem ........ ............... 27 2.2.1 PnP Description ......................... 27 2.2.2 The OpenCV Iterative PnP Method .... ....... ... 29 3 Monocular Localization System 31 3.1 System Requirements .. ......................... 31 3.2 Monocular Localization Choice ... ........ ......... 32 3.3 H ardware .......... ............ ........... 32 3.3.1 Active Features ........ .................. 33 3.3.2 C am era ....... ........ ........ ....... 34 7 3.3.3 Power Considerations ...... ..... ...... ...... 35 3.4 Software .. ..... ..... ..... ...... .... ..... .. 36 3.4.1 Initial Centroid Extraction ...... ...... ...... .. 37 3.4.2 First Centroid Filter . ..... ..... .... ..... ... 39 3.4.3 Second Centroid Filter ... ..... .... ..... ..... 39 3.4.4 Assign Points to Rows .... ........ ....... ... 40 3.4.5 Solve PnP ...... ........ ........ ....... 41 3.4.6 Summary .. ....... ........ ........ .... 41 4 Flight System 43 4.1 Hardware ....... ...... ...... ...... ...... .. 43 4.1.1 GCS Hardware ... ........ ........ ....... 43 4.1.2 UAV Hardware ...... ........ ........ .... 45 4.1.3 Special Hardware Considerations ...... ....... ... 46 4.2 Software ..... ...... ....... ...... ...... .... 48 4.2.1 GCS Software . ........ ........ ....... ... 49 4.2.2 UAV Software . ..... ..... ..... .... ..... .. 50 4.2.3 Waypoint Commands in a Relative System . ..... ..... 52 4.2.4 A Note on Communications .. ..... .... ..... ... 54 4.2.5 Summary . .... .... .... .... .... ... .... 56 5 Testing and Results 57 5.1 Monocular Localization System Ground Tests ... .... .... .. 57 5.1.1 Vicon ..... ...... ...... ...... ...... .. 58 5.1.2 Outdoor . ...... ...... ..... ...... ...... 66 5.2 Flight System Ground Tests ....... ...... ...... .... 70 5.2.1 Software-in-the-Loop Simulations .. .... ..... ..... 70 5.2.2 Hardware Validation ..... ..... .... ..... .... 70 5.3 Integrated System Tests . ..... ...... .... ..... .... 72 5.3.1 Initial Verification ..... ..... .... ..... ..... 72 5.3.2 Stationary GCS Flights .... .... .... .... .... .. 73 8 5.3.3 Mobile GCS Flights. ................... .... 77 5.3.4 Summary ....... ............ .......... 80 6 Conclusion 81 6.1 Summary ................................. 81 6.2 Future Work. ..................... .......... 82 A Equipment 83 9 10 List of Figures 1-1 Project Overview . ....... ...... ...... ...... ... 16 2-1 Pinhole Camera Model ... ........ ......... ...... 25 2-2 Camera Distortion Examples .... .......... ......... 26 2-3 Camera Calibration Process ... ......... ........ ... 27 2-4 P3P Problem Diagram ........ ........ ......... 28 3-1 Sunlight Energy Spectrum at Sea Level ................. 33 3-2 Mobile Ground Station Beacon Mounting .......... ...... 34 3-3 Camera Setting Comparison ..... ............ ...... 35 3-4 Optical Filter Transmission Spectrum .................. 36 3-5 Pose Extraction Block Diagram ... ............ ...... 37 3-6 Trucated Search Space and Feature Extraction Process .. ...... 38 4-1 Hardware Interconnect Diagram . .................... 44 4-2 Ground Control Station Hardware .. ............ ...... 45 4-3 Assembled UAV .. ........ ......... ........ ... 45 4-4 Anti-vibration Mount and Plot ... ............ ...... 47 4-5 Electromagnetic Interference Shielding . ............. ... 48 4-6 Tether Tensioner . ............ ............. ... 49 4-7 Ground Control Station Console ..... ............. ... 50 4-8 ROS Architecture for Flight ............. .......... 51 4-9 Yaw Command Generation Example ..... ............ 52 4-10 Mission Transform Example ............. .......... 53 11 5-1 1:35 Scale Model ......... .. ..... ... .... .... 5 9 5-2 Vicon Ground Test 1: Position .. ... .... ... .... .... 6 1 5-3 Vicon Ground Test 1: Orientation .. ..... ... .... .... 6 2 5-4 Vicon Ground Test 2: Position . .. ... ... ... ... ... .. 6 3 5-5 Vicon Ground Test 2: Orientation .... .... .... .... ... 6 4 5-6 Noise Filter Example ....... ... ... ... ... ... ... 6 5 5-7 Briggs Field Beacon Installation .. .... .... .... .... 6 7 5-8 Outdoor Validation: Position . ... .... .... .... ... 68 5-9 Outdoor Validation:
Recommended publications
  • Team Members: Pooja Bansal (Pbansal2) Suraj Shanbhag (Smshanbh)
    ARDUROVER with BBBmini Project made for ECE 785 Advanced Computer Design Team Members: Pooja Bansal (pbansal2) Suraj Shanbhag (smshanbh) OVERVIEW: The aim of the project was to interface the beaglebone with the APM rover in order to impart autonomy to the vehicle. The idea was to make the beaglebone as the gateway between the remote control and the vehicle and also interface with the sensors and communicate with the Ground Control Station(GCS) to give the commands to the rover. The GCS software is called Mission Planner and it runs on Windows OS. This report contains the details about the hardware and software required and procedure to be followed to build and launch the system. HARDWARE: The basic hardware consists of a rover built to be compatible with the APM arduover software (we used the APM rover provided to us), Beagle Bone Black which forms the heart of the system, the BBB mini which is a cape on the BBB to interface with the other sensors and the Remote Control for providing the motion commands to the rover. The hardware of the BBBMini requires the following sensors: 1. MPU-9250 IMU 2. MS5611 barometer 3. 3Dr GPS 4. Wifi adapter 5. RC remote (Taranis capable of combined PWM output on one pin) When all these components are connected to the beaglebone, an external power supply is required. In this project a battery pack with usb output of 5V and 2A was used. The connections are as shown below. The output to servo and ESC has to be on RC2(servo) and RC4(throttle).
    [Show full text]
  • Dragon Bee University of Central Florida EEL 4915C Senior Design II Spring 2016 Group 34
    Dragon Bee University of Central Florida EEL 4915C Senior Design II Spring 2016 Group 34 Group Members Ayoub Soud(CpE) Younes Enouiti (EE) Akash Patel(CpE) Nishit Dave(EE) Table of Content 1 Executive Summary…………………………………………………………..……1 2 Project Description ………………………………………………………..………2 2.1 Project Motivation……………………………………………….…………2 2.2 Goals and Objectives…………………………………………..………….3 2.3 Specifications………………………………………………………….……4 2.4 Design Constraints and Standards……………….…………………….. 5 2.4.1 Battery/Power Consumption……………………………...…….11 2.4.2 Video Streaming………………………………………………....12 2.4.3 Budget Allocation………………………………..……………….13 3 Research……………………………………………………………………….……..14 3.1 Quad Rotor Frame…………………………………………………………14 3.1.1 Quad Rotor Frame……………………………………………….14 3.1.2 Motors………………………………………………………..……16 3.1.3 Propellers……………………………………………….…….......20 3.1.4 Electric Speed Controllers (ESC)……………………………... 21 3.1.5 Flight Controller/ArduPilot……………………………………….22 3.2 Microcontroller………………………………………………..…………….24 3.2.1 PCB………………………………………………………………..26 3.3 Communication Technologies…………………..……………………….. 28 3.3.1 Wi-Fi Module for MSP430……………………………………….30 3.3.2 Ardupilot/MSP430 Communication………………..……………31 3.4 Position Detection Sensors………………………..………………………35 3.4.1 Ultrasonic Sensors………………………..………………………35 3.5 Wi-Fi Camera………………………………………….……………………..36 3.6 GPS…………………………………………………….……………………. 38 3.7 Power Consumption Management…………………………………………41 3.7.1 Batteries…………………………………………………………….43 3.7.2 Power Module ……………………………..………………………46 3.7.3 Power Distribution Board …………………………….…………..47
    [Show full text]
  • Downloaded Model to the Curved Shape of the IRIS+ Belly
    Abstract This project involves the design of a vision-based navigation and guidance system for a quadrotor unmanned aerial vehicle (UAV), to enable the UAV to follow a planned route specified by navigational markers, such as brightly colored squares, on the ground. A commercially available UAV is modified by attaching a camera and an embedded computer called Raspberry Pi. An image processing algorithm is designed using the open-source software library OpenCV to capture streaming video data from the camera and recognize the navigational markers. A guidance algorithm, also executed by the Raspberry Pi, is designed to command with the UAV autopilot to move from the currently recognized marker to the next marker. Laboratory bench tests and flight tests are performed to validate the designs. Fair Use Disclaimer: This document may contain copyrighted material, such as photographs and diagrams, the use of which may not always have been specifically authorized by the copyright owner. The use of copyrighted material in this document is in accordance with the “fair use doctrine”, as incorporated in Title 17 USC S107 of the United States Copyright Act of 1976. I Acknowledgments We would like to thank the following individuals for their help and support throughout the entirety of this project. Project Advisor: Professor Raghvendra Cowlagi II Table of Authorship Section Author Project Work Background Real World Applications AH N/A Research AH N/A Navigation, Guidance, and Control JB N/A GPS AH N/A Python and OpenCV JB, KB N/A System Design Image Processing
    [Show full text]
  • A Step-By-Step Guidance to Build a Drone from Scratch Using Ardupilot APM Navio2 Flight Controller
    A step-by-step guidance to build a drone from scratch using Ardupilot APM Navio2 Flight controller Prepared by Sim Kai Sheng Yew Chang Chern 1 Table of Contents Full Component List ............................................................................................................................... 1 Navio2 Emlid Flight Controller ................................................................................................ 2 Raspberry Pi3 Model B ............................................................................................................ 2 Airframe .................................................................................................................................. 3 Motors 2216/950KV ................................................................................................................ 4 Electronic Speed Controller (ESC) ........................................................................................... 5 Propellers ................................................................................................................................ 6 GNSS receiver with antenna ................................................................................................... 7 Transmitter ............................................................................................................................. 8 Receiver ................................................................................................................................... 8 MicroSD card ..........................................................................................................................
    [Show full text]
  • Implementation of a Trusted I/O Processor on a Nascent Soc-FPGA Based Flight Controller for Unmanned Aerial Systems
    Implementation of a Trusted I/O Processor on a Nascent SoC-FPGA Based Flight Controller for Unmanned Aerial Systems Akshatha J Kini Thesis submitted to the Faculty of the Virginia Polytechnic Institute and State University in partial fulfillment of the requirements for the degree of Master of Science in Computer Engineering Cameron D. Patterson, Chair Paul E. Plassmann Lynn A. Abbott February 21, 2018 Blacksburg, Virginia Keywords: SoC, UAV, FPGA, ZYNQ, MicroBlaze, autopilot, ArduPilot, ArduPlane, GPS, Mission Planner, Sensor, Vulnerabilities, Mailbox Copyright 2018, Akshatha J Kini Implementation of a Trusted I/O Processor on a Nascent SoC-FPGA Based Flight Controller for Unmanned Aerial Systems Akshatha J Kini (ABSTRACT) Unmanned Aerial Systems (UAS) are aircraft without a human pilot on board. They are comprised of a ground-based autonomous or human operated control system, an unmanned aerial vehicle (UAV) and a communication, command and control (C3) link be- tween the two systems. UAS are widely used in military warfare, wildfire mapping, aerial photography, etc primarily to collect and process large amounts of data. While they are highly efficient in data collection and processing, they are susceptible to software espionage and data manipulation. This research aims to provide a novel solution to enhance the secu- rity of the flight controller thereby contributing to a secure and robust UAS. The proposed solution begins by introducing a new technology in the domain of flight controllers and how it can be leveraged to overcome the limitations of current flight controllers. The idea is to decouple the applications running on the flight controller from the task of data validation.
    [Show full text]
  • Conceptual Design Document
    Aerospace Senior Projects ASEN 4018 2017 University of Colorado Department of Aerospace Engineering Sciences Senior Projects – ASEN 4018 SHAMU Search and Help Aquatic Mammals UAS Conceptual Design Document 2 October 2017 1.0 Information Project Customers Jean Koster James Nestor 872 Welsh Ct. Louisville, CO 80027 San Francisco, CA Phone: 303-579-0741 Phone: Email: [email protected] Email: [email protected] Team Members Severyn V. Polakiewicz Samuel N. Kelly [email protected] [email protected] 818-358-5785 678-437-6309 Michael R. Shannon Ian B. Barrett [email protected] [email protected] 720-509-9549 815-815-5439 Jesse R. Holton Grant T. Dunbar [email protected] [email protected] 720-563-9777 720-237-6294 George Duong Brandon Sundahl [email protected] [email protected] 720-385-5828 303-330-8634 Benjamin Mellinkoff Justin Norman [email protected] [email protected] 310-562-3928 303-570-5605 Lauren Mcintire [email protected] 267-664-0889 Conceptual Design Assignment Conceptual Design Document 2017 Aerospace Senior Projects ASEN 4018 1.1 List of Acronyms AP Autopilot API Application Programming Interface ARM Advanced RISC Machine A(M)SL Above (Mean) Sea Level CETI Cetacean Echolocation Translation Initiative CLI Command Line Interface CPU Central Processing Unit CONOPS Concept of Operations COTS Commercial Off The Shelf EPS Expanded Polystyrene FBD Functional Block Diagram GCS Ground Control Station GPS Global Positioning System GPU Graphics Processing
    [Show full text]
  • Reef Rover: a Low-Cost Small Autonomous Unmanned Surface Vehicle (USV) for Mapping and Monitoring Coral Reefs
    Article Reef Rover: A Low-Cost Small Autonomous Unmanned Surface Vehicle (USV) for Mapping and Monitoring Coral Reefs George T. Raber 1,* and Steven R. Schill 2,* 1 The University of Southern Mississippi, School of Biological, Environmental, and Earth Sciences, Hattiesburg, MS 39406, USA 2 The Nature Conservancy, Caribbean Division, Coral Gables, FL 33134, USA * Correspondence: [email protected] (G.T.R.); [email protected] (S.R.S.); Tel.: +1-1601-434-9489 (G.T.R.); +1-1435-881-7838 (S.R.S.) Received: 8 March 2019; Accepted: 15 April 2019; Published: 17 April 2019 Abstract: In the effort to design a more repeatable and consistent platform to collect data for Structure from Motion (SfM) monitoring of coral reefs and other benthic habitats, we explore the use of recent advances in open source Global Positioning System (GPS)-guided drone technology to design and test a low-cost and transportable small unmanned surface vehicle (sUSV). The vehicle operates using Ardupilot open source software and can be used by local scientists and marine managers to map and monitor marine environments in shallow areas (<20 m) with commensurate visibility. The imaging system uses two Sony a6300 mirrorless cameras to collect stereo photos that can be later processed using photogrammetry software to create underwater high-resolution orthophoto mosaics and digital surface models. The propulsion system consists of two small brushless motors powered by lithium batteries that follow pre-programmed survey transects and are operated by a GPS-guided autopilot control board. Results from our project suggest the sUSV provides a repeatable, viable, and low-cost (<$3000 USD) solution for acquiring images of benthic environments on a frequent basis from directly below the water surface.
    [Show full text]
  • Autonomous Landing of a Multicopter Using Computer Vision
    Autonomous Landing of a Multicopter Using Computer Vision Thesis of 30 ECTS submitted to the Department of Computer Science in partial fulfillment of the requirements for the degree of Master of Science in Computer Science Joshua Springer Reykjavík University; Department of Computer Science Mälardalen University; School of Innovation, Design, and Engineering June 12, 2020 Acknowledgements I would like to thank my family for their undying love, support, and encouragement which have gotten me this far. My thanks go to Mälardalen University, Reykjavík University, and the Nordic Council, without whose generosity in the MDH Scholarship and the Nordic Master Programme, I would not have had the opportunity to study my passion in two wonderful countries. 2 Abstract Takeoff, normal flight, and even specialized tasks such as taking pictures, are essentially solved problems in autonomous drone flight. This notably excludes landing, which typically requires a pilot because of its inherently risky nature. This project attempts to solve the problem of autonomous drone landing using computer vision and fiducial markers - and specifically does not use GPS as a primary means of navigation during landing. The system described in this thesis extends the functionality of the widely-used, open-source ArduPilot software which runs on many drones today, and which has only primitive functionality for autonomous landing. This system is implemented as a set of ROS modules which interact with ArduPilot for control. It is designed to be easily integrated into existing ArduPilot drone systems through the addition of a companion board and gimbal-mounted camera. Results are presented to evaluate the system’s performance with regard to pose estimation and landing capabilities within Gazebo 9 simulator.
    [Show full text]
  • Using Beagle As a Flight Controller
    Using Beagle as a Flight Controller Open-source hardware Linux computers Proven platform with professional community Integration for real-time control Jason Kridner February 5, 2018 Co-Founder BeagleBoard.org Agenda • Introduction to Beagle • Choosing flight controller hardware • Flight control software options • ArduPilot - MAVLink - APM planner • Building out a quadcopter • Installing ArduPilot • Calibration • Buying parts • Mistakes and questions BeagleBoard.org Roadmap Fanless open computer BeagleBoard $250 Robotics-focused BeagleBone Blue 2016: Extreme power BeagleBoard-X15 2010: Extra MHz/memory $80 BeagleBoard-xM 2008: Personally affordable BeagleBoard 2017: Complete robotics controller BeagleBone Blue $25 2016: BeagleBone Black Wireless uses $50 EAGLE, Wilink8 and Octavo SIP 2017: PocketBeagle breaks out even 2013: Wildly popular smaller Octavo SIP BeagleBone Black Mint tin sized 2011: Bare-bones BeagleBone Smalls mint tin sized BeagleBone PocketBeagle 3 BeagleBone used in many applications • Simple Mobile Robots • Industrial Robots • Network Security • Medical • Citizen Science • Home Automation • Localizing Information • Assistive Technology • LEGO Robotics Robotics Cape • Designed at UCSD by James Strawson • Used in hundreds of student projects • On fourth revision • Supported by ‘libroboticscape’ software – C library – Examples for all major functions • Features – 2-cell LiPo support with balancing and LED power gauge – 9-18V charger input – 4 DC motor & 8 servo outputs, 4 quadrature encoder inputs – 9 axis IMU, barometer –
    [Show full text]
  • Master's Thesis
    MASTER'S THESIS Autonomous Takeoff and Landing for Quadcopters Robert Lindberg 2015 Master of Science in Engineering Technology Space Engineering Luleå University of Technology Department of Computer Science, Electrical and Space Engineering Abstract In this project an automated takeoff and landing system for quadcopters was devised. This will make unmanned aerial vehicles (UAVs) less dependent of human supervision which could improve for example swarms of quadcopters where humans cannot control all in detail. Quadcopters are used due to their mobility and ability to hover over a specific location, useful for surveillance and search missions. The system is self-contained and real time processing is done on board. To make the project possible, software for an onboard computer had to be developed and put on the quadcopter. The onboard computer is controlled from a ground station which can give high level commands such as takeoff, land and change altitude. Experiments were conducted in a laboratory environment to measure the effectiveness of the takeoff, hovering, and landing commands. The parameter used to control the sensor fusion, the time constant in z direction, was found to have an optimal value of 3.0 s. When tracking the desired altitude the root mean square error is in the order of a few centimetres. ii Acknowledgements I would like to thank my supervisor Dr. Jan Carlo Barca at Swarm Robotics Laboratory and Monash University for giving me the opportunity to doing an interesting research project while also experiencing a new country. Dr. Hoam Chung for the expertise and help you have given me throughout the project.
    [Show full text]
  • Circuit Cellar | Issue 358 | May 2020
    SOLUTIONS FOR SMART AGRICULTURE MAY 2020 circuitcellar.com ISSUE 358 CIRCUIT CELLAR | ISSUE 358 | MAY 2020 358 | MAY CELLAR | ISSUE CIRCUIT Inspiring the Evolution of Embedded Design IOT TECHNOLOGIES FEED SMART AGRICULTURE w Datasheet: Mini-ITX and Pico-ITX SBCs w Embedded Software Tool Security | circuitcellar.com Food Delivery Notifier Uses Bluetooth | Intro to Ardupilot and PX4 (Part 2) | Creative Mechanical Ideas for Embedded | Modernizing Antique Clocks | Pest Control System w Broad Market Secure MCUs | Weather Tree Upgrade | Build a SoundFont MIDI Synthesizer (Part 1) w The Future of Linux Security $29 value PCB ASSEMBLY DESIGN-FOR-ASSEMBLY FUNDAMENTALS for getting your PCBs back fast NEW! eBOOK eBOOK bit.ly/fastPCBs | (800) 838-5650 2 CIRCUIT CELLAR • MAY 2020 #358 Issue 358 May 2020 | ISSN 1528-0608 OUR NETWORK CIRCUIT CELLAR® (ISSN 1528-0608) is published monthly by: KCK Media Corp. PO Box 417, Chase City, VA 23924 Periodical rates paid at Chase City, VA, and additional offices. One-year (12 issues) subscription rate US and possessions $50, Canada $65, Foreign/ ROW $75. All subscription orders payable in US funds only via Visa, MasterCard, international postal money order, or check drawn on US bank. SUBSCRIPTION MANAGEMENT Online Account Management: circuitcellar.com/account Renew | Change Address/E-mail | Check Status SUPPORTING COMPANIES CUSTOMER SERVICE E-mail: [email protected] Advanced Assembly 1 Phone: 434.533.0246 Mail: Circuit Cellar, PO Box 417, Chase City, VA 23924 All Electronics Corp. 77 Postmaster: Send address changes to Circuit Cellar, PO Box 417, Chase City, VA 23924 CCS, Inc. 77 NEW SUBSCRIPTIONS easyOEM 49 circuitcellar.com/subscription HuMANDATA, Ltd.
    [Show full text]
  • Integration of a Flight Management System and an Unmanned Aerial Vehicle Resulting in Improved Safety and Efficiency During Autonomous Flight
    Integration of a Flight Management System and an Unmanned Aerial Vehicle Resulting in Improved Safety and Efficiency During Autonomous Flight Joseph DeRose | Keenan Kimmick | Arun Kumar | Niko Pagones | Marcus Stears | Justin White Table of Contents Abstract 3 State of the Industry 3 Motivation 4 Problem Statement 4 Solution 5 Basic Drone Configuration 5 Flight Scenarios 7 Weather Variables and Sensors 8 Data Collection and Analysis 9 Component Placement 10 Conclusion 13 Appendix 14 Flight Scenario Tree 14 References 15 2 Abstract Within aviation, safety is always priority number one. It is the first requirement when any part, assembly, or aircraft is designed. The emerging unmanned aerial vehicle (UAV) industry leaves itself vulnerable to the effects weather conditions can have on flight. While most UAVs are but a fraction of the size of a 737, they are just as prone to turbulence and weather conditions. This vulnerability poses a substantial problem to guaranteeing safe UAV flight. This project proposes a solution to strategically mount sensors on-board a UAV to avoid bias in measurement of wind speed, ambient temperature, pressure, humidity, and the presence of lightning. The data collected from sensors is then gathered onto an on-board Raspberry Pi where a Python script confirms that safe operating conditions have not been violated. If safe operating conditions have been exceeded, a command will be sent to the UAV’s flight controller, the Pixhawk 4, changing the flight plan. The suite of sensors and Raspberry Pi make up the key components of the flight management system (FMS) that will increase safety of autonomous UAV flight and provide a way to boost efficiency and performance of UAV flight.
    [Show full text]