Vehicle Detection For Cyclist Safety

Item Type text; Electronic Thesis

Authors Purdy, Ruben

Publisher The University of Arizona.

Rights Copyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author.

Download date 28/09/2021 17:54:52

Item License http://rightsstatements.org/vocab/InC/1.0/

Link to Item http://hdl.handle.net/10150/632650

VEHICLE DETECTION FOR CYCLIST SAFETY By RUBEN PURDY

Team 18037 Ruben Purdy, Francisco Cervantes, Katherine Cheetham, Eric Cornforth, Thomas Sawyer, Edmund Sheah ______

A Thesis Submitted to The Honors College In Partial Fulfillment of the Bachelors degree With Honors in Electrical and Computer Engineering

THE UNIVERSITY OF ARIZONA MAY 2019

Approved by:

______Gary Redford Department of Engineering

Abstract This document contains the final results of Senior Design Project 18037: Vehicle Detection for Cyclist Safety. In the following sections, we present a deep neural network enabled vehicle detection system which can be run in real time on self-contained, portable computer hardware, which in turn can be attached to a bicycle. We outline the architecture of our system, and then provide technical details of both the software and mechanical design in a technical data package. Furthermore, we detail the procedure and results of the analyses and tests performed on the system. Lastly, we present our final budget and reflect on the lessons learned during this project.

2 Contents

1 Scope 6 1.1 Overview ...... 6 1.2 Functional Requirements ...... 6 1.3 System Requirements ...... 6

2 System Block Diagram 8 2.1 System Architecture and Requirements Traceability ...... 9

3 Technical Design Package 11 3.1 Detailed Part and Assembly Drawings ...... 11 3.2 Schematics ...... 12 3.2.1 Wiring Diagram ...... 12 3.2.2 Mechanical Assembly ...... 13 3.2.3 Holes ...... 14 3.3 Software Design Document ...... 18 3.3.1 Scope ...... 18 3.3.2 Referenced Documents ...... 18 3.3.3 CSCI Design ...... 18 3.3.4 CSCI Architecture ...... 19 3.3.5 CSCI Detailed Designs ...... 20 3.3.6 Camera Driver ...... 20 3.3.7 Initialization Script ...... 20 3.3.8 Preprocessor ...... 21 3.3.9 Sun Interference Detection ...... 21 3.3.10 Deep Neural Network ...... 21 3.3.11 System Calls ...... 22

4 Acceptance Test procedures 23 4.1 Jitter and Distance ...... 23 4.1.1 Introduction ...... 23 4.1.2 Referenced Documents ...... 23 4.1.3 Required Test Equipment ...... 23 4.1.4 Test Procedures ...... 23 4.1.5 Support Requirements ...... 24 4.1.6 Results ...... 24 4.2 Smear and Distance ...... 24 4.2.1 Introduction ...... 24 4.2.2 Referenced Documents ...... 24 4.2.3 Required Test Equipment ...... 24 4.2.4 Test Procedures ...... 25 4.2.5 Results ...... 25 4.3 Heat ...... 25 4.3.1 Introduction ...... 25 4.3.2 Referenced Documents ...... 25 4.3.3 Required Test Equipment ...... 25 4.3.4 Test Procedure ...... 26 4.4 Water ...... 26 4.4.1 Introduction ...... 26 4.4.2 Referenced Documents ...... 26 4.4.3 Required Test Equipment ...... 26 4.4.4 Test Procedure ...... 26

3 4.5 Battery ...... 27 4.5.1 Introduction ...... 27 4.5.2 Referenced Documents ...... 27 4.5.3 Required Test Equipment ...... 27 4.5.4 Test Procedure ...... 27

5 Models and Analyses 28 5.1 Sun Interference ...... 28 5.1.1 Introduction ...... 28 5.1.2 Reference Documents ...... 28 5.1.3 Required Tools ...... 28 5.1.4 Procedure to set up Sun Interference Model ...... 28 5.1.5 Results ...... 29 5.2 Heat ...... 29 5.2.1 Introduction ...... 29 5.2.2 Reference Documents ...... 29 5.2.3 Required Tools ...... 29 5.2.4 Procedure to set up SolidWorks ...... 29 5.2.5 Results ...... 30 5.3 Camera Angle ...... 30 5.3.1 Introduction ...... 30 5.3.2 Referenced Documents ...... 30 5.3.3 Required Tools ...... 30 5.3.4 Procedure ...... 30 5.3.5 Results ...... 32 5.4 Battery ...... 32 5.4.1 Introduction ...... 32 5.4.2 Reference Documents ...... 33 5.4.3 Required Tools ...... 33 5.4.4 Analysis Procedure ...... 33 5.4.5 Analysis Results ...... 33 5.5 Network Speed ...... 34 5.5.1 Introduction ...... 34 5.5.2 Referenced Documents ...... 34 5.5.3 Required Tools ...... 34 5.5.4 Procedure ...... 35 5.5.5 Results ...... 35

6 Acceptance Test Results 36 6.1 Jitter and Distance ...... 36 6.2 Smear and Distance ...... 37 6.3 Heat ...... 38 6.4 Water ...... 39 6.5 Battery ...... 40

7 Final Budget 41 7.1 Overall Budget ...... 41 7.2 Budget for Final Product ...... 41

8 Lessons Learned 43 8.1 Plan Testing Data ...... 43 8.2 Value Feedback ...... 43 8.3 Various Testing and Analysis ...... 43 8.4 Over Engineering a Design ...... 43 8.5 Dampening Mechanical Noise Before Using Optical Image Stabilization ...... 43

4 A Appendix 46 A.1 Functional Requirements ...... 46 A.2 System Requirements ...... 46 A.3 System Requirements Verification Matrix ...... 47 A.4 Part Pictures ...... 47 A.5 System Part Diagrams and Specifications ...... 52 A.5.1 Bike Specifications ...... 52 A.5.2 Bike Rack Installation Specifications ...... 53 A.5.3 CA378-AOIS Camera Specifications ...... 54 A.5.4 System Enclosure Measurements ...... 55 A.5.5 PowerAdd ChargerCenter 2/Samsung Li-ion 18650-25R Specifications ...... 55 A.6 JetPack 3.3 Installation Instructions ...... 73 A.7 Camera Driver Installation Instructions ...... 97 A.7.1 Step 1: Setting sudopermissions ...... 97 A.7.2 Step 2: Installing the driver ...... 97 A.8 Object Detection API and OpenCV Instillation Instructions ...... 98 A.9 Jetson TX2 Installation Instructions ...... 99 A.10 TensorRT Instructions ...... 99 A.11 Network Speed Model Code ...... 100 A.12 Test Data Sheets ...... 102 A.13 Camera Angle Calculations ...... 106 A.14 Item Links ...... 107 A.14.1 Bike & Bike Rack ...... 107 A.14.2 Developer Board ...... 107 A.14.3 Camera ...... 107 A.14.4 Battery ...... 107 A.14.5 Enclosure Parts ...... 107 A.14.6 Tools & Test Equipment ...... 108 A.14.7 Other ...... 108 A.15 Camera Data Sheets ...... 108 A.16 Risk Analysis and Mitigation Plan ...... 114

5 1 Scope

1.1 Overview In this project, we designed and implemented a vehicle detection system to promote cyclist safety. The focus of this project was on the deep neural network (DNN) algorithm used as the vehicle detection engine, as well as an outline of the boundary conditions that constrain the performance of the network in real world situations. As such, issues related to user experience, such as alerting cyclists about vehicles, are considered out of scope. Instead, we present our process for designing a DNN system that is portable and energy efficient, as well as our mechanical design which helps mitigate the jitter associated with a system mounted on a bicycle. The performance of our system is described qualitatively by the accuracy of our network, and quantitatively by the speed, distance, and resistance to jitter and smear that our system shows. Due to the complexity of obtaining consistent and general accuracy results when using a DNN, quantitative analysis of the accuracy of our network is outside of the scope of this project. Additionally, resistance to sun interference is measured qualitatively through observation of the network. Simple quantitative analysis of sun interference is provided through a post-processing, but a more thorough analysis is outside of the scope of this project.

1.2 Functional Requirements Success in this project is defined by the following functional requirements:

1. The system shall use a deep neural network. 2. The system shall attach to a bicycle or bicyclist. 3. The system shall detect vehicles using a camera. 4. The system shall be operable while ambient light is available. 5. The system shall not impede the mobility of the bicycle.

1.3 System Requirements The functional requirements of our system have been broken into the following system requirements

1. Deep Neural Network Requirements 1.1 Network Size: The size of the neural network shall be less than or equal to 4 GB. 1.2 Network Speed: The speed of the neural network shall be at least 8 frames per second. 2. Attachment Requirements 2.1 Mechanical Interface: The system shall be mounted to the bike. 2.2 Hardware Enclosure: The system shall contain an enclosure to protect the hardware. 2.3 Dynamic Environment: The system shall be robust to the variations typical of Tucson roads. 3. Camera Requirements 3.1 Camera Resolution: The camera shall have a resolution of at least 1920 x 1080. 3.2 Camera Speed: The camera shall have the necessary frames per second required by the neural network (at least 8 frames per second). 3.3 Camera FOV: The camera shall have a field of view of at least 60◦. 3.4 Depth of Field: The camera shall be able to capture vehicles at least 20 feet away. 3.5 Smear: The system shall have no more than one pixel of smear. 3.6 Jitter: The system shall be robust to jitter.

6 3.7 Camera Angle: The camera shall be angled to detect vehicles to the rear-left of the bicycle. 4. Operating Condition Requirements 4.1 Battery Life: The battery shall last for a minimum of 6 hours. 4.2 Sun Interference: The system shall be resistant to interference from the sun. 4.3 Operating Temperature: The system will be able to function up to 110◦F. 4.4 Robustness to Rain: The system shall be operable under light rain (less than 0.1 inches of rain per hour). 5. Mobility Requirements

5.1 Size: The system enclosure shall be no larger than 22x12x10 inches. 5.2 Weight: The system shall weigh less than 30 lbs.

7 2 System Block Diagram

Figure 1 Shows the block diagram of our system. The bulk of the system is placed inside the Altelix Enclosure Weatherproof Enclosure. The developer board, ’s Jetson TX2, is what runs the DNN algorithm. It is powered by the Poweradd ChargeCenter II battery through a 19 Volt / 4 Amp connection. Data is captured using the CA378-AOIS camera, which has a custom designed mount and is enclosed in a Supreme Tech Acrylic Dome. The camera is mounted onto the top of the Altelix controller. The camera is powered using the Jetson and communicates using the MIPI camera serial interface specification. Data is saved onto an SD card attached to the Jetson. The Altelix enclosure is mounted onto a bike rack using vibration isolators to mitigate jitter.

Figure 1: System Block Diagram.

8 2.1 System Architecture and Requirements Traceability

Figure 2: System Architecture.

Figure 2 shows three levels of the system architecture for the vehicle detection system. Level 1 is the overall detection system as a whole. Level 2 shows the system’s subsystems: the developer board, the camera, the enclosure, and the battery. Level 3 are the sub-assemblies within each subsystem. The developer board subsystem is made up of the neural network, storage, and a preprocessing unit. The enclosure subsystem has a mount. Finally, the battery subsystem has one sub-assembly: a DC connect cable.

Figure 3: Requirement Flow Down Matrix

The matrix in Figure 3 shows how each system requirement will flow down into the sub-assemblies. The light blue highlighted boxes show at what level each requirement will be verified. Each requirement can have one of three types of flow down: direct flow, derived flow, and allocated flow. A direct flow means the requirement flows down directly to the sub-assembly as it is. For example, requirement 1.1 Network Size flows directly to the developer board, but does not flow to any of the other

9 sub-assemblies. A derived flow is calculated to either meet a requirement or establish a safety margin. This project does not have any derived requirements in the flow down matrix. Finally, an allocated flow gets divided between all relevant sub-assemblies. In this project, requirement 5.2 Weight is allocated between all of the sub-assemblies because they will all be physical parts of the system, which means they will add to the system’s weight. The enclosure sub-assembly is allocated the majority of the weight budget because it is the heaviest piece of the system while the camera is allocated the least amount of the weight budget, because it will be the smallest part of the system. The system requirements verification matrix (SRVM), which defines how each requirement will be verified, can be found in Appendix A.3.

10 3 Technical Design Package

3.1 Detailed Part and Assembly Drawings The items that are included in our our enclosure system are:

1. CA378-AOIS 2. Supreme Tech Acrylic Dome/Plastic Hemisphere - Clear 3. Altelix 14x12x8 Vented FRP Fiberglass Weatherproof NEMA Box

4. Poweradd ChargerCenter 2 5. TX2 6. Custom made battery brackets 7. Custom made dome wedge

8. ACC Conduit and Pipe Hangers 9. 2.25 in. x 1 in. Vibration Isolator

Figure 4: The complete assembly of the enclosure.

Item No. Name of Part Description Vendor QTY. Atelix Vented FRP Fiberglass 1 Enclosure Amazon 1 Weatherproof NEMA Box 2 Isolator Vibration Isolator Home Depot 4 3 Jetson Board Jetson TX2 Amazon 1 4 Battery POWERADD Charger 2 Amazon 1 Custom 3-D printed Wedge 5 Dome Wedge Custom 1 made for the Acrylic Dome Supreme Tech Acrylic Dome/Plastic 6 Dome Amazon 1 Hemisphere Clear 7 Camera CA387-AOIS Framos 1 8 Clamp ACC Conduit and Pipe Hangers Home Depot 4 9 Battery Brackets Custom MAde Brakets for the Battery Custom 2

The rest of the parts will have pictures in the appendix for references.

11 3.2 Schematics 3.2.1 Wiring Diagram

Figure 5: A Pictorial Wiring Diagram of Vehicle Detection System

12 Figure 6: A Pictorial Wiring Diagram of CA378-AOIS camera to Jetson TX2 developer board

3.2.2 Mechanical Assembly Here are the instructions on how to assemble the system in its entirety. By the end the system shall look like Figure 11.

1. Assemble the Roadmaster 26” Granite Peak Men’s Mountain Bike. 2. Assemble the Lumintrail Bicycle Commuter Carrier Rear Seatpost Frame Mounted Bike Cargo Rack. 3. Attach the Lumintrail Bicycle Commuter Carrier Rear Seatpost Frame Mounted Bike Cargo Rack to the Roadmaster 26” Granite Peak Men’s Mountain Bike.

4. Collect all the materials needed to the enclosure. (List is in Section 3.1) 5. Drill all holes into the Altelix 14x12x8 Vented FRP Fiberglass Weatherproof NEMA Box as stated in Section 3.2.3. 6. Take the ACC Conduit and Pipe Hangers and put a 3/8” screw into the top of it. (Do this x4)

7. Place a 2.25 in. x 1 in. Vibration Isolator on top of that screw. (Do this x4) 8. Now screw these into the bottom of the Altelix 14x12x8 Vented FRP Fiberglass Weatherproof NEMA Box. 9. Open the Altelix 14x12x8 Vented FRP Fiberglass Weatherproof NEMA Box.

10. Assemble the Jetson Board TX2 onto Puget Systems Acrylic Enclosure for NVIDIA Jetson TX2. (The installation instructions can be found in Section A.9)

13 11. Screw the Puget Systems Acrylic Enclosure for NVIDIA Jetson TX2 into the holes desired inside the Altelix 14x12x8 Vented FRP Fiberglass Weatherproof NEMA Box. 12. Screw the 2-Hole 90◦Angle Bracket into the slots for the Poweradd ChargerCenter 2. 13. Place the Poweradd ChargerCenter 2 inside the Altelix 14x12x8 Vented FRP Fiberglass Weatherproof NEMA Box. 14. When placing the AC Infinity MULTIFAN S1 on the roof of the enclosure, make sure the side that is blowing is facing down towards the Poweradd ChargerCenter 2 and the Acrylic Enclosure for NVIDIA Jetson TX2. 15. Use a sticky adhesive and place on the four corners of the AC Infinity MULTIFAN S1.

16. Place the fan directly over the forward vent. (This allows for more air to flow through the fan). 17. Cut the 1/2 x 6 ft. Non-Metallic Liquidtight Whip to the desired length. 18. Secure the 1/2 x 6 ft. Non-Metallic Liquidtight Whip to the hole on the side of the Altelix 14x12x8 Vented FRP Fiberglass Weatherproof NEMA Box.

19. Seal the 1/2 x 6 ft. Non-Metallic Liquidtight Whip with Flex Seal. 20. Close the Altelix 14x12x8 Vented FRP Fiberglass Weatherproof NEMA Box. 21. Feed the 1/2 x 6 ft. Non-Metallic Liquidtight Whip up to the rear vented box.

22. Feed the wire connecting the CA378-AOIS and the Jetson board through the 1/2 x 6 ft. Non-Metallic Liquidtight Whip. 23. Glue the CA378-AOIS to the 2-Hole 90◦Angle Bracket - Silver Galvanized 24. Screw the 2-Hole 90◦Angle Bracket - Silver Galvanized into the top of the rear vent in the center hole.

25. Plug the camera wire into the CA378-AOIS. 26. Place the Supreme Tech Acrylic Dome/Plastic Hemisphere - Clear over the CA378-AOIS. 27. Screw the Supreme Tech Acrylic Dome/Plastic Hemisphere - Clear into the rear vent of the Altelix 14x12x8 Vented FRP Fiberglass Weatherproof NEMA Box.

28. Attach the whole system to the Lumintrail Bicycle Commuter Carrier Rear Seatpost Frame Mounted Bike Cargo Rack. The system is now ready for testing.

3.2.3 Holes 3.2.3.1 Bottom of Enclosure These are the holes that the ACC Conduit and Pipe Hangers will be screwed into. These holes will be drilled using a 1/4” drill bit.

1. Hole 1 measures at 3 1/4” from the left side and 2 1/4” from the bottom. 2. Hole 2 measures at 3 1/4” from the right side and 2 1/4” from the bottom. 3. Hole 3 measures at 3 1/4” from the left side and 3 1/2” from the top.

4. Hole 4 measures at 3 1/4” from the right side and 3 1/2” from the top.

14 Figure 7: Measured holes for the bottom of the enclosure.

3.2.3.2 Top of Enclosure These are the holes that will be drilled for the CA378-AOIS and the Supreme Tech Acrylic Dome/Plastic Hemisphere - Clear. These holes will also be drilled using a 1/4” drill bit. These will be located on the rear vent roof.

1. The hole for the CA378-AOIS is 2 1/2” from the side and 2 inches from the bottom. 2. For the measurement of the Acrylic Dome/Plastic Hemisphere - Clear holes, place it on the top with the edge against the edge of the vent. Mark where the holes are and drill them from there. (Because its a circular edge the holes can be lined up in many different areas. Will not effect the outcome of the project.)

Figure 8: Measured holes for the top of the enclosure.

15 3.2.3.3 Inside the Enclosure These holes are to attach the Puget Systems Acrylic Enclosure for NVIDIA Jetson TX2. These holes will be drilled with a 9/64” drill bit. These measurements are also taken from the inside wall of the Altelix 14x12x8 Vented FRP Fiberglass Weatherproof NEMA Box.

1. Hole 1 measures at 9.15” from the top side and .35” from the left side. 2. Hole 2 measures at 9.15” from the top side and 5.60” from the left side.

3. Hole 3 measures at 1.75” from the top side and .35” from the left side. 4. Hole 4 measures at 1.75” from the top side and 5.60” from the left side.

Figure 9: Measured holes for the Jetson Board inside of the enclosure.

These holes will be for the Custom Battery Bracket that will hold the Poweradd ChargerCenter 2 in place.

1. Hole 1 measures at 4” from the right side and 3.50” from the top side.

2. Hole 2 measures at 4” from the right side and 5” from the bottom side. 3. Hole 3 on side wall 4” from the bottom and 3.50” from the right side. 4. Hole 4 on side wall 4” from the bottom and 5” from the left side.

16 Figure 10: Measured holes for the battery bracket on the bottom of the enclosure.

Figure 11: Measured holes for the battery bracket on the side of the enclosure.

17 3.3 Software Design Document 3.3.1 Scope The deep neural network and the software which supports it is an integral part of our system. As such, we will present a detailed description of our software system, beginning with a high level overview in Section 3.3.3, then moving to an examination of the interfaces between components in Section 3.3.4, and ending with an in depth explanation of each component in Section 3.3.5.

3.3.2 Referenced Documents Vehicle Detection for Cyclist Safety System Requirements Document, 10/01/2018.

3.3.3 CSCI Design

Figure 12: The top level software diagram for our system.

The overall architecture of the software for our vehicle detection system is given in Figure 12. The majority of the software is the deep neural network. Additionally, some helper code is present to preprocess the data, control the camera, and to correctly store the generated data. At a high level, the flow of the software is as follows:

1. The initialization routine sets the stabilization properties of the camera.

2. The image preprocessor receives data from the camera and prepares it for the DNN. 3. The DNN recieves data from the image preprocessor and performs inference. 4. The storage routine stores the output of the preprocessor and DNN. 5. (Optional) Post-processing is run to detect potential sun interference.

Step 1 is performed only once when the system is initialized. Steps 2-4 repeat until the system is shut down.

18 3.3.4 CSCI Architecture

Figure 13: The interfaces of our software system.

Figure 13 gives a more detailed view into the organization of the software system with a focus on the interfaces between components. The components which are shaded in blue are external components. The rest will be deployed onto our developer board. The components shaded in yellow are third party libraries which are publicly available. The gray components are the components that will be implemented specifically for this system.

19 3.3.5 CSCI Detailed Designs

Planned Component Description Library Resource Receives control Provided by information and sends it Camera Driver 19.4 MB camera to the camera. Fetches manufacturer pixel data from the camera. Sends initialization Initialization information to the camera 1 KB* None Script driver in order to set stabilization properties. Receives data from camera driver and Publicly OpenCV 171.1 MB abstracts it for use in the available preprocessor. Formats data received by OpenCV so that it can be Numpy, Preprocessor 100 KB* processed by the neural OpenCV network. Receives data from the Deep Neural TensorFlow, preprocessor and detects 200 MB* Network Numpy vehicles. Receives data from the Sun Interference DNN and marks potential 200 MB* Numpy Detector sun interference. Stores the data from the System Calls preprocessor and neural 1 KB* None network.

Table 1: The computer software components to be used in this system. Resource numbers marked with * are estimates.

Table 1 gives a description of each internal component shown in Figure 13. Note that the planned resources for all the components added together is well under the 8 GB available on the Jetson. An explanation of the implementation and integration of each component is now given. This explanation assumes that the Jetson is flashed with JetPack 3.3. Instructions for installing this version of JetPack are provided in Appendix A.6.

3.3.6 Camera Driver The camera driver is publicly available and written specifically for the Jetson TX2 [9]. However, the in- structions given online must be modified slightly in order to install the driver on JetPack 3.3. The modified instructions are given in Appendix A.7. Once the driver is installed, communication to and from the camera will be possible.

3.3.7 Initialization Script The initialization script has the simple task of initializing the optical stabilization for the camera. The initialization script will be installed by following the instructions in Appendix A.7 as “demo.sh”. In the script, lens shading correction must be turned on, and the stabilization mode must be set to “Exposure/Shake Eval.”

20 3.3.7.1 OpenCV OpenCV is a free computer library which will be used to parse the camera data into an object easily usable by the preprocessor [2]. Installing OpenCV on the Jetson requires a bit of modification of the general installation instructions. A specific guide can be found in Appendix A.8. Once OpenCV is installed, loading the camera stream is as simple as the following Python command:

gst_str = (‘nvcamerasrc ! ’ ‘video/x-raw(memory:NVMM), ’ ‘width=(int)1920, height=(int)1080, ’ ‘format=(string)I420, ’ ‘framerate=(fraction)60/1 ! ’ ‘nvvidconv ! ’ ‘video/x-raw, width=(int){}, height=(int){}, ’ ‘format=(string)BGRx ! ’ ‘videoconvert ! appsink’).format(width, height) return cv2.VideoCapture(gst_str, cv2.CAP_GSTREAMER)

This will return a OpenCV video capture object with a given width and height. For testing, the network was run at size 1920x1080, but decreasing the resolution will increase the speed of the network.

3.3.8 Preprocessor The core functionality of the preprocessor is to prepare the video data so that it can be best analyzed by the neural network. At the simplest level, this involves cropping the image to remove the top portion of the image (which will not contain any cars.) This can be done by indexing into the Numpy array returned by OpenCV. For testing, the input was cropped in half, but cropping can be changed based on need. The more the image is cropped, the faster the network will run.

3.3.9 Sun Interference Detection Lastly, the sun interference detection is ran as a post processing phase utilizing both Python and OpenCV libraries. Sun interference is detected by monitoring histogram values. The program first takes in the pixel data from the image and then create a histogram with 256 bins. 0, is the first bin and 255 is the last bin. O corresponds to the darkest pixel(black), while 255 corresponds to the brightest(white) pixel As we move from 0 up to 255 in the bins we are increasing in terms of pixel contrast. The histogram values of the incoming images will be checked for any prolonged spikes in bins pertaining to sun interference. The bins to be checked should be known before hand by observing the effect of sun interference on the histogram values in manufactured data where sun interference is known to occur. Because this may vary at different times of the day, which bins to check is entered in as a parameter when beginning to detect sun interference. Histogram calculation can be performed using OpenCV’s calcHist function. The neural network should be robust to a fair amount of this noise, and so in the post processing stage there must be an idea about network performance in order to accurately detect interference. Because of this, the confidence values of the neural network must also be given as an input to the sun interference detection program. If the network experiences a sudden drop in confidence values which can be correlated to a spike in a monitored histogram bin, then the sun interference detection program will mark that image as having sun interference. Again, the thresholds for confidence values vary from iteration to iteration of the network as well as different times of the day. So these will be parameters that have to be updated as more information is acquired during the morning, noon, evening.

3.3.10 Deep Neural Network The deep neural network is the centerpiece of our software system. Its architecture is based MobileNetV2-SSDLite [8]. MobileNetV2 is a lightweight residual-based network which is specifically designed for mobile applica- tions. A modification of the Single Shot Detector (SSD) object detection architecture, SSDLite, was cre- ated for enabling object detection using MobileNet. A pre-trained version of this network, trained on the

21 COCO dataset [6], is available through the TensorFlow Model Detection Zoo at the following endpoint: (http://download.tensorflow.org/models/ object detection/ssdlite mobilenet v2 coco 2018 05 09.tar.gz). Our network is modified so that the output layer only has 4 classes (car, truck, bus, and train). Because our network is tuned for vehicle detection, the rest of the classes are irrelevant. However, the MobileNet portion of the network is held statically at the pre-trained weights mentioned above. Fine tuning is performed only on the object detection portion of the network. Because COCO already contains labels for cars, trucks, buses, and trains, the network already has learned representations of these classes. Training was performed using the TensorFlow Object Detection API [5]. The network was trained on a subsection of the COCO dataset (only the images with labels we care about) and the Udacity self-driving-car annotated datasets [4] The procedures for using the TensorFlow Object Detection API are available at https://github.com/ tensorflow/models/tree/master/research/object_detection. Once the procedure for reformatting and fine tuning is understood, development is fairly simple. After training, optimization using NVIDIA’s TensorRT tool [7] was performed provide a boost in model throughput. Instructions on the use of TensorRT are provided in Appendix A.10.

3.3.11 System Calls OpenCV‘s VideoWriter class was used to save the video to the SD card. The use of the class is straightfor- ward and documented in our code.

22 4 Acceptance Test procedures

In this section, we outline the procedures for our acceptance tests.

4.1 Jitter and Distance 4.1.1 Introduction This procedure outlines the acceptance tests to be performed on the Vehicle Detection System using a bike. This test verifies that the system will be able to detect a car at a minimum of 20 feet and extending out to 70 feet away from the cyclist. Additionally this test will also outline how far away the Vehicle Detection System will be able to detect a car with variable and increasing amounts of added noise in the form of jitter. The modulation transfer function of jitter is as follows:

4.1.2 Referenced Documents Vehicle Detection for Cyclist Safety System Requirements Document, 10/01/2018.

4.1.3 Required Test Equipment 1. Vehicle Detection System 2. Python 3. Measuring Tape

4.1.4 Test Procedures For this test there are two phases, data collection and processing. The data of a car will be collected at a range of different distances and the raw data will be processed to simulate varying amounts of jitter. The processing step involves using a python script that uses the modulation transfer function of jitter and convolving it over the original image. This is used to simulate the blur that occurs when the system is experiencing jitter on the road. The Modulation Transfer Function associated with jitter is as follows:

− 1 k2σ2 MTFjitter(k) = e 2 (1) Where k is the spatial frequency and σ is the amplitude of jitter. We can increase the amount of jitter in the image by increasing the values of k, our spatial frequency. Data Collection Phase 1. Setup car as outlined in support requirements and setup and turn on vehicle detection system. 2. Setup markers to indicate distances 20, 30, 40, 50, 60, and 70 feet away from the car. Use the measuring tape to obtain these distances.

3. Begin capturing video on system using the nvcamerasrc functionality. 4. Capture video at the pre-marked distances. 5. Stop capturing video, and extract screenshots from the captured video at each of the outlined distances to be sent for processing Processing Phase 1. Make a file folder named ”jitter”. 2. Put all of the extracted images into the newly created file folder.

23 3. Put that file inside the same directory as the python code named ”jitter.py”. 4. Open up terminal or the equivalent on your . 5. In the terminal, navigate to the directory where you have put the jitter code. 6. Once in the correct directory type: ”python3 jitter.py”.

7. Now you should have a folder with the jittered image called: ”jittered images”. 8. Run this sequence of images through the neural network. 9. Observe the output.

4.1.5 Support Requirements All tests should be performed during daylight hours with a stationary car that and enough room to have an unobstructed view of the car from 70 feet away.

4.1.6 Results To see the actual recorded results please see section 6.1. Our Vehicle Detection System was able to detect cars quite well with different all the amounts of jitter we tested with up to 30 feet. In the 40 foot range the system was not able to identify the vehicle at all with any amount of jitter. At 70 feet, we experienced that with extreme amounts of jitter our system was not able to detect the vehicle at all.

4.2 Smear and Distance 4.2.1 Introduction This procedure outlines the acceptance tests to be performed on the Vehicle Detection System while using a bike. For this test, data of a car will be collected at a range of different distances and the raw data will be processed to simulate varying amounts of smear. The processing step involves inputting the original image into our MATLAB code and then convolving the object of interest within the image with the modulation transfer function of smear:

sin(παu) MTF (u) = (2) smear παu

Where u is the spatial frequency and α is the amplitude of smear in pixels. To simulate smear, we utilized a set of MATLAB funcitons called Image Deblurring, where we utilize modulation transfer function above to smear the images. In the MATLAB code, g = Hf is the equation we utilize. Where g is the blurred image, H is the modulation transfer function of smear, and f is the original image. We chose to use this MATLAB code for simulating smear because of the similarities it has with smear we had experienced in some of our testing data sets.

4.2.2 Referenced Documents Vehicle Detection for Cyclist Safety System Requirements Document, 10/01/2018.

4.2.3 Required Test Equipment 1. Vehicle Detection System 2. Matlab

3. Matlab Image Processing Toolkit

24 4.2.4 Test Procedures This test uses the same raw data captured from the jitter and distance test outilined in 4.1, but with different processing protocols. Processing Phase 1. Make a file folder named ”Smear”

2. Put all of the extracted images into the newly created file folder. 3. Put that file inside the same directory as the MATLAB code. 4. Inside the Matlab Graphical User Interface run the smear code.

5. Now you should have a folder with the smeared images called: ”smearedImages” 6. Run this sequence of images through the neural network. 7. Observe the output.

4.2.5 Results To see the actual recorded results please see section 6.2. Our Vehicle Detection System only failed with one of the variable amounts of smear we tested at 20 feet. In the 30 foot range, our Vehicle Detection System was able to detect cars quite well with different all the amounts of smear we tested. However, the system was not able to identify the vehicle at all with any amount of smear up to 60 feet. At 70 feet, our system was able to detect the vehicle with most of the amounts of smear we tested.

4.3 Heat 4.3.1 Introduction This procedure outlines the acceptance test to be performed on the Vehicle Detection System using a bike. This test verifies the Vehicle Detection System will be able to work effectively due to a certain amount of heat placed on the system. The system will be tested in a variety of different temperatures due to Tucson’s varying weather. This test will show that the system is able to run in 110 degrees Fahrenheit.

4.3.2 Referenced Documents Vehicle Detection for Cyclist Safety System Requirements Document, 10/01/2018.

4.3.3 Required Test Equipment 1. CA378-AOIS

2. Supreme Tech Acrylic Dome/Plastic Hemisphere - Clear 3. Altelix 14x12x8 Vented FRP Fiberglass Weatherproof NEMA Box 4. Poweradd ChargerCenter 5. NVIDIA Jetson TX2

6. 250W Heat Bulb - 2PK 7. Woods Clamp Lamp with 10 Inch Reflector 8. Digital Laser Infrared Thermometer

25 4.3.4 Test Procedure 1. Set up enclosure. 2. Turn on battery, Jetson board, camera, and heat lamp.

3. Run all of these items for an hour taking heat measurements every ten minutes. 4. Use the Digital Laser Infrared Thermometer to get temperature readings of the battery, Jetson board, enclosure as a whole, camera and the heating lamp. 5. Once temperature gets to around 110◦F take two more measurements with the the digital thermometer of the battery, Jetson board, enclosure and camera.

6. Since we know the max operating temperatures of each piece of equipment, we will be able to tell if the system will be able to run effectively in the heat.

4.4 Water 4.4.1 Introduction This procedure outlines the acceptance test to be performed on the Vehicle Detection System using a sprinkler and water detection papers. This test verifies the Vehicle Detection System will be able to operate effectively in a light rain condition. The system will be tested by a sprinkler which will be spraying less than 0.1 inches of water drops per hour of water on the system itself. This test will show the system is water resistant.

4.4.2 Referenced Documents Vehicle Detection for Cyclist Safety System Requirements Document, 10/01/2018.

4.4.3 Required Test Equipment 1. Supreme Tech Acrylic Dome/Plastic Hemisphere - Clear 2. Altelix 14x12x8 Vented FRP Fiberglass Weatherproof NEMA Box

3. Gilmour Rectangular Pattern Spot Sprinkler 4. Water Detection Papers

4.4.4 Test Procedure 1. Remove all damageable electronics from enclosure.

2. Lay out or tape water detection papers on the inner surface of the enclosure and of the dome. 3. Turn on water sprinkler and let it spray at least 0.1 inches of water drops on the system for an hour. 4. Once the system has been sprayed for an hour, turn off the sprinkler. 5. Open the enclosure of the system and check the color of the water detection paper.

6. Since we know the color of the water detection paper changes if water leaks or the humidity inside the enclosure greater than 55%, we will be able to tell if the system will be able to run under light rain.

26 4.5 Battery 4.5.1 Introduction This procedure outlines the acceptance tests to be performed on the Vehicle Detection System using a bike. This test verifies the PowerAdd ChargerCenter 2 will be able to power the entirety of the Vehicle Detection System for a minimum duration of 6 hours while powered and running on the streets of Tucson. This test is to be performed outdoors during the late morning into the afternoon as per the Functional Requirement A.1.4 states: The system shall be operable while ambient light is available.

4.5.2 Referenced Documents Vehicle Detection for Cyclist Safety System Requirements Document, 10/01/2018.

4.5.3 Required Test Equipment 1. Bicycle 2. CA378-AOIS Camera 3. Nvidia Jetson TX2 Development Board

4. PowerAdd ChargerCenter 2 5. Atelix Vented Weatherproof Enclosure 6. Stop Clock or Timer

4.5.4 Test Procedure 1. Ensure that the PowerAdd ChargerCenter 2 is fully charged. If the battery back is not fully charge stop the test and charge the battery. Else, proceed to step 2. 2. Verify the Detection System is strapped in properly onto the bicycle. 3. Next, open the Atelix enclosure and turn on the the PowerAdd ChargerCenter 2, then the Jetson, and close the enclosure. As soon as this is done start the timer and record the stating and ending time on your data sheet. 4. Keep the timer running for six hours. 5. Once the six hours expires, check to see of the system is still on and running. If it is, it has passed; else it has failed.

6. Record the remaining battery capacity on the on the Battery/Power Data Sheet and project how much longer the system will run based off of the battery life dissipated. Record this in the Battery data sheet.

27 5 Models and Analyses

5.1 Sun Interference 5.1.1 Introduction This procedure describes the analysis of the sun interference model of the system and how sunlight may distort any of the images the camera captures such that the network is unable to draw boundary boxes around the vehicles. The system will be tested during a sunny day due to the requirement of being able to perform during a midsummer day in Tucson. The system is tested using inputs such as number of detections and pixel values and converting these to histogram plots to demonstrate when sun interference is high or low. Complete protection from sun interference is not required, but rather boundary conditions are a requirement. Therefore, this analysis of the sun interference model demonstrates when sunlight affects the detection of vehicles and again causes the network to not draw boundary boxes around them.

5.1.2 Reference Documents Vehicle Detection for Cyclist Safety System Requirements Document, 10/01/2018.

5.1.3 Required Tools 1. Bicycle 2. Bike Rack

3. CA378-AOIS Camera 4. Nvidia Jetson TX2 Development Board 5. PowerAdd ChargerCenter 2 6. Altelix 14x12x8 Vented FRP Fiberglass Weatherproof NEMA Box

7. Supreme Tech Acrylic Dome/Plastic Hemisphere - Clear 8. Computer with python, OpenCV, and MatPlotLib software

5.1.4 Procedure to set up Sun Interference Model 1. Ride the bike with the camera facing in the direction of the sun to capture images that have high concentration of sunlight on the camera’s lens.

2. Load the captured footage using OpenCV. 3. Use OpenCV and MatPlotLib to analyze and plot histogram values of sunlight concentration on the camera lens. 4. Use plots to determine boundary conditions of sun interference on the camera to determine when it is present and compare it to when vehicles are being detected. 5. Obtain analysis of when sunlight might affect detecting any vehicles.

28 5.1.5 Results The results that were obtained from performing an analysis on the sun interference model of the system allowed the neural network to draw boundary boxes around vehicles when sunlight did not affect the images of the captured footage. Under 4.2 Preprocessor Figure 1 shows an image without sun interference while Figure 2 does. As you can see in these histogram plots, there is a noticeable spike from Figure 1 to Figure 2 as shown by the white boundary boxes drawn around them. Boundary conditions are determined by comparing the spikes in sun interference in the histogram plots, and correlating them with images that are affected by sunlight.

5.2 Heat 5.2.1 Introduction This procedure describes the acceptance test performance for the systems heat capacity. For the heat analysis, it was planned to use the built in SolidWorks function, Thermal Analysis. The idea was to build all the pieces in the system that would give off heat, as well as be able to model that the outside air of our system would be 110 degrees Fahrenheit. The main pieces that would give off heat in our system are the battery, Jetson Board, and camera. With the equipment and technology we currently have, the analysis was unsuccessful and unable to run. This poses an issues as we are unsure of the max temperature the system will reach. The testing for the system will commence as soon as the whole thing is built together. This will allow for the system to get enough analysis as to make sure the system will run effectively and be able to operate in 110 degree Fahrenheit.

5.2.2 Reference Documents Vehicle Detection for Cyclist Safety System Requirements Document, 10/01/2018.

5.2.3 Required Tools 1. Computer running SolidWorks

5.2.4 Procedure to set up SolidWorks Know what the max temperature for each component is going to be. This way it will be easy to notice if the system is going to over heat, or is in danger of over heating. The max operating temperature for the Jetson Board TX2 is 178 degrees Fahrenheit. The max operating temperature of the camera is 220 degrees Fahrenheit, and the max operating temperature of the Poweradd ChargerCenter 2 is roughly 205 degrees Fahrenheit. There is also a cooling fan that will be implemented into the system to help dissipate some of the heat. Steps for setting up SolidWorks: 1. Make all the items in your system in SolidWorks. 2. Open up a new Thermal Analysis study. 3. Define the outside temperature to be 110 degrees Fahrenheit. 4. Define the battery with the max wattage that is given during its maximum run time. 5. Define the Jetson Board to be giving off its max heat to see how hot the system could get. 6. Define the max temperature for the camera to make sure that the camera will not overheat. 7. Mesh the whole system together. 8. Run the analysis. 9. Obtain a Thermal Analysis about the system from SolidWorks.

29 5.2.5 Results Program failure.

5.3 Camera Angle 5.3.1 Introduction This model describes the analysis procedure for determining the angle at which the system’s camera is placed in order to detect cars to the rear left of the bicycle. The desired horizontal camera angle is one that allows the camera to see a width of at least two lanes at its 20 foot requirement for distance detection. This width allows for the detection of vehicles that are within a notable distance of the cyclist. This model also describes the analysis procedure for determining the vertical angle that the camera would need to be tilted to detect cars (if such an angle was needed). To calculate this, we first calculated if the camera’s height will allow it to detect the full height of a car, and at what distance it will be able to do so. If the camera is high enough to capture a full vehicle, then no vertical tilt will be needed. This ensures that the system requirement of being able to detect vehicles will be met. The inputs for this analysis model are the average car lane width in Pima County, the average bike lane width in Arizona, the camera’s horizontal and vertical field of view, the height of the camera, and the height of an average car. The outputs of this model are the camera angle and height that will meet our requirements. Microsoft Excel will be used to perform this analysis. Before performing this analysis, we needed to gather the values for each input. The average lane width in Pima County, where Tucson is located, is 12 feet [3]. The average bike lane width in Arizona is 5 feet [1]. These two inputs are important because the camera angle will be designed to allow a width of at least two lanes to be seen at 20 feet behind the bicycle, and if the cyclist is on a road with a bike lane, that number will be needed as well. The camera’s horizontal field of view is 68.8◦and its vertical field of view is 54.3◦. The camera will be placed on top of the enclosure, which is attached to a bike rack on the back of the bike, placing it at about 41 inches above the ground. The height of an average car was estimated to be 6.5 feet.

5.3.2 Referenced Documents Vehicle Detection for Cyclist Safety System Requirements Document, 10/01/2018.

5.3.3 Required Tools 1. Microsoft Excel 2. Pencil and Paper (for drawing diagrams)

5.3.4 Procedure The first analysis performed was determining the horizontal angle for the camera. This angle was calculated in Microsoft Excel using geometry. The desired angle, X, is the difference between where the center of the camera’s field of view is located once the camera is turned and the axis pointing directly behind the bicycle (where the center of the field of view would be if the camera was not angled at all).

30 Figure 14: Drawing of the geometry used to calculate the desired camera angle.

To determine the X, a triangle was drawn using the axis pointing 20 feet directly behind the bicycle (shown in Figure 14 above). The horizontal distance was calculated assuming the cyclist would be riding down the center of the bike lane. This means the the horizontal distance of the triangle would be equal to the width of half of the bike lane plus the width of two lanes, which is 26.5 feet. Next, the angle of the triangle that is closest to the camera, Y, was calculated using the inverse tangent of the horizontal length, 26.5 feet, over the vertical length, 20 feet. The angle was calculated to be 52.96◦. After calculating that angle, another angle, Z, is needed. This angle is the angle created by the left edge of the camera’s field of view and the axis directly behind the bike. To calculate Z, we subtracted angle Y from the camera’s field of view, 68.8◦and got a value of 15.84◦. Finally, X could be calculated because it is the difference between one half of the camera’s field of view (34.4◦) and angle Z. The final calculated value for X was 18.56◦.

31 Figure 15: Drawing of the geometry used to calculate the distance that a car can be fully captured behind the camera.

Next, the distance at which the full height of a car will be captured by the camera, D, was calculated. First, a right triangle was drawn by bisecting the angle of the triangle created by the camera’s vertical field of view (shown in Figure 15 above). The value for D is the length of side of the triangle adjacent to the 27.15◦angle marked. To determine D, two values were needed: the length of the opposite side of the triangle, which is 3.25 feet (half of the average height of a car) and half of the vertical field of view of the camera (27.15◦). Finally, divide 3.25 by the tangent of 27.15◦to get the value of D: 6.33 feet. The Excel sheet showing these calculations is included in Appendix A.13.

5.3.5 Results After performing the required analysis, the calculated camera angle was 18.56◦. To simplify of the design, the angle will be rounded up to 19◦, which still allows at the camera to see at least two lanes at a distance of 20 feet. The analysis also showed that the height at which the camera will be placed will allow the full height of a car to be seen 6.33 feet. This value is smaller than 20 feet, meaning the full height of many cars (even vehicles taller than 6.5 feet) will be captured at 20 feet behind the bicycle, so the camera will not need to be tilted in the vertical direction. This analysis has shown that the camera angle requirement has been met at the 20 foot distance requirement.

5.4 Battery 5.4.1 Introduction Battery Life/Power Model: This model describes the analysis procedure for determining the power consumption of the Vehicle Detection System over a period of 6 hours. In order to get an idea about how much power is drawn by the system from the battery, we researched how much power each individual component uses. We extracted the data from the Nvidia Jetson TX2 Power Data Sheet and the CA378 - AOIS Power Data Sheet. This model assumes that each component is drawing its maximum rated power and that the batteries’ capacity will decrease linearly with time. Below are the steps we used to this model.

32 5.4.2 Reference Documents Vehicle Detection for Cyclist Safety System Requirements Document, 10/01/2018. IMX378 Module Design Reference Manual, 05/10/2018 INR18650-25R Specification of Product, 03/1/2014

5.4.3 Required Tools Microsoft Excel or Google Sheets

5.4.4 Analysis Procedure 1. Inputs: Jetson, Camera Wattages 2. Pull power data from the Nvidia Jetson TX2 Data Sheet, 15W MAX.

3. Acquire the power specifications from CA-AOIS from it’s Power Data Sheet, 435mW @ MAX 4. Sum all acquire power data, 15.4W 5. Create an Excel Spread Sheet with 1 column labelled capacity, it will have 10 data entries. 6. Since we know that the initial capacity of the battery is 385Wh, that will be our first data point in the Capacity column. For each successive entry after 385Wh, it will be the previous entry - 15.4W. 7. Once we have our ten entries proceed to make a line chart. Making the Y-axis data the Capacity Column.

5.4.5 Analysis Results Output: Battery Life 15.4W * 6 = 92.4Wh @ MAX Note: Assumes linear power consumption

Figure 16: This model assumes that capacity decreases linearly with time. At 6 hours into the model we can see that the capacity of the battery is at 75% or 286Wh

According to these results, the PowerAdd ChargerCenter 2 has ample capacity to fuel the Vehicle Detec- tion System for 6 hours. Keep in mind that this model neglects both number of charge cycles the battery

33 has been through, as well as the temperature the battery will be performing at.

By observing the Samsung Li-ion cell 18650 data sheet, we have constructed the following models to simulate what would happen to the battery pack at varying temperatures.

Figure 17: This model shows us that at lower temperatures we see a smaller capacity. As we get into the 60, 70, 80 Fahrenheit range we see that we get the maximum capacity. Past 80◦F we start to see the performance of the battery decrease again. The lowest capacity occurs at temperatures above 110◦F.

Figure 18: we see that around 40◦F, we start out with 60% of the original capacity. The cells are at optimal operating temperature around 60-70◦F range. After 110◦F we start to see a significant decline in the overall performance of the battery.

5.5 Network Speed 5.5.1 Introduction The neural network speed model is concerned with the rate at which the network can process images. The requirement for network speed is 10 frames per second (fps).

5.5.2 Referenced Documents Vehicle Detection for Cyclist Safety System Requirements Document, 10/01/2018.

5.5.3 Required Tools TensorFlow

34 5.5.4 Procedure Because the neural network will be developed iteratively, the procedure for modelling network speed involves an estimation of speed based on certain parameters. The estimation is based off of the camera running in a laboratory environment. Although factors such as heat and number of visible vehicles may produce variations in performance during deployment, this model provides a good starting estimate. The presence of vehicles to detect has a slight effect on the speed of the network, so the testing data includes images with heavy traffic. The procedure is as follows:

1. Download ssdlite mobilenet v2 coco model. 2. Optimize model using TensorRT. 3. Open stream to camera 4. Begin feeding data into network, keeping track of the time elapsed for each frame.

5. Run network for 1000 iterations and then average the time elapsed over the 1000 frames.

The code for this test is available in Appendix A.11.

5.5.5 Results Using the above procedure, the model was able to run at an average 10.3 fps, showing that our network should be able to meet the required fps.

35 6 Acceptance Test Results

6.1 Jitter and Distance

Jitter and Distance Acceptance Test

Reference: Vehicle Detection for Cyclist Safety System Requirements Document

Name of Test: Jitter and Distance Test Unit Under Test: Name: Vehicle Detection System Date of Test: 4/4/19 Results (Pass/Fail): Pass Requirement (SRD): The system shall be robust to jitter; the system shall be able to detect cars at a distance of 20 ft Amount of jitter Distance 1e-1 9e-2 5e-2 1e-2 20 ft Detected Detected Detected Detected 30 ft Detected Detected Detected Detected 40 ft Not Detected Not Detected Not Detected Not Detected 50 ft Detected Detected Detected Detected 60 ft Detected Detected Detected Not Detected 70 ft Detected Detected Not Detected Not Detected Signatures: Tester:______

Customer:______

Figure 19: The results of the jitter and distance test.

36 6.2 Smear and Distance

Smear and Distance Acceptance Test

Reference: Vehicle Detection for Cyclist Safety System Requirements Document

Name of Test: Jitter and Distance Test Unit Under Test: Name: Vehicle Detection System Date of Test: 4/4/19 Results (Pass/Fail): Pass Requirement (SRD): The system shall be able to detect cars at a distance of 20 ft Amount of smear Distance +5 +10 +15 +20 20 ft Detected Detected Not Detected Detected 30 ft Detected Detected Detected Detected 40 ft Not Detected Not Detected Not Detected Not Detected 50 ft Not Detected Not Detected Not Detected Not Detected 60 ft Not Detected Not Detected Not Detected Not Detected 70 ft Detected Detected Detected Not Detected Signatures: Tester:______

Customer:______

Figure 20: The results of the smear and distance test.

37 6.3 Heat

Figure 21: The results of the heat test.

38 6.4 Water

Figure 22: The results of the water test.

39 6.5 Battery

Figure 23: The results of the battery test.

40 7 Final Budget

7.1 Overall Budget Below is the final budget for the entire project. It includes every item purchased by the team.

Figure 24: The final budget for the project.

The final amount spent by the team for this project was $1869.75. This means there was $2130.25 of the $4000 given to the team at the beginning of the project remaining.

7.2 Budget for Final Product Below is the cost for the product itself. This calculation removes anything purchased for prototyping, testing, presenting, etc.

41 Figure 25: The final budget for the system.

The cost of the detection system itself was $1176.36.

42 8 Lessons Learned

8.1 Plan Testing Data It is important to have a well thought out plan to obtain testing data for various points of a project. There should be a way to collect testing data for a project without having to have to put forth too much effort so that different components can be test before the system as a whole is completely working. If we were to restart this project, we would design a way to collect somewhat accurate testing data without having to have the computer hardware completely working. It is very important to order parts and pieces as early as you can. We had planned to use a specific part for our Jetson board, but when it came time to order the part it was no longer available. This made it so we had to come up with a new solution to our problem.

8.2 Value Feedback When designing a solution for a problem, it is easy for the team to get caught up in one design, and to not consider other options. This is why presenting to an audience (i.e. the class and mentors) can be very valuable. Before our team presented our Preliminary Design Review, we had considered multiple cameras and thought we had picked out the best one. However, after the presentation our mentors told us to look into cameras with optical image stabilization, which is something we did not consider. We decided to use a camera with optical image stabilization and this helped our project immensely. This moment taught us that it is very important to value the feedback given by classmates, mentors, and other engineers because it may open a new path for discovering a better engineering solution.

8.3 Various Testing and Analysis We have learned that a good design requires various iterations of testing and analysis. Different iterations of testing and analysis could help us see the boundary conditions of our design, so that, we could improve our design if we see any errors or issues. In addition, various testing and analysis can help us check if our design has fulfill the requirements for our project. Thus, we did multiple acceptance tests to test our system. We also know it is important to ensure our design works at different situations and different environments. In order to accomplish that, we tested our system by cycling on different roads in Tucson every weekends to see if our system is functioning well. If we only did one testing for our design, it would have a very high chance that our design fails the testing. Even though it passed the testing, it still does not prove that our design is a working design and also a good design.

8.4 Over Engineering a Design When designing a solution for a problem it is important to try to mitigate the problem and at the same time avoid spending too much time on it. The problem with this is spending too much time trying to perfect the solution that it gives the team less time to start testing and analysis especially if the design is needed immediately. For example, when trying to produce a 3D printed design for the camera and for the battery, too much time was spent doing research on how these problems are fixed in industry. At the end of the day we figured out that a simple solution for the camera enclosure was to just create a simple 3D printed design in which the camera would fit comfortably and avoid any jitter inside the enclosure. For the battery we ended up using some brackets that eliminated the bounciness of the battery inside the enclosure box very well instead of completely designing another enclosure for the battery itself. In conclusion, try to plan what processes such as tests or designing needs more attention before spending too much time on something that is not necessarily critical as much as other things.

8.5 Dampening Mechanical Noise Before Using Optical Image Stabilization One of the biggest issues we had with our project was dealing with mechanical noise. Although we used rubber noise isolaters to mitigate the effect of the rough Tucson roads on our system, we still experienced a fair amount of jitter that blurred our images. We had initially planned to utilize the optical image stabilization software that was embedded in our camera to obtain a clearer picture, but it did not work as well as we

43 had anticipated. What we ended up doing was a thorough inspection of our system. We found the parts of our enclosure that were still rattling when under heavy shock or weren’t entirely secure. Once we tightened everything down, we experienced much less noise from the road and the noise that was leftover was able to be smoothed out by the optical image stabilization software. To sum things up, find all the sources of mechanical noise in your system first and foremost before even trying to use any optical image stabilization routines.

44 References

[1] ArizonaBikeLaw. AASHTO Guide for the Development of Bicycle Facilities. Feb. 2018. url: http: //azbikelaw.org/aashto-guide-for-the-development-of-bicycle-facilities/. [2] G. Bradski. “The OpenCV Library”. In: Dr. Dobb’s Journal of Software Tools (2000). [3] Pima County. Pima County Roadway Design Manual. 4th ed. 2013. [4] Eric Gonzalez, MacCallister Higgins, and Oliver Cameron. Self-Driving Car. https://github.com/ udacity/self-driving-car. 2018. [5] Jonathan Huang et al. “Speed/accuracy trade-offs for modern convolutional object detectors”. In: CoRR abs/1611.10012 (2016). arXiv: 1611.10012. url: http://arxiv.org/abs/1611.10012. [6] Tsung-Yi Lin et al. “Microsoft COCO: Common Objects in Context”. In: CoRR abs/1405.0312 (2014). arXiv: 1405.0312. url: http://arxiv.org/abs/1405.0312. [7] NVIDIA. TensorRT. https://developer.nvidia.com/tensorrt. 2018. [8] Mark Sandler et al. “Inverted Residuals and Linear Bottlenecks: Mobile Networks for Classification, Detection and Segmentation”. In: CoRR abs/1801.04381 (2018). arXiv: 1801.04381. url: http:// arxiv.org/abs/1801.04381. [9] Yasuo Tanaka. CA378-AOIS for Jetson TX2. https://github.com/centuryarks/CA378-AOIS/tree/ master/JetsonTX2. 2018.

45 A Appendix

A.1 Functional Requirements 1. The system shall use a deep neural network.

2. The system shall attach to a bicycle or bicyclist. 3. The system shall detect vehicles using a camera. 4. The system shall be operable while ambient light is available.

5. The system shall not impede the mobility of the bicycle.

A.2 System Requirements 1. Deep Neural Network Requirements

1.1 Network Size: The size of the neural network shall be less than or equal to 4 GB. 1.2 Network Speed: The speed of the neural network shall be at least 8 frames per second. 2. Attachment Requirements 2.1 Mechanical Interface: The system shall be mounted to the bike. 2.2 Hardware Enclosure: The system shall contain an enclosure to protect the hardware. 2.3 Dynamic Environment: The system shall be robust to the variations typical of Tucson roads. 3. Camera Requirements 3.1 Camera Resolution: The camera shall have a resolution of at least 1920 x 1080. 3.2 Camera Speed: The camera shall have the necessary frames per second required by the neural network (at least 8 frames per second). 3.3 Camera FOV: The camera shall have a field of view of at least 60◦. 3.4 Depth of Field: The camera shall be able to capture vehicles at least 20 feet away. 3.5 Smear: The system shall have no more than one pixel of smear. 3.6 Jitter: The system shall be robust to jitter. 3.7 Camera Angle: The camera shall be angled to detect vehicles to the rear-left of the bicycle. 4. Operating Condition Requirements 4.1 Battery Life: The battery shall last for a minimum of 6 hours. 4.2 Sun Interference: The system shall be resistant to interference from the sun. 4.3 Operating Temperature: The system will be able to function up to 110◦F. 4.4 Robustness to Rain: The system shall be operable under light rain (less than 0.1 inches of rain per hour).

5. Mobility Requirements 5.1 Size: The system enclosure shall be no larger than 22x12x10 inches. 5.2 Weight: The system shall weigh less than 30 lbs.

46 A.3 System Requirements Verification Matrix

Figure 26: System Requirement Verification Matrix

A.4 Part Pictures

Figure 27: Roadmaster 26” Granite Peak Men’s Mountain Bike

47 Figure 28: Lumintrail Bicycle Commuter Carrier Rear Seatpost Frame Mounted Bike Cargo Rack for Heavier Top and Side Loads

Figure 29: CA378-AOIS

Figure 30: Interface Board for CA378-AOIS camera

48 Figure 31: Interface Board for Jetson TX2 board

Figure 32: FFC cable

Figure 33: Supreme Tech Acrylic Dome/Plastic Hemisphere

Figure 34: Altelix 14x12x8 Vented FRP Fiberglass Weatherproof NEMA Box

49 Figure 35: Poweradd ChargerCenter 2

Figure 36: Puget Systems Acrylic Enclosure for NVIDIA Jetson TX1

Figure 37: SanDisk Extreme Pro 128GB

50 Figure 38: 1/2 x 6 ft. Non-Metallic Liquidtight Whip

Figure 39: ACC Conduit and Pipe Hangers

51 Figure 40: 2.25 in. x 1 in. Vibration Isolator

A.5 System Part Diagrams and Specifications A.5.1 Bike Specifications

Figure 41: Roadmaster Bike Specifications

52 A.5.2 Bike Rack Installation Specifications

Figure 42: Lumintrail Bicycle Commuter Carrier Rear Seatpost Frame Mounted Bike Cargo Rack Installation Specifications

53 A.5.3 CA378-AOIS Camera Specifications

Figure 43: CA378-AOIS Camera Specifications

54 A.5.4 System Enclosure Measurements

Figure 44: Altelix 14 x 12 x 8 Inch Vented Nema Enclosure

A.5.5 PowerAdd ChargerCenter 2/Samsung Li-ion 18650-25R Specifications

Figure 45: Poweradd ChargerCenter II Battery Specifications

55 -SAMSUNG SDI Confidential Proprietary –

Spec. No. INR18650-25R Version No. 1.0 In-Young Jang

SPECIFICATION OF PRODUCT

Lithium-ion rechargeable cell for power tools

Model name : INR18650-25R

Mar., 2014

Samsung SDI Co., Ltd. Energy Business Division

-SAMSUNG SDI Confidential Proprietary –

Spec. No. INR18650-25R Version No. 1.0 In-Young Jang

Contents 1.0. Scope 2.0. Description and model name 2.1. Description 2.2. Model name 3.0. Nominal specification 4.0. Outline dimensions 5.0. Appearance 6.0. Standard test conditions 6.1. Environmental conditions 6.2. Measuring equipments 7.0. Characteristics 7.1. Standard charge 7.2. Rapid charge 7.3 Nominal discharge capacity 7.4. Standard rated discharge capacity 7.5. Initial internal impedance 7.6. Temperature dependence of discharge capacity 7.7. Temperature dependence of charge capacity 7.8. Charge rate capabilities 7.9. Discharge rate capabilities 7.10. Cycle life 7.11. Storage characteristics 7.12. Status of the cell as of ex-factory 8.0. Mechanical Characteristics 8.1. Drop test 8.2. Vibration test 9.0. Safety 9.1 Overcharge test 9.2 External short-circuit test 9.3 Reverse charge test 9.4 Heating test 10.0. Warranty 11.0. Others 11.1 Storage for a long time 11.2 Others 12.0. Packing

Proper use and handling of lithium ion cells Handling precaution and prohibitions of lithium Ion rechargeable cells and batteries Samsung SDI emergency contact information Additional remarks Revision history

- 1/16 - -SAMSUNG SDI Confidential Proprietary –

Spec. No. INR18650-25R Version No. 1.0 In-Young Jang

1.0. Scope This product specification has been prepared to specify the rechargeable lithium-ion cell ('cell') to be supplied to the customer by Samsung SDI Co., Ltd. 2.0. Description and model name 2.1 Description lithium-ion rechargeable cell 2.2 Model name INR18650-25R 3.0. Nominal specifications Item Specification 2,500mAh 3.1 Nominal discharge capacity Charge: 1.25A, 4.20V,CCCV 125mA cut-off, Discharge: 0.2C, 2.5V discharge cut-off 3.2 Nominal voltage 3.6V

3.3 Standard charge CCCV, 1.25A, 4.20 ± 0.05 V, 125mA cut-off

3.4 Rapid charge CCCV, 4A, 4.20 ± 0.05 V, 100mA cut-off Standard charge : 180min / 125mA cut-off 3.6 Charging time Rapid charge: 60min (at 25℃) / 100mA cut-off 3.7 Max. continuous discharge 20A(at 25℃), 60% at 250 cycle (Continuous) 3.8 Discharge cut-off voltage 2.5V End of discharge 3.9 Cell weight 45.0g max Height : 64.85 ± 0.15mm 3.10 Cell dimension Diameter : 18.33 ± 0.07mm Charge : 0 to 50℃ ℃ 3.11 Operating temperature (recommended recharge release < 45 ) (surface temperature) Discharge: -20 to 75℃ (recommended re-discharge release < 60℃) 1.5 year -30~25℃(1*) 3.12 Storage temperature 3 months -30~45℃(1*) (Recovery 90% after storage) 1 month -30~60℃(1*) Note (1): If the cell is kept as ex-factory status (50±5% SOC, 25℃), the capacity recovery rate is more than 90% of 10A discharge capacity 100% is 2,450mAh at 25℃ with SOC 100% after formation.

- 2/16 - -SAMSUNG SDI Confidential Proprietary –

Spec. No. INR18650-25R Version No. 1.0 In-Young Jang

4.0 Outline dimensions See the attachment (Fig. 1) 5.0. Appearance There shall be no such defects as scratch, rust, discoloration, leakage which may adversely affect commercial value of the cell. 6.0. Standard test conditions 6.1 Environmental conditions Unless otherwise specified, all tests stated in this specification are conducted at temperature 25±5℃ and humidity 65±20%.

6.2 Measuring equipments (1) Amp-meter and volt-meter The amp-meter and volt-meter should have an accuracy of the grade 0.5mA and mV or higher. (2) Slide caliper The slide caliper should have 0.01 mm scale. (3) Impedance meter The impedance meter with AC 1kHz should be used. 7.0. Characteristics 7.1 Standard charge This "Standard charge" means charging the cell CCCV with charge current 0.5CmA (1,250mA), constant voltage 4.2V and 125mA cut-off in CV mode at 25℃ for capacity. . 7.2 Rapid charge Rapid charge means charging the cell CCCV with charge current 4A and 100mA cut-off at 25℃ 7.3 Nominal discharge capacity The standard discharge capacity is the initial discharge capacity of the cell, which is measured with discharge current of 500mA(0.2C) with 2.5V cut-off at 25℃ within 1hour after the standard charge. Nominal discharge capacity ≥ 2,500mAh Which complying to the minimum capacity of IEC61960 standard.

7.4 Standard rated discharge capacity The standard rated discharge is the discharge capacity of the cell, which is measured with discharge current of 10A with 2.5V cut-off at 25℃ within 1hour after the standard charge. Standard rated discharge capacity ≥ 2,450mAh 7.5 Initial internal impedance Initial internal impedance measured at AC 1kHz after standard charge Initial internal impedance ≤ 18mΩ 7.6 Temperature dependence of discharge capacity Capacity comparison at each temperature, measured with discharge constant current 10A and 2.5V cut-off after the standard charge is as follows. Discharge temperature -20℃ -10℃ 0℃ 25℃ 60℃

- 3/16 - -SAMSUNG SDI Confidential Proprietary –

Spec. No. INR18650-25R Version No. 1.0 In-Young Jang

60% 75% 80% 100% 100% Note: If charge temperature and discharge temperature is not the same, the interval for temperature change is 3 hours. Percentage index of the discharge at 25℃ at 10A (=2,450mAh) is 100%. 7.7 Temperature dependence of charge capacity Capacity comparison at each temperature, measured with discharge constant current 10A and 2.5V cut-off after the standard charge is as follows.

Charge temperature Discharge temperature

0℃ 5℃ 25℃ 45℃ 50℃ 25℃ Relative capacity 80% 90% 100% 95% 95% Note: If charge temperature and discharge temperature is not the same, the interval for temperature change is 3 hours. Percentage index of the discharge at 25℃ at 10A (=2,450mAh) is 100%. 7.8 Charge rate capabilities Discharge capacity is measured with constant current 10A and 2.5V cut-off after the cell is charged with 4.2V as follows.

Charge condition Current Standard 1.25A Maximum rapid charge 4A Cut-off 125mA 100mA Relative Capacity 100% 98% Note: Percentage index of the discharge at 25℃ at 10A (=2,450mAh) is 100%. 7.9 Discharge rate capabilities Discharge capacity is measured with the various currents in under table and 2.5V cut-off after the standard charge. Discharge condition Current 0.50A 5A 10A 15A 20A Relative Capacity 100% 97% 100% 97% 95% Percentage index of the discharge at 25℃ at 10A (=2,450mAh) is 100%. 7.10 Cycle life With standard charge and maximum continuous discharge. Capacity after 250cycles, Capacity ≥ 1,500mAh (60% of the nominal capacity at 25℃) 7.11 Storage characteristics Standard rated discharge capacity after storage for 1 month at 60℃ from the standard charged state is ≥ 90% of the initial 10A discharge capacity at 25℃ 7.12 Status of the cell as of ex-factory The cell should be shipped in 50 ± 5% charged state. In this case, OCV is from 3.600V to 3.690V.

- 4/16 - -SAMSUNG SDI Confidential Proprietary –

Spec. No. INR18650-25R Version No. 1.0 In-Young Jang

8.0. Mechanical Characteristics 8.1 Drop test Test method: Cell(as of shipment or full charged) drop onto a concrete from 1.0m height at 3 sides. Criteria: No leakage, Voltage decrease≦0.025V, AC iR increase ≦1.0mΩ 8.2 Vibration test Test method: As to the UN transportation regulation(UN38.3), for each axis (X and Y axis with cylindrical cells) 7Hz→200Hz→7Hz for 15min, repetition 12 times totally 3hours, the acceleration 1g during 7 to 18Hz and 8g (amplitude 1.6mm) up to 200Hz. Criteria: No leakage, with less than 10mV of OCV drop

9.0. Safety 9.1 Overcharge test Test method: To charge with 20A-20V at 25℃ for 3hr. Criteria: No fire, and no explosion. 9.2 External short-circuit test Test method: To short-circuit the standard charged cell (or 50% discharged cell) by connecting positive and negative terminal by 80mΩ wire for 10min. Criteria: No fire, and no explosion. 9.3 Reverse charge test Test method: To charge the standard charged cell with charge current 10A By 0V for 2.5 hours. Criteria: No fire, and no explosion. 9.4 Heating test Test method: To heat up the standard charged cell at heating rate 5℃ per minute up to 130℃ and keep the cell in oven for 10 minutes. Criteria: No fire, and no explosion. 10.0. Warranty Samsung SDI will be responsible for replacing the cell against defects or poor workmanship for 18months from the date of shipping. Any other problem caused by malfunction of the equipment or mix-use of the cell is not under this warranty. The warranty set forth in proper using and handling conditions described above and excludes in the case of a defect which is not related to manufacturing of the cell. 11.0. Others 11.1 Storage for a long time If the cell is kept for a long time (3 months or more), It is strongly recommended that the cell is preserved at dry and low-temperature. 11.2 Others Any matters that specifications do not have, should be conferred with between the both parties.

12.0. Packing See Fig.2, Package Drawing

- 5/16 - -SAMSUNG SDI Confidential Proprietary –

Spec. No. INR18650-25R Version No. 1.0 In-Young Jang

18.33±0.07

64.85± 0.15

Unit : mm With tube

Fig.1. Outline dimensions of INR110500-25R

- 6/16 - -SAMSUNG SDI Confidential Proprietary –

Spec. No. INR18650-25R Version No. 1.0 In-Young Jang

Fig.2. Package drawing

- 7/16 - -SAMSUNG SDI Confidential Proprietary –

Spec. No. INR18650-25R Version No. 1.0 In-Young Jang

Proper use and handling of lithium ion cells See before using lithium-ion cell Supplied by Samsung SDI Co., Ltd.

1.0. General This document has been prepared to describe the appropriate cautions and prohibitions, which the customer should take or employ when the customer uses and handles the lithium ion cell to be manufactured and supplied by Samsung SDI Co., Ltd., in order to obtain optimal performance and safety. 2.0. Charging 2.1 Charging current Charging current shall be less than maximum charge current specified in the product specification. 2.2 Charging voltage Charging shall be done by voltage less than that specified in the product specification. 2.3 Charging time Continuous charging under specified voltage does not cause any loss of performance characteristics. However, the charge timer is recommended to be installed from a safety consideration, which shuts off further charging at time specified in the product specification. 2.4 Charging temperature The cell shall be charged within a range of specified temperatures in the specification. 2.5 Reverse charging The cell shall be connected, confirming that its poles are correctly aligned. Inverse charging shall be strictly prohibited. If the cell is connected improperly, it may be damaged. 3.0. Discharging 3.1 Discharging 3.1.1 The cell shall be discharged continuously at less than maximum discharge current specified in the product specification. In case of the higher discharge current should be set, it shall be discussed together with SDI. 3.2 Discharging temperature 3.2.1 The cell shall be discharged within a range of temperatures specified in the product specification. 3.2.2 Otherwise, it may cause loss of performance characteristics. 3.3 Over-discharging 3.3.1 The system should equip with a device to prevent further discharging exceeding discharging cut-off voltage specified in the product specification. 3.3.2 Over-discharging may cause loss of performance characteristics of battery. 3.3.3 Over-discharging may occur by self-discharge if the battery is left for a very long time without any use. 3.3.4 The charger should equip with a device to detect voltage of cell block and to determine recharging procedures.

4.0. Storage 4.1 Storage conditions

- 8/16 - -SAMSUNG SDI Confidential Proprietary –

Spec. No. INR18650-25R Version No. 1.0 In-Young Jang

4.1.1 The cell should be stored within a range of temperatures specified in the product specification. 4.1.2 Otherwise, it may cause loss of performance characteristics, leakage and/or rust. 4.2 Long-term storage 4.2.1 The cell should be used within a short period after charging because long-term storage may cause loss of capacity by self-discharging. 4.2.2. If long-term storage is necessary, the cell should be stored at lower voltage within a range specified in the product specification, because storage with higher voltage may cause more loss of performance characteristics. 5.0. Cycle life 5.1 Cycle life performance 5.1.1 The cell can be charged/discharged repeatedly up to times specified in the product specification with a certain level of capacity specified in the product specification. 5.1.2 Cycle life may be determined by conditions of charging, discharging, operating temperature and/or storage. 6.0. Design of system 6.1 Connection between the cell and the battery 6.1.1 The cell should not be soldered directly with other cells. Namely, the cell should be welded with leads on its terminal and then be soldered with wire or leads to solder. 6.1.2 Otherwise, it may cause damage of component, such as separator and insulator, by heat generation. 6.2 Positioning the battery in the system 6.2.1 The battery should be positioned as possible as far from heat sources and high temperature components. 6.2.2 Otherwise, it may cause loss of characteristics. 6.2.3 The recommended spacing between the cells is more than 1mm. 6.3 Mechanical shock protection of the battery 6.3.1 The battery should be equipped with appropriate shock absorbers in the pack in order to minimize shock, which can damage the cells. 6.3.2 Otherwise, it may cause shape distortion, leakage, heat generation and/or rupture and/or open circuit. 6.4 Short-circuit protection of the cell 6.4.1 The cell equips with an insulating sleeve to protect short-circuit which may occur during transportation, battery assembly and /or system operation. 6.4.2 If the cell sleeve is damaged by some cause such as outside impact, it may cause short-circuit with some wiring inside the battery. 6.5 Connection between the battery and charger/system 6.5.1 The battery should be designed to be connected only to the specified charger and system. 6.5.2 A reverse connection of the battery, even in the specified system, should be avoided by employing special battery design such as a special terminals. 6.6 Pack design 6.6.1 The current consumption of the battery pack should be under 10uA at sleep mode. 6.6.2 Cell voltage monitoring system. The system (charger or pack) should be equipped with a device to monitor each - 9/16 - -SAMSUNG SDI Confidential Proprietary –

Spec. No. INR18650-25R Version No. 1.0 In-Young Jang

voltage of cell block to avoid cell imbalance which can cause damage to the cells. 6.6.4 The battery pack or system should have warning system such as over temperature, over voltage, over current, and so on. 7.0. Battery pack assembly 7.1 Prohibition of usage of damaged cell 7.1.1 The cell should be inspected visually before battery assembly. 7.1.2 The cell should not be used if sleeve-damage, can-distorsion and/or electrolyte-smell is detected. 7.2 Terminals handling 7.2.1 Excessive force on the negative terminal should be avoided when external strip terminal is welled. 7.3 Transportation 7.3.1 If the cell is necessary to be transported to such as the battery manufacturer, careful precautions should be taken to avoid damage of cell. 8.0. Others 8.1 Disassembly 8.1.1 The cell should not be dismantled from the battery pack. 8.1.2 Internal short-circuit caused by disassembly may lead to heat generation and/or venting. 8.1.3 When the electrolyte is coming in contact with the skin or eyes, flush immediately with fresh water and seek medical advice. 8.2 Short-circuiting 8.2.1 Short-circuit results in very high current which leads to heat generation. 8.2.3 An appropriate circuitry should be employed to protect accidental short-circuiting. 8.3 Incineration 8.3.1 Incinerating and disposing of the cell in fire are strictly prohibited, because it may cause rupture and explosion. 8.4 Immersion 8.4.1 Soaking the cell in water is strictly prohibited, because it may cause corrosion and leakage of components to be damaged to functions 8.5 Mixing use 8.5.1 Different types of cell, or same types but different cell manufacturer's shall not be used, which may lead to cell imbalance, cell rupture or damage to system due to the different characteristics of cell. 8.6 Battery exchange 8.6.1 Although the cell contains no environmentally hazardous component, such as lead or cadmium, the battery shall be disposed according to the local regulations when it is disposed. 8.6.2 The cell should be disposed with a discharged state to avoid heat generation by an inadvertent short-circuit. 8.7 Caution The Battery used in this device may present a risk of fire or chemical burn if mistreated. Do not disassemble, expose to heat above 100℃ or incinerate it. Replace battery with those of Samsung SDI only. Use of another battery may cause a risk of fire or explosion. Dispose of used battery promptly. - 10 /16 - -SAMSUNG SDI Confidential Proprietary –

Spec. No. INR18650-25R Version No. 1.0 In-Young Jang

Keep battery away from children. Do not disassemble and do not dispose of battery in fire.

8.8 Warning – Attached

Handling precaution and prohibitions of lithium Ion rechargeable cells and batteries Inaccurate handling of lithium ion and lithium ion polymer rechargeable battery may cause leakage, heat, smoke, an explosion, or fire. This could cause deterioration of performance or failure. Please be sure to follow instructions carefully. 1.1 Storage Store the battery at low temperature (below 25℃ is recommended), low humidity, no dust and no corrosive gas atmosphere. 1.2 Safety precaution and prohibitions To assure product safety, describe the following precautions in the instruction manual of the application.

[ Danger! ] ■ Electrical misusage Use stipulated charger. Use or charge the battery only in the stipulated application. Don't charge the battery by an electric outlet directly or a cigarette lighter charger. Don't charge the battery reversely. ■ Environmental misusage Don't leave the battery near the fire or a heated . Don't throw the battery into the fire. Don't leave, charge or use the battery in a car or similar place where inside of temperature may be over 60℃. Don't immerse, throw, wet the battery in water / sea water. ■ others Don't fold the battery cased with laminated film such as pouch and polymer. Don't store the battery in a pocket or a bag together with metallic objects such as keys, necklaces, hairpins, coins, or screws. Don't short circuit (+) and (-) terminals with metallic object intentionally. Don't pierce the battery with a sharp object such as a needle, screw drivers. Don't heat partial area of the battery with heated objects such as soldering iron. Don't hit with heavy objects such as a hammer, weight. Don't step on the battery and throw or drop the battery on the hard floor to avoid mechanical shock.

- 11 /16 - -SAMSUNG SDI Confidential Proprietary –

Spec. No. INR18650-25R Version No. 1.0 In-Young Jang

Don't disassemble the battery or modify the battery design including electric circuit. Don't solder on the battery directly. Don't use seriously scared or deformed battery. Don't put the battery into a microwave oven, dryer or high-pressure container. Don't use or assemble the battery with other makers' batteries, different types and/or models of batteries such as dry batteries, nickel-metal hydride batteries, or nickel-cadmium batteries. Don't use or assemble old and new batteries together.

[ Warning! ] Stop charging the battery if charging isn't completed within the specified time. Stop using the battery if the battery becomes abnormally hot, order, discoloration, deformation, or abnormal conditions is detected during use, charge, or storage. Keep away from fire immediately when leakage or foul odors are detected. If liquid leaks onto your skin or cloths, wash well with fresh water immediately. If liquid leaking from the battery gets into your eyes, don't rub your eyes and wash them with clean water and go to see a doctor immediately. If the terminals of the battery become dirty, wipe with a dry cloth before using the battery. The battery can be used within the following temperature ranges. Don't exceed these ranges. The operating temperature is based on the cell surface temperature in hottest position in pack. Charge temperature ranges : 0℃ ~ 50℃ Discharge Temperature ranges : -20℃ ~ 75℃ Store the battery at temperature below 60℃ Cover terminals with proper insulating tape before disposal.

[ Caution! ] ■ Electrical misusage Battery must be charged with constant current-constant voltage (CC/CV). Charge current must be controlled by specified value in cell specification. Cut-off voltage of charging must be less than 4.2 + 0.05V Charger must stop charging battery by detecting either charging time or current specified in cell’s specification. Discharge current must be controlled by specified value in cell’s specification. Cut-off voltage of full discharging and recharging must be over 2.5V.

■ others Keep the battery away from babies and children to avoid any accidents such as swallow. - 12 /16 - -SAMSUNG SDI Confidential Proprietary –

Spec. No. INR18650-25R Version No. 1.0 In-Young Jang

If younger children use the battery, their guardians should explain the proper handling method and precaution before using. Before using the battery, be sure to read the user's manual and precaution of it's handling. Before using charger, be sure to read the user's manual of the charger. Before installing and removing the battery from application, be sure to read user's manual of the application. Replace the battery when using time of battery becomes much shorter than usual. Cover terminals with insulating tape before proper disposal. If the battery is needed to be stored for an long period, battery should be removed from the application and stored in a place where humidity and temperature are low. While the battery is charged, used and stored, keep it away from object materials with static electric chargers.

Safety handling procedure for the transporter

■ Quarantine Packages that are crushed, punctured or torn open to reveal contents should not be transported. Such packages should be isolated until the shipper has been consulted, provided instructions and, if appropriate, arranged to have the product inspected and repacked. ■ Spilled product In the event that damage to packaging results in the release of cells or batteries, the spilled products should be promptly collected and segregated and the shipper should contact for instructions. Design of positioning the battery pack in application and charger To prevent the deterioration of the battery performance caused by heat, battery shall be positioned away from the area where heat is generated in the application and the charger. Design of the battery pack Be sure adopting proper safe device such as PTC specified type or model in Cell Specification. If you intend to adopt different safety device which is not specified in Cell Specification, please contact Samsung SDI to investigate any potential safety problem. Be sure designing 2nd protective devices such as PCM at the same time to protect cell just in case one protective device is fault. Please contact following offices when you need any help including safety concerns.

- 13 /16 - -SAMSUNG SDI Confidential Proprietary –

Spec. No. INR18650-25R Version No. 1.0 In-Young Jang

Samsung SDI emergency contact information

■ Samsung SDI Cheonan factory CS group 508, Sungsung-dong, Cheonan-si, Chungnam, Korea Tel:(+82)70-7125-1806 Fax:(+82)41-560-3697

■ Samsung SDI America office. 18600 Broadwick Street Rancho Dominguez CA 90220 Tel:(+1)310-900-5205 Fax:(+1)310-537-1033

■ Samsung SDI Taiwan office. Rm. 3010, 30F., 333, Keelung Rd. Sec. 1, Taipei, Taiwan Tel:(+886)2-2728-8469 Fax:(+886)2-2728-8480

- 14 /16 - -SAMSUNG SDI Confidential Proprietary –

Spec. No. INR18650-25R Version No. 1.0 In-Young Jang

Additional remarks

■ Cell package : The bare cell is packed by which packaging material, PET tube.

■ Model and tube marking : there are three lines on the cell tube as follows.

Line 1 : INR18650-25R --- cell model name

Line 2 : SAMSUNG SDI --- cell manufacturer

Line 3 : 2D51 --- date code (Capacity ; “2” is over 2.0Ah, Year, Month, Week)

■ Lot marking : There are three lines on the cell metal can as follows.

Line 1 : J5D5 --- 1st digit: Line number ( “1” for cylindrical line No.1, “J” for cylindrical line No. 8)

2nd digit: Final number of Model Name (“5” is INR18650-25x)

3rd digit: Year ( “D” is 2013)

4th digit: Month ( “5” is May ; A is Oct., B is Nov., C is Dec)

Line 2 : 45221 --- 1st digit: Negative coater number ( “7” is No. 7 coater)

2nd ~ 4th digit: Batch number

5th digit: Serial No. of assembling

Line 3 : 62F1 --- 1st digit: Date (“6” is 6th day ; 10 is A, 11 is B…)

2nd digit: Serial No. of winding in a batch

3rd digit: Reel No ( “F” is F reel ; A is A reel, B is B reel, ... F is F reel)

4th digit: Winding Machine No. ( “1” is No.1 winder)

- 15 /16 - -SAMSUNG SDI Confidential Proprietary –

Spec. No. INR18650-25R Version No. 1.0 In-Young Jang

Revision history Version Date(‘yr-m-d) Changes/Author Reason of change 1.0 ‘14-02-10 In-Young Jang First version

- 16 /16 - A.6 JetPack 3.3 Installation Instructions The following document contains the instillation instructions for JetPack 3.3 from NVIDIA.

73

NVIDIA JET P ACK 3.3

Version 3.3 | August 2018 NVIDIA CORPORATION

User Documentation

NVIDIA JetPack Documentation

Table of Contents

JetPack ...... 3 What's Included in JetPack? ...... 3 OS Image ...... 3 Libraries ...... 3 Developer Tools ...... 4 Samples ...... 4 Documentation ...... 4 Release Notes ...... 5 JetPack 3.3 ...... 5 What's New ...... 5 Known Issues ...... 5 Download and Install JetPack ...... 6 System Requirements ...... 6 Download the Latest JetPack Version ...... 6 Installing JetPack ...... 7 Compiling Samples ...... 14 Run Sample Code ...... 14 Copyright & License Notices...... 15 NVIDIA CORPORATION ...... 15

©2018 NVIDIA Corporation. All Rights Reserved. 2 NVIDIA JetPack Documentation JetPack

NVIDIA JetPack SDK is the most comprehensive solution for building AI applications. Use the JetPack installer to flash your Jetson Developer Kit with the latest OS image, to install developer tools for both the host PC and Developer Kit, and to install the libraries and APIs, samples, and documentation needed to jumpstart your development environment. What's Included in JetPack? OS Image A sample file system derived from Ubuntu for Jetson. Libraries • CUDA Toolkit for Host PC (Ubuntu with cross-development support) • CUDA Toolkit for Jetson CUDA Toolkit provides a comprehensive development environment for C and C++ developers building. GPU-accelerated applications. The toolkit includes a compiler for NVIDIA GPUs, math libraries, and tools for debugging and optimizing the performance of your applications.

• OpenCV • VisionWorks VisionWorks is a software development package for Computer Vision (CV) and image processing. It Includes VPI (Vision Programming Interface), a set of optimized CV primitives for use by CUDA developers. The NVX library enables direct access to VPI, and the OVX library enables indirect access to VPI via OpenVX framework.

• cuDNN CUDA Deep Neural Network library provides high-performance primitives for deep learning frameworks. It includes support for convolutions, activation functions and tensor transformations.

• TensorRT TensorRT is a high performance deep learning inference runtime for image classification, segmentation, and object detection neural networks. It speeds up deep learning inference as well as reducing the runtime memory footprint for convolutional and deconv neural networks.

• MultiMedia API The Jetson Multimedia API package provides low level APIs for flexible application development.

©2018 NVIDIA Corporation. All Rights Reserved. 3 NVIDIA JetPack Documentation Camera application API: libargus offers a low-level frame-synchronous API for camera applications, with per frame camera parameter control, multiple (including synchronized) camera support, and EGL stream outputs. RAW output CSI cameras needing ISP can be used with either libargus or GStreamer plugin. In either case, the V4L2 media-controller sensor driver API is used. Sensor driver API: V4L2 API enables video decode, encode, format conversion and scaling functionality. V4L2 for encode opens up many features like bit rate control, quality presets, low latency encode, temporal tradeoff, motion vector maps, and more.

Developer Tools • NVIDIA System Profiler A multi-core CPU PC sampling profiler that provides an interactive view of captured profiling data, helping improve overall application performance.

Graphics Debugger A console-grade tool that allows developers to debug and profile OpenGL ES 2.0, OpenGL ES 3.0, OpenGL ES 3.1, and OpenGL 4.3-4.6 applications, enabling developers to get the most out of the Jetson platform.

Samples • NVIDIA GameWorks OpenGL samples • Multimedia API samples • CUDA samples • VisionWorks Documentation • Tegra Linux Driver Package Documentation • JetPack Documentation

©2018 NVIDIA Corporation. All Rights Reserved. 4 NVIDIA JetPack Documentation Release Notes JetPack 3.3 What's New

• To increase security and data integrity, all server communications will now use the HTTPS protocol.

Known Issues • Please use the default download and install directories provided by JetPack. Changing the directories may cause an installation error.

• The installation of VisionWorks results in Ubuntu distro OpenCV 2.4.9 packages to be installed, in addition to OpenCV 3.3.1. This does not affect the usage of OpenCV 3.3.1.

©2018 NVIDIA Corporation. All Rights Reserved. 5 NVIDIA JetPack Documentation Download and Install JetPack

This document is intended to help you get familiar with installing JetPack, using the tools, and running sample code. System Requirements Host Platform:

• Ubuntu Linux x64 v16.04 Note that a valid Internet connection and at least 10GB of disk space is needed for the complete installation of JetPack.

Target Platform:

• One of the following developer kits: o Jetson TX2

o Jetson TX2i

o Jetson TX1

• Additional target requirements: o USB Micro-B cable connecting Jetson to your Linux host for flashing.

o (Not included in the developer kit) To connect USB peripherals such as keyboard, mouse, and [optional] USB/Ethernet adapter (for network connection), a USB hub could be connected to the USB port on the Jetson system.

o An HDMI cable plugged into the HDMI port on Jetson Developer Kit, which is connected to an external HDMI display.

o An Ethernet cable plugged into the on-board Ethernet port, which is connected to either a secondary network card on your Linux host or the same network router providing internet access for the Linux host.

Download the Latest JetPack Version The latest version of JetPack is available in the NVIDIA Embedded Developer Zone at: https://developer.nvidia.com/jetson-development-pack All available JetPack downloads can be found at: https://developer.nvidia.com/jetpack-archive

Downloading JetPack

• On the host machine running Ubuntu, create a new directory to store installation packages. ©2018 NVIDIA Corporation. All Rights Reserved. 6 NVIDIA JetPack Documentation • Download JetPack-${VERSION}.run into the new directory on the host Ubuntu machine.

Avoid running or installing JetPack in a path that contains a "." Paths that contain a "." are known to cause installation issues. Installing JetPack

JetPack runs on the host Ubuntu x86_64 machine and sets up your development environment and Jetson Development Kit target via remote access. Please refer to the System Requirements section for supported hardware configurations. The following instructions assume you have downloaded the latest JetPack version, JetPack${VERSION}.run, where ${VERSION} refers to the version string for the installer you have.

1. Add exec permission for the JetPack-${VERSION}.run

chmod +x JetPack-${VERSION}.run

2. Run JetPack-${VERSION}.run in terminal on your host Ubuntu machine.

3. Next, the JetPack installer will indicate the installation directory. In the Privacy Notice section, select whether or not to enable data collection.

©2018 NVIDIA Corporation. All Rights Reserved. 7 NVIDIA JetPack Documentation

4. Select the development environment to setup.

5. The JetPack installer will pop up a window to ask for permission to use during the installation process; you will need to enter your sudo password here.

©2018 NVIDIA Corporation. All Rights Reserved. 8 NVIDIA JetPack Documentation 6. The Component Manager opens, which allows you to customize which components to install. Select the Jetson Developer Kit you would like to develop for to customize the installation components for each device.

NOTE: To run a standalone Ubuntu install, deselect Jetson target specific entries.

7. Accept the license agreement for the selected components.

©2018 NVIDIA Corporation. All Rights Reserved. 9 NVIDIA JetPack Documentation 8. The Component Manager will proceed with the installation. Once the host installation steps are completed, click the Next button to continue with the installation of target components.

NOTE: JetPack will now proceed with setting up the Jetson Developer Kit target, if the corresponding components were selected (i.e., flashing the OS and pushing components to the Jetson Developer Kit target).

9. If you de-selected Flash OS in the Component Manager, you will need to enter the IP address, user name, and password to set up an ssh connection to the target device.

©2018 NVIDIA Corporation. All Rights Reserved. 10 NVIDIA JetPack Documentation After you enter the required information and click Next, JetPack will begin installing components on the target device. 10. If you selected Flash OS in the Component Manager, you will need to select the network layout for your specific environment.

11. If you selected the Device access Internet via router/switch layout, you will be asked to select which interface to use for Internet access.

12. If you selected the Device get IP assigned by DHCP server on host and access Internet via host machine layout, you must select which interface is to be used for Internet access, and which is to be used for the target interface. ©2018 NVIDIA Corporation. All Rights Reserved. 11 NVIDIA JetPack Documentation

13. A pop-up window will instruct you to put your device into Force USB Recovery Mode, so you can flash the OS.

14. Next, you will be prompted to install components on the specific target machine, and to compile samples.

©2018 NVIDIA Corporation. All Rights Reserved. 12 NVIDIA JetPack Documentation

15. After the post installation tasks have been completed, the installation will be complete.

©2018 NVIDIA Corporation. All Rights Reserved. 13 NVIDIA JetPack Documentation Compiling Samples JetPack automatically compiles all samples, if Compile Samples was checked during the component selection portion of the installation. CUDA samples can be found in the following directory:

/NVIDIA_CUDA-_Samples

You can recompile the samples by running:

SMS=53 EXTRA_LDFLAGS=--unresolved-symbols=ignore-in-shared-libs TARGET_ARCH=aarch64 make

Run Sample Code The CUDA samples directory will be copied to the home directory on your device by JetPack. The built binaries are in the following directory:

/home/ubuntu/NVIDIA_CUDA-_Samples/bin/aarch64/linux/release/

Run them by calling them in terminal, or double-clicking on them in the file browser. For example, when you run the oceanFFT sample, the following screen will be displayed.

©2018 NVIDIA Corporation. All Rights Reserved. 14 NVIDIA JetPack Documentation Copyright & License Notices NVIDIA CORPORATION

NVIDIA SOFTWARE LICENSE AGREEMENT IMPORTANT – READ BEFORE DOWNLOADING, INSTALLING, COPYING OR USING THE LICENSED SOFTWARE This Software License Agreement ("SLA”), made and entered into as of the time and date of click through action (“Effective Date”), is a legal agreement between you and NVIDIA Corporation ("NVIDIA") and governs the use of the NVIDIA computer software and the documentation made available for use with such NVIDIA software. By downloading, installing, copying, or otherwise using the NVIDIA software and/or documentation, you agree to be bound by the terms of this SLA. If you do not agree to the terms of this SLA, do not download, install, copy or use the NVIDIA software or documentation. IF YOU ARE ENTERING INTO THIS SLA ON BEHALF OF A COMPANY OR OTHER LEGAL ENTITY, YOU REPRESENT THAT YOU HAVE THE LEGAL AUTHORITY TO BIND THE ENTITY TO THIS SLA, IN WHICH CASE “YOU” WILL MEAN THE ENTITY YOU REPRESENT. IF YOU DON’T HAVE SUCH AUTHORITY, OR IF YOU DON’T ACCEPT ALL THE TERMS AND CONDITIONS OF THIS SLA, THEN NVIDIA DOES NOT AGREE TO LICENSE THE LICENSED SOFTWARE TO YOU, AND YOU MAY NOT DOWNLOAD, INSTALL, COPY OR USE IT. 1. LICENSE. 1.1 License Grant. Subject to the terms of the AGREEMENT, NVIDIA hereby grants you a nonexclusive, non-transferable license, without the right to sublicense (except as expressly set forth in a Supplement), during the applicable license term unless earlier terminated as provided below, to have Authorized Users install and use the Software, including modifications (if expressly permitted in a Supplement), in accordance with the Documentation. You are only licensed to activate and use Licensed Software for which you a have a valid license, even if during the download or installation you are presented with other product options. No Orders are binding on NVIDIA until accepted by NVIDIA. Your Orders are subject to the AGREEMENT. SLA Supplements: Certain Licensed Software licensed under this SLA may be subject to additional terms and conditions that will be presented to you in a Supplement for acceptance prior to the delivery of such Licensed Software under this SLA and the applicable Supplement. Licensed Software will only be delivered to you upon your acceptance of all applicable terms. 1.2 Limited Purpose Licenses. If your license is provided for one of the purposes indicated below, then notwithstanding contrary terms in Section 1.1 or in a Supplement, such licenses are for internal use and do not include any right or license to sub-license and distribute the Licensed Software or its output in any way in any public release, however limited, and/or in any manner that provides third parties with use of or access to the Licensed Software or its functionality or output, including (but not limited to) external alpha or beta testing or development phases. Further: (i) Evaluation License. You may use evaluation licenses solely for your internal evaluation of the Licensed Software for broader adoption within your Enterprise or in connection with a NVIDIA product purchase decision, and such licenses have an expiration date as indicated by NVIDIA in its sole discretion (or ninety days from the date of download if no other duration is indicated). ©2018 NVIDIA Corporation. All Rights Reserved. 15 NVIDIA JetPack Documentation (ii) Educational/Academic License. You may use educational/academic licenses solely for educational purposes and all users must be enrolled or employed by an academic institution. If you do not meet NVIDIA’s academic program requirements for educational institutions, you have no rights under this license. (iii) Test/Development License. You may use test/development licenses solely for your internal development, testing and/or debugging of your software applications or for interoperability testing with the Licensed Software, and such licenses have an expiration date as indicated by NVIDIA in its sole discretion (or one year from the date of download if no other duration is indicated). 1.3 Pre-Release Licenses. With respect to alpha, beta, preview, and other pre-release Software and Documentation (“Pre-Release Licensed Software”) delivered to you under the AGREEMENT you acknowledge and agree that such Pre-Release Licensed Software (i) may not be fully functional, may contain errors or design flaws, and may have reduced or different security, privacy, accessibility, availability, and reliability standards relative to commercially provided NVIDIA software and documentation, and (ii) use of such Pre-Release Licensed Software may result in unexpected results, loss of data, project delays or other unpredictable damage or loss. THEREFORE, PRE-RELEASE LICENSED SOFTWARE IS NOT INTENDED FOR USE, AND SHOULD NOT BE USED, IN PRODUCTION OR BUSINESS-CRITICAL SYSTEMS. NVIDIA has no obligation to make available a commercial version of any Pre-Release Licensed Software and NVIDIA has the right to abandon development of Pre-Release Licensed Software at any time without liability. 1.4 Enterprise and Contractor Usage. You may allow your Enterprise employees and Contractors to access and use the Licensed Software pursuant to the terms of the AGREEMENT solely to perform work on your behalf, provided further that with respect to Contractors: (i) you obtain a written agreement from each Contractor which contains terms and obligations with respect to access to and use of Licensed Software no less protective of NVIDIA than those set forth in the AGREEMENT, and (ii) such Contractor’s access and use expressly excludes any sublicensing or distribution rights for the Licensed Software. You are responsible for the compliance with the terms and conditions of the AGREEMENT by your Enterprise and Contractors. Any act or omission that, if committed by you, would constitute a breach of the AGREEMENT shall be deemed to constitute a breach of the AGREEMENT if committed by your Enterprise or Contractors. 1.5 Services. Except as expressly indicated in an Order, NVIDIA is under no obligation to provide support for the Licensed Software or to provide any patches, maintenance, updates or upgrades under the AGREEMENT. Unless patches, maintenance, updates or upgrades are provided with their separate governing terms and conditions, they constitute Licensed Software licensed to you under the AGREEMENT. 2. LIMITATIONS. 2.1 License Restrictions. Except as expressly authorized in the AGREEMENT, you agree that you will not (nor authorize third parties to): (i) copy and use Software that was licensed to you for use in one or more NVIDIA hardware products in other unlicensed products (provided that copies solely for backup purposes are allowed); (ii) reverse engineer, decompile, disassemble (except to the extent applicable laws specifically require that such activities be permitted) or attempt to derive the source code, underlying ideas, algorithm or structure of Software provided to you in object code form; (iii) sell, transfer, assign, distribute, rent, loan, lease, sublicense or otherwise make available the Licensed Software

©2018 NVIDIA Corporation. All Rights Reserved. 16 NVIDIA JetPack Documentation or its functionality to third parties (a) as an application services provider or service bureau, (b) by operating hosted/virtual system environments, (c) by hosting, time sharing or providing any other type of services, or (d) otherwise by means of the internet; (iv) modify, translate or otherwise create any derivative works of any Licensed Software; (v) remove, alter, cover or obscure any proprietary notice that appears on or with the Licensed Software or any copies thereof; (vi) use the Licensed Software, or allow its use, transfer, transmission or export in violation of any applicable export control laws, rules or regulations; (vii) distribute, permit access to, or sublicense the Licensed Software as a stand-alone product; (viii) bypass, disable, circumvent or remove any form of copy protection, encryption, security or digital rights management or authentication mechanism used by NVIDIA in connection with the Licensed Software, or use the Licensed Software together with any authorization code, serial number, or other copy protection device not supplied by NVIDIA directly or through an authorized reseller; (ix) use the Licensed Software for the purpose of developing competing products or technologies or assisting a third party in such activities; (x) use the Licensed Software with any system or application where the use or failure of such system or application can reasonably be expected to threaten or result in personal injury, death, or catastrophic loss including, without limitation, use in connection with any nuclear, avionics, navigation, military, medical, life support or other life critical application (“Critical Applications”), unless the parties have entered into a Critical Applications agreement; (xi) distribute any modification or derivative work you make to the Licensed Software under or by reference to the same name as used by NVIDIA; or (xii) use the Licensed Software in any manner that would cause the Licensed Software to become subject to an Excluded License. Nothing in the AGREEMENT shall be construed to give you a right to use, or otherwise obtain access to, any source code from which the Software or any portion thereof is compiled or interpreted. You acknowledge that NVIDIA does not design, test, manufacture or certify the Licensed Software for use in the context of a Critical Application and NVIDIA shall not be liable to you or any third party, in whole or in part, for any claims or damages arising from such use. You agree to defend, indemnify and hold harmless NVIDIA and its Affiliates, and their respective employees, contractors, agents, officers and directors, from and against any and all claims, damages, obligations, losses, liabilities, costs or debt, fines, restitutions and expenses (including but not limited to attorney’s fees and costs incident to establishing the right of indemnification) arising out of or related to you and your Enterprise, and their respective employees, contractors, agents, distributors, resellers, end users, officers and directors use of Licensed Software outside of the scope of the AGREEMENT or any other breach of the terms of the AGREEMENT. 2.2 Third Party License Obligations. The Licensed Software may come bundled with, or otherwise include or be distributed with, third party software licensed by an NVIDIA supplier and/or open source software provided under an open source license (collectively, “Third Party Software”). Notwithstanding anything to the contrary herein, Third Party Software is licensed to you subject to the terms and conditions of the software license agreement accompanying such Third Party Software whether in the form of a discrete agreement, click- through license, or electronic license terms accepted at the time of installation and any additional terms or agreements provided by the third party licensor (“Third Party License Terms”). Use of the Third Party Software by you shall be governed by such Third Party License Terms, or if no Third Party License Terms apply, then the Third Party Software is provided to you as-is, without support or warranty or indemnity obligations, for use in or

©2018 NVIDIA Corporation. All Rights Reserved. 17 NVIDIA JetPack Documentation with the Licensed Software and not otherwise used separately. Copyright to Third Party Software is held by the copyright holders indicated in the Third Party License Terms. Audio/Video Encoders and Decoders. You acknowledge and agree that it is your sole responsibility to obtain any additional third party licenses required to make, have made, use, have used, sell, import, and offer for sale your products or services that include or incorporate any Third Party Software and content relating to audio and/or video encoders and decoders from, including but not limited to, Microsoft, Thomson, Fraunhofer IIS, Sisvel S.p.A., MPEG-LA, and Coding Technologies as NVIDIA does not grant to you under the AGREEMENT any necessary patent or other rights with respect to audio and/or video encoders and decoders. 2.3 Limited Rights. Your rights in the Licensed Software are limited to those expressly granted under the AGREEMENT and no other licenses are granted whether by implication, estoppel or otherwise. NVIDIA reserves all rights, title and interest in and to the Licensed Software not expressly granted under the AGREEMENT. 3. CONFIDENTIALITY. Neither party will use the other party’s Confidential Information, except as necessary for the performance of the AGREEMENT, nor will either party disclose such Confidential Information to any third party, except to personnel of NVIDIA and its Affiliates, you, your Enterprise, your Enterprise Contractors, and each party’s legal and financial advisors that have a need to know such Confidential Information for the performance of the AGREEMENT, provided that each such personnel, employee and Contractors are subject to a written agreement that includes confidentiality obligations consistent with those set forth herein. Each party will use all reasonable efforts to maintain the confidentiality of all of the other party’s Confidential Information in its possession or control, but in no event less than the efforts that it ordinarily uses with respect to its own Confidential Information of similar nature and importance. The foregoing obligations will not restrict either party from disclosing the other party’s Confidential Information or the terms and conditions of the AGREEMENT as required under applicable securities regulations or pursuant to the order or requirement of a court, administrative agency, or other governmental body, provided that the party required to make such disclosure (i) gives reasonable notice to the other party to enable it to contest such order or requirement prior to its disclosure (whether through protective orders or otherwise), (ii) uses reasonable effort to obtain confidential treatment or similar protection to the fullest extent possible to avoid such public disclosure, and (iii) discloses only the minimum amount of information necessary to comply with such requirements. NVIDIA Confidential Information under the AGREEMENT includes output from Licensed Software developer tools identified as “Pro” versions, where the output reveals functionality or performance data pertinent to NVIDIA hardware or software products. 4. OWNERSHIP. You are not obligated to disclose to NVIDIA any modifications that you, your Enterprise or your Contractors make to the Licensed Software as permitted under the AGREEMENT. As between the parties, all modifications are owned by NVIDIA and licensed to you under the AGREEMENT unless otherwise expressly provided in a Supplement. The Licensed Software and all modifications owned by NVIDIA, and the respective Intellectual Property Rights therein, are and will remain the sole and exclusive property of NVIDIA or its licensors. You shall not engage in any act or omission that would impair NVIDIA’s and/or its licensors’ Intellectual Property Rights in the Licensed Software or any other materials, information, processes or subject matter proprietary to NVIDIA. NVIDIA’s licensors are intended third party beneficiaries with the right to enforce

©2018 NVIDIA Corporation. All Rights Reserved. 18 NVIDIA JetPack Documentation provisions of the AGREEMENT with respect to their Confidential Information and/or Intellectual Property Rights. 5. FEEDBACK. You may, but you are not obligated, to provide Feedback to NVIDIA. You hereby grant NVIDIA and its Affiliates a perpetual, non-exclusive, worldwide, irrevocable license to use, reproduce, modify, license, sublicense (through multiple tiers of sublicensees), distribute (through multiple tiers of distributors) and otherwise commercialize any Feedback that you voluntarily provide without the payment of any royalties or fees to you. NVIDIA has no obligation to respond to Feedback or to incorporate Feedback into the Licensed Software. 6. NO WARRANTIES. THE LICENSED SOFTWARE AND ANY CONFIDENTIAL INFORMATION AND/OR SERVICES ARE PROVIDED BY NVIDIA “AS IS” AND “WITH ALL FAULTS,” AND NVIDIA AND ITS AFFILIATES EXPRESSLY DISCLAIM ALL WARRANTIES OF ANY KIND OR NATURE, WHETHER EXPRESS, IMPLIED OR STATUTORY, INCLUDING, BUT NOT LIMITED TO, ANY WARRANTIES OF OPERABILITY, CONDITION, VALUE, ACCURACY OF DATA, OR QUALITY, AS WELL AS ANY WARRANTIES OF MERCHANTABILITY, SYSTEM INTEGRATION, WORKMANSHIP, SUITABILITY, FITNESS FOR A PARTICULAR PURPOSE, TITLE, NON-INFRINGEMENT, OR THE ABSENCE OF ANY DEFECTS THEREIN, WHETHER LATENT OR PATENT. NO WARRANTY IS MADE ON THE BASIS OF TRADE USAGE, COURSE OF DEALING OR COURSE OF TRADE. WITHOUT LIMITING THE FOREGOING, NVIDIA AND ITS AFFILIATES DO NOT WARRANT THAT THE LICENSED SOFTWARE OR ANY CONFIDENTIAL INFORMATION AND/OR SERVICES PROVIDED UNDER THE AGREEMENT WILL MEET YOUR REQUIREMENTS OR THAT THE OPERATION THEREOF WILL BE UNINTERRUPTED OR ERROR-FREE, OR THAT ALL ERRORS WILL BE CORRECTED. 7. LIMITATION OF LIABILITY. TO THE MAXIMUM EXTENT PERMITTED BY LAW, NVIDIA AND ITS AFFILIATES SHALL NOT BE LIABLE FOR ANY SPECIAL, INCIDENTAL, PUNITIVE OR CONSEQUENTIAL DAMAGES, OR ANY LOST PROFITS, LOSS OF USE, LOSS OF DATA OR LOSS OF GOODWILL, OR THE COSTS OF PROCURING SUBSTITUTE PRODUCTS, ARISING OUT OF OR IN CONNECTION WITH THE AGREEMENT OR THE USE OR PERFORMANCE OF THE LICENSED SOFTWARE AND ANY CONFIDENTIAL INFORMATION AND/OR SERVICES PROVIDED UNDER THE AGREEMENT, WHETHER SUCH LIABILITY ARISES FROM ANY CLAIM BASED UPON BREACH OF CONTRACT, BREACH OF WARRANTY, TORT (INCLUDING NEGLIGENCE), PRODUCT LIABILITY OR ANY OTHER CAUSE OF ACTION OR THEORY OF LIABILITY. IN NO EVENT WILL NVIDIA’S AND ITS AFFILIATES TOTAL CUMULATIVE LIABILITY UNDER OR ARISING OUT OF THE AGREEMENT EXCEED THE NET AMOUNTS RECEIVED BY NVIDIA OR ITS AFFILIATES FOR YOUR USE OF THE PARTICULAR LICENSED SOFTWARE DURING THE TWELVE (12) MONTHS BEFORE THE LIABILITY AROSE (or up to US$10.00 if you acquired the Licensed Software for no charge). THE NATURE OF THE LIABILITY, THE NUMBER OF CLAIMS OR SUITS OR THE NUMBER OF PARTIES WITHIN YOUR ENTERPRISE THAT ACCEPTED THE TERMS OF THE AGREEMENT SHALL NOT ENLARGE OR EXTEND THIS LIMIT. THE FOREGOING LIMITATIONS SHALL APPLY REGARDLESS OF WHETHER NVIDIA, ITS AFFILIATES OR ITS LICENSORS HAVE BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES AND REGARDLESS OF WHETHER ANY REMEDY FAILS ITS ESSENTIAL PURPOSE. YOU ACKNOWLEDGE THAT NVIDIA’S OBLIGATIONS UNDER THE AGREEMENT ARE FOR THE BENEFIT OF YOU ONLY. The disclaimers, exclusions and limitations of liability set forth in the AGREEMENT form an essential basis of the bargain between the parties, and, absent any such disclaimers, exclusions or limitations of liability, the provisions of the AGREEMENT, including, without limitation, the economic terms, would be substantially different. 8. TERM AND TERMINATION.

©2018 NVIDIA Corporation. All Rights Reserved. 19 NVIDIA JetPack Documentation 8.1 AGREEMENT, Licenses and Services. This SLA shall become effective upon the Effective Date, each Supplement upon their acceptance, and both this SLA and Supplements shall continue in effect until your last access or use of the Licensed Software and/or services hereunder, unless earlier terminated as provided in this “Term and Termination” section. Each Licensed Software license ends at the earlier of (a) the expiration of the applicable license term, or (b) termination of such license or the AGREEMENT. Each service ends at the earlier of (x) the expiration of the applicable service term, (y) termination of such service or the AGREEMENT, or (z) expiration or termination of the associated license and no credit or refund will be provided upon the expiration or termination of the associated license for any service fees paid. 8.2 Termination and Effect of Expiration or Termination. NVIDIA may terminate the AGREEMENT in whole or in part: (i) if you breach any term of the AGREEMENT and fail to cure such breach within thirty (30) days following notice thereof from NVIDIA (or immediately if you violate NVIDIA’s Intellectual Property Rights); (ii) if you become the subject of a voluntary or involuntary petition in bankruptcy or any proceeding relating to insolvency, receivership, liquidation or composition for the benefit of creditors, if that petition or proceeding is not dismissed with prejudice within sixty (60) days after filing, or if you cease to do business; or (iii) if you commence or participate in any legal proceeding against NVIDIA, with respect to the Licensed Software that is the subject of the proceeding during the pendency of such legal proceeding. If you or your authorized NVIDIA reseller fail to pay license fees or service fees when due then NVIDIA may, in its sole discretion, suspend or terminate your license grants, services and any other rights provided under the AGREEMENT for the affected Licensed Software, in addition to any other remedies NVIDIA may have at law or equity. Upon any expiration or termination of the AGREEMENT, a license or a service provided hereunder, (a) any amounts owed to NVIDIA become immediately due and payable, (b) you must promptly discontinue use of the affected Licensed Software and/or service, and (c) you must promptly destroy or return to NVIDIA all copies of the affected Licensed Software and all portions thereof in your possession or control, and each party will promptly destroy or return to the other all of the other party’s Confidential Information within its possession or control. Upon written request, you will certify in writing that you have complied with your obligations under this section. Upon expiration or termination of the AGREEMENT all provisions survive except for the license grant provisions. 9. CONSENT TO COLLECTION AND USE OF INFORMATION. You hereby agree and acknowledge that the Software may access and collect data about your Enterprise computer systems as well as configures the systems in order to (a) properly optimize such systems for use with the Software, (b) deliver content through the Software, (c) improve NVIDIA products and services, and (d) deliver marketing communications. Data collected by the Software includes, but is not limited to, system (i) hardware configuration and ID, (ii) operating system and driver configuration, (iii) installed applications, (iv) applications settings, performance, and usage data, and (iv) usage metrics of the Software. To the extent that you use the Software, you hereby consent to all of the foregoing, and represent and warrant that you have the right to grant such consent. In addition, you agree that you are solely responsible for maintaining appropriate data backups and system restore points for your Enterprise systems, and that NVIDIA will have no responsibility for any damage or loss to such systems (including loss of data or access) arising from or relating to (a) any changes to the configuration, application settings, environment ©2018 NVIDIA Corporation. All Rights Reserved. 20 NVIDIA JetPack Documentation variables, registry, drivers, BIOS, or other attributes of the systems (or any part of such systems) initiated through the Software; or (b) installation of any Software or third party software patches initiated through the Software. In certain systems you may change your system update preferences by unchecking "Automatically check for updates" in the "Preferences" tab of the control panel for the Software. In connection with the receipt of the Licensed Software or services you may receive access to links to third party websites and services and the availability of those links does not imply any endorsement by NVIDIA. NVIDIA encourages you to review the privacy statements on those sites and services that you choose to visit so that you can understand how they may collect, use and share personal information of individuals. NVIDIA is not responsible or liable for: (i) the availability or accuracy of such links; or (ii) the products, services or information available on or through such links; or (iii) the privacy statements or practices of sites and services controlled by other companies or organizations. To the extent that you or members of your Enterprise provide to NVIDIA during registration or otherwise personal data, you acknowledge that such information will be collected, used and disclosed by NVIDIA in accordance with NVIDIA's privacy policy, available at URL http://www.nvidia.com/object/privacy_policy.html. 10. GENERAL. This SLA, any Supplements incorporated hereto, and Orders constitute the entire agreement of the parties with respect to the subject matter hereto and supersede all prior negotiations, conversations, or discussions between the parties relating to the subject matter hereto, oral or written, and all past dealings or industry custom. Any additional and/or conflicting terms and conditions on purchase order(s) or any other documents issued by you are null, void, and invalid. Any amendment or waiver under the AGREEMENT must be in writing and signed by representatives of both parties. The AGREEMENT and the rights and obligations thereunder may not be assigned by you, in whole or in part, including by merger, consolidation, dissolution, operation of law, or any other manner, without written consent of NVIDIA, and any purported assignment in violation of this provision shall be void and of no effect. NVIDIA may assign, delegate or transfer the AGREEMENT and its rights and obligations hereunder, and if to a non-Affiliate you will be notified. Each party acknowledges and agrees that the other is an independent contractor in the performance of the AGREEMENT, and each party is solely responsible for all of its employees, agents, contractors, and labor costs and expenses arising in connection therewith. The parties are not partners, joint ventures or otherwise affiliated, and neither has any authority to make any statements, representations or commitments of any kind to bind the other party without prior written consent. Neither party will be responsible for any failure or delay in its performance under the AGREEMENT (except for any payment obligations) to the extent due to causes beyond its reasonable control for so long as such force majeure event continues in effect. The AGREEMENT will be governed by and construed under the laws of the State of Delaware and the United States without regard to the conflicts of law provisions thereof and without regard to the United Nations Convention on Contracts for the International Sale of Goods. The parties consent to the personal jurisdiction of the federal and state courts located in Santa Clara County, California. You acknowledge and agree that a breach of any of your promises or agreements

©2018 NVIDIA Corporation. All Rights Reserved. 21 NVIDIA JetPack Documentation contained in the AGREEMENT may result in irreparable and continuing injury to NVIDIA for which monetary damages may not be an adequate remedy and therefore NVIDIA is entitled to seek injunctive relief as well as such other and further relief as may be appropriate. If any court of competent jurisdiction determines that any provision of the AGREEMENT is illegal, invalid or unenforceable, the remaining provisions will remain in full force and effect. Unless otherwise specified, remedies are cumulative. The Licensed Software has been developed entirely at private expense and is “commercial items” consisting of “commercial computer software” and “commercial computer software documentation” provided with RESTRICTED RIGHTS. Use, duplication or disclosure by the U.S. Government or a U.S. Government subcontractor is subject to the restrictions set forth in the AGREEMENT pursuant to DFARS 227.7202-3(a) or as set forth in subparagraphs (c)(1) and (2) of the Commercial Computer Software - Restricted Rights clause at FAR 52.227-19, as applicable. Contractor/manufacturer is NVIDIA, 2788 San Tomas Expressway, Santa Clara, CA 95051. You acknowledge that the Licensed Software described under the AGREEMENT is subject to export control under the U.S. Export Administration Regulations (EAR) and economic sanctions regulations administered by the U.S. Department of Treasury’s Office of Foreign Assets Control (OFAC). Therefore, you may not export, reexport or transfer in-country the Licensed Software without first obtaining any license or other approval that may be required by BIS and/or OFAC. You are responsible for any violation of the U.S. or other applicable export control or economic sanctions laws, regulations and requirements related to the Licensed Software. By accepting this SLA, you confirm that you are not a resident or citizen of any country currently embargoed by the U.S. and that you are not otherwise prohibited from receiving the Licensed Software. Any notice delivered by NVIDIA to you under the AGREEMENT will be delivered via mail, email or fax. Please direct your legal notices or other correspondence to NVIDIA Corporation, 2788 San Tomas Expressway, Santa Clara, California 95051, United States of America, Attention: Legal Department. GLOSSARY OF TERMS Certain capitalized terms, if not otherwise defined elsewhere in this SLA, shall have the meanings set forth below: a. “Affiliate” means any legal entity that Owns, is Owned by, or is commonly Owned with a party. “Own” means having more than 50% ownership or the right to direct the management of the entity. b. “AGREEMENT” means this SLA and all associated Supplements entered by the parties referencing this SLA. c. “Authorized Users” means your Enterprise individual employees and any of your Enterprise’s Contractors, subject to the terms of the “Enterprise and Contractors Usage” section. d. “Confidential Information” means the Licensed Software (unless made publicly available by NVIDIA without confidentiality obligations), and any NVIDIA business, marketing, pricing, research and development, know-how, technical, scientific, financial status, proposed new products or other information disclosed by NVIDIA to you which, at the time of disclosure, is designated in writing as confidential or proprietary (or like written designation), or orally identified as confidential or proprietary or is otherwise reasonably identifiable by parties exercising reasonable business judgment, as confidential. Confidential Information does not and will not include information that: (i) is or becomes generally known to the public through

©2018 NVIDIA Corporation. All Rights Reserved. 22 NVIDIA JetPack Documentation no fault of or breach of the AGREEMENT by the receiving party; (ii) is rightfully known by the receiving party at the time of disclosure without an obligation of confidentiality; (iii) is independently developed by the receiving party without use of the disclosing party’s Confidential Information; or (iv) is rightfully obtained by the receiving party from a third party without restriction on use or disclosure. e. “Contractor” means an individual who works primarily for your Enterprise on a contractor basis from your secure network. f. “Documentation” means the NVIDIA documentation made available for use with the Software, including (without limitation) user manuals, datasheets, operations instructions, installation guides, release notes and other materials provided to you under the AGREEMENT. g. “Enterprise” means you or any company or legal entity for which you accepted the terms of this SLA, and their subsidiaries of which your company or legal entity owns more than fifty percent (50%) of the issued and outstanding equity. h. “Excluded License” includes, without limitation, a software license that requires as a condition of use, modification, and/or distribution that software be (i) disclosed or distributed in source code form; (ii) licensed for the purpose of making derivative works; or (iii) redistributable at no charge. i. “Feedback” means any and all suggestions, feature requests, comments or other feedback regarding the Licensed Software, including possible enhancements or modifications thereto. j. “Intellectual Property Rights” means all patent, copyright, trademark, trade secret, trade dress, trade names, utility models, mask work, moral rights, rights of attribution or integrity service marks, master recording and music publishing rights, performance rights, author’s rights, database rights, registered design rights and any applications for the protection or registration of these rights, or other intellectual or industrial property rights or proprietary rights, howsoever arising and in whatever media, whether now known or hereafter devised, whether or not registered, (including all claims and causes of action for infringement, misappropriation or violation and all rights in any registrations and renewals), worldwide and whether existing now or in the future. k. “Licensed Software” means Software, Documentation and all modifications owned by NVIDIA. l. “Order” means a purchase order issued by you, a signed purchase agreement with you, or other ordering document issued by you to NVIDIA or a NVIDIA authorized reseller (including any online acceptance process) that references and incorporates the AGREEMENT and is accepted by NVIDIA. m. “Software” means the NVIDIA software programs licensed to you under the AGREEMENT including, without limitation, libraries, sample code, utility programs and programming code. n. “Supplement” means the additional terms and conditions beyond those stated in this SLA that apply to certain Licensed Software licensed hereunder.

©2018 NVIDIA Corporation. All Rights Reserved. 23 A.7 Camera Driver Installation Instructions The following are instructions for installing the CA378-AOIS camera driver on the Jetson TX2. Most of the information comes from their GitHub repository (https://github.com/centuryarks/CA378-AOIS/tree/ master/JetsonTX2) with some modifications made for compatibility with JetPack 3.3.

A.7.1 Step 1: Setting sudopermissions 1. Open a terminal window and execute the following command: sudo visudo 2. Add the following two lines (beginning with nvidia) to the file:

# User privilege specification root ALL=(ALL:ALL) ALL nvidia ALL=(ALL:ALL) ALL

# Members of the admin group may gain root privileges %admin ALL=(ALL) ALL

# Allow members of group sudo to execute any command %sudo ALL=(ALL:ALL) ALL %nvidia ALL=(ALL:ALL) NOPASSWD: ALL

3. Reboot the Jetson using the following command: sudo reboot

A.7.2 Step 2: Installing the driver 1. Download the CA378-AOIS MIPI-2L v1.1.0 for Jetson TX2 (L4T 28.2) prerelease from the GitHub page using the following command in a terminal window: wget --no-check-certificate https://github.com/centuryarks/CA378-AOIS/releases/download/ v1.1.0/CA378 2L v1.1.0 L4T28.2.tar.gz 2. Extract the zip file using tar zxvf CA378 2L v1.1.0 L4T28.2.tar.gz 3. Enter the unzipped folder and run the instillation script, specifying the number of cameras as 1:

$ cd CA378_2L_v1.1.0_L4T28.2/ $ ./Install.sh What is the number of camera connections? : 1

4. Download the driver onto the computer running Ubuntu 16.04 which was used to flash the Jetson using the following link: https://github.com/centuryarks/CA378-AOIS/releases/download/v1. 1.0/CA378_2L_v1.1.0_L4T28.2.tar.gz 5. On that computer, unzip the download and copy the dtb folder into the $JETSON INSTALL LOCATION/64 TX2/Linux for Tegra 64 tx2/ folder. 6. Navigate to that folder and run the InstallDTB.sh script 7. (Optional) Download the demo from https://github.com/centuryarks/Sample/releases/download/ v1.1.0/demo_v1.1.0_tx2.tar.gz onto the Jetson and run the demo to ensure that the camera driver has been installed properly.

97 A.8 Object Detection API and OpenCV Instillation Instructions The instructions for installing the TesnorFlow Object Detection API are a reproduction of the instructions available at https://github.com/rbnprdy/tf_trt_models.

1. Install dependencies: sudo apt-get install python-pip python-matplotlib python-pil 2. Install TensorFlow:

pip3 install \ tensorflow-1.8.0-cp35-cp35m-linux_aarch64.whl \ --user

3. Clone the repository:

git clone --recursive \ https://github.com/rbnprdy/tf_trt_models.git cd tf_trt_models

4. Run the instillation script:

./install.sh python3

OpenCV can be installed using the following GitHub repository: https://github.com/jetsonhacks/ buildOpenCVTX2. Clone the repository, enter into the repository, and then run ./buildOpenCV.sh

98 A.9 Jetson TX2 Installation Instructions

A.10 TensorRT Instructions In order to optimize a TensorFlow graph with TensorRT, the following code should be used:

trt_graph = trt.create_inference_graph( input_graph_def=frozen_graph, outputs=output_names, max_batch_size=1, max_workspace_size_bytes=1 << 25, precision_mode=‘FP16’, minimum_segment_size=50 )

This creates a TensorRT optimized graph from a frozen graph.

99 A.11 Network Speed Model Code import os import time import tensorflow as tf import numpy as np import cv2 import tensorflow.contrib.tensorrt as trt from tf_trt_models.detection import download_detection_model from tf_trt_models.detection import build_detection_graph from object_detection.utils import label_map_util from object_detection.utils import visualization_utils as vis_util

def open_cam_onboard(width, height): # On versions of L4T prior to 28.1, add ’flip-method=2’ into gst_str gst_str = (’nvcamerasrc ! ’ ’video/x-raw(memory:NVMM), ’ ’width=(int)640, height=(int)480, ’ ’format=(string)I420, framerate=(fraction)240/1 ! ’ ’nvvidconv ! ’ ’video/x-raw, width=(int){}, height=(int){}, ’ ’format=(string)BGRx ! ’ ’videoconvert ! appsink’).format(width, height) return cv2.VideoCapture(gst_str, cv2.CAP_GSTREAMER)

def load_image_into_numpy_array(image): (im_width, im_height) = image.size return np.array(image.getdata()).reshape( (im_height, im_width, 3)).astype(np.uint8)

WIDTH = 640 HEIGHT = 480

# Define the video stream cap = open_cam_onboard(WIDTH, HEIGHT)

# What model to download. MODEL = ’ssdlite_mobilenet_v2_coco’ DATA_DIR = ’./data/’ CONFIG_FILE = MODEL + ’.config’ CHECKPOINT_FILE = ’model.ckpt’ PATH_TO_LABELS = ’../third_party/models/research/object_detection/data/’ +\ ’mscoco_label_map.pbtxt’ OPTIMIZED_MODEL_FILE = ’optimized_model.pbtxt’ # Number of classes to detect NUM_CLASSES = 90 if not os.path.exists(os.path.join(DATA_DIR, OPTIMIZED_MODEL_FILE)): print(’Creating optimized graph...’) # Download model and build frozen graph config_path, checkpoint_path = download_detection_model(MODEL, ’data’) frozen_graph, input_names, output_names = build_detection_graph( config=config_path,

100 checkpoint=checkpoint_path, score_threshold=0.3, batch_size=1 ) # Optimize with TensorRT trt_graph = trt.create_inference_graph( input_graph_def=frozen_graph, outputs=output_names, max_batch_size=1, max_workspace_size_bytes=1 << 25, precision_mode=’FP16’, minimum_segment_size=50 ) with tf.gfile.GFile( os.path.join(DATA_DIR, OPTIMIZED_MODEL_FILE), ’wb’) as f: f.write(trt_graph.SerializeToString()) else: print(’Loaded optimized graph’) trt_graph = tf.GraphDef() with open(os.path.join(DATA_DIR, OPTIMIZED_MODEL_FILE), ’rb’) as f: trt_graph.ParseFromString(f.read())

# Loading label map label_map = label_map_util.load_labelmap(PATH_TO_LABELS) categories = label_map_util.convert_label_map_to_categories( label_map, max_num_classes=NUM_CLASSES, use_display_name=True) category_index = label_map_util.create_category_index(categories)

# Create session and load graph tf_config = tf.ConfigProto() tf_config.gpu_options.allow_growth = True tf_sess = tf.Session(config=tf_config) tf.import_graph_def(trt_graph, name=’’) tf_input = tf_sess.graph.get_tensor_by_name(’image_tensor:0’) tf_scores = tf_sess.graph.get_tensor_by_name(’detection_scores:0’) tf_boxes = tf_sess.graph.get_tensor_by_name(’detection_boxes:0’) tf_classes = tf_sess.graph.get_tensor_by_name(’detection_classes:0’) tf_num_detections = tf_sess.graph.get_tensor_by_name(’num_detections:0’)

# Number of iterations num_iterations = 1000 total_time = 0 for i in range(0, num_iterations): print(’{}/{}’.format(i, num_iterations)) start_time = time.time() # Read frame from camera ret, image_np = cap.read() image_np_expanded = np.expand_dims(image_np, axis=0) (boxes, scores, classes, num_detections) = tf_sess.run( [tf_boxes, tf_scores, tf_classes, tf_num_detections],

101 feed_dict={tf_input: image_np_expanded}) vis_util.visualize_boxes_and_labels_on_image_array( image_np, np.squeeze(boxes), np.squeeze(classes).astype(np.int32), np.squeeze(scores), category_index, use_normalized_coordinates=True, line_thickness=8) # Calculate FPS elapsed_time = time.time() - start_time cv2.imshow(’object detection’, image_np)

total_time += elapsed_time

print(’average fps: {}’.format(num_iterations / total_time))

A.12 Test Data Sheets

Figure 46: The sheet used for Jitter Test 1: Distance Detection Test.

102 Figure 47: The sheet used for Jitter Test 2: Matlab Test.

Figure 48: The sheet used for Jitter Test 3: Tennis Ball Test.

103 Figure 49: The sheet used for the heat test.

104 Figure 50: The sheet used for the water test.

105 Figure 51: The Battery Test Data Sheet.

A.13 Camera Angle Calculations

Figure 52: Excel calculations for camera angles.

106 A.14 Item Links A.14.1 Bike & Bike Rack • Roadmaster 26 Granite-Peak-Mountain-Bike: https://www.walmart.com/ip/Roadmaster-26-Granite- Peak-Mountain-Bike/728826935 • Lumintrail Bicycle Commuter Carrier Seatpost Bike Rack: https://www.amazon.com/Lumintrail- Bicycle-Commuter-Carrier-Seatpost/dp/B075NPJBZP/ref=cm_cr_arp_d_product_top?ie=UTF8

A.14.2 Developer Board • Jetson TX2 Developer Kit: https://store.nvidia.com/store;jsessionid=7897122959004E3540FDC922606DA79D? Action=DisplayPage&Locale=en_US&SiteID=nvidia&id=QuickBuyCartPage

• SanDisk-Extreme-SD Card 128GB: https://www.amazon.com/SanDisk-Extreme-128GB-UHS-I-SDSDXXG- 128G-GN4IN/dp/B01J5RH06K/ref=sr_1_9?ie=UTF8&qid=1542582166&sr=8-9&keywords=sd+card+128GB • Puget Systems Acrylic Enclosure for NVIDIA Jetson TX1 and TX2 development kits: https://www. amazon.com/\\Puget-Systems-Acrylic-Enclosure-development/dp/B01AYPGMJS/ref=pd_ybh_a_15? _encoding=UTF8&psc=1&refRID=W6BME395KJJJ6SCQF7BN

A.14.3 Camera • CA378-AOIS Camera: https://www.framos.com/us/ca378-aois-w.-i/f-board-for-jetsontx1/ tx2-21794

A.14.4 Battery • PowerAdd Charge Center 2 Battery: https://www.amazon.com/Poweradd-ChargerCenter\%E2\%85\ %A1-100000mAh-Generator-Smartphone/dp/B071FZRY7H/ref=sr_1_1?ie=UTF8&qid=1542577117&sr= 8-1&keywords=poweradd+charging+center

A.14.5 Enclosure Parts • Altelix-Fiberglass-Weatherproof-Equipment-Enclosure: https://www.amazon.com/Altelix-Fiberglass- Weatherproof-Equipment-Enclosure/dp/B076XZWMLM/ref=sr_1_7?s=industrial&ie=UTF8&qid=1541370055& sr=1-7&keywords=altelix+vented+nema+enclosure#feature-bullets-btf • Supreme Tech Acrylic Dome Camera Enclosure: https://www.amazon.com/dp/B01MZGX7XY/ref= twister_B01AGZGOEE?_encoding=UTF8&psc=1 • 1/2 x 6 ft. Non-Metallic Liquidtight Whip: https://www.homedepot.com/p/AFC-Cable-Systems-1- 2-x-6-ft-Non-Metallic-Liquidtight-Whip-8028/202286682 • 2.25 in. x 1 in. Vibration Isolator: https://www.homedepot.com/p/Powermate-2-25-in-x-1-in- Vibration-Isolator-094-0190RP/202593098 • 2-Hole 90◦Angle Bracket - Silver Galvanized: https://www.homedepot.com/p/Superstrut-2-Hole- 90-Angle-Bracket-Silver-Galvanized-ZAB201EG-10/100390324?cm_mmc=Shopping\%7CG\%7CVF\%7CD27E\ %7C27-6_CONDUIT-BOXES-FITTINGS\%7CNA\%7CPLA\%7c71700000033099037\%7c58700003867178937\ %7c92700031090234816&gclid=EAIaIQobChMIserT0cHr3wIVQxx9Ch0fagYxEAQYBSABEgKAP_D_BwE&gclsrc= aw.ds • #0 ACC Conduit and Pipe Hangers (5-Pack): https://www.homedepot.com/p/0-ACC-Conduit-and- Pipe-Hangers-5-Pack-26780/100130865

107 A.14.6 Tools & Test Equipment • Weight Scale: https://www.amazon.com/Digital-Bathroom-GreaterGoods-Precision-Measurements/ dp/B01929N69G/ref=sr_1_7_a_it?ie=UTF8&qid=1543180041&sr=8-7&keywords=weight+scale • Wireless Thermometer: https://www.amazon.com/ECOWITT-WH0280-Thermometer-Hygrometer-Temperature/ dp/B078VZHC6C/ref=sr_1_15?ie=UTF8&qid=1543180801&sr=8-15&keywords=small+thermometer+outdoor • Sprinkler Head: https://www.amazon.com/Gilmour-Rectangular-Pattern-Spot-Sprinkler/dp/B0008IT0HS/ ref=sr_1_6?ie=UTF8&qid=1543181951&sr=8-6&keywords=small\%2Bsprinkler&th=1 • Woods Clamp Lamp with 10 Inch Reflector and Bulb Guard (300 Watt Bulb, 6 Foot Cord): https:// www.amazon.com/Woods-Clamp-Lamp-Reflector-Guard/dp/B003XV8QOU/ref=pd_bxgy_60_2?_encoding= UTF8&pd_rd_i=B003XV8QOU&pd_rd_r=838861e8-0e95-11e9-a176-db5c0092f3eb&pd_rd_w=W4Yce&pd_ rd_wg=KMAbX&pf_rd_p=6725dbd6-9917-451d-beba-16af7874e407&pf_rd_r=T3QP9YP13FDXX6PGA1D3& refRID=T3QP9YP13FDXX6PGA1D3&th=1 • 250W Heat Bulb - 2PK: https://www.amazon.com/Woods-Clamp-Lamp-Reflector-Guard/dp/B07FXWBZL1/ ref=pd_bxgy_60_2?_encoding=UTF8&pd_rd_i=B003XV8QOU&pd_rd_r=838861e8-0e95-11e9-a176-db5c0092f3eb& pd_rd_w=W4Yce&pd_rd_wg=KMAbX&pf_rd_p=6725dbd6-9917-451d-beba-16af7874e407&pf_rd_r=T3QP9YP13FDXX6PGA1D3& refRID=T3QP9YP13FDXX6PGA1D3&th=1 • 1/2 in x 4 ft. #4 Rebar: https://swww.homedepot.com/p/Weyerhaeuser-1-2-in-x-4-ft-4-Rebar- 35616/202094286?MERCH=REC-_-PIPHorizontal2_rr-_-202532809-_-202094286-_-N

• Cobalt Chloride detection paper: https://www.indigoinstruments.com/test_strips/leak_detection/ water-leak-test-detection-paper-33813-Co8.html

A.14.7 Other • AC Infinity MULTIFAN S1: https://www.amazon.com/dp/B00G059G86/ref=sspa_dk_detail_5?psc= 1&pd_rd_i=B00G059G86&pf_rd_m=ATVPDKIKX0DER&pf_rd_p=21517efd-b385-405b-a405-9a37af61b5b4& pd_rd_wg=PdGma&pf_rd_r=AXT0JAKNXK37E271RVCS&pf_rd_s=desktop-dp-sims&pf_rd_t=40701&pd_ rd_w=O1aqi&pf_rd_i=desktop-dp-sims&pd_rd_r=259f6d48-eb7b-11e8-88d2-3b20857d970c

A.15 Camera Data Sheets

108

9. Electrical Characteristics

Table 9-1 Absolute Maximum Ratings

Item Symbol Ratings Unit Supply voltage (analog) VANA -0.3 to +3.3 V Supply voltage (digital) VDIG -0.3 to +1.8 V Supply voltage (interface) VIF -0.3 to +3.3 V Input voltage (digital) VI -0.3 to +3.3 V Output voltage (digital) VO -0.3 to +3.3 V Guaranteed Operating temperature TOPR -20 to +70 ˚C Guaranteed storage temperature TSTG -30 to +80 ˚C Guaranteed performance temperature TSPEC -20 to +60 ˚C

Table 9-2 Recommended Operating Voltage

Item Symbol Ratings Unit Supply voltage (analog) VANA 2.8 ± 0.1 V Supply voltage (digital) VDIG 1.05 ± 0.1 V Supply voltage (interface) VIF 1.8 ± 0.1 V

Sony Electronics Inc.

22 Copyright 2016, 2017, 2018 Sony Semiconductor Solutions Corporation

2018/08/17 11:54:23 (GMT+09:00)

10. Power Consumption

Table 10-1 Power Consumption

Normal mode Item Unit Typ(*1) Max(*2)

Analog Current(VANA) IANA 47.0 55.0 mA

Full resolution Digital Current(VDIG) IDIG 323.7 446.5 mA 60fps (H:4056 ,V:3040) I/O Current(VIF) IIF 2.1 2.5 mA

Total Power 475.4 677.7 mW

Analog Current(VANA) IANA 25.2 33.4 mA

Full resolution Digital Current(VDIG) IDIG 194.4 290.1 mA 30fps (H:4056 ,V:3040) I/O Current(VIF) IIF 2.4 2.8 mA

Total Power 279.0 435.7 mW

Analog Current(VANA) IANA 54.2 61.7 mA

Movie Digital Current(VDIG) IDIG 273.7 385.7 mA 1080@240fps (H:2028, V:1128) I/O Current(VIF) IIF 2.3 2.8 mA

Total Power 444.3 627.6 mW

Analog Current(VANA) IANA 29.1 37.2 mA

Movie Digital Current(VDIG) IDIG 174.9 266.4 mA 1080@120fps (H:2028, V:1128) I/O Current(VIF) IIF 2.5 2.7 mA

Total Power 269.5 419.3 mW

Analog Current(VANA)Sony ElectronicsIANA 1.6 Inc. 6.7 mA

Digital Current(VDIG) IDIG 3.5 65.0 mA SW STB I/O Current(VIF) IIF 0.011 0.021 mA

Total Power 8.1 94.3 mW

Analog Current(VANA) IANA 0.004 0.023 mA

Digital Current(VDIG) IDIG 3.4 60.0 mA HW STB I/O Current(VIF) IIF 0.001 0.003 mA

Total Power 3.5 69.1 mW *1. VANA=2.8V, VDIG=1.05V, VIF=1.8V, Ta=25℃ *2. VANA=2.9V, VDIG=1.15V, VIF=1.9V, Ta=60℃ *3. Typ. & Max values are the results of ES evaluation; they are not the specification value. *4. The above values are values during stable state, not instantaneous values. *5. Normal mode: Additional functions are SONY recommended condition.

23 Copyright 2016, 2017, 2018 Sony Semiconductor Solutions Corporation

2018/08/17 11:54:23 (GMT+09:00)

Table 5-10 Typical image output of main capture modes (1)

Modes

Full resolution 2 Binning SME-HDR 2 Binning 10bit/SME-HDR/12bit (V:1/2, H:1/2) (V:1/2)

Number of vertical lines 3044 1524 1524 in imaging area

Number of horizontal 4056 2028 4056 pixels in effective area

Number of lines and Number of Start Number Start Number Start position start position lines position of lines position of lines

Frame start 1 1 1 1 1 1

Embedded data 2 2 2 2 2 2 lines

Number of vertical pixels in effective 3 3040 3 1520 3 1520 area

PD data 3043 1 3043 1 3043 1 Name of the areas the of Name Gyro Data 3044 1 3044 1 3044 1

Frame end 3044 1 3044 1 3044 1

Table 5-11 Typical image output of main capture modes (2)

Sony Electronics Inc.

65 Copyright 2016 Sony Semiconductor Solutions Corporation

2018/08/17 11:54:23 (GMT+09:00)

Table 5-12 Typical image output of main capture modes (3)

Table 5-13 Typical image output of main capture modes (4)

Sony Electronics Inc.

66 Copyright 2016 Sony Semiconductor Solutions Corporation

2018/08/17 11:54:23 (GMT+09:00)

Table 5-14 FOV% vs FPS(Crop)

The following table shows the setting list for the supporting operation modes of capture mode.

Other settings are not supported in this sensor.

Table 5-15 Support list of operation mode of capture mode1

HDR setting BINNING setting Sub-sampling setting

0x0220 0x3140 0x0221 0x0900 0x0901 0x0381 0x0383 0x0385 0x0387 [5:0] [4:0] [7:0] [0] [7:0] [1:0] [2:0] [1:0] [2:0] Sony Electronics Inc.

H

Operation

mode

1 1

* *

:0] [0] [0] 1 [ [7:4] [3:0] [7:4] [3:0] [1:0] [2:0] [1:0] [2:0] Y_EVN_INC HDR_MODE X_EVN_INC Y_ODD_INC X_ODD_INC HDR_FNCSEL BINNING_MODE BINNING_TYPE_V BINNING_TYPE_ HDR_RESO_REDU_V HDR_RESO_REDU_H

Full Resolution 0 x 1 1

V2 Binning 1 0x12 1 1

0 x x HV2 Binning 1 0x22 1 1

V2 Binning + 2Sub Sampling 1 0x22 1 3 H2 Binning

HDR Full 1 1 Resolution 0x11 1 2 x x

HDR V2 Binning 0x12 1 1

*1 : X_EVN_INC and X_ODD_INC are fix value(1) in this sensor

SRM-258-004-378-001

67 Copyright 2016 Sony Semiconductor Solutions Corporation

2018/08/17 11:54:23 (GMT+09:00) A.16 Risk Analysis and Mitigation Plan

Figure 53: Risks and mitigation plan table.

Figure 54: Risk chart.

114