case study IndustryAI-Assisted SolutionFLIR Focus Area Reimagined Machine Vision with On-Camera

The FLIR® Firefly® camera adds a new level of intelligence to machine vision and image analysis supporting on-device inference with an ® ™ Myriad™ 2 .

Automated analysis of images captured by cameras is a key part of our day-to-day lives that we may not often think about. The quality and affordability of the phones in our pockets, the cars we drive, and the food on our plates are made possible by machines using cameras for process automation, quality inspection, and robot guidance. Without this kind of “machine vision,” the high speed and high volume required for these tasks would make them too tedious and error-prone for humans to handle reliably or affordably. As this technology has developed, machine vision has gone from enabling basic inspection and sorting operations to more complex tasks, such as guiding manufacturing robots in automotive factories and enhancing surveillance applications. Still, there has been a hard limit on the capabilities of these systems because they rely on pre-established rules. For example, machine vision has been well suited to reading a standardized barcode or checking a manufactured part against specifications, but not to the subjective judgment of whether a piece of fruit is of export quality. Neural networks trained using modern deep learning techniques take image analysis to the next level, being able to recognize objects or people with a high degree of accuracy, for example. Using machine-learning approaches, neural networks can be trained on large data sets, enabling highly accurate decision making. This approach provides greater precision than legacy object-recognition methods, as well as removing the need for painstaking hand-coding of explicit rules. The capabilities with sophisticated software are open-ended. The forthcoming FLIR® Firefly® machine vision camera (available in 2019) incorporates an on-camera deep neural network accelerator based on the Intel® Movidius™ Myriad™ 2 Vision Processing Unit (VPU). The Firefly machine vision camera enables sophisticated machine vision applications, while remaining more cost-effective, simpler to integrate, and more reliable than discrete systems. FLIR® Firefly®: Machine Vision + Deep Learning FLIR engineers accelerated the Firefly’s development cycle using Intel® Movidius™ technology for both prototype development and large-scale commercial production, as shown in Figure 1. Rapid prototyping based on the Intel® Movidius™ Neural Compute Stick (NCS) and Neural Compute SDK streamlined the early development of in the camera. The production version of the Firefly uses the tiny, stand-alone Intel Movidius Myriad 2 VPU to do two jobs: image signal processing and open platform inference. Once satisfied with neural network performance in the prototyping phase, FLIR engineers took advantage of the VPU’s onboard image signal and CPU. Utilizing onboard imaging, convolutional neural network (CNN) and programmable compute capabilities of the chip allowed FLIR to aggressively minimize size, weight, and power consumption. This approach provides a single hardware and software target that simplifies prototyping, while also enabling the production version of CASE STUDY | Reimagined Machine Vision with On-Camera Deep Learning

the full camera to be about an inch square, as illustrated in Figure 2. Placing deep neural network acceleration directly on the camera enables inference to be performed at the network edge, rather than having to transmit the raw video stream elsewhere for processing. This approach introduces a number of advantages that improve the overall solution, including the following: Figure 2. Use of the onboard image signal processor and • Real-time operation. Processing in place eliminates CPU enable the Firefly® camera—including its onboard the latency associated with transporting data for off- processing hardware—to be very small in size. camera computation, allowing detection and subsequent responses to be made in real time. “The inspiration for Firefly is to subjectively analyze • Efficiency . Eliminating the need to send raw video data over visual information. For example, it can inspect a the network reduces costs related to bandwidth, storage, and power consumption. manufactured part and identify defects that no one has ever seen before or even anticipated seeing. • Security . On-camera inference enables a simplified, self- contained architecture that reduces the attack surface, and The result will be automation of visual tasks that the relatively small amount of data passed over the wire can be encrypted with minimal impact. previously could only be handled by humans.” – Mike Fussell, Product Marketing Manager, FLIR Systems

The Intel Movidius Myriad 2 VPU is a system-on-chip (SoC) design that enables high-performance, on-camera image 1 processing and inference, as illustrated in Figure 3. Key features of the VPU include the following: • Hardware accelerators for image processing are purpose- built for imaging and . • Streaming hybrid architecture vector engine (SHAVE) 2 processor cores accelerate on-camera inference based on CNNs, with a very long instruction word (VLIW) architecture, including vector data processing that is more optimized for the branching logic of neural networks than the more general-purpose cores found in graphics processing units (GPUs). The FLIR Firefly integrates three discrete devices • General-purpose RISC CPU cores support interaction with 3 onto a single device: external systems, parse and schedule workload processing 1. Camera on the SHAVE processor cores, and execute the actual on- 2. Development board camera inferences. 3. Neural compute stick The advanced firmware that ships with the Firefly adds significant value. Key firmware machine-vision features Figure 1. The FLIR® Firefly® camera was tested (left) with the include the USB3 Vision protocol, eight- and 16-bit raw pixel Intel® Movidius™ Neural Compute Stick and prototyped (right) formats, pixel binning, and selectable region of interest. In with the Intel® Movidius™ Myriad™ 2 vision processing unit. addition, the firmware offers control of the four GPIO ports, allowing other systems to trigger the camera, as well as The FLIR Firefly camera marries machine vision and deep enabling the camera to trigger external equipment such as learning by combining excellent image quality with Sony lighting, actuators, or other cameras. Pregius* sensors, GenICam* compliance for ease of use, and an Intel Movidius Myriad 2 VPU for performing deep neural network inference. Firefly’s ultra-compact footprint and low power consumption make it ideal for implementations with space and power constraints, such as handheld and embedded systems. The camera is also equipped with a USB port for host connectivity as well as four bi-directional general-purpose input/output (GPIO) lines for connection to other systems. The initial version of the Firefly uses a 1.6 MP Sony Pregius CMOS image sensor. This 60-FPS global shutter sensor features excellent imaging performance, even in challenging Figure 3. Pass/fail quality inferences during inspection lighting conditions. Future iterations of the Firefly camera will of manufactured parts with the prototype for the FLIR® offer increased flexibility with additional sensor options. Firefly camera.

2 CASE STUDY | Reimagined Machine Vision with On-Camera Deep Learning

Use Cases An Open-Standards Platform for Innovation AI is disruptive in the machine vision field because of its The Firefly is part of an open ecosystem, which allows for ability to answer questions that require judgment, which is tremendous flexibility in terms of interactions with other to say the estimates could not have been specifically defined equipment and software, as well as giving developers the on the basis of preset rules. Deep neural networks are flexibility to take advantage of their tools of choice. The trained on large amounts of sample data and the resultant FLIR Spinnaker* SDK is the GenICam API library that enables trained model is then uploaded to the Firefly camera. Figure CNNs to be uploaded to the camera with the same familiar 4 illustrates examples of use cases enabled by on-camera tools used across FLIR’s machine-vision product lines. It execution of deep neural networks. provides a simple approach to deploying trained networks • Robotic guidance can help industrial, healthcare, and into the field with a user experience similar to uploading consumer robots interact in more sophisticated ways with new firmware. objects, including avoiding obstacles when navigating Developers can use the Intel Movidius NCS to begin work unfamiliar spaces. immediately on applications for Firefly that meet specific • Quality inspection can be automated and sophisticated, real-world scenarios, including appropriate business such as gauging whether variations in a pattern are logic, as well as CNNs tuned for an optimal combination of acceptable in a textile manufacturing scenario. accuracy and speed. The training set size can also be altered • Biometric recognition based on inputs such as face, experimentally, dialing in the required level of subjective thumbprint, or iris scans can be used to govern access decision making. authorization for facilities, computer systems, or other resources. Conclusion • Precision agriculture can draw on the analysis of The upcoming FLIR Firefly uses on-camera inference to crop-condition images taken in the visible and infrared enable faster, more accurate image analysis than with spectrums to guide efficient application of herbicides traditional rules-based systems. Running deep neural and pesticides. networks directly on the camera enables edge-based image analysis with ultra-low latency for real-time responses to • Medical imaging implementations include histology usages to flag anomalies in biopsies as a first-pass screening or events. Let the disruption begin that will power the next as a fail-safe measure to identify false negatives after generation of machine vision and image analysis. standard reads by medical personnel.

Robotic Quality Biometric Precision Medical Guidance Inspection Recognition Agriculture Imaging

Figure 4. Example use cases for on-camera inference.

Solution provided by:

For more information about FLIR machine vision, visit: www.flir.com/mv. For more information about Intel Movidius technology, visit: www.movidius.com.

All rights reserved. Intel, the Intel logo, Movidius, and Myriad are trademarks of Intel Corporation in the U.S. and/or other countries. FLIR and Firefly are legal trademarks of FLIR Systems Inc. *Other names and brands may be claimed as the property of others. © 2018 Intel Corporation. 1018/MB/MESH/PDF