Graf Industrial & Velodyne Lidar

Total Page:16

File Type:pdf, Size:1020Kb

Graf Industrial & Velodyne Lidar Graf Industrial & Velodyne Lidar SPECIAL SITUATIONS ALERT Email [email protected] Report date July 6, 2020 CMP $20.53 Executive Summary: Ticker GRAF, GRAF.W, GRAF.U Graf Industrial Corp. (NYSE: GRAF), a special purpose acquisition company and Velodyne Lidar, Inc. Market cap $360M (“Velodyne” or “Company”), a world leader in lidar technology with 70% market share in 52W Hi/Lo $22.80-9.87 autonomous vehicles (AVs), announced an agreement to merge on July 2, 2020. At the close of 3M ADTV 460K the transaction, the Company will retain the Velodyne Lidar name and trade on New York Stock IPO date Oct 16, 2018 Exchange under the ticker symbol VLDR. Deadline Sep 30, 2020 IPO proceeds $244M SPAC background: Current unit Common + GRAF is a Houston-based special purpose acquisition company founded by CEO, James Graf and structure 3/4 Warrant Michael Dee, President and CFO, in partnership with Owl Creek. GRAF was incorporated in Shares issued 17.55M Delaware on June 26, 2018 and raised $225M in an initial public offering. Warrants issued • 38.5M • $11.50 strike, The Target Company: • Call at $18.00, 5-year term San Jose, CA based Velodyne provides real-time vision based on remote sensing technology called IPO Bankers EarlyBirdCapital, LiDAR (Light Detection and Ranging). LiDAR uses pulsed laser to measure variable distances and Oppenheimer, generates precise 3D information that allows machines to view their surroundings. Velodyne’s Ladenburg Thalmann, LiDAR-based smart vision solutions have been implemented in the automotive industry as well as I-Bankers non-automotive applications, such as last-mile delivery, autonomous mobile robots, unmanned aerial vehicles (UAVs), advanced security systems, and smart city initiatives. Velodyne has raised Target Velodyne Lidar, Inc. ~$250M from investors that include Nikon, Ford, Hyundai Mobis and Baidu. Target sector Autonomous vehicles Deal announced July 2, 2020 Proxy filed tbd The Transaction: Closing vote Q3 2020 As per merger filings, post-Closing, Velodyne is expected to have market capitalization of $1.8B Target EV $1.6B and initial enterprise value of $1.6B (2.3x estimated 2024 revenue of $680M). At this valuation, Cashout $50M existing Velodyne stockholders will hold 83% of the pro forma equity and receive up to $50M in Key closing Total cash ≥ $200M cash. The parties agreed to a $58.9M termination fee if Velodyne terminates the transaction. The conditions SPAC sponsor agreed to forfeit 3.5M founder shares and all the 14M private placement warrants. Committed $150M PIPE At Closing, the SPAC sponsor shall retain 2.5M founder shares (~42% of their initial promote financing including earnout shares). Transaction SPAC: Oppenheimer, Bankers EarlyBirdCapital Velodyne: BofA The Company will receive $150M of proceeds from a PIPE (of which ~$9M was invested by the Legal Counsel SPAC: White & Case SPAC sponsor) at a price of $10.00 per common share, along with $117M in cash held in trust Velodyne: Gunderson assuming no public shareholders of Graf exercise their redemption rights at closing. The Dettmer Stough transaction will result in a minimum of $192M cash to the balance sheet to fund growth, enhance Villeneuve Franklin & financial flexibility, and fund selective acquisition opportunities to further expand market Hachigian leadership. David Hall, Founder of Velodyne will continue as executive chairman along with CEO, Dr. Anand $25 Gopalan, CFO, Drew Hamer and Chief Marketing Officer Marta Hall. Current Velodyne GRAF shareholders, including David Hall and strategic investors Ford, Baidu, Nikon and Hyundai Mobis, $20 will retain an equity interest of >80% in the combined company. $15 Pro forma estimates, USD M Sources Uses $10 Seller rollover $1,472 Equity rollover $1,472 2-Jun 9-Jun 5-May 23-Jun 30-Jun 16-Jun SPAC trust cash $117 Cash to Sellers $50 12-May 19-May 26-May PIPE equity $150 Cash to balance sheet $200 Founder shares $24 Founder shares $24 Velodyne cash $8 Estimated fees and expenses $25 Total sources $1,771 Total uses $1,771 www.spacalpha.com Page 1 of 7 Balance Sheet Composition Shareholding Composition Shares, M 172.33 Shareholders Shares (M) % Share price $10.25 Seller rollover 143.6 83.5 Equity value 1,766 SPAC 11.5 6.7 Pro Forma Net Cash (200) Founder 2.3 1.3 Debt -- PIPE 15 8.5 Enterprise Value $1,566 Total 172.33 100.0 Earnout consideration Shares in M VWAP ≥ $15.00 (est. Value in $M) (20 out of 30 trading days within 6 months of IBC closing) Velodyne owners 2.00 ($30M) SPAC Founders 0.275 ($4M) Business Overview: 1. Company History & Management: • David Hall is the Founder and Executive Chairman of Velodyne, the inventor of surround (3D) LiDAR technology and introduced the HDL-64 Solid-State Hybrid LiDAR sensor in 2005 to give autonomous vehicles real-time 360º vision. • In 2007, Velodyne technology was used by 5 out of 6 contestants in a driverless car race sponsored by Defense Advanced Research Projects Agency – the winning and second place teams both used a Velodyne system. • Soon thereafter, the Company began world’s first commercial production of real-time 3D LiDAR Technology and established global sales and mass scale manufacturing in 2007. • In 2015, Velodyne became independent from Velodyne Acoustics. Its product range includes a broad range of sensing solutions, including the cost-effective Puck™, the versatile Ultra Puck™, the autonomy-advancing Alpha Prime™, the Advanced Driver Assistance Systems (ADAS)-optimized Velarray™ and the software for driver assistance, Vella™ • Dr. Anand Gopalan, its current CEO, previously served as Velodyne’s CTO and as VP of Engineering at Rambus, where he oversaw chip and IP development activities for the Memory and Interfaced Division. He received a PhD in Electrical engineering and Microsystems Engineering from the Rochester Institute of Technology. • In Aug 2016, completed a $150M investment from co-investors Ford Motor Company (NYSE: F) (“Ford”) and Baidu, Inc. (Nasdaq: BIDU) (“Baidu”). • Velodyne was Automotive News PACE Award winner in 2019, the only LiDAR provider to win the award, which distinguishes automotive suppliers for superior innovation, technological advancement, and business performance. • Acquired Mapper.ai with 25 engineers in 2019 and will introduce Auto Pilot, Collision Avoidance, Pedestrian Automotive Emergency Braking, and other functionalities in 2020. www.spacalpha.com Page 2 of 7 2. Commercial Analysis: • The automotive LiDAR market is estimated to rise to $28 billion by 2032. The rising adoption of LiDAR systems in UAVs, increasing adoption of LiDAR in engineering and construction applications, use of LiDAR in geographical information systems (GIS) applications are among the factors driving the growth of the LiDAR market. • Velodyne addresses a $7B+ cumulative revenue opportunity in the 2020 – 2025 period based on multi-year commercial demands from customers; this estimate is inclusive of signed and awarded contracts and current additional pipeline. The Company has 300+ customers and has booked $500M+ in revenue since inception. It projects revenue of about $100M in 2020 and expects it to increase to $680M in 2024 through contracts of which 50-70% have been already contracted. This represents a revenue CAGR of 60%. The Company expects to apply its technology to serve a range of industries such as delivery, RoboTaxies, ADAS, Shuttles, Mapping, Robotics and SmartCities. 3. Product Portfolio: Velodyne’s product line includes a broad range of sensing solutions that differ in size and application: • HDL-32E: Compact, lightweight sensor is available for applications in drones/unmanned AVs • VLP-16 LiDAR Puck: 16-channel LiDAR sensor that is substantially smaller and less expensive than previous generation sensors. • Velodyne Puck 32MR™: A sensor that produces a point cloud with minimal noise and the ability to detect low reflectivity objects at a range of 120 meters. Accurately detects crosswalks, curbs and obstacles in warehouse aisles for safe and efficient navigation in roadway, commercial and industrial settings. • Ultra Puck™: Sensor provides a full 360-degree environmental view to deliver accurate real- time 3D data and has applications in robotics, mapping, security, driver assistance, and autonomous navigation. www.spacalpha.com Page 3 of 7 • Alpha Prime™: Uses surround view technology to deliver functionalities such as perception, field-of-view and range for autonomous markets including transportation, trucking and robotics. • Velarray™: Sensor with a small, embeddable form factor that allows automakers to create superior ADAS and address edge-cases for current approaches, including curvy roads, potholes, intersections, on/off ramps, residential areas and roadways with unclear lane markings. • VelaBit™: Velodyne’s smallest mid-range lidar sensor is highly configurable for specialized use cases and can be embedded almost anywhere within vehicles, robots, UAVs, and infrastructure. • VelaDome™: High resolution close-range sensing combined with small form factor for use in mounting and styling options. With 180° x 180° FoV and the ability to detect objects as close as 0.1m, the sensor’s near-field detection and high-density image satisfies a range of automotive applications, including blind spot monitoring. • VellaTM: Advanced driver assistance software that builds upon the directional view Velarray sensor. Provide functionality such as lane keeping assist, automatic emergency braking and adaptive cruise control. 4. Manufacturing facilities: The Company expects to meet demand for its LiDAR sensors to expand
Recommended publications
  • Navigating Automotive LIDAR Technology
    Navigating Automotive LIDAR Technology Mial Warren VP of Technology October 22, 2019 Outline • Introduction to ADAS and LIDAR for automotive use • Brief history of LIDAR for autonomous driving • Why LIDAR? • LIDAR requirements for (personal) automotive use • LIDAR technologies • VCSEL arrays for LIDAR applications • Conclusions What is the big deal? • “The automotive industry is the largest industry in the world” (~$1 Trillion) • “The automotive industry is > 100 years old, the supply chains are very mature” • “The advent of autonomy has opened the automotive supply chain to new players” (electronics, optoelectronics, high performance computing, artificial intelligence) (Quotations from 2015 by LIDAR program manager at a major European Tier 1 supplier.) LIDAR System Revenue The Automotive Supply Chain OEMs (car companies) Tier 1 Suppliers (Subsystems) Tier 2 Suppliers (components) 3 ADAS (Advanced Driver Assistance Systems) Levels SAE and NHTSA • Level 0 No automation – manual control by the driver • Level 1 One automatic control (for example: acceleration & braking) • Level 2 Automated steering and acceleration capabilities (driver is still in control) • Level 3 Environment detection – capable of automatic operation (driver expected to intervene) • Level 4 No human interaction required – still capable of manual override by driver • Level 5 Completely autonomous – no driver required Level 3 and up need the full range of sensors. The adoption of advanced sensors (incl LIDAR) will not wait for Level 5 or full autonomy! The Automotive LIDAR Market Image courtesy of Autonomous Stuff Emerging US $6 Billion LIDAR Market by 2024 (Source: Yole) ~70% automotive Note: Current market is >$300M for software test vehicles only! Sensor Fusion Approach to ADAS and Autonomous Vehicles Much of the ADAS development is driven by NHTSA regulation LIDAR Vision & Radar Radar Radar Vision Vision Vision & Radar Each technology has weaknesses and the combination of sensors provides high confidence.
    [Show full text]
  • LIDAR – a New (Self-Driving) Vehicle for Introducing Optics to Broader Engineering and Non-Engineering Audiences Corneliu Rablau a Akettering University, Dept
    LIDAR – A new (self-driving) vehicle for introducing optics to broader engineering and non-engineering audiences Corneliu Rablau a aKettering University, Dept. of Physics, 1700 University Ave., Flint, MI USA 48504 ABSTRACT Since Stanley, the self-driven Stanford car equipped with five SICK LIDAR sensors won the 2005 DARPA Challenge, the race to developing and deploying fully autonomous, self-driving vehicles has come to a full swing. By now, it has engulfed all major automotive companies and suppliers, major trucking and taxi companies, not to mention companies like Google (Waymo), Apple and Tesla. With the notable exception of the Tesla self-driving cars, a LIDAR (Light, Detection and Ranging) unit is a key component of the suit of sensors that allow autonomous vehicles to see and navigate the world. The market space for lidar units is by now downright crowded, with a number of companies and their respective technologies jockeying for long-run leading positions in the field. Major lidar technologies for autonomous driving include mechanical scanning (spinning) lidar, MEMS micro-mirror lidar, optical-phased array lidar, flash lidar, frequency- modulated continuous-wave (FMCW) lidar and others. A major technical specification of any lidar is the operating wavelength. Many existing systems use 905 nm diode lasers, a wavelength compatible with CMOS-technology detectors. But other wavelengths (like 850 nm, 940 nm and 1550 nm) are also investigated and, in the long run, the telecom near- infrared range (1550 nm) is expected to experience significant growth because it offers a larger detecting distance range (200-300 meters) within eye safety laser power limits while also offering potential better performance in bad weather conditions.
    [Show full text]
  • Autonomous Vehicle Technology: a Guide for Policymakers
    Autonomous Vehicle Technology A Guide for Policymakers James M. Anderson, Nidhi Kalra, Karlyn D. Stanley, Paul Sorensen, Constantine Samaras, Oluwatobi A. Oluwatola C O R P O R A T I O N For more information on this publication, visit www.rand.org/t/rr443-2 This revised edition incorporates minor editorial changes. Library of Congress Cataloging-in-Publication Data is available for this publication. ISBN: 978-0-8330-8398-2 Published by the RAND Corporation, Santa Monica, Calif. © Copyright 2016 RAND Corporation R® is a registered trademark. Cover image: Advertisement from 1957 for “America’s Independent Electric Light and Power Companies” (art by H. Miller). Text with original: “ELECTRICITY MAY BE THE DRIVER. One day your car may speed along an electric super-highway, its speed and steering automatically controlled by electronic devices embedded in the road. Highways will be made safe—by electricity! No traffic jams…no collisions…no driver fatigue.” Limited Print and Electronic Distribution Rights This document and trademark(s) contained herein are protected by law. This representation of RAND intellectual property is provided for noncommercial use only. Unauthorized posting of this publication online is prohibited. Permission is given to duplicate this document for personal use only, as long as it is unaltered and complete. Permission is required from RAND to reproduce, or reuse in another form, any of its research documents for commercial use. For information on reprint and linking permissions, please visit www.rand.org/pubs/permissions.html. The RAND Corporation is a research organization that develops solutions to public policy challenges to help make communities throughout the world safer and more secure, healthier and more prosperous.
    [Show full text]
  • Google Announces Lidar for Autonomous Cars
    NEWS Google Announces Lidar for Autonomous Cars Waymo wants to do everything itself. The self-driving automobile division of Google’s parent company Alphabet, has announced that it is building all of its self-driving automobile hardware in-house. That means Waymo is manufacturing its own lidar. This shouldn’t come as a surprise. Two years ago the project director for Google’s self-driving car project started dropping hints that the tech company would be building its own lidar sensors. At the time, their self-driving prototype used a $70,000 Velodyne lidar sensor, and Google felt the price was too high to justify mass-production. Rather than wait for the market to produce a cheaper sensor, Google started developing its own. In the intervening time automotive lidar has become a red-hot market segment. It seems like every month brings a new company announcing a low-cost sensor. But it turns out that Google was not deterred: At the Detroit Auto show last Sunday, Waymo’s CEO announced that the company had produced its own lidar, and brought the technology’s price down “by more than 90%.” A quick bit of napkin math puts the final number at about US$7,500. Attentive readers will note that the unit’s cost is notably higher than solid-state offerings from Quanergy, Innoviz, and a number of other competitors. However, many of those sensors are not currently available on the market, despite lots of press attention and big PR pushes. Waymo’s sensors, on the other hand, are available right now–at least to Waymo.
    [Show full text]
  • The Multiple 3D Lidar Dataset
    LIBRE: The Multiple 3D LiDAR Dataset Alexander Carballo1;4 Jacob Lambert2;4 Abraham Monrroy Cano2;4 David Robert Wong1;4 Patiphon Narksri2;4 Yuki Kitsukawa2;4 Eijiro Takeuchi2;4 Shinpei Kato3;4;5 Kazuya Takeda1;2;4 Abstract— In this work, we present LIBRE: LiDAR Bench- marking and Reference, a first-of-its-kind dataset featuring 10 different LiDAR sensors, covering a range of manufacturers, models, and laser configurations. Data captured independently from each sensor includes three different environments and configurations: static targets, where objects were placed at known distances and measured from a fixed position within a controlled environment; adverse weather, where static obstacles were measured from a moving vehicle, captured in a weather chamber where LiDARs were exposed to different conditions (fog, rain, strong light); and finally, dynamic traffic, where dynamic objects were captured from a vehicle driven on public urban roads, multiple times at different times of the day, and including supporting sensors such as cameras, infrared imaging, and odometry devices. LIBRE will contribute to the research community to (1) provide a means for a fair comparison of currently available LiDARs, and (2) facilitate the improvement of existing self-driving vehicles and robotics-related software, in terms of development and tuning of LiDAR-based perception Fig. 1: Multiple 3D LiDARs algorithms. Index Terms— 3D LiDAR, dataset, adverse weather, range accuracy, pointcloud density, LIBRE I. INTRODUCTION pending on the individual perception application and op- erating domain, there are several key LiDAR performance LiDAR (Light Detection And Ranging, sometimes Light attributes: measurement range, measurement accuracy, point Imaging Detection And Ranging for the image-like resolution density, scan speed and configurability, wavelength, robust- of modern 3D sensors) is one of the core perception tech- ness to environmental changes, form factor, and cost.
    [Show full text]
  • Velodyne Lidar Announces Appointment of Deborah Hersman to Board of Directors
    Velodyne Lidar Announces Appointment of Deborah Hersman to Board of Directors March 17, 2021 Former NTSB Chair and National Safety Council President Brings Nearly 30 Years of Leadership Experience in Transportation, Safety and Policy Sectors SAN JOSE, Calif.--(BUSINESS WIRE)--Mar. 17, 2021-- Velodyne Lidar, Inc. (Nasdaq: VLDR, VLDRW) today announced that Deborah Hersman, former chair of the National Transportation Safety Board (NTSB), has been appointed to the Company’s Board of Directors, effective immediately. This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20210317005753/en/ Ms. Hersman has nearly 30 years of government, nonprofit and private sector executive leadership experience in transportation, safety and policy. Following her role as Chair of the National Transportation Safety Board, she served as the President and Chief Executive Officer of the National Safety Council. Most recently, Ms. Hersman served as the first Chief Safety Officer of Waymo, Google’s self-driving car project. She currently serves on the Board of Directors of NiSource Inc. “Debbie is a passionate, well-respected pioneer in the safety field with a proven track record of successfully promoting public safety, delivering on high profile objectives and managing complex, technical initiatives,” said Dr. Joseph B. Culkin, PhD, Chairman of Velodyne Lidar’s Board of Directors. “As the leading lidar provider, our commitment to advancing safety is the unifying principle that defines, coordinates and focuses our entire organization. We are thrilled that Debbie, who is a true leader in our field and shares in our vital mission, is joining us at this exciting time.
    [Show full text]
  • Building an L4 Autonomous Driving R&D Platform
    Enabling the Future of Transportation BUILDING AN L4 AUTONOMOUS DRIVING R&D PLATFORM “Drive-PX2 on Wheels” Wolfgang Juchmann, Ph. D. VP of Business Development Supplier of components and services that enable autonomy Building an L4 Autonomous Driving R&D Platform • Step 0: Autonomous Applications • Step 1: Vehicle & Compute Platforms • Step 2: Perception/Positioning Sensors • Step 3: Data-Fusion / DriveWorks • Step 4: Automation Algorithms Introduction: Wolfgang Juchmann VP of Sales & Business Development • Born in Germany • Ph.D. in Physics Technical Sales • In Silicon Valley since 2001 • Last 4 years at Velodyne LiDAR • Since January 2016 with AutonomouStuff • Live in Santa Cruz with my wife & 5 year old son Introduction: AutonomouStuff Our Goal Fast-Tracking Autonomous Driving OUR GOAL: To enable the future of transportation by significantly reducing development time of autonomy, therefore rocketing our customers forward Y-O-U Introduction: AutonomouStuff • Founded in 2010 • ~ 2000 customers, world-wide sales • Aggressive & continuous growth • Headquarter in Peoria, Illinois • Strong presence in Silicon Valley & Detroit • Autonomy hardware & software experts Pleasure of Driving Reality of Driving Danger of Driving We need Autonomous Vehicles! We want Autonomous Vehicles! AutonomouStuff helps to get to Autonomy faster Building an L4 Autonomous Driving R&D Platform • Step 0: Autonomous Applications • Step 1: Vehicle & Compute Platforms • Step 2: Perception/Positioning Sensors • Step 3: Data-Fusion / DriveWorks • Step 4: Automation Algorithms
    [Show full text]
  • Computing Systems for Autonomous Driving: State-Of-The-Art And
    1 Computing Systems for Autonomous Driving: State-of-the-Art and Challenges Liangkai Liu∗, Sidi Lu∗, Ren Zhong∗, Baofu Wu∗, Yongtao Yao∗, Qingyang Zhangy, Weisong Shi∗ ∗Department of Computer Science, Wayne State University, Detroit, MI, USA, 48202 ySchool of Computer Science and Technology, Anhui University, Hefei, China, 230601 {liangkai, lu.sidi, ren.zhong, baofu.wu, yongtaoyao, weisong}@wayne.edu, [email protected] Abstract— The recent proliferation of computing technologies autonomous vehicle’s computing systems are defined to cover (e.g., sensors, computer vision, machine learning, and hardware everything (excluding the vehicle’s mechanical parts), in- acceleration), and the broad deployment of communication mech- cluding sensors, computation, communication, storage, power anisms (e.g., DSRC, C-V2X, 5G) have pushed the horizon of autonomous driving, which automates the decision and control management, and full-stack software. Plenty of algorithms of vehicles by leveraging the perception results based on multiple and systems are designed to process sensor data and make sensors. The key to the success of these autonomous systems a reliable decision in real-time. is making a reliable decision in real-time fashion. However, accidents and fatalities caused by early deployed autonomous However, news of fatalities caused by early developed vehicles arise from time to time. The real traffic environment autonomous vehicles (AVs) arises from time to time. Until is too complicated for current autonomous driving computing August 2020, five self-driving car fatalities happened for level- systems to understand and handle. In this paper, we present 2 autonomous driving: four of them from Tesla and one from state-of-the-art computing systems for autonomous driving, in- Uber [19].
    [Show full text]
  • The Automotive Lidar Market April 2018 Market and Technology Overview Market Map LIDAR for AUTOMOTIVE: POTENTIAL OEM
    The Automotive LiDAR Market April 2018 Market and Technology Overview Market Map LIDAR FOR AUTOMOTIVE: POTENTIAL OEM Russia Europe Japan Europe Korea USA USA China China Robotic cars ADAS (Advanced Driver Assistance Systems) 4 AUTOMOTIVE LiDAR MANUFACTURERS Europe China Canada Japan Israel USA Australia 5 POTENTIAL AUTOMOTIVE LIDAR MANUFACTURERS (STEALTH MODE) Europe Korea Japan USA 6 ILLUMINATION SOURCE PLAYERS Europe Japan USA Korea EEL: Edge Emitting Laser VCSEL: Vertical Cavity Surface-Emitting Laser 7 PHOTODETECTOR PLAYERS Europe Japan USA Photodiodes and Avalanche Photodiodes Single Photon Avalanche Photodiodes and Silicon Photomultipliers 8 Technology Overview LiDAR PRINCIPLE AND COMPONENTS The basic working principle of the LiDAR is very simple. A light source illuminates a scene. The light scattered by the objects of the scene is detected by a photodetector. Measuring the time it takes for the light to travel to the object and back from it, allows to know its distance. Scene under exposure Light scanner Laser beam Optics or Light source Light diffuser Computing unit 3D point cloud Optics Photodetector Signal processor distance = time x velocity of light LiDAR system 10 AUTOMOTIVE LiDAR ECOSYSTEM Photodetectors LiDAR systems Laser sources PD/APD SPAD/SiPM Active players EEL VCSEL R&D players IC Optical elements FPGA ADC MEMS Optical filters Amplifier Optical systems ADC: Analog Digital Converter IC: Integrated Circuit SPAD: Single-Photon Avalanche Diode APD: Avalanche Photodiode MEMS: Micro-Electro-Mechanical System VCSEL: Vertical Cavity Surface-Emitting Laser EEL: Edge-Emitting Laser PD: Photodiode FPGA: Field-Programmable Gate Array SiPM: Silicon Photomultiplier 11 TAXONOMY OF LiDAR TECHNIQUES LiDAR 1D 2D/3D (scanning of light source) Scanning Non scanning Non Structured Multi Mechanical Flash LiDAR mechanical light cameras Optical phased Stereo vision Non-MEMS MEMS array Macro- mechanical Electro-optical Not under development for scanning automotive applications.
    [Show full text]
  • Autonomous Driving and Related Technologies
    Paper ID #26395 Autonomous Driving and Related Technologies Dr. Rendong Bai, Eastern Kentucky University Dr. Rendong Bai received his PhD degree in Computer Science from University of Kentucky in 2008. From 2007 to 2008, he worked at Eastern Kentucky University in the Department of Computer Science as a Visiting Assistant Professor. He was an Assistant/Associate Professor in the School of Technology at Eastern Illinois University from 2008 to 2018. In Fall 2018, he joined the Applied Engineering and Technology department at Eastern Kentucky University. His research interests include mobile comput- ing, server technology, network security, multimedia and web technologies, computer-aided design and manufacturing, quality management, and renewable energy. Dr. Wutthigrai Boonsuk, Eastern Illinois University Dr. Wutthigrai Boonsuk is an associate professor of Applied Engineering and Technology at Eastern Illi- nois University. He earned his master’s and doctorate degrees in Industrial and Manufacturing System Engineering from Iowa State University. Dr. Boonsuk also received his second master’s degree in Human Computer Interaction from the same university. His research interests include 3D stereoscopic applica- tions, Manufacturing Systems, Rapid Prototyping, Robotic and Controller Systems, Virtual Reality, and Geographic Information System (GIS). Dr. Boonsuk may be reached at [email protected] Peter P. Liu c American Society for Engineering Education, 2019 Autonomous Driving and Related Technologies Dr. Rendong Bai, Eastern Kentucky University, Richmond, KY 40475, 859-622- 1181, [email protected] Dr. Wutthigrai Boonsuk, Eastern Illinois University, Charleston, IL 61920, 217- 581-5772, [email protected] Dr. Peter Ping Liu, Eastern Illinois University, Charleston, IL 61920, 217-581- 6267, [email protected] Abstract We’re in the middle of a rapid evolution of the way vehicles are operated on road.
    [Show full text]
  • Velodyne Announces Order from Ford Motor Company for Its Next
    Velodyne Announces Order From Ford Motor Company for its Next-Gen Solid-State LiDAR Sensor Designed for ADAS Safety and Autonomous Driving First 3D LiDAR Sensor Capable of Supporting All ADAS Functionality Levels MORGAN HILL, Calif. (PRWEB) January 05, 2016 -- Velodyne announced today that it has received a purchase order from Ford Motor Company for its next-generation advanced driver assistance system (ADAS) LiDAR sensor. Velodyne’s new Solid-State Hybrid Ultra Puck™ Auto is designed to combine the functionality of its pioneering 3D LiDAR sensors in a miniaturized form factor while extending sensing range to 200 meters. Velodyne set target pricing of less than $500 per unit in automotive mass production quantities. The Solid-State Hybrid Ultra Puck Auto will be the first affordable ADAS sensor capable of supporting ADAS levels 1-4/5, including fully autonomous driving. Ford Motor Co. has been involved in research and development of autonomous driving features for more than a decade, and has worked with Velodyne during much of that time ( https://media.ford.com/content/fordmedia/fna/us/en/news/2015/01/06/ford-at-ces-announces-smart-mobility- plan.html). The company’s Smart Mobility Plan includes its Fusion Hybrid Autonomous Research Vehicles, equipped with Velodyne’s HDL-32E LiDAR sensors. At the same time, Ford has developed a wide range of semi-autonomous features already implemented in vehicles currently in production. In November, Ford became the first automaker to test an autonomous vehicle at the University of Michigan’s Mcity, the world's first full-scaled simulated urban environment. The autonomous vehicle in question was outfitted with Velodyne’s real-time, 3D LiDAR sensors.
    [Show full text]
  • 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27
    1 2 3 4 5 6 7 8 9 10 11 12 UNITED STATES DISTRICT COURT 13 NORTHERN DISTRICT OF CALIFORNIA 14 _______ Individually and on Behalf of All ) Case No. 15 Others Similarly Situated, ) ) CLASS ACTION 16 Plaintiff, ) ) COMPLAINT FOR VIOLATIONS OF THE 17 vs. ) FEDERAL SECURITIES LAWS ) 18 VELODYNE LIDAR, INC. f/k/a GRAF ) INDUSTRIAL CORP., ANAND GOPALAN, ) 19 ANDREW HAMER, JAMES A. GRAF, ) MICHAEL DEE, OC OPPORTUNITIES ) 20 FUND II, L.P., OWL CREEK ASSET ) MANAGEMENT, L.P. and GRAF ) 21 ACQUISITION LLC, ) ) 22 Defendants. ) ) DEMAND FOR JURY TRIAL 23 24 25 26 27 28 1 Plaintiff ______ (“plaintiff”), individually and on behalf of all others similarly 2 situated, alleges the following based upon information and belief as to the investigation 3 conducted by plaintiff’s counsel, which included, among other things, a review of U.S. 4 Securities and Exchange Commission (“SEC”) filings by Velodyne Lidar, Inc. f/k/a Graf 5 Industrial Corp. (“Velodyne” or the “Company”) and securities analyst reports, press 6 releases, and other public statements issued by, or about, the Company. Plaintiff believes 7 that substantial additional evidentiary support will exist for the allegations set forth herein 8 after a reasonable opportunity for discovery. 9 NATURE OF THE ACTION 10 1. This is a federal securities class action brought on behalf of all purchasers of 11 Velodyne securities (the “Class”) between July 2, 2020 and March 17, 2021, inclusive (the “Class 12 Period”), seeking to pursue remedies under the Securities Exchange Act of 1934 (the “Exchange 13 Act”). 14 JURISDICTION AND VENUE 15 2.
    [Show full text]