The programming on automotive optics is a “workshop within a conference” and will emphasize sensing technologies for driverless vehicles and other emerging autonomous vehicle applications. In addition to an invited program of speakers, a keynote will be presented by Jan-Erik Kallhammer, Veoneer, Inc., Sweden. Sessions and panel discussions focusing on optical technologies for autonomous vehicles complete the program.
Visionary Talk: From Night Vision to LiDAR: An Automotive Perspective
Visionary Speaker: Jan-Erik Kallhammer
Work is underway to complement cameras and radar with LiDAR in serial automotive use. There are many considerations besides technical challenges to be made before LiDAR can be launched as a serial product. The talk will draw on experiences of taking Night Vision to the automotive market.
Session: LIDAR Approaches and the Demands on Optical Components
LIDAR is an effective technology that has been applied for the past two decades in areas such as aerial mapping, targeting, wind energy and civil engineering. Adapting the technology for driverless cars poses significant challenges — of cost, reliability, performance, sensor fusion and vehicle integration. The final architectures and technologies that will succeed for automotive LiDAR is not clear at this time, since requirements are evolving and fragmented. A large number of companies are pursuing the effort, with different approaches — operating wavelengths (9XX nm and 15XX nm), scanning methods (flash, solid state, MEMs, opto-mechanical), processing methods (Time-of-flight, FMCW), laser types (diode edge emitters, VCSELs, fiber lasers) and detector types (linear mode APDs, Geiger Mode APDs, PIN, CCD).
This session will feature speakers who represent some of the companies working on LIDAR today and their approaches for the automotive application, and the needs for optical components.
Barry Behnken, AEye, Inc., USA
Matthew Byrd, Analog Photonics, USA
Mark Itzler, Argo AI LLC, USA
Angus Pacala, Ouster, USA
Matthew Weed, Luminar Technologies, USA
Panel Discussion: A Realistic Assessment of Optics for Self-driving Vehicles
LIDAR and other ADAS (Advanced Driver Assistance Systems) imaging technologies promise to enable driverless cars, with potentially wide-ranging impacts to the vehicle manufacturing industry, vehicle-enabled services such as trucking and taxis, shared transportation and urban planning. But how close are we to ramping production for actual vehicle sales? What are the hurdles ahead for LIDAR and other optical systems and components? Where else does optics play a role in autonomous vehicles? What other factors impact the success of these optical technologies, from narrow technical issues to broader issues of regulation, standards, safety and customer acceptance? This panel of experts will conduct a lively discussion of these issues on this timely subject.
Moderator: Sabbir Rangwala, Patience Consulting LLC, USA
Brandon Collings, Lumentum, USA
Kevin Dopart, US Department of Transportation, USA
Rob Murano, II-VI Inc., USA
Shawn Esser, Finisar, USA
Steve Gehring, Association of Global Automakers, USA
Panel Discussion: Optical Technologies for Autonomy in Realistic Weather
The general understanding that sensors "work" often neglects real world conditions outside of laboratories or sunny streets. Most tests and demonstrations focus on controlled or favorable conditions, leaving out the harsher realities of real-world operation. Rain, snow, fog, pollen, dust, and numerous other common obscurants can negatively affect a variety of sensing modalities. Sensor failure can take a variety of forms: graceful degradation with built-in warnings, unknown blind-spots, low signal-to-noise ratios (known or unknown), etc. In this session we address failure modalities, how improved sensor design can improve autonomy estimations, and the optical technologies needed to address common weather conditions.
Moderator: Jenna Chan, General Technical Services, LLC, USA
Chris Debrunner, Lockheed Martin, USA
Paolo Masini, Raytheon Vision Systems, USA
Joseph Minor, U.S. Army, USA
Colin Reese, U.S. Army Research Laboratory, USA
Panel Discussion: Optical Technologies for Autonomy in Unstructured Environments
Typical discussions of autonomy revolve around self-driving cars on the streets of major metropolitan areas. These highly structured environments offer standard visual cues, commonly accepted behavioral protocols, and obstacles occurring within a standard plane and region. Throughout large parts of the world, it is not unusual to find the local road is dirt track which may have limited variation from the surrounding landscape. Moreover, operation off-road is regularly required by the military, rescue personnel, and aid organizations. For an autonomous system to operate in complex, unstructured environments, the sensors must be able to observe the environment in a new way. This session will address the difficulties facing optical sensors in environments ranging from featureless snow fields to dense jungles.
Moderator: William Nothwang, U.S. Army Research Laboratory, USA
Marcus Chevitarese, Raytheon, USA
Priya Narayanan, U.S. Army Research Laboratory, USA
Deva Ramanan, Carnegie Mellon University, USA
Robert Sadowski, U.S. Army TARDEC, USA
[return to schedule]
Science & Industry Showcase
Demo Area—Vehicle Equipped with Ouster LIDAR
Ouster will host live demonstrations of an OS-1-64 LIDAR sensor mounted on a vehicle on the exhibit floor, highlighting its panoramic imaging capabilities, high spatial acuity, and small form factor. Ouster is a leading developer of LIDAR and perception technology for the autonomous vehicle and robotics sectors. The company's flagship product, the OS-1-64, provides industry leading performance, scalability, reliability, and form factor. Ouster's corporate headquarters and manufacturing facility are located in San Francisco. For more information, visit www.ouster.io.