• Technical Conference:  16 – 20 September 2018
  • Science & Industry Showcase:  18 – 19 September 2018

Theme: Automotive


The programming on automotive optics is a “workshop within a conference” and will emphasize sensing technologies for driverless vehicles and other emerging autonomous vehicle applications.  

 

Monday, 17 September

 
08:30 – 09:15

Visionary Speaker: Automotive
From Night Vision to LIDAR: An Automotive Perspective
Jan-Erik Kallhammer, Autoliv Development AB, Sweden

09:15 – 10:00 Break
Experience another theme: Nanophotonics & Plasmonics
10:00  10:30 Coffee Break
10:30  12:30 Session
LIDAR Approaches and the Demands on Optical Components
12:30  13:30 Lunch Break (on your own)
13:30  15:30 Panel Discussion
A Realistic Assessment of Optics for Self-driving Vehicles
Tuesday, 18 September  
09:30  15:00 Science & Industry Showcase
15:00  16:30 Panel Discussion
Optical Technologies for Autonomy in Realistic Weather
16:30  16:45 Break
16:45  18:15 Panel Discussion
Optical Technologies for Autonomy in Unstructured Environments
18:30  20:30 Conference Reception
Session: LIDAR Approaches and the Demands on Optical Components
LIDAR is an effective technology that has been applied for the past two decades in areas such as aerial mapping, targeting, wind energy and civil engineering.  Adapting the technology for driverless cars poses significant challenges — of cost, reliability, performance, sensor fusion and vehicle integration.  The final architectures and technologies that will succeed for automotive LiDAR is not clear at this time, since requirements are evolving and fragmented.  A large number of companies are pursuing the effort, with different approaches  operating wavelengths (9XX nm and 15XX nm), scanning methods (flash, solid state, MEMs, opto-mechanical), processing methods (Time-of-flight, FMCW), laser types (diode edge emitters, VCSELs, fiber lasers) and detector types (linear mode APDs, Geiger Mode APDs, PIN, CCD).
 
This session will feature speakers who represent some of the companies working on LIDAR today and their approaches for the automotive application, and the needs for optical components. [return to schedule]

Speakers:
Barry Behnken, AEye, Inc., USA
Mark Itzler, Argo AI LLC, USA
Mike Watts, Analog Photonics, USA
Matthew Weed, Luminar Technologies, USA


Panel Discussion: A Realistic Assessment of Optics for Self-driving Vehicles
LIDAR and other ADAS (Advanced Driver Assistance Systems) imaging technologies promise to enable driverless cars, with potentially wide-ranging impacts to the vehicle manufacturing industry, vehicle-enabled services such as trucking and taxis, shared transportation and urban planning.  But how close are we to ramping production for actual vehicle sales?  What are the hurdles ahead for LIDAR and other optical systems and components?  Where else does optics play a role in autonomous vehicles?  What other factors impact the success of these optical technologies, from narrow technical issues to broader issues of regulation, standards, safety and customer acceptance? This panel of experts will conduct a lively discussion of these issues on this timely subject. [return to schedule]

Moderator: Sabbir Rangwala, Patience Consulting LLC, USA

Panelists:
Brandon Collings, Lumentum, USA
Kevin Dopart, US Department of Transportation, USA
Rob Murano, II-VI Inc., USA
Craig Thompson, Finisar, USA


Panel Discussion: Optical Technologies for Autonomy in Realistic Weather
The general understanding that sensors "work" often neglects real world conditions outside of laboratories or sunny streets. Most tests and demonstrations focus on controlled or favorable conditions, leaving out the harsher realities of real-world operation. Rain, snow, fog, pollen, dust, and numerous other common obscurants can negatively affect a variety of sensing modalities. Sensor failure can take a variety of forms: graceful degradation with built-in warnings, unknown blind-spots, low signal-to-noise ratios (known or unknown), etc. In this session we address failure modalities, how improved sensor design can improve autonomy estimations, and the optical technologies needed to address common weather conditions. [return to schedule]

Panel Discussion: Optical Technologies for Autonomy in Unstructured Environments
Typical discussions of autonomy revolve around self-driving cars on the streets of major metropolitan areas. These highly structured environments offer standard visual cues, commonly accepted behavioral protocols, and obstacles occurring within a standard plane and region. Throughout large parts of the world, it is not unusual to find the local road is dirt track which may have limited variation from the surrounding landscape. Moreover, operation off-road is regularly required by the military, rescue personnel, and aid organizations. For an autonomous system to operate in complex, unstructured environments, the sensors must be able to observe the environment in a new way. This session will address the difficulties facing optical sensors in environments ranging from featureless snow fields to dense jungles. [return to schedule]
 

Sponsored by:

  •