NFM Workshop on AI Safety

(NFM-AI-SAFETY-20)

Click here for Program

General Information

The workshop will be held as part of the NASA Formal Methods (NFM) Conference 2020 .

Location: NASA Ames

Date: Monday, May 11, 2020

Introduction

Artificial Intelligence (AI) is increasingly finding use in safety-critical applications. For example, autonomous driving, aircraft collision avoidance, and other domains. In order for these systems to find use in the real world, we need to trust them. This trust can be established through the application of formal methods. However, verifying these systems poses a number of challenges. The focus of this workshop is to discuss these challenges.

Topics of Interest

  • Neural network verification
    • Robust perception
    • Verified control
    • Fairness
    • Other problems
  • Safe neural network training
    • Correct by construction training
    • Approximately safe training or risk-aware training
  • Robustness
    • Out of distribution detection / calibration
  • Application areas
    • Autonomous:
      • Driving
      • Flying
    • Other applications
  • Explainable AI
  • AI impact on laws


If you are interested in a specific topic that you do not see listed here, but you think should be, please email us at: nfm.ai.safety@gmail.com

Registration

All the participants (including speakers) need to register to the main conference. Register to NFM-2020 (early registration is recommended)

Please also complete the form here.

Questions?

nfm.ai.safety@gmail.com