Meaningful virtual validation of edge cases is all about Operational Design Domain

To thoroughly train and comprehensively test the perception systems used in ADAS and automated driving applications, the systems are exposed to scenarios that push their boundaries. Understanding the Operational Design Domain (ODD) is imperative.

One challenging test for perception systems equipped with camera and radar, for example, is the timely detection of stopped vehicles. Especially against a brightly lit sky. But why?

If you’ve ever driven into the glare of a setting sun, you know just how hard it can be to see the car ahead until you’ve gotten uncomfortably close. Ever notice that white cars are even tougher to see? Cameras face the same challenge. 

As for radar, it is very good at determining how fast objects around it are moving (this is why police use it to catch speeders) but it is terrible at determining what those objects actually are. In order to prevent false positives, and avoid reacting to irrelevant objects, the radar system filters out and ignores stopped objects that aren’t identified by another sensor. Here’s an example of one such test from our simulation platform.

Learn more: