For AEye, software-definable means their lidar settings—object revisit rate, instantaneous (angular) resolution, and classification range—can be tuned or reconfigured on the fly to optimize performance and power consumption, depending on the use case.
AEye’s lidar capabilities can be updated or reprogrammed in the field, over the air (OTA), allowing for faster deployment of new features and a quicker path to SaaS (software-as-a-service) revenues. Compared to passive lidar systems that scan with fixed patterns at fixed distances, software definable lidars allow automakers to reconfigure it to their specific and evolving requirements without costly hardware changes.
CEO Matt Fisch says the ability to update the sensor in real-time and over the air will make a difference for automakers designing new electric and software-defined vehicles, saying it allows automakers to “install AEye lidar across a vehicle fleet to enhance basic ADAS (L1–L2) functionality, like automatic emergency braking, while refining the algorithms needed to introduce L3–L5 features and functionality across all vehicle models over the air, in the future, [reducing] hardware validation costs for OEMs, dramatically reducing time-to-market to introduce new features to consumers”.
High-Speed Object Detection
Increasing spatial resolution by 400 per cent is the equivalent of higher-quality image capture, making it easier for vehicles to see smaller objects at high speeds. Similarly, AEye’s 20-per-cent range improvement means vehicles can receive more information about the road for path planning, including better prediction capabilities at longer distances.
More resolution and greater range, made possible by new algorithms and calibration techniques, allow for safer driving because the vehicle can see things like bricks and tires from a farther distance (up to 200 meters), allowing more time for decision-making and alleviating the need to “slam on the brakes,” Fisch explained. “This is a critical capability for high-speed, hands-free highway driving,” he said. “4Sight+ detects road surfaces (asphalt and cement) up to 100 meters at highway speeds, even in direct sunlight and low-light conditions. It also allows vehicles to track vulnerable road users – pedestrians, motorcycles, and other vehicles – up to 300 meters.”
Phantom Braking
With 4Sight+, the company aims to help automakers eliminate dangerous nuisances like phantom braking, which is what NHTSA calls it when an AV activates the automatic emergency braking system (AEB) without an actual roadway obstacle. “The problem is that cameras alone and even cameras and radar are not enough to prevent this issue, as shadows, road curvatures, faint lane markings, parked vehicles, and metallic structures can create misleading sensing data with increased false positives and negatives, resulting in unintended braking,” Fisch said.
Phantom braking or false positives may be triggered by a variety of conditions depending on the sensor systems employed by the vehicle, explained a NHTSA spokesperson over email. “As driver assistance systems continue to evolve to employ more effective software filtering, multi-sensor designs, and mitigation strategies, false positives have decreased,” they wrote.
Addressing Hazardous Vehicle Cut-Ins
Dangerous vehicle cut ins are among the most studied areas in ADAS and autonomous vehicle development and assessment. The landmark 1994 University of Indiana study on traffic safety found that human error was the cause of all lane change or merge crashes. The Indiana researchers concluded that 89 percent of these crashes were due to drivers failing to recognize the hazard. As technologists set about to mitigate cut-in risk with driver-assist applications, researchers in a 2019 SAE technical paper cast doubt on the ability of ADAS and automated vehicle systems to behave as well as humans in cut-in scenarios. Those and other researchers have defined and prioritized normal and dangerous cut-in scenarios to assist in designing and assessing predictive safety applications.
More recently, a 2022 NCAP report revealed that vehicle cut-ins account for 12 per cent of potentially dangerous situations that an autonomous vehicle may encounter during lane keeping on a freeway and ranked the most dangerous scenario as when an automated vehicle’s lane is clear, but a vehicle suddenly cuts in from a congested lane—a scenario that requires quick and precise reactions to prevent a collision.
It seems clear that any improvement in predictive safety applications focused on cut-ins can positively affect crash statistics.
DVN comment
Improving the capabilities of a lidar without the need to change the hardware is a key advantage as it avoids the high cost and time related to new hardware development. The driving scenarios in real road situations require punctually different evolution of sensor’s performances. The lidar sensor can show its advantages if it could adapt automatically its physical performances (range, resolution, accuracy) to the road context.