AEye presented the compelling features of their 4Sight lidar sensing platform concept at AutoSens Detroit. According to the company, their 4Sight™ Intelligent Sensing Platform can be configured via software for different vehicle placements. This achievement of using a single platform, configurable through software and working in multiple mounting locations, provides automakers significant vehicle design and aesthetic flexibility. This concept can help to further advance the pursuit of software defined cars including over-the-air updates to improve safety features over time without having to replace the lidar sensor hardware.
AEye cofounder and automotive general manager Jordan Greene says, “AEye customers gain the distinct advantage of a single platform that can be modified for any vehicle model and application, increasing adoption and deployment across automaker platforms and reducing engineering costs. Moving AEye sensor hardware from one location on a vehicle to another does not require a mechanical adaptation, as the sensor’s performance parameter can be configured by a simple software operation. This provides our go-to-market partners, like Continental, the ultimate flexibility in design, without compromising top-end performance in the process”.
As automakers shift towards software-driven business models, they are looking to software-defined hardware to absorb new technological advancements, and to deploy new, innovative services. AEye’s adaptive sensor platform can be configured via software for different vehicle placements, use cases, and markets to help automakers realise their vision of smart assets and software definable vehicles.
In this context we put two key questions to AEye senior automotive VP Bernd Reichert:
DVN: Your company’s mission is to drive the future of safe autonomy. How do you see the role of lidar sensors in this context and especially the lidar technology of AEye?
BR: The first electromechanical lidar sensors, with big rotating mirrors, have been developed and deployed in different automotive applications (for instance, the Audi A8 and the Mercedes S-class).
We see a shift to solid-state MEMS-based products, which are much more robust to meet the requirements of both passenger and commercial vehicle manufacturers, combined with higher performance regarding range, resolution, and field of view. Knowing that the transition of the automotive industry will require new ways to develop such technology, AEye created a configurable software-defined lidar solution. It is the only lidar on the market that supports the development of the software-defined vehicle.
So, our solutions and roadmaps are in line with the ambitions of the European automakers, which have defined clear strategies about how they want to master this transition, where the car becomes a data-centric mobile device, similar to a smartphone.
DVN: Will you share your view on the development of ADAS and AVs in this decade? Where in the world will the automotive community see the broadest progress? How will passenger cars compare to trucks, delivery service vehicles, robotaxis, etc?
BR: Let me start by pointing out the massive investment governments across the world are making in Smart City technology and infrastructure. This is happening across Asia; the Middle East; the EU, and North America. In parallel, the entire automotive industry is moving toward EV fleets in a matter of years. For EVs and smart cities, this means software enables new features, such as ADAS, and the updates and upgrades of those features over time. What is most exciting is that the addition of these features can all happen through software updates—no change in hardware needed. We also see an increasing shift towards software-defined EVs happening in commercial vehicles and, really, any machine that moves.
A couple of years ago, we saw strong enthusiasm for AVs and robotaxis on the streets. While we anticipated this to happen much earlier, we know that regulation has been a major bottleneck to the adoption of autonomous vehicles. The first L3 traffic jam pilots, in which autonomous systems will control driving and monitoring in some situations, have already received regulatory approval. The first applications of L4 highway pilots, with a higher degree of automation and higher speeds than traffic-jam pilots, will be possible by 2024 for private cars.
These highway pilot applications require lidar sensors with longer range, improved scan patterns, and focus on the corresponding region of interest to make fast decisions and focus on the important data.
Early highway pilots will be on commercial vehicles for hub-to-hub automation, most likely limited to certain well-mapped highways, but excluding operational design domains such as merging highways or crossing highways. Multiple truck automakers and autonomous system providers are conducting pilots in North America with fully autonomous trucks driving on highways. They still have the back-up of a safety driver on board. Favourable regulations in the southern United States, combined with good weather conditions, help accelerate the progress in autonomy for the commercial vehicle segment.
For robotaxis, we see the deployment at scale starting in the middle of the decade, maybe 2026, with a focus on megacities that have a higher volume of bookings. The US and China will lead this market.
Thank you, Bernd, for this enlightening interview. We wish you great success with your enterprise and hope to meet you in person at one of our DVN lidar events.