Lumotive, a startup based in Seattle, Washington, is developing beam-steering semiconductor chips based around optical metasurfaces. At CES, they announced they have raised USD $13m in a new round of venture finance.
The support, provided by Samsung Ventures plus new investors USAA (United Services Automobile Association) and Korea-based electronics distributor Uniquest, brings total investment in the firm to $56m, and is intended to accelerate the development and customer delivery of its devices for advanced lidar sensing.
Lumotive—Bill Gates was an early investor—won a Prism Award in 2022, and can says they are engaged with “more than two dozen world-class companies” looking to use the supplier’s patented light-control metasurface (LCM) chips.
The devices are being aimed at next-generation lidar systems for autonomy, automation, and augmented reality markets. Suitable for software-defined lidar with no mechanical parts required to steer the beam—and therefore zero mechanical inertia—Lumotive says their LCMs can steer light in any pattern across the entire field of view, within microseconds.
Manufactured in a major chip foundry using a silicon CMOS process and compatible with vertical-cavity surface-emitting lasers (VCSELs), the approach aims to take advantage of mass-produced photonic and electronic devices to reduce the cost and size of lidar sensors.
Lumotive says the design and assembly process closely resemble smartphone production, with an architecture that can be scaled from tiny close-range sensing to the kind of high-performance, extended-range operation required by AVs. At last year’s Vision trade show in Stuttgart, Germany, the company demonstrated a reference design platform combining its LCMs with time-of-flight CMOS image sensors from development partner Gpixel, adding that an updated version suitable for volume production should become available by mid-2023.
Lumotive CEO Sam Heidari said the latest funding will “accelerate the deployment of the current generation of LCM chips and the development of the next generation of our product”.
DVN comment
Zero-inertia beam steering, capable of arbitrary scan patterns, allows a sensor to adapt to any situation. It reduces overall system cost by supporting unlimited virtual sensors in a single device. It enables scan modes that instantly react to the scene, such as HDR illumination. This is an evident advantage in comparison with classical MEMs or macro mechanical mirrors which have always the same scanning pattern even if some parts of the FOV have no interest.