AEye Applies AI to Sensor Tech for a Safer Self-Driving Experience

AEye self driving car.

While the widespread use of self-driving vehicles may still be some years away, the onboard systems and technologies that help keep passengers safe are constantly improving. One such technology is LiDAR (Light Detection and Ranging), a remote sensing method that works by using a rapidly pulsing laser beam to detect motion. Although LiDAR became useful in the realm of military and aerospace tech in the late 1960s, the recent application of artificial intelligence has expanded its potential, allowing the sensor technology to serve a wider range of markets, including the automotive industry.

Luis Dussan, founder and CTO of LiDAR startup AEye, saw this potential from his experiences as an engineer for Lockheed Martin and Northrop Grumman, where he designed reconnaissance, targeting, and defense systems for fighter jets. The system’s ability to recognize and respond to objects in real-time meant saving the lives of pilots, but when Dussan realized that self-driving vehicles faced a similar challenge, he decided to work on a perception system that could deliver military-grade performance in autonomous cars. Out of this idea, Dussan founded AEye in 2013 – the first company to take the step of applying artificial intelligence to a commercially available LiDAR sensor.

Vehicles That See Better Than Humans

AEye’s AI-driven, LiDAR-based technology allows self-driving vehicles the ability to more accurately scan their environment and have real-time, 360-degree “vision.” Known as iDAR (Intelligent Detection and Ranging), the pioneering intelligent sensing system is able to mimic how the human visual cortex focuses on and evaluates its surroundings, including potential driving hazards. To achieve human-like perception, the iDAR platform captures, processes, and optimizes both 2D and 3D data in real-time. For the automotive, trucking, rail, and intelligent transportation systems (ITS) markets, such a perception system could help in the improvement of safety and efficiency — especially when designed by a team of scientists and electro-optics engineers from NASA, Lockheed, the Air Force, DARPA and other agencies.

Point cloud image from AEye's system.

“By pushing intelligence and processing to the edge of the network, AEye’s integrated perception system, which combines software extensibility, artificial intelligence, and smart, active sensors, is able to scan its surroundings and quickly identify critical objects. This is key to improving safety and reliability for autonomous vehicles,” explains Jordan Greene, AEye co-founder and VP of Corporate Development.

The iDAR platform has won several startup awards and further cemented AEye’s stated mission to save lives and propel the future of transportation. Greene says the company aims to bring AI-powered sensors and perception capabilities to the mass market “in order to change the way robots, big and small, perceive the environment.”

So how exactly does the application of AI improve upon LiDAR’s sensor technology? The use of artificial intelligence boosts the system’s information-gathering abilities, creating a domino effect of benefits, including smarter data, quicker perception, faster performance, and lower computational costs. “Our embedded AI enables us to collect four to eight times the information of conventional, fixed pattern LiDAR using a fraction of the power. Current LiDAR-only solutions lack this intelligence and face severe limitations in their ability to respond to changing conditions,” explains Greene.

Headshot of Jordan Greene, AEye co-founder and VP of Corporate Development.

Innovating During the Pandemic

Like many businesses, AEye encountered the need to respond urgently to new circumstances brought on by the pandemic. In early 2020, the startup was preparing to launch its 4Sight Mobility (4SightM) product, a smart sensor built on the company’s iDAR (Intelligent Detection and Ranging) platform. However, due to worldwide lockdowns, traveling to give demos of the system to potential customers and partners in Detroit, Europe, Japan, and South Korea became impossible. However, finding a way to virtually offer these demos was crucial for the launch of the artificial perception system, especially as it was a new product in a new market. Traditional video-conferencing solutions weren’t suitable for the demo because AEye needed a very low latency platform to feature the high-performance capabilities of its AI-driven LiDAR system. It was at this moment that the company’s CEO, whose three sons are teenage gamers, had the idea of using a gaming platform to demo iDAR.

“That’s exactly what gaming demands, with its high-density, low latency performance, so it was a perfect match,” says Greene. The company was able to use the API from gaming platform Discord to create an interactive demo platform. Within two months of the shutdown, AEye was giving live, real-time driving demos over Discord to engineers all over the world. “They loved it because they could ask us to test whatever they wanted in a live setting — for example, ‘Go on the freeway,’ or ‘Have someone walk in front of the sensor.’”

Rising Demand for Lidar From Automakers

While LiDAR technology was previously reserved for military applications due to its expense, advances in technology and cost reductions have brought it to the point of commercialization.

With the potential that AI-driven LiDAR-based perception has to improve upon human perception, Greene says that automakers are “clambering to implement the technology in order to roll out increasingly advanced safety features, including advanced driver-assistance. Add to that, we have a new administration, new clean air mandates, and a president incentivized to prioritize traffic safety.”

Although LiDAR’s advances for the automotive industry are only now emerging from the R&D stage, Greene points out that the technology will be deployed at scale in the next few years. “We are partnered with some of the world’s leading Tier 1 automotive suppliers, including Continental AG, and bidding on large OEM contracts for the industry’s first commercial ADAS deployments using LiDAR at volume. These contracts will be awarded in the next six months, for rollouts taking place in 2024-25, and we expect to win our unfair share of them.” The startup has also been testing its sensors with a wide range of customers and integrators in trucking, transit, construction, rail, ITS, aerospace, and defense markets.

Greene believes that the future for AEye looks bright, and the market has matured greatly since the company was established. Although finding a highly promising market is never easy, Greene says several factors have nurtured the company’s growth since day one: hiring world-class talent, prioritizing a company culture of transparency and accountability, focusing on core competencies, and funding that has come from hands-on financial investors who have been key to the company’s success.

“Customers know what they’re looking for [and] have a much more solid grasp on what the technology is capable of and how it can be integrated and implemented for a variety of use cases. We’re excited about the year ahead, the growth we anticipate, and our role in enabling a safer future.”

Suchi Rudra

Suchi Rudra is a freelance writer who is passionate about covering emerging tech, entrepreneurship, and real estate. Her work has appeared in The New York Times, Fast Company, VICE, EdTech Magazine, and many other publications.

Read more from Suchi Rudra