Rice University’s EyeDAR Radar Sensors Could Reshape Autonomous Vehicle Safety Through Smart Infrastructure

· · Views: 2,396 · 4 min time to read

Autonomous vehicles could soon get an extra set of eyes with a new compact roadside radar system. This technology aims to reduce blind spots and make driving safer when visibility is poor.

Researchers at Rice University have created EyeDAR, a low-power radar sensor about the size of an orange. It helps self-driving vehicles by capturing radar reflections that would otherwise go unnoticed, according to Rice University News.

Unlike conventional systems that rely solely on onboard cameras and lidar, EyeDAR is designed to be mounted on infrastructure such as streetlights and intersections. Interesting Engineering reported that the device acts as a second set of eyes that functions reliably where cameras and lidar often fail.

Moving the Intelligence to the Road

Rather than putting more complex computers in vehicles, the research team chose to add intelligence directly to the road.

“EyeDAR is an example of what I like to call ‘analog computing,’” said Kun Woo Cho, the Rice University postdoctoral researcher leading the project. Cho added, “Over the past two decades, people have been focusing on the digital and software side of computation, and the analog, hardware side has been lagging behind. I want to explore this overlooked analog design space.”

Cho, who works in the lab of Ashutosh Sabharwal, Rice’s Ernest Dell Butcher Professor of Engineering, introduced the technology at HotMobile, The International Workshop on Mobile Computing Systems and Applications held in Atlanta on Feb. 25–26.

Why Current Sensors Fall Short

Cho said that vehicle sensor systems have big limitations when the weather is bad or lighting is poor. Cameras and lidar, which are key parts of most self-driving cars, have trouble in rain, fog, and low light. Radar, on the other hand, is more reliable and can even spot objects through some obstacles.

Standard radar systems send out signals that bounce off nearby objects, but only a small amount of those signals make it back to the vehicle. Most radar waves scatter away, so self-driving systems only get part of the picture. This problem is even more serious when pedestrians step out from behind big vehicles or when cyclists come from tricky angles.

EyeDAR is designed to address that gap by capturing radar reflections that would otherwise be lost and relaying that directional information back to vehicles in real time. Cho described the system as effectively providing automotive radar systems with an additional set of eyes.

A Lens Inspired by the Human Eye

At the heart of EyeDAR’s design is a 3D-printed Luneburg lens inspired by the human eye.

The device has more than 8,000 tiny, uniquely shaped resin elements with different refractive properties. By arranging them carefully, the lens bends radar waves toward a focal point, so most of the direction-finding happens physically instead of digitally.

“Our lens consists of over 8,000 uniquely shaped, extremely small elements with a varying refractive index,” Cho explained.

Since the physical structure does most of the work, EyeDAR needs much less processing power than traditional radar systems that use big antenna arrays and complex algorithms.

In testing, the system resolved target directions more than 200 times faster than traditional radar designs, according to Rice University News.

Interesting Engineering similarly noted that EyeDAR resolved target directions 200 times faster than existing digital radar.

A “Talking Sensor” That Communicates Without New Signals

EyeDAR is designed not just for sensing, but also for communication.

Instead of sending out new radar signals, the device switches between absorbing and reflecting incoming waves. This creates patterns of 0s and 1s that can be understood by other systems.

“Like blinking Morse code,” Cho said. “EyeDAR is a talking sensor ⎯ it is a first instance of integrating radar sensing and communication functionality in a single design.”

This dual sensing-and-communication architecture allows EyeDAR to remain low-power and cost-effective, making large-scale deployment feasible.

Potential Urban Deployment

Thanks to its small size and low cost, researchers think EyeDAR units could be placed at many traffic lights, stop signs, and intersections.

If placed in the right spots, these sensors could spot hidden dangers, like a pedestrian behind a truck or a car coming from a blind spot, and send that information to self-driving systems below.

This system could be especially useful in crowded cities where it is often hard to see everything on the road.

Researchers also believe this technology could help drones, robots, and wearable devices. Networks of these sensors could share information, giving each device a bigger picture of its surroundings.

The research received partial funding from the National Science Foundation (2346550), according to Rice University.

As self-driving cars become more common, EyeDAR shows a new way of thinking. Safety might depend not just on smarter cars, but also on smarter roads.

Share
f 𝕏 in
Copied