A robot that can check every corner using wireless signals and AI

Penn engineers have developed a system that allows robots to see around turns using radio waves processed by AI. This is a feature that improves the safety and performance of self-driving cars and robots operating in cluttered indoor environments such as warehouses and factories.

system called holo radarallows the robot to reconstruct three-dimensional scenes that are outside of its direct line of sight, such as pedestrians turning a corner. Unlike previous approaches to non-line-of-sight (NLOS) recognition that rely on visible light, HoloRadar operates reliably in the dark and under various lighting conditions.

“Robots and self-driving cars need to look beyond what’s right in front of them,” he says. Chao Minminassistant professor of computer and information science (CIS) and senior author of the following paper: Paper describing HoloRadarpresented at the 39th Annual Conference. Neural information processing system (NeurIPS). “This capability is essential for enabling robots and self-driving cars to make safer decisions in real time.”

From left: Zitong Lan, Haowen Lai, Mingmin Zhao. (Credit: Sylvia Chan)

turn the wall into a mirror

At the heart of HoloRadar is a counterintuitive insight into radio waves. Compared to visible light, wireless signals have much longer wavelengths, which limits their resolution, and this property has traditionally been considered a disadvantage for imaging. Zhao’s team found that longer wavelengths are actually advantageous for monitoring around corners.

“Because radio waves are much larger than small changes in the surface of a wall,” he says. Lai Haowen“Those surfaces effectively become mirrors that reflect radio signals in a predictable way,” says CIS PhD student and co-author of the new paper.

In practice, this means that flat surfaces such as walls, floors, and ceilings can reflect radio signals around corners, conveying information to the robot about hidden spaces. HoloRadar captures these reflections and reconstructs what lies beyond your direct line of sight.

“This is similar to how human drivers rely on mirrors placed at intersections with poor visibility,” Lai said. “HoloRadar uses radio waves, so the environment itself is filled with mirrors without actually changing the environment.”

HoloRadar utilizes compact and agile scanning equipment, enabling real-world applications. (Credit: Sylvia Chan)

Designed for field operation

In recent years, other researchers have demonstrated system and resemble abilityUsually visible light is used. These systems analyze shadows and indirect reflections and are therefore highly dependent on lighting conditions. other attempt Using wireless signals relied on slow and bulky scanning equipment, limiting real-world applications.

“HoloRadar is designed to work in environments where robots would actually operate,” Zhao said. “The system is mobile, runs in real time, and does not rely on controlled lighting.”

HoloRadar enhances the safety of autonomous robots by complementing, rather than replacing, existing sensors. Self-driving cars already use LiDAR, a sensing system that uses lasers to detect objects within a vehicle’s direct line of sight, but HoloRadar adds a layer of awareness by revealing what these sensors can’t see, giving machines more time to react to potential hazards.

AI-powered radio processing

A single radio pulse can reflect multiple times before returning to the sensor, creating a complex set of reflections that are difficult to disentangle using only traditional signal processing methods.

To solve this problem, the team developed a custom AI system that combines machine learning and physically-based modeling. In the first stage, the system improves the resolution of the raw radio signal and identifies multiple “returns” corresponding to different reflection paths. In the second stage, the system uses a physically guided model to track these reflections backwards, undoing the mirror-like effect of the environment and reconstructing the actual 3D scene.

“In some ways, this challenge is like walking into a room full of mirrors,” he says. Lan Zitonga PhD student in Electrical Systems Engineering (ESE) and co-author of the paper. “You see many copies of the same object appearing in different locations, but the challenge is figuring out where the object actually is. Our system learns how to reverse that process in a physics-based way.”

By explicitly modeling how radio waves reflect off surfaces, AI can distinguish between direct and indirect reflections and determine the correct physical location of various objects, including people.

HoloRadar works by reconstructing 3D scenarios from radio wave reflections. (Credit: WAVES Lab)

From the laboratory to the real world

The researchers tested HoloRadar on a mobile robot in real indoor environments, such as hallways and building corners. In these settings, the system successfully reconstructed walls, hallways, and hidden figures outside the robot’s line of sight.

Future research will investigate outdoor scenarios such as intersections and urban roads where longer distances and more dynamic situations pose additional challenges.

“This is an important step in giving robots a more complete understanding of their surrounding environment,” Zhao says. “Our long-term goal is to enable machines to operate safely and intelligently in the dynamic and complex environments that humans move through every day.”

This study Wireless, Audio, Vision, and Sensing Electronics (WAVES) Lab He received his Ph.D. from the School of Engineering and Applied Sciences at the University of Pennsylvania and was supported by the University of Pennsylvania.

Latest Update