8/6/2020 Allie Arp, CSL
Written by Allie Arp, CSL
While a picture may be worth a thousand words, it doesn’t always tell the whole story. When it comes to autonomous vehicles, cameras are often used to gauge the environment and potential obstacles around the car, but these cameras have limitations. That’s why CSL students Spencer Markowitz and Teck-Yian Lim are looking at how radar, generally associated with aircraft, could be used to improve autonomous vehicle safety.
“Most people think radar belongs in military defense systems, but recent innovations have allowed people to make radar technology really small, which means we can put them just about anywhere,” said Markowitz, an electrical and computer engineering student. “We’re trying to create detectors based on radar data to complement cameras on autonomous vehicles.”Cameras have several limitations. They can’t see in the dark or through opaque objects, and they can’t reliably determine the velocity of moving objects. These limitations could lead to dangerous situations on the road. Similar to a camera, radar can detect objects such as bikes, other cars, or pedestrians around a vehicle, but radar has additional capabilities such as determining the direction an object is moving and how quickly. Knowing how objects in the environment will and are moving in relation to the vehicle is crucial for safe autonomous technology.
In order to make radar useful for autonomous vehicles, Markowitz and Lim are creating machine learning algorithms that can identify a large variety of objects commonly found on roads today, such as distinguishing between a sign and a person, a moving car and a parked car, or a bicycle and a motorcycle.
“We’re approaching this problem by using supervised learning, which means we give our models lots of labeled examples of radar data,” said Markowitz. “We tell the system that this blob is a human, this blob is a bicycle, and once we give our model enough of these examples, we hope we can train it to identify new examples on its own.”
In order to get enough of these examples to train the radar, Markowitz and Lim are collecting data from a variety of locations, including a Meijer shopping center.
“We needed a mixture of people and cars together, and one great place for that is a parking lot,” said Markowitz.
In order to make his experiment more similar to realistic traffic conditions, the group’s radar implement also needed to be moving, to simulate the conditions of a moving car. Since the duo couldn’t attach their radar mount to a car, Markowitz attached it to a shopping cart and pushed it through the parking lots. Conducting the experiences at Meijer, and various locations around campus, allowed the group to get hours of footage of cars and humans interacting for their analysis.
In addition to collecting radar data, the group is also working to automate the process of labeling the data by synchronizing the radar, a camera, and a depth sensor. This will allow them to use already advanced detection algorithms to help label the radar data.
“We humans are only familiar mostly with vision perception, which uses a tiny visible-wavelength part of the broad electromagnetic spectrum,” said Do, professor of ECE. “Radar significantly broadens the spectrum usage for perception, especially with potential of cross-utilization of technology in the millimeter wave range of the incoming 5G communication.”
This work is supported in part by Texas Instruments and a DSO Postgraduate Scholarship.