By Ron Wilson, Editor-in-Chief, Altera Corporation
You are charging down the road late at night, barely on schedule, crowding your electronic log, your truck heavily loaded. The rain is intensifying. This is too fast, you think, but the car traffic around you isn’t slowing down.
Suddenly, you are snapped awake by an alarm. Instinctively you peer into the dark beyond the headlamp beams—nothing. But your rig has already begun an emergency stop. You glance at the dashboard display, and there, growing larger in the glowing red image, is an overturned van, blocking the lane just around the bend. Your truck stops itself, your headlights now illuminating the frantic efforts of rescuers to free the van’s driver.
Welcome to the world of radar-assisted driving.
Radar will play a crucial role in the evolution of advanced driver-assist systems (ADAS) in coming years. As its role expands, radar transceivers, signal processing, and the hazard-avoidance automation they enable will make vehicle ADAS systems look increasingly like the tactical systems on combat aircraft, influencing the foundations of automotive system design along the way.
Much of the discussion of ADAS has centered on passive vision systems using cameras at visual wavelengths. But Ralf Reuter, radar systems engineer at Freescale Semiconductor, argues persuasively for the role of 77-GHz radar. “One important point is that [of the ADAS sensing technologies] only radar is weather-independent,” Reuter observed in a recent interview. “And while cameras have an advantage in classifying objects, radar is distinctly better at detecting objects and measuring their speeds.” Reuter explained that for these reasons, many early systems that emphasize detection and risk assessment over classification will choose radar. “There is a requirement coming for emergency braking systems for trucks in Europe,” he pointed out. “It is radar based.”
Radar systems may start out with a simple, medium-range system that attempts to look straight down the road. But they will quickly evolve into multiple-sensor systems that attempt both long-range forward-looking and short-range 360-degree threat assessment, as seen in Figure 1.
Figure 1. Radar systems can both search ahead and watch the entire neighborhood of the vehicle.
Image courtesy of Freescale Semiconductor
Even as optical vision systems mature, the advantages of radar will keep it part of the mix. Reuter predicts that the complex, object-classifying, multiple-camera systems of the near future will fuse video with radar data to achieve their dynamic models of the outside world.
Capturing the Signal
Understanding the impact of radar on automotive system design begins with understanding the sensor. Most automotive radar designs favor the 24- or 77-GHz unregulated bands. Transmitters are usually frequency-modulated continuous-wave (FMCW) designs that emit "chirps": fast ramps in the frequency domain, as illustrated in Figure 2.
Figure 2. Vehicular radar will use a chirping style of FMCW.
Image courtesy of Freescale Semiconductor
“The big advantage of FMCW is the ease of interpreting the reflected signal,” Reuter explained. “You can read the range of an object directly from the reflected frequency, and work out the velocity from the Doppler shift.” CW is also less complex to generate, and easier and more reliable to interpret than pulsed modulation schemes. These are vital considerations for vehicle manufacturers, who imagine every Euro spent on improved ADAS coming directly out of corporate earnings.
Receiving the signal also requires thrift and ingenuity. Collecting the reflected signal in a way that captures azimuth information requires either a mechanical scanning antenna or a phased-array antenna coupled with digital beam-forming algorithms. Behind the antenna there will typically be a homodyne receiver with as many channels as are required for the antenna design—one for a simple rotating antenna, up to 16 for an array.
“The output of a receiver is a baseband signal in the DC-20-MHz band,” Reuter said. A system designed to achieve good azimuth resolution may have 8 to 16 channels, requiring 8 to 16 high-speed analog-to-digital converters (ADCs.)
Extracting Information from Chirps
The digital baseband signal from each channel flows into a fast Fourier transform (FFT) block, conducting transforms with lengths up to about 2K samples. “In the past, executing the FFTs has meant lots of FPGAs,” Reuter said. “Today, the trend is toward 32-bit microcontrollers with integrated floating-point DSP accelerators.” A beam-forming system may use two banks of FFTs to extract both range and velocity data from the signal, as seen in Figure 3.
Figure 3. An FFT configuration for both beam-forming and range-and-velocity estimation.
Image courtesy of Freescale Semiconductor
Essentially, this front-end processing processes the—possibly multiple—incoming FMCW analog channels to a single digital stream of azimuth/range/velocity tuples. This data stream flows into one or more CPU cores where software, possibly supported by additional accelerators, attempts to infer the presence, location, and nature of objects surrounding the vehicle.
“You want to identify targets, separate them from the background, and choose the most critical ones,” Reuter explained. “You may be dealing with up to 200 objects, so the computation—especially for extracting angle information—can become very complex.”
The physical configuration of the system changes with growing complexity. Today, Reuter said, there is minimal processing in the sensor itself. Instead, there is a proprietary analog interface to the ADCs, a proprietary digital interface to the signal-processing hardware responsible for the FFTs, and yet another to the microcontroller where the objects are extracted and classified. Information about the objects then goes onto the vehicle’s controller area network (CAN) or FlexRay bus, headed to the central CPU cluster for interpretation and analysis.
This entire pipeline has challenging bandwidth and latency requirements. Reuter said that the CPU’s interpretation of the data will usually be presented to the driver as a graphic display, often a heads-up display that the driver must blend with what she is seeing through the windshield. That blending mandates at most a 50-ms update interval and—perhaps more challenging—a 50-ms maximum latency. Otherwise the graphic image will be distractingly jerky, or will visibly lag behind the view through the windshield, potentially causing the driver to err.
As the system evolves from one sensor to beam-forming with multiple sensors to fusion with camera data, the interconnect architecture changes as well. “There is a desire to use Ethernet to save money,” Reuter said. But the system still has to operate in real time, raising issues about how to ensure real-time deadlines over Ethernet.
A Future Full of Interference
The case for vehicular radar is very solid, as long as no one else is using it. But therein lies an obvious problem: As the number of vehicles using radar increases, the units will inevitably begin to interfere with each other. Reuter said that you can change the modulation speed to reduce interference or, ultimately, replace the simple frequency ramp with a coded frequency-hopping pattern, so that each vehicle can recognize its own chirp. Such changes would preserve the architecture and most of the hardware of earlier systems, but would complicate extraction and analysis by requiring each system to identify reflections of signals originating only from its own antennas.
In addition, there is class of problems less easily solved: interference by or with fixed installations. “One immediate problem is that tunnels in Europe use high-powered radar to recognize traffic,” Reuter warned. “Their transmitters will blind the vehicle radars.”
A more scientifically sensitive issue is that important parts of the emission spectra of astronomical objects lie in the 77-GHz band. The incessant twitter of vehicular radars could end radio astronomy anywhere near populated areas.
“There is already legislation in Japan that requires turning off transmitters within 1,000 km of a radio telescope,” Reuter remarked. “That would cover pretty much the entire country.”
Chirping modulators, digital beam-forming, target identification, threat assessment, defensive frequency hopping, jamming—it all sounds like a scenario for a new combat aircraft, not for a lorry or a Lexus. And, in fact, ADAS will inherit a lot of technology from military systems. Not surprising, when the weather closes in and traffic gets dense, it’s a war out there.