Automated driving made easy


Camera, radar, lidar: Deep Dive intot the technology.

This article was originally published in the Daimler blog.

A lot of information is already available about automated driving — concerning greater safety in road traffic, leisure time in your car, and new opportunities that don’t require a driving license. But how many people know the technical details that make automated driving possible in the first place?

7 min reading time

by Laura Bürkle, Editor
published on August 20, 2018

My name is Laura Bürkle. I’m studying for my bachelor’s degree, and I write for the Daimler blog. I’m an absolute layperson when it comes to automobile technology, but I recently made a short “deep dive” into this technology. At the testing center in Immendingen, researchers are closer to the automobile of the future than anywhere else. This is where today’s unimaginable possibilities no longer seem quite so impossible. That’s because automated driving is making an even bigger leap than one from one continent to another — the leap from test operation to being ready for series production.

Cooperation between Bosch and Daimler

Automated driving is the greatest challenge the automotive industry has faced so far. The aim is to enable the car of the future to “think” in roughly the same way as the human brain — and that’s a fairly complex process. The car has to understand how road traffic works, and it has to operate correctly and independently at all times.

You could compare it with the training of a soccer player. First he learns the various techniques that he must later apply correctly during a game and, ideally, use to score a goal. Unfortunately, the German national team didn’t do that very often in the recent World Cup. But if the automobile of the future makes a mistake, it won’t simply have another chance in four years. It must always flawlessly implement what it has learned and guarantee absolute safety at all times.

Daimler and Bosch are currently tackling this challenge. For decades, they have had the automotive expertise that enables them to reliably develop innovations to the point of series production and market launch. Daimler provides the necessary prototypes and testing facilities, while Bosch is responsible for developing the components, such as sensors and control units.

In this collaboration, employees from Daimler and Bosch sit together at a desk and apply all of their knowledge to advance the future of mobility. Within a few years the combination of three types of sensor — cameras, radar, and lidar — should enable cars to (almost) dispense with the driver and make travelling by car a leisure activity.

Of red pedestrians and blue cars

The camera is one of the three important components that are essential to automated driving. It constitutes the “eyes” of the car. In order to really understand the function and the importance of the camera, I’ve been privileged to sit in a test vehicle, which is actually more technology than car.

Markus Braun, a specialist for pattern recognition/camera and driver of the test vehicle, explains to me that the camera system mainly focuses on the more vulnerable road users: pedestrians, bike riders, and motorcyclists. Of course the self-driving car must also keep an eye on all the other objects, such as cars, streetlights, and traffic lights.

In order to train the system as effectively as possible, it was tested in the road traffic of two different countries, Germany and Italy. The images captured there serve as the data from which the car’s neural network learns. This is a typical application of artificial intelligence.

On the display, all of the objects moving in front of us have been assigned different colors. The cars are blue, and the pedestrians are red. Almost 20 images are projected per second — a few less than the 25 images per second that are perceived by the human eye.

The predictive camera

The vehicle has learned quite a lot in recent years. Its camera can now detect people behind parked cars, trash cans or streetlights and distinguish individual pedestrians separately from others. A few years ago, the camera could only perceive groups of people. This means that it’s worthwhile to train the system. The algorithm is constantly improving itself, in the same way that a soccer player improves if he gets the right training and thus develops better game tactics.

But it gets even better: Today the car, with the help of the camera, can even anticipate what a pedestrian intends to do next. By registering the pedestrian’s posture and head movements, the car can calculate in which direction he or she will move and whether he or she will soon cross the street. It’s quite a challenge to determine whether a pedestrian will stay put or take advantage of the last opportunity to cross the street.

Nonetheless, our systems can already precisely identify pedestrians and analyze their intentions at a distance of up to 50 meters. In some cases they can even do so at greater distances — and this performance does not decrease in conditions of poor visibility due to rain or darkness.

If the camera is still prevented from conducting its analysis — for example, by glare from an oncoming car — it is supported by a second type of sensor: radar.

Radar’s unobstructed view

For automated driving, radar is just as important as the camera. It’s located in the car’s bumpers, and it emits electromagnetic waves that supply information about the objects in front of it. Today the test vehicles are equipped with as many as eight radar sensors in order to cover the car’s entire surroundings. What human driver can claim to have eight eyes?

Radar can distinguish between static and dynamic objects — in other words, moving vehicles and parked vehicles. Today it can also precisely register the shape of objects. Because radar can “see through” objects such as cars by looking under them, the system can estimate the length and width of cars driving ahead and quickly detect people behind cars. Our human eyesight can’t compete with this kind of “vision.”

I can hardly imagine how such a tiny component can do something so important and thus make the driving of the future so much safer.

On the road to accident-free driving with lidar

In order to optimally perceive the car’s surroundings and plan ahead even more precisely, the third type of sensor, lidar, makes it possible to precisely measure all distances, as well as the reflectivity of objects, with centimeter precision. Instead of electromagnetic waves, lidar uses laser pulses, which have a shorter range than radar but produce more precise images.

Various scanners are used to measure several different distances. Because this process uses infrared light, it cannot be perceived by the human eye. There’s only one problem: Lidar cannot perceive colors. Nonetheless, the approximately 10,000 measurements it makes per second give the automated vehicle precise information. As a result, lidar plays an important role on the road to accident-free driving.

Staying reliably in its lane

In another test vehicle, I’m allowed to find out whether a drive in an automated car on a bumpy road with sharp curves is always accident-free. In the future, detection by sensors will enable vehicles to safely remain in their lane and stay on the road in spite of wind or other hindrances.

But for now, the task of the vehicle in front of me is to stay in its lane on the test course it has already driven — without the driver’s help, of course. The test drivers Kuhn and Socher explain to me that, thanks to GPS control, the timestamps and positions along the test course could be precisely calculated so that the car can drive the course precisely and independently the second time around. Then, off we go!

Via a laptop installed in its interior, the car receives the command to start up, and it then drives along the designated course with absolute precision. It drives fairly fast, and I manage to prevent my colleague’s single-lens reflex camera from falling just in time.

The test driver’s hands are near the steering wheel so that he can intervene in case of an emergency — but this isn’t necessary. In spite of the bumpy road surface, the car stays securely in its lane without even grazing the traffic cones that have been set up in a compact row.

Two are better than one

Apropos safety, the car of the future has not only a central control unit but also two steering and braking systems that are regulated by independent control units. This is a fundamental requirement for automated driving, because if one steering or braking system fails the other one can take over.

The combination of the three types of sensors I’ve described enables the automated vehicle to optimally analyze its surroundings. The different positioning of the camera, radar, and lidar show the car various perspectives that are combined into a comprehensive picture.

I’m impressed, and also completely astonished by all the technology. At Daimler we’re lucky to have such outstanding developers, who have securely installed all of this information and huge volumes of data in the car. I climb into my borrowed Mercedes-Benz, and as I drive back to my workplace in Untertürkheim I wish that my car were already capable of automated driving…

Laura Bürkle

Laura Bürkle is studying for her bachelor’s degree and wrote for the Daimler Blog. She always enjoys covering new and exciting stories for the blog.

More about the author