Why Audi’s zFAS is a blueprint for next-gen domain architectures

March 16, 2016 // By Christoph Hammerschmidt
In Audi’s future car generations, a central computer assumes many if not all of the various tasks associated to the diversity of driver assistance systems. The zFAS (central driver assistance controller) greatly reduces the number of electronic control units – a concept that will set a precedent for other carmakers and for future domain architectures.

The zFAS, scheduled to enter series production in the next version of the A8 top-class sedan due end of 2017, unites multiple computing tasks on one powerful main board. Its most demanding role is sensor fusion. Here, the signals from multiple sensors like stereo cameras, radar, multi-axis acceleration sensors and, if applicable, lidar sensors are merged and transformed into a 360-degree digital environmental model which in turn is used by all the driver assistance systems including those responsible for autonomous driving to compute their respective action. “We need the zFAS for piloted driving in series,” an Audi spokesperson acknowledges.

 

In the zFAS therefore multiple microprocessors and microcontrollers share the workload. Basically, an application processor handles the compute-intensive image processing and low-level data fusion tasks; the host processor is responsible for the safety-critical aspects like object fusion, decision making and vehicle communication as shown in the block diagram. 

 

Next-gen semi automated driving

Fig. 1: More computing power than all ECUs in today’s cars combined: Generic view of a domain controller for Driver Assistance Systems that make use of sensor data fusion. Source: Infineon

It is known that Audi has a special relationship with Nvidia and therefore uses processors like the Tegra K1 for most tasks associated to graphical computing. But since the various sensors in the vehicle – front cameras, surround cameras radar etc – generate such a huge amount of data, the carmaker’s design engineers have chosen to employ two processors with the task of processing the sensor data – a Tegra K1 and a device from Mobileye, the EyeQ3 SoC. The Tegra is dedicated to processing the data from the four surround cameras; these data are used to assist the driver during parking. In the current version of the zFAS that will enter series production, the more time-critical data from the stereo front camera are fed to the EyeQ3. In addition, it will handle the data from the driver monitoring camera, another requirement for piloted driving.