System analyses in-car activities for next-gen ADAS

August 03, 2016 // By Christoph Hammerschmidt
Today’s cars have the capabilities to exactly analyze their environment. The interior however is left out to the greatest possible extend. A novel system detects number, size and position of the occupants and recognizes what they are and where they direct their attention. Thus, it could enable a new class of driver assistant systems, in particular those related to automated driving.

Eyes on the road, always ready to (re)act – this is the way car drivers today are sitting at the wheel. In the future this might change: If cars that steer and bake autonomously the driver can lay back texting or turn around to the kids on the back seat. It also would be possible that the car itself offers enhanced infotainment options, opening up new possibilities for the driver to spend his time. However, while numerous sensors are available that analyze the car’s surroundings, such systems for the vehicle’s interior are missing.

Within their project Intelligent Car Interieur (InCarIN), researchers from Fraunhofer Institutes IOSB in Karlsruhe and IAO in Stuttgart along with developers from Volkswagen, Bosch and Visteon have developed a system that analyses the interior. “Our sensors scan the entire interior” explains IOSB group manager Michael Voit. “Through depth cameras we can identify the number of occupants along with their size and their sitting posture. From this information we deduce their activities”.

The project’s long-term goal is the development of new driver assistance systems. Such driver assistance systems could become relevant under the conditions of partially automated driving. If, for example, the driver turns his head around towards the children on the backseat, the system could on the fly depict a camera’s image of the rear seat and what is happening there. Thus, the driver could immediately direct his eyes to the street and yet watch the children’s activities. “Through its sensors the system can estimate how long the driver will need to completely reassume the control over the car after a phase of automated driving,” explains Frederik Diederichs, project manager at Fraunhofer IAO.

Based on information on the exact position of the rear seat passengers and their size it also would be possible to adapt airbag deployment to the individual’s size. By analyzing the person’s body position it also could adapt to specific situations: For instance, if the front seat passenger has put his legs onto the dashboard, it could prevent that the airbag fully deploys if in such a situation a crash occurs.