SCIENTIFIC PUBLICATION #4 | DriverGaze360: OmniDirectional Driver Attention with Object-Level Guidance

A new scientific contribution from DFKI presents the paper “DriverGaze360: OmniDirectional Driver Attention with Object‑Level Guidance,” currently available as a preprint and accepted for presentation at CVPR 2026.

This work introduces DriverGaze360, the first large‑scale 360‑degree driver‑attention dataset, featuring around 1 million gaze‑labelled frames collected from 19 drivers in panoramic simulation scenarios. The paper also proposes DriverGaze360‑Net, a model capable of predicting both attention maps and attended objects across the full 360° field of view.

Explore the dataset and project materials here.

What Makes This Study Stand Out

Driver attention research has historically been limited to narrow forward‑facing datasets, leaving out critical behaviours such as attention shifts during lane changes, turns, and peripheral interactions. DriverGaze360 addresses this gap by enabling the study of gaze patterns across the entire driving scene, offering a far more realistic representation of how drivers allocate visual attention.

Key Achievements

  • A comprehensive 360° dataset, capturing gaze behaviour in a wide range of driving contexts
  • A panoramic model that jointly learns attention and object‑level information, enabled by an auxiliary segmentation head
  • State‑of‑the‑art performance on panoramic attention‑prediction tasks across multiple metrics

Why This Matters for BERTHA

DriverGaze360 is especially relevant for BERTHA because its dataset and modelling approach were used to build the project’s Perception Module in WP1, supporting the estimation of real‑world driver attention across complex, multi‑directional scenarios. Its scale and 360° scope enable the training of stronger AI perception models—an essential foundation for human‑centred automated‑driving behaviours.

Relevance for the EU and Future Systems

This research contributes valuable insights for:

  • Driver‑safety analysis and understanding of real attention patterns
  • Infrastructure planning in mixed human–AI road environments
  • The development of ADAS and automated‑driving systems that anticipate driver behaviour more effectively

Read the article here.

Acknowledgment: Research conducted under the BERTHA project (GA101076360), funded by the European Union. Views expressed are those of the authors and do not necessarily reflect those of the EU or CINEA.

Share to