Introduction

The recent VR/AR applications require some way to evaluate the Quality of Experience (QoE) which can be described in terms of comfort, acceptability, realism and ease of use. In order to assess all these different dimensions, it is necessary to take into account the user’s senses and in particular, for A/V content, the vision. Understanding how users watch a 360° image, how they scan the content, where they look and when is thus necessary to develop appropriate rendering devices and create good content for consumers. The goal of this data-base is to help solve those questions and design good models.

The dataset contains 60 images (360°), along with eye tracking data provided as scan-path and saliency maps and collected from 48 different observers. The data has been produced by Technicolor and by the University of Nantes (LS2N laboratory) in 2017. It has been described in several publications. A detailed description of the database and benchmark can be found on our Data Description. The license conditions are mentioned on the Download page.
 

Acknowledgments

We would like to thank the different users who participated to this capture.
The creation of this benchmark has also been supported, in part, by: PROVISION, a Marie-Curie ITN of the European Commission.
 

Citing the Database

Please cite the following paper in your publications making use of the Salient360! database:

Yashas Rai, Patrick Le Callet and Philippe Guillotel. “Which saliency weighting for omni directional image quality assessment?”. In Proceedings of the IEEE Ninth International Conference on Quality of Multimedia Experience (QoMEX’17). Erfurt, Germany, pp. 1-6, June 2017.

Yashas Rai, Jesús Gutiérrez, and Patrick Le Callet. 2017. “A Dataset of Head and Eye Movements for 360 Degree Images“. In Proceedings of the 8th ACM on Multimedia Systems Conference (MMSys'17). ACM, New York, NY, USA, pp. 205-210, June 2017.

 

Links

Detailed description of database contents

Dataset Download