HDR-Eye: dataset of high dynamic range images with eye tracking data

Introduction

Vsual attention information is often required in many image and video applications such as gaze adaptive compression, objective quality metrics, and image retrieval. However, the influence of HDR image on human visual attention is not yet well understood. We address this problem by creating a publicly available dataset of 46 HDR and corresponding LDR images with varying regions of interests, scenes, and dynamic range.

Image contents

HDR-Eye dataset was created by combining nine bracketed images acquired with several cameras, including Sony DSC-RX100 II, Sony NEX-5N, and Sony alpha 6000, with different exposures settings (-2.7, -2, -1.3, -0,7, 0, 0,7, 1,3, 2, 2.7 [EV]). We also used several images (obtained with Nikon D70 camera) from PEViD-HDR dataset [1] and a few images from the other existing datasets1,2.

The resulted dataset contains 46 images that cover a wide variety of content, e.g., natural scenes (both indoor and outdoor), humans, stained glass, sculptures, historical buildings, etc.

Sample images from HDR-Eye dataset

LDR version                                       HDR version*

 LDR version   

*The HDR images are tone-mapped for the representation on standard monitors.

[1] P. Korshunov, H. Nemoto, A. Skodras, and T. Ebrahimi, “Crowdsourcing-based Evaluation of Privacy in HDR Images,” in SPIE Photonics Europe 2014, Optics, Photonics and Digital Technologies for Multimedia Applications, Brussels, Belgium, Apr. 2014.

1EMPA HDR images dataset, http://www.empamedia.ethz.ch/hdrdatabase/index.php

2‘Tears of Steel’ short film, https://media.xiph.org/mango/

Eye tracking experiments

The eye tracking experiments were conducted at the MMSPG test laboratory. The laboratory environments and the viewing conditions were set to fulfill recommendations ITU-R BT.500-13 [2] and ITU-R BT.2022 [3], respectively. Please see the following table for the specific information on the experimental conditions.

Category Details Specification
Participants Number 20
Age range (average age) 18-56 (25.3)
Screening Snellen and Ishihara charts
Viewing conditions Environment Laboratory
Illumination 20 [lux]
Color temperature 6500 [K]
Viewing distance 1.89 [m]
Task Free-viewing
Display Manufacturer SIM2
Model SHDR47E S K4
Type LCD
Size 47 [inch]
Resolution 1920 × 1080 [pixels]
Angular resolution 60 [pixel/degree]
Eye tracker Manufacturer Smart Eye
Model Smart Eye Pro 5.8
Mounting position 0.7 [m] from the display
Sampling frequency 60 [Hz]
Accuracy < 0.5 [degree]
Calibration points 5 points on screen
Image presentation Presentation order Random
Presentation time 12 [s]
Grey-screen duration 2 [s]

[2] ITU-R BT.500-13, “Methodology for the subjective assessment of the quality of television pictures,” International Telecommunication Union, January 2012.

[3] ITU-R BT.2022, “General viewing conditions for subjective assessment of quality of SDTV and HDTV television pictures on flat panel displays,” International Telecommunication Union, August 2012.

Fixation density maps

The fixation density maps (FDMs) were computed by convolving the recorded gaze points with a Gaussian filter, and then normalizing the result to values between 0 and 1. Only gaze points corresponding to fixation points were used for computing a FDM. The standard deviation of the Gaussian filter used for computing the FDMs was set to 1 degree of visual angle, which corresponds to

Download

You can download all image files, lists of fixation points and fixation density maps from the following FTP (please use dedicated FTP clients, such as FileZilla or FireFTP):

FTP address: tremplin.epfl.ch
Username: [email protected]
Password: ohsh9jah4T
FTP port: 21

After you connect, choose the HDREye folder from the remote site, and download the relevant material. The total size of the provided data is ~1.5 GB.

If you use the HDR-Eye dataset in your research, we kindly ask you to reference the following paper and URL link of this website:

H. Nemoto, P. Korshunov, P. Hanhart and T. Ebrahimi. Visual attention in LDR and HDR images. 9th International Workshop on Video Processing and Quality Metrics for Consumer Electronics (VPQM), Chandler, Arizona, USA, 2015.

URL link: http://www.epfl.ch/labs/mmspg/hdr-eye

You may also check the above paper for some other helpful information.

In case of any problems or questions, please send an email to [email protected]

Permission is hereby granted, without written agreement and without license or royalty fees, to use, copy, modify, and distribute the data provided and its documentation for research purpose only. The data provided may not be commercially distributed. In no event shall the Ecole Polytechnique Fédérale de Lausanne (EPFL) be liable to any party for direct, indirect, special, incidental, or consequential damages arising out of the use of the data and its documentation. The Ecole Polytechnique Fédérale de Lausanne (EPFL) specifically disclaims any warranties. The data provided hereunder is on an “as is” basis and the Ecole Polytechnique Fédérale de Lausanne (EPFL) has no obligation to provide maintenance, support, updates, enhancements, or modifications.