Vsual attention information is often required in many image and video applications such as gaze adaptive compression, objective quality metrics, and image retrieval. However, the influence of HDR image on human visual attention is not yet well understood. We address this problem by creating a publicly available dataset of 46 HDR and corresponding LDR images with varying regions of interests, scenes, and dynamic range.
HDR-Eye dataset was created by combining nine bracketed images acquired with several cameras, including Sony DSC-RX100 II, Sony NEX-5N, and Sony alpha 6000, with different exposures settings (-2.7, -2, -1.3, -0,7, 0, 0,7, 1,3, 2, 2.7 [EV]). We also used several images (obtained with Nikon D70 camera) from PEViD-HDR dataset  and a few images from the other existing datasets1,2.
The resulted dataset contains 46 images that cover a wide variety of content, e.g., natural scenes (both indoor and outdoor), humans, stained glass, sculptures, historical buildings, etc.
Sample images from HDR-Eye dataset
LDR version HDR version*
*The HDR images are tone-mapped for the representation on standard monitors.
 P. Korshunov, H. Nemoto, A. Skodras, and T. Ebrahimi, “Crowdsourcing-based Evaluation of Privacy in HDR Images,” in SPIE Photonics Europe 2014, Optics, Photonics and Digital Technologies for Multimedia Applications, Brussels, Belgium, Apr. 2014.
Eye tracking experiments
The eye tracking experiments were conducted at the MMSPG test laboratory. The laboratory environments and the viewing conditions were set to fulfill recommendations ITU-R BT.500-13  and ITU-R BT.2022 , respectively. Please see the following table for the specific information on the experimental conditions.
|Age range (average age)||18-56 (25.3)|
|Screening||Snellen and Ishihara charts|
|Color temperature||6500 [K]|
|Viewing distance||1.89 [m]|
|Model||SHDR47E S K4|
|Resolution||1920 × 1080 [pixels]|
|Angular resolution||60 [pixel/degree]|
|Eye tracker||Manufacturer||Smart Eye|
|Model||Smart Eye Pro 5.8|
|Mounting position||0.7 [m] from the display|
|Sampling frequency||60 [Hz]|
|Accuracy||< 0.5 [degree]|
|Calibration points||5 points on screen|
|Image presentation||Presentation order||Random|
|Presentation time||12 [s]|
|Grey-screen duration||2 [s]|
Fixation density maps
The fixation density maps (FDMs) were computed by convolving the recorded gaze points with a Gaussian filter, and then normalizing the result to values between 0 and 1. Only gaze points corresponding to fixation points were used for computing a FDM. The standard deviation of the Gaussian filter used for computing the FDMs was set to 1 degree of visual angle, which corresponds to
You can download all image files, lists of fixation points and fixation density maps from the following FTP (please use dedicated FTP clients, such as FileZilla or FireFTP):
If you use the HDR-Eye dataset in your research, we kindly ask you to reference the following paper and URL link of this website:
H. Nemoto, P. Korshunov, P. Hanhart and T. Ebrahimi. Visual attention in LDR and HDR images. 9th International Workshop on Video Processing and Quality Metrics for Consumer Electronics (VPQM), Chandler, Arizona, USA, 2015.
URL link: http://mmspg.epfl.ch/hdr-eye
In case of any problems or questions, please send an email to hiromi.nemoto (at) epfl.ch
Permission is hereby granted, without written agreement and without license or royalty fees, to use, copy, modify, and distribute the data provided and its documentation for research purpose only. The data provided may not be commercially distributed. In no event shall the Ecole Polytechnique Fédérale de Lausanne (EPFL) be liable to any party for direct, indirect, special, incidental, or consequential damages arising out of the use of the data and its documentation. The Ecole Polytechnique Fédérale de Lausanne (EPFL) specifically disclaims any warranties. The data provided hereunder is on an “as is” basis and the Ecole Polytechnique Fédérale de Lausanne (EPFL) has no obligation to provide maintenance, support, updates, enhancements, or modifications.