ctrlnum 0.17632-cfm4d9y7bh.1
fullrecord <?xml version="1.0"?> <dc><creator>Kar, Anuradha</creator><title>NUIG_EyeGaze01(Labelled eye gaze dataset)</title><publisher>Mendeley</publisher><description>The NUIG_EyeGaze01(Labelled eye gaze dataset) is a rich and diverse gaze dataset, built using eye gaze data from experiments done under a wide range of operating conditions from three user platforms (desktop, laptop, tablet) . Gaze data is collected under one condition at a time. The dataset includes gaze (fixation) data collected under 17 different head poses, 4 user distances, 6 platform poses and 3 display screen size and resolutions. Each gaze data file is labelled with the operating condition under which it was collected and has the name format: USERNUMBER_CONDITION_PLATFORM.CSV CONDITION: RP- Roll plus in degree PP- Pitch plus in degree YP- Yaw plus in degree RM- Roll minus in degree PM-Pitch minus in degree YM- Yaw minus in degree 50, 60, 70, 80: User distances PLATFORM: desk- Desktop, lap- Laptop, tab- Tablet Desktop display: 22 inch, 1680 x1050 pixels Laptop display: 14 inch, 1366x 768 pixels Tablet display: 10.1 inch 1920 x 800, pixels Eye tracker accuracy: 0.5 degrees (for neutral head and tracker position) The dataset has 3 folders called &#x201C;Desktop&#x201D;, &#x201C;Laptop&#x201D;, &#x201C;Tablet&#x201D; containing gaze data from respective platforms. The Desktop folder has 2 sub-folders: user_distance and head_pose. These have data for different user distances and head poses (neutral, roll, pitch, yaw )measured with desktop setup. The Tablet folder has 2 sub-folders: user_distance and tablet_pose,. These have data for different user distances and tablet+tracker poses (neutral, roll, pitch, yaw) measured with tablet setup . The Laptop folder has one sub-folder called user_distance which has data for different user distances, measured with laptop setup. All data files are in CSV format. Each file contains the following data header fields: ("TIM REL","GTX", "GTY","XRAW", "YRAW","GT Xmm", "GT Ymm","Xmm", "Ymm","YAW GT", "YAW DATA","PITCH GT", "PITCH DATA","GAZE GT","GAZE ANG", "DIFF GZ", "AOI_IND","AOI_X","AOI_Y","MEAN_ERR","STD ERR") The meanings of the header fields are as follows: TIM REL: relative time stamp for each gaze data point (measured during data collection) "GTX", "GTY": Ground truth x, y positions in pixels "XRAW", "YRAW": Raw gaze data x, y coordinates in pixels "GT Xmm", "GT Ymm": Ground truth x, y positions in mm "Xmm", "Ymm": Gaze x, y positions in mm "YAW GT", "YAW DATA": Ground truth and estimated yaw angles "PITCH GT", "PITCH DATA": Ground truth and estimated pitch angles "GAZE GT","GAZE ANG": Ground truth and estimated gaze angles "DIFF GZ": Gaze angular accuracy "AOI_IND","AOI_X","AOI_Y": Index of the stimuli locations and their x, y coordinates "MEAN_ERR","STD ERR": Mean and standard deviation of error at the stimuli locations For more details on the purpose of this dataset and data collection method, please consult the paper by authors of this dataset : Anuradha Kar, Peter Corcoran: Performance Evaluation Strategies for Eye Gaze Estimation Systems with Quantitative Metrics and Visualizations. Sensors 18(9): 3151 (2018) </description><subject>Computer Vision</subject><subject>Human-Computer System Performance Evaluation</subject><subject>Consumer Electronics</subject><subject>Machine Learning</subject><subject>Machine Learning Algorithm</subject><subject>Cognitive Science</subject><subject>Augmented Reality</subject><subject>Virtual Reality</subject><subject>Clustering</subject><subject>Eye</subject><subject>Cognitive Vision</subject><subject>Information Classification</subject><subject>Human Machine Interaction</subject><subject>Vision</subject><subject>Modelling</subject><subject>Performance Evaluation</subject><subject>Model Evaluation</subject><contributor>Corcoran, Peter</contributor><type>Other:Dataset</type><identifier>10.17632/cfm4d9y7bh.1</identifier><rights>Attribution-NonCommercial 3.0 Unported</rights><rights>https://creativecommons.org/licenses/by-nc/3.0</rights><relation>https:/data.mendeley.com/datasets/cfm4d9y7bh</relation><date>2019-02-27T03:59:05Z</date><recordID>0.17632-cfm4d9y7bh.1</recordID></dc>
format Other:Dataset
Other
author Kar, Anuradha
author2 Corcoran, Peter
title NUIG_EyeGaze01(Labelled eye gaze dataset)
publisher Mendeley
publishDate 2019
topic Computer Vision
Human-Computer System Performance Evaluation
Consumer Electronics
Machine Learning
Machine Learning Algorithm
Cognitive Science
Augmented Reality
Virtual Reality
Clustering
Eye
Cognitive Vision
Information Classification
Human Machine Interaction
Vision
Modelling
Performance Evaluation
Model Evaluation
url https:/data.mendeley.com/datasets/cfm4d9y7bh
contents The NUIG_EyeGaze01(Labelled eye gaze dataset) is a rich and diverse gaze dataset, built using eye gaze data from experiments done under a wide range of operating conditions from three user platforms (desktop, laptop, tablet) . Gaze data is collected under one condition at a time. The dataset includes gaze (fixation) data collected under 17 different head poses, 4 user distances, 6 platform poses and 3 display screen size and resolutions. Each gaze data file is labelled with the operating condition under which it was collected and has the name format: USERNUMBER_CONDITION_PLATFORM.CSV CONDITION: RP- Roll plus in degree PP- Pitch plus in degree YP- Yaw plus in degree RM- Roll minus in degree PM-Pitch minus in degree YM- Yaw minus in degree 50, 60, 70, 80: User distances PLATFORM: desk- Desktop, lap- Laptop, tab- Tablet Desktop display: 22 inch, 1680 x1050 pixels Laptop display: 14 inch, 1366x 768 pixels Tablet display: 10.1 inch 1920 x 800, pixels Eye tracker accuracy: 0.5 degrees (for neutral head and tracker position) The dataset has 3 folders called “Desktop”, “Laptop”, “Tablet” containing gaze data from respective platforms. The Desktop folder has 2 sub-folders: user_distance and head_pose. These have data for different user distances and head poses (neutral, roll, pitch, yaw )measured with desktop setup. The Tablet folder has 2 sub-folders: user_distance and tablet_pose,. These have data for different user distances and tablet+tracker poses (neutral, roll, pitch, yaw) measured with tablet setup . The Laptop folder has one sub-folder called user_distance which has data for different user distances, measured with laptop setup. All data files are in CSV format. Each file contains the following data header fields: ("TIM REL","GTX", "GTY","XRAW", "YRAW","GT Xmm", "GT Ymm","Xmm", "Ymm","YAW GT", "YAW DATA","PITCH GT", "PITCH DATA","GAZE GT","GAZE ANG", "DIFF GZ", "AOI_IND","AOI_X","AOI_Y","MEAN_ERR","STD ERR") The meanings of the header fields are as follows: TIM REL: relative time stamp for each gaze data point (measured during data collection) "GTX", "GTY": Ground truth x, y positions in pixels "XRAW", "YRAW": Raw gaze data x, y coordinates in pixels "GT Xmm", "GT Ymm": Ground truth x, y positions in mm "Xmm", "Ymm": Gaze x, y positions in mm "YAW GT", "YAW DATA": Ground truth and estimated yaw angles "PITCH GT", "PITCH DATA": Ground truth and estimated pitch angles "GAZE GT","GAZE ANG": Ground truth and estimated gaze angles "DIFF GZ": Gaze angular accuracy "AOI_IND","AOI_X","AOI_Y": Index of the stimuli locations and their x, y coordinates "MEAN_ERR","STD ERR": Mean and standard deviation of error at the stimuli locations For more details on the purpose of this dataset and data collection method, please consult the paper by authors of this dataset : Anuradha Kar, Peter Corcoran: Performance Evaluation Strategies for Eye Gaze Estimation Systems with Quantitative Metrics and Visualizations. Sensors 18(9): 3151 (2018)
id IOS7969.0.17632-cfm4d9y7bh.1
institution Universitas Islam Indragiri
affiliation onesearch.perpusnas.go.id
institution_id 804
institution_type library:university
library
library Teknologi Pangan UNISI
library_id 2816
collection Artikel mulono
repository_id 7969
city INDRAGIRI HILIR
province RIAU
shared_to_ipusnas_str 1
repoId IOS7969
first_indexed 2020-04-08T08:14:15Z
last_indexed 2020-04-08T08:14:15Z
recordtype dc
_version_ 1686587405082034176
score 17.538404