Division, College, and Department Research
Permanent URI for this communityhttps://hdl.handle.net/10877/1
Research, creative, and scholarly works created by the university community organized by area.
Browse
Browsing Division, College, and Department Research by Type "Dataset"
Now showing 1 - 6 of 6
Results Per Page
Sort Options
Item A Method for the Detection of Poorly-Formed or Misclassified Saccades: A case study using the GazeCom Dataset(2022-02) Friedman, Lee; Djanian, Shagen; Komogortsev, OlegThere are many automatic methods for the detection of eye movement types like fixation and saccades. Evaluating the accuracy of these methods can be a difficult and time-consuming process. We present a method to detect misclassified or poorly formed saccades\footnote{Throughout the manuscript, when we use the word ``misclassified'', we will be referring to both misclassified or poorly formed. saccades.}, regardless of how they were classified. We developed and tested our method on saccades from the very large and publicly available GazeCom dataset. We started out by creating a total of 9 metrics (velocity shape, velocity shape amplitude, position shape, position shape amplitude, flatness, entropy, kurtosis, skewness, and the Dip Test statistic of multimodality) which will be explained below. We applied these metrics to horizontal saccades of 20, 40 and 60 ms duration. For each duration, we performed a data reduction step with factor analysis to see how these 9 metrics were naturally grouped. For every duration, there were 2 factors, one which was dominated by our velocity shape metric and one which was dominated by our entropy metric. We determined that the entropy metric was the single most valuable metric for detecting misclassified saccades. We illustrate the types of saccades that our entropy metric indicates are misclassified. Link to Python Code https://github.com/sdjanian/sacanalysisItem A Novel Evaluation of Two Related, and Two Independent Algorithms for Eye Movement Classification during Reading(2018-01) Friedman, Lee; Rigas, Ioannis; Abdulin, Evgeny; Komogortsev, OlegThis repository contains classified eye-movement data from the submitted paper, "Novel Evaluation of Two Related, and Two Independent Algorithms for Eye Movement Classification during Reading" Lee Friedman, Ioannis Rigas, Evgeny Abdulin and Oleg V. Komogortsev The Department of Computer Science, Texas State University, San Marcos, Texas. As of 2/19/2018, the third revision is under review at Behavior Research Methods. There are 4 directories included, each with exactly 20 files. These are the 20 files that were evaluated with 4 scoring methods. ONH – These data were scored by the method described in [1]. MNH – These data were scored by the method presented in the manuscript. IRF – These data were scored by the method presented in [2]. EDF – These data were scored by the EyeLink Parser. File naming convention: Take, for example, this name: “S_1051_S1_TEX_Class_EyeLink.csv”. This is data from Subject number 1051, recording session 1, the TEX (poetry reading) task and it contains classification data scored by the EyeLink Parser. “S_1066_S2_TEX_Class_IRF.csv” is data from Subject number 1066, recording session 2, the TEX (poetry reading) task and it contains classification data scored by [2]. “S_1334_S2_TEX_Class_ONH.csv” is data from Subject number 1334, recording session 2, the TEX (poetry reading) task and it contains classification data scored by [1]. Files like “S_1282_S2_TEX_Class_MNH.csv” were scored by the method described in the manuscript. The first column of every dataset is a msec timestamp. Only the first 26,000 msec of each file were processed for the manuscript. The second column of every dataset is the horizontal (X) eye position signal in degrees of visual angle. In the case of the ONH and the MNH methods, these position signals were smoothed. See manuscript for details. The third column of every dataset is the vertical (Y) eye position signal in degrees of visual angle. In the case of the ONH and the MNH methods, these position signals were smoothed. See manuscript for details. The fourth column of every dataset is the radial velocity of the eye movement signals. Please see manuscript for details of this calculation for every dataset. The fifth column of each dataset is a classification code, where 1 = fixation, 2 = saccade, 3 = post-saccadic oscillation, 4 = noise or artifact, and 5 is unclassified. Note that the IRF coded data did not use an “unclassified” category. References: [1] M. Nystrom and K. Holmqvist, "An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data," Behav Res Methods, vol. 42, no. 1, pp. 188-204, Feb 2010. [2] R. Zemblys, D. C. Niehorster, O. Komogortsev, and K. Holmqvist, "Using machine learning to detect events in eye-tracking data," Behav Res Methods, Feb 23 2017.Item County-Level Population Growth, Density and Auto Thefts, 2001-2007(2024-08) Clement, MatthewNo abstract prepared.Item Data reference (DR): A Multi-Facility Two-Stage Stochastic Aggregate Production Planning Model with Renewable and Prosumer Microgrids(2021-12) Islam, Sayed Rezwanul; Novoa, Clara; Jin, TongdanNo abstract prepared.Item Power-efficient and Shift-robust Eye-tracking Sensor for Portable VR Headsets(Association for Computing Machinery, 2019-06) Katrychuk, Dmytro; Griffith, Henry; Komogortsev, OlegThis repository contains both the complete data sets and supporting code base from the paper entitled "Power-efficient and shift-robust eye-tracking sensor for portable VR headsets," which as been accepted in the Proceedings of the 2019 ACM Symposium On Eye Tracking Research & Applications (ETRA). Full details regarding the contents of each directory and instructions for preparing the processing environment are provided in the "README.md" file, which is contained within the code base directory. The up-to-date codebase is available on GitHub: https://github.com/pseudowolfvn/psog_nn/tree/etra2019Item Untitled(2022-11) Friedman, LeeThis is the site where Friedman and Komogortsev introduce and document a new method for the classification of eye movements (fixation, saccades and PSOs). This article is a complete description of the substantive functions of the Friedman-Komogortsev Method (FKM).