The cognitive workload is an important component in performance psychology, ergonomics and human factors. Publicly available datasets are scarce, making it difficult to establish new approaches and comparative studies. In this work, COLET-COgnitive workLoad estimation based on Eye-Tracking dataset is presented.
Forty-seven (47) individuals' eye movements were monitored as they solved puzzles involving visual search activities of varying complexity and duration. The participants' cognitive workload level was evaluated with the subjective test of NASA-TLX and this score is used as an annotation of the activity. Extensive data analysis was performed in order to derive eye and gaze features from low-level eye recorded metrics, and a range of machine learning models were evaluated and tested regarding the estimation of the cognitive workload level.
The activities induced four different levels of cognitive workload. Multi tasking and time pressure have induced a higher level of cognitive workload than the one induced by single tasking and absence of time pressure. Multi tasking had a significant effect on 17 eye features while time pressure had a significant effect on 7 eye features. Both binary and multi-class identification attempts were performed by testing a variety of well-known classifiers, resulting in encouraging results towards cognitive workload levels estimation, with up to 88\% correct predictions between low and high cognitive workload.
Machine learning analysis demonstrated potential in discriminating cognitive workload levels using only eye-tracking characteristics. The proposed dataset includes a much higher sample size and a wider spectrum of eye and gaze metrics than other similar datasets, allowing for the examination of their relations with various cognitive states.
The database consists of gaze related, pupil related and blink related metrics, as well as information for the participants, the task images and their annotation. The database was part of a study inducted within the framework of SeeFar (H2020 No 826429).