Comprehensive VR dataset for machine learning: Head- and eye-centred video and positional dataThe Adaptive Mind Datahub
We present a comprehensive dataset comprising head- and eye-centred video recordings from human participants performing a search task in a variety of Virtual Reality (VR) environments. Using a VR motion platform, participants navigated these environments freely while their eye movements and position...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Elsevier
2024-12-01
|
| Series: | Data in Brief |
| Subjects: | |
| Online Access: | http://www.sciencedirect.com/science/article/pii/S2352340924011491 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1846122107120910336 |
|---|---|
| author | Alexander Kreß Markus Lappe Frank Bremmer |
| author_facet | Alexander Kreß Markus Lappe Frank Bremmer |
| author_sort | Alexander Kreß |
| collection | DOAJ |
| description | We present a comprehensive dataset comprising head- and eye-centred video recordings from human participants performing a search task in a variety of Virtual Reality (VR) environments. Using a VR motion platform, participants navigated these environments freely while their eye movements and positional data were captured and stored in CSV format. The dataset spans six distinct environments, including one specifically for calibrating the motion platform, and provides a cumulative playtime of over 10 h for both head- and eye-centred perspectives.The data collection was conducted in naturalistic VR settings, where participants collected virtual coins scattered across diverse landscapes such as grassy fields, dense forests, and an abandoned urban area, each characterized by unique ecological features. This structured and detailed dataset offers substantial reuse potential, particularly for machine learning applications.The richness of the dataset makes it an ideal resource for training models on various tasks, including the prediction and analysis of visual search behaviour, eye movement and navigation strategies within VR environments. Researchers can leverage this extensive dataset to develop and refine algorithms requiring comprehensive and annotated video and positional data. By providing a well-organized and detailed dataset, it serves as an invaluable resource for advancing machine learning research in VR and fostering the development of innovative VR technologies. |
| format | Article |
| id | doaj-art-70b6ef3876b44d41afbdb7d0fd5b8e1f |
| institution | Kabale University |
| issn | 2352-3409 |
| language | English |
| publishDate | 2024-12-01 |
| publisher | Elsevier |
| record_format | Article |
| series | Data in Brief |
| spelling | doaj-art-70b6ef3876b44d41afbdb7d0fd5b8e1f2024-12-15T06:15:53ZengElsevierData in Brief2352-34092024-12-0157111187Comprehensive VR dataset for machine learning: Head- and eye-centred video and positional dataThe Adaptive Mind DatahubAlexander Kreß0Markus Lappe1Frank Bremmer2Corresponding author.; Department of Neurophysics, Philipps University Marburg, Karl-von-Frisch Straße 8a, 35043 Marburg, Hesse, GermanyInstitute of Psychology, University Münster, Fliednerstraße 21, 48149 Münster, North Rhine-Westphalia, GermanyDepartment of Neurophysics, Philipps University Marburg, Karl-von-Frisch Straße 8a, 35043 Marburg, Hesse, GermanyWe present a comprehensive dataset comprising head- and eye-centred video recordings from human participants performing a search task in a variety of Virtual Reality (VR) environments. Using a VR motion platform, participants navigated these environments freely while their eye movements and positional data were captured and stored in CSV format. The dataset spans six distinct environments, including one specifically for calibrating the motion platform, and provides a cumulative playtime of over 10 h for both head- and eye-centred perspectives.The data collection was conducted in naturalistic VR settings, where participants collected virtual coins scattered across diverse landscapes such as grassy fields, dense forests, and an abandoned urban area, each characterized by unique ecological features. This structured and detailed dataset offers substantial reuse potential, particularly for machine learning applications.The richness of the dataset makes it an ideal resource for training models on various tasks, including the prediction and analysis of visual search behaviour, eye movement and navigation strategies within VR environments. Researchers can leverage this extensive dataset to develop and refine algorithms requiring comprehensive and annotated video and positional data. By providing a well-organized and detailed dataset, it serves as an invaluable resource for advancing machine learning research in VR and fostering the development of innovative VR technologies.http://www.sciencedirect.com/science/article/pii/S2352340924011491Eye trackingHead trackingDeep learningSpatial navigationForaging behaviourBehavioural data |
| spellingShingle | Alexander Kreß Markus Lappe Frank Bremmer Comprehensive VR dataset for machine learning: Head- and eye-centred video and positional dataThe Adaptive Mind Datahub Data in Brief Eye tracking Head tracking Deep learning Spatial navigation Foraging behaviour Behavioural data |
| title | Comprehensive VR dataset for machine learning: Head- and eye-centred video and positional dataThe Adaptive Mind Datahub |
| title_full | Comprehensive VR dataset for machine learning: Head- and eye-centred video and positional dataThe Adaptive Mind Datahub |
| title_fullStr | Comprehensive VR dataset for machine learning: Head- and eye-centred video and positional dataThe Adaptive Mind Datahub |
| title_full_unstemmed | Comprehensive VR dataset for machine learning: Head- and eye-centred video and positional dataThe Adaptive Mind Datahub |
| title_short | Comprehensive VR dataset for machine learning: Head- and eye-centred video and positional dataThe Adaptive Mind Datahub |
| title_sort | comprehensive vr dataset for machine learning head and eye centred video and positional datathe adaptive mind datahub |
| topic | Eye tracking Head tracking Deep learning Spatial navigation Foraging behaviour Behavioural data |
| url | http://www.sciencedirect.com/science/article/pii/S2352340924011491 |
| work_keys_str_mv | AT alexanderkreß comprehensivevrdatasetformachinelearningheadandeyecentredvideoandpositionaldatatheadaptiveminddatahub AT markuslappe comprehensivevrdatasetformachinelearningheadandeyecentredvideoandpositionaldatatheadaptiveminddatahub AT frankbremmer comprehensivevrdatasetformachinelearningheadandeyecentredvideoandpositionaldatatheadaptiveminddatahub |