A dataset of paired head and eye movements during visual tasks in virtual environments
Abstract We describe a multimodal dataset of paired head and eye movements acquired in controlled virtual reality environments. Our dataset includes head and eye movement for n = 25 participants who interacted with four different virtual reality environments that required coordinated head and eye be...
Saved in:
| Main Authors: | , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2024-12-01
|
| Series: | Scientific Data |
| Online Access: | https://doi.org/10.1038/s41597-024-04184-1 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1846137309096837120 |
|---|---|
| author | Colin Rubow Chia-Hsuan Tsai Eric Brewer Connor Mattson Daniel S. Brown Haohan Zhang |
| author_facet | Colin Rubow Chia-Hsuan Tsai Eric Brewer Connor Mattson Daniel S. Brown Haohan Zhang |
| author_sort | Colin Rubow |
| collection | DOAJ |
| description | Abstract We describe a multimodal dataset of paired head and eye movements acquired in controlled virtual reality environments. Our dataset includes head and eye movement for n = 25 participants who interacted with four different virtual reality environments that required coordinated head and eye behaviors. Our data collection involved two visual tracking tasks and two visual searching tasks. Each participant performed each task three times, resulting in approximately 1080 seconds of paired head and eye movement and 129,611 data samples of paired head and eye rotations per participant. This dataset enables research into predictive models of intended head movement conditioned on gaze for augmented and virtual reality experiences, as well as assistive devices like powered exoskeletons for individuals with head-neck mobility limitations. This dataset also allows biobehavioral and mechanism studies of the variability in head and eye movement across different participants and tasks. The virtual environment developed for this data collection is open sourced and thus available for others to perform their own data collection and modify the environment. |
| format | Article |
| id | doaj-art-d2b2bf2f065e4cc5a0ca0adf0ed10474 |
| institution | Kabale University |
| issn | 2052-4463 |
| language | English |
| publishDate | 2024-12-01 |
| publisher | Nature Portfolio |
| record_format | Article |
| series | Scientific Data |
| spelling | doaj-art-d2b2bf2f065e4cc5a0ca0adf0ed104742024-12-08T12:18:11ZengNature PortfolioScientific Data2052-44632024-12-011111810.1038/s41597-024-04184-1A dataset of paired head and eye movements during visual tasks in virtual environmentsColin Rubow0Chia-Hsuan Tsai1Eric Brewer2Connor Mattson3Daniel S. Brown4Haohan Zhang5Department of Mechanical Engineering, University of UtahKahlert School of Computing, University of UtahKahlert School of Computing, University of UtahKahlert School of Computing, University of UtahRobotics Center, University of UtahDepartment of Mechanical Engineering, University of UtahAbstract We describe a multimodal dataset of paired head and eye movements acquired in controlled virtual reality environments. Our dataset includes head and eye movement for n = 25 participants who interacted with four different virtual reality environments that required coordinated head and eye behaviors. Our data collection involved two visual tracking tasks and two visual searching tasks. Each participant performed each task three times, resulting in approximately 1080 seconds of paired head and eye movement and 129,611 data samples of paired head and eye rotations per participant. This dataset enables research into predictive models of intended head movement conditioned on gaze for augmented and virtual reality experiences, as well as assistive devices like powered exoskeletons for individuals with head-neck mobility limitations. This dataset also allows biobehavioral and mechanism studies of the variability in head and eye movement across different participants and tasks. The virtual environment developed for this data collection is open sourced and thus available for others to perform their own data collection and modify the environment.https://doi.org/10.1038/s41597-024-04184-1 |
| spellingShingle | Colin Rubow Chia-Hsuan Tsai Eric Brewer Connor Mattson Daniel S. Brown Haohan Zhang A dataset of paired head and eye movements during visual tasks in virtual environments Scientific Data |
| title | A dataset of paired head and eye movements during visual tasks in virtual environments |
| title_full | A dataset of paired head and eye movements during visual tasks in virtual environments |
| title_fullStr | A dataset of paired head and eye movements during visual tasks in virtual environments |
| title_full_unstemmed | A dataset of paired head and eye movements during visual tasks in virtual environments |
| title_short | A dataset of paired head and eye movements during visual tasks in virtual environments |
| title_sort | dataset of paired head and eye movements during visual tasks in virtual environments |
| url | https://doi.org/10.1038/s41597-024-04184-1 |
| work_keys_str_mv | AT colinrubow adatasetofpairedheadandeyemovementsduringvisualtasksinvirtualenvironments AT chiahsuantsai adatasetofpairedheadandeyemovementsduringvisualtasksinvirtualenvironments AT ericbrewer adatasetofpairedheadandeyemovementsduringvisualtasksinvirtualenvironments AT connormattson adatasetofpairedheadandeyemovementsduringvisualtasksinvirtualenvironments AT danielsbrown adatasetofpairedheadandeyemovementsduringvisualtasksinvirtualenvironments AT haohanzhang adatasetofpairedheadandeyemovementsduringvisualtasksinvirtualenvironments AT colinrubow datasetofpairedheadandeyemovementsduringvisualtasksinvirtualenvironments AT chiahsuantsai datasetofpairedheadandeyemovementsduringvisualtasksinvirtualenvironments AT ericbrewer datasetofpairedheadandeyemovementsduringvisualtasksinvirtualenvironments AT connormattson datasetofpairedheadandeyemovementsduringvisualtasksinvirtualenvironments AT danielsbrown datasetofpairedheadandeyemovementsduringvisualtasksinvirtualenvironments AT haohanzhang datasetofpairedheadandeyemovementsduringvisualtasksinvirtualenvironments |