NCT07394231

Brief Summary

Eye-hand coordination (EHC) is a critical cognitive-motor function that enables individuals to interact effectively with their environment through visually guided hand movements. It plays an essential role in daily activities such as reaching, grasping, and object manipulation. Previous studies have shown that targeted physical activities and sports can enhance EHC performance. However, aging is commonly associated with declines in EHC, executive function, and postural control, which can negatively affect independence in daily living. These age-related changes are also closely linked to cognitive decline and may contribute to the development of mild cognitive impairment (MCI), dementia, and Alzheimer's disease, thereby increasing the burden on families and healthcare systems. To mitigate these effects, various cognitive-motor and technology-assisted training approaches have been proposed to improve EHC and cognitive function in older adults. While many existing EHC training systems are computerized and implemented using virtual reality (VR) or mixed reality (MR), accumulating evidence suggests that virtual environments may not fully replicate real-world eye-hand interactions. Limitations in depth perception, haptic feedback, and realism may alter visual fixation strategies, movement execution, and overall task performance, potentially reducing training effectiveness compared with real-world interactions. Given these limitations, it remains unclear whether real-world EHC training provides greater benefits to executive functions and motor performance than virtual training. Therefore, this study aims to compare the acute effects of EHC exercise performed in a real-world environment and a mixed reality passthrough environment among older adults. The proposed EHC training task involves catching a real three-dimensional (3D) object guided by a physical mini drone, inspired by natural human behaviors such as swatting at flying insects, and its virtual counterpart involving a virtual 3D object and drone. The primary objective is to examine differences in executive functions, task performance, and postural stability between real and virtual EHC conditions. By identifying which training modality better supports cognitive-motor performance, this study seeks to inform the design of effective and engaging interventions for healthy aging and early prevention of cognitive decline.

Trial Health

87
On Track

Trial Health Score

Automated assessment based on enrollment pace, timeline, and geographic reach

Enrollment
38

participants targeted

Target at P25-P50 for not_applicable

Timeline
Completed

Started Oct 2024

Shorter than P25 for not_applicable

Geographic Reach
1 country

1 active site

Status
completed

Health score is calculated from publicly available data and should be used for screening purposes only.

Trial Relationships

Click on a node to explore related trials.

Study Timeline

Key milestones and dates

Study Start

First participant enrolled

October 19, 2024

Completed
2 months until next milestone

Primary Completion

Last participant's last visit for primary outcome

December 23, 2024

Completed
Same day until next milestone

Study Completion

Last participant's last visit for all outcomes

December 23, 2024

Completed
1.1 years until next milestone

First Submitted

Initial submission to the registry

January 22, 2026

Completed
15 days until next milestone

First Posted

Study publicly available on registry

February 6, 2026

Completed
Last Updated

February 6, 2026

Status Verified

February 1, 2026

Enrollment Period

2 months

First QC Date

January 22, 2026

Last Update Submit

February 1, 2026

Conditions

Keywords

Eye-Hand Coordination (EHC)Physical and Virtual ObjectsMixed Reality (MR)Executive FunctionsBehavioral ResponsesPostural StabilitiesGerontechnology

Outcome Measures

Primary Outcomes (8)

  • Executive Functions via Flanker-ERP Measurement

    Each participant underwent Flanker-ERP assessment at three stages: at baseline (pre-intervention) and following both the physical and virtual object-based EHC training sessions.

    2 hours

  • Success Rate (SR)

    SR was measured for each participant during object-catching trials across two EHC training modalities: the physical and the virtual 3D object-based drone-catching systems.

    1-1.5 hours

  • Reaction Time (RT)

    RT was measured for each participant during object-catching trials across two EHC training modalities: the physical and the virtual 3D object-based drone-catching systems.

    1-1.5 hours

  • Movement Time (MT)

    MT was measured for each participant during object-catching trials across two EHC training modalities: the physical and the virtual 3D object-based drone-catching systems.

    1-1.5 hours

  • Peak Hand Velocity (PHV)

    PHV was measured for each participant during object-catching trials across two EHC training modalities: the physical and the virtual 3D object-based drone-catching systems.

    1-1.5 hours

  • Time-to-Peak Hand Velocity (TPHV)

    TPHV was measured for each participant during object-catching trials across two EHC training modalities: the physical and the virtual 3D object-based drone-catching systems.

    1-1.5 hours

  • Center of Mass (CoM)

    The CoM of every participant while performing EHC training tasks was investigated regarding two different EHC training modalities, including physical object-based and virtual object-based drone-catching systems.

    1-1.5 hours

  • Center of Pressure (CoP)

    The CoP of every participant while performing EHC training tasks was investigated regarding two different EHC training modalities, including physical object-based and virtual object-based drone-catching systems.

    1-1.5 hours

Secondary Outcomes (3)

  • Subjective participant feedback on perceived task difficulty

    10-15 minutes

  • Subjective participant feedback on system preference

    10-15 minutes

  • Virtual Reality Sickness Questionnaire (VRSQ)

    10-15 minutes

Study Arms (2)

Underwent the virtual system after the real system

EXPERIMENTAL
Other: Real Object-Based Catching SystemOther: Virtual Object-Based Catching System

Underwent the real system after the virtual system

EXPERIMENTAL
Other: Real Object-Based Catching SystemOther: Virtual Object-Based Catching System

Interventions

This condition involves a participant grasping a physical 3D object located beneath the drone in a real-world environment.

Underwent the real system after the virtual systemUnderwent the virtual system after the real system

The condition involves a participant grasping a virtual counterpart of a physical 3D object within a mixed reality (MR) passthrough environment.

Underwent the real system after the virtual systemUnderwent the virtual system after the real system

Eligibility Criteria

Age60 Years+
Sexall
Healthy VolunteersYes
Age GroupsAdult (18-64), Older Adult (65+)

You may qualify if:

  • years and older (65 years and older preferred).
  • Able to perform regular exercise.
  • Normal vision or normal vision after correction.

You may not qualify if:

  • Have a history of significant chronic diseases such as neurological (e.g., stroke, dementia, Parkinson's disease, poor vision, and hearing loss), cardiovascular, metabolic, pulmonary, or musculoskeletal diseases.
  • Have a history of significant motion sickness, active nausea, and vomiting, or epilepsy.
  • Fear of wearing a VR headset.

Contact the study team to confirm eligibility.

Sponsors & Collaborators

Study Sites (1)

Motion Analysis Laboratory, Dept. of Biomedical Engineeing, National Cheng Kung University

Tainan, 701, Taiwan

Location

Related Publications (17)

  • R. B. Davis, S. Õunpuu, D. Tyburski, and J. R. Gage, "A gait analysis data collection and reduction technique," Hum Mov Sci, vol. 10, no. 5, pp. 575-587, 1991, doi: https://doi.org/10.1016/0167-9457(91)90046-Z.

    BACKGROUND
  • Lopez-Calderon J, Luck SJ. ERPLAB: an open-source toolbox for the analysis of event-related potentials. Front Hum Neurosci. 2014 Apr 14;8:213. doi: 10.3389/fnhum.2014.00213. eCollection 2014.

    PMID: 24782741BACKGROUND
  • Delorme A, Makeig S. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J Neurosci Methods. 2004 Mar 15;134(1):9-21. doi: 10.1016/j.jneumeth.2003.10.009.

    PMID: 15102499BACKGROUND
  • Birckhead B, Khalil C, Liu X, Conovitz S, Rizzo A, Danovitch I, Bullock K, Spiegel B. Recommendations for Methodology of Virtual Reality Clinical Trials in Health Care by an International Working Group: Iterative Study. JMIR Ment Health. 2019 Jan 31;6(1):e11973. doi: 10.2196/11973.

    PMID: 30702436BACKGROUND
  • M. Aly and H. Kojima, "Acute moderate-intensity exercise generally enhances neural resources related to perceptual and cognitive processes: A randomized controlled ERP study," Ment Health Phys Act, vol. 19, p. 100363, 2020, doi: https://doi.org/10.1016/j.mhpa.2020.100363.

    BACKGROUND
  • Pei YC, Chou SW, Lin PS, Lin YC, Hsu TH, Wong AM. Eye-hand coordination of elderly people who practice Tai Chi Chuan. J Formos Med Assoc. 2008 Feb;107(2):103-10. doi: 10.1016/S0929-6646(08)60123-0.

    PMID: 18285242BACKGROUND
  • B. A. Eriksen and C. W. Eriksen, "Effects of noise letters upon the identification of a target letter in a nonsearch task," Percept Psychophys, vol. 16, no. 1, pp. 143-149, 1974, doi: 10.3758/BF03203267.

    BACKGROUND
  • Lavoie E, Hebert JS, Chapman CS. Comparing eye-hand coordination between controller-mediated virtual reality, and a real-world object interaction task. J Vis. 2024 Feb 1;24(2):9. doi: 10.1167/jov.24.2.9.

    PMID: 38393742BACKGROUND
  • A. Dalia Blaga, M. Frutos-Pascual, C. Creed, and I. Williams, "A Grasp on Reality: Understanding Grasping Patterns for Object Interaction in Real and Virtual Environments," in 2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), 2021, pp. 391-396. doi: 10.1109/ISMAR-Adjunct54149.2021.00090.

    BACKGROUND
  • Chan PT, Chang WC, Chiu HL, Kao CC, Liu D, Chu H, Chou KR. Effect of interactive cognitive-motor training on eye-hand coordination and cognitive function in older adults. BMC Geriatr. 2019 Jan 28;19(1):27. doi: 10.1186/s12877-019-1029-y.

    PMID: 30691404BACKGROUND
  • "Dementia Prevention and Care Policy and Action Plan 2.0 Ministry of Health and Welfare," 2018.

    BACKGROUND
  • P. Lenik, K. Przednowek, M. Śliż, G. Bobula, and J. Lenik, "The impact of exercises with a reaction ball on the eye-hand coordination of basketball players," Apr. 2017.

    BACKGROUND
  • C. A. Manning and J. K. Ducharme, "Chapter 6 - Dementia Syndromes in the Older Adult," in Handbook of Assessment in Clinical Gerontology (Second Edition), Second Edition., P. A. Lichtenberg, Ed., San Diego: Academic Press, 2010, pp. 155-178. doi: https://doi.org/10.1016/B978-0-12-374961-1.10006-5.

    BACKGROUND
  • Heintz Walters B, Huddleston WE, O'Connor K, Wang J, Hoeger Bement M, Keenan KG. The role of eye movements, attention, and hand movements on age-related differences in pegboard tests. J Neurophysiol. 2021 Nov 1;126(5):1710-1722. doi: 10.1152/jn.00629.2020. Epub 2021 Oct 13.

    PMID: 34644180BACKGROUND
  • Van Halewyck F, Lavrysen A, Levin O, Boisgontier MP, Elliott D, Helsen WF. Both age and physical activity level impact on eye-hand coordination. Hum Mov Sci. 2014 Aug;36:80-96. doi: 10.1016/j.humov.2014.05.005. Epub 2014 Jun 22.

    PMID: 24964357BACKGROUND
  • Rand MK, Stelmach GE. Effects of hand termination and accuracy requirements on eye-hand coordination in older adults. Behav Brain Res. 2011 May 16;219(1):39-46. doi: 10.1016/j.bbr.2010.12.008. Epub 2010 Dec 14.

    PMID: 21163306BACKGROUND
  • Niechwiej-Szwedo E, Wu S, Nouredanesh M, Tung J, Christian LW. Development of eye-hand coordination in typically developing children and adolescents assessed using a reach-to-grasp sequencing task. Hum Mov Sci. 2021 Dec;80:102868. doi: 10.1016/j.humov.2021.102868. Epub 2021 Sep 9.

    PMID: 34509902BACKGROUND

Study Design

Study Type
interventional
Phase
not applicable
Allocation
RANDOMIZED
Masking
NONE
Purpose
OTHER
Intervention Model
SINGLE GROUP
Sponsor Type
OTHER
Responsible Party
PRINCIPAL INVESTIGATOR
PI Title
University Chair Professor

Study Record Dates

First Submitted

January 22, 2026

First Posted

February 6, 2026

Study Start

October 19, 2024

Primary Completion

December 23, 2024

Study Completion

December 23, 2024

Last Updated

February 6, 2026

Record last verified: 2026-02

Locations