NCT05311878

Brief Summary

People's perceptual skills can significantly affect their abilities to make optimal decisions, judgments, and actions in real-world dynamic environments. Perceptual learning refers to training and experiences to induce improvements in the ability to make sense of what people see, hear, feel, taste or smell based on ambiguous sensory information. In this study, investigators hypothesise that there exist neural signatures that robustly encode the conscious visual perception of rotations of a cursor and the magnitudes of these rotations in a novel, rotation-based perceptual learning task. Investigators also hypothesise that online, instantaneous EEG-based feedback on subjects' visual perceptions of rotations with an EEG-based Brain Computer Interface (BCI) can foster perceptual learning much more effectively than behaviour perceptual training, especially in very small rotation magnitudes that represent extremely difficult perceptual tasks.

Trial Health

87
On Track

Trial Health Score

Automated assessment based on enrollment pace, timeline, and geographic reach

Enrollment
32

participants targeted

Target at P25-P50 for not_applicable

Timeline
Completed

Started Jan 2021

Typical duration for not_applicable

Geographic Reach
1 country

1 active site

Status
completed

Health score is calculated from publicly available data and should be used for screening purposes only.

Trial Relationships

Click on a node to explore related trials.

Study Timeline

Key milestones and dates

Study Start

First participant enrolled

January 1, 2021

Completed
1.1 years until next milestone

First Submitted

Initial submission to the registry

February 22, 2022

Completed
1 month until next milestone

First Posted

Study publicly available on registry

April 5, 2022

Completed
1.6 years until next milestone

Primary Completion

Last participant's last visit for primary outcome

November 1, 2023

Completed
Same day until next milestone

Study Completion

Last participant's last visit for all outcomes

November 1, 2023

Completed
Last Updated

April 24, 2025

Status Verified

April 1, 2025

Enrollment Period

2.8 years

First QC Date

February 22, 2022

Last Update Submit

April 21, 2025

Conditions

Outcome Measures

Primary Outcomes (1)

  • Change in correct answer rate of different rotation magnitudes across 5 intervention sessions

    The correct answer rate per rotation magnitude reflects the improvements in perceptual skills across the two conditions. It measures the percentage of each rotation magnitude spotted correctly. The score is 0-100, and the higher the value, the better the outcome.

    Difference is measured every 24 hours, before versus after each intervention session

Secondary Outcomes (1)

  • Change in neural correlates of conscious perception across 5 intervention sessions

    Difference is measured every 24 hours, before versus after each intervention session

Study Arms (2)

EEG based perceptual training

EXPERIMENTAL

Subjects complete a perceptual learning task in which EEG-based visual feedback is provided

Device: EEG-based perceptual training

Behavior based perceptual training

ACTIVE COMPARATOR

Subjects complete a perceptual learning task in which ground truth visual feedback is provided

Device: Behavior based perceptual training

Interventions

Electroencephalography (EEG) signals will be recorded from subjects as they perform rotation-based perceptual tasks. The neural correlates of conscious perception of rotations will be processed and decoded in real-time using machine learning algorithms to provide feedback. Subjects are instructed to assume a mental state/find a strategy to maximise the accuracy of feedback. In total, each subject will complete 5 sessions of perceptual training with this intervention.

EEG based perceptual training

Subjects complete the rotation-based perceptual tasks, and ground truth visual feedback is provided indicating whether subjects have spotted the rotations correctly. Subjects are instructed to spot as many rotation as possible to maximise the accuracy of feedback. In total, each subject will complete 5 sessions of perceptual training with this intervention.

Behavior based perceptual training

Eligibility Criteria

Age18 Years - 80 Years
Sexall
Healthy VolunteersYes
Age GroupsAdult (18-64), Older Adult (65+)

You may qualify if:

  • Able-bodied volunteers:
  • good general health
  • normal or corrected vision
  • no history of neurological/psychiatric disease
  • ability to read and understand English
  • ability to understand information and ability to give a free and informed consent
  • Subjects with neuropsychiatric diseases
  • Subjects with neuropsychiatric diseases such as bipolar disorder and schizophrenia.
  • normal or corrected vision
  • ability to read and understand English
  • ability to understand information and ability to give a free and informed consent

You may not qualify if:

  • short attentional spans or cognitive deficits that prevent to remain concentrated during the experimental sessions
  • concomitant serious illnesses (e.g., metabolic disorders, cardiac arrest)
  • factors hindering proper EEG acquisition (e.g., scalp wound, uncontrolled muscle activity)

Contact the study team to confirm eligibility.

Sponsors & Collaborators

Study Sites (1)

Engineering Education and Research Center

Austin, Texas, 78712, United States

Location

Study Officials

  • Jose del R. Millan, PhD

    The University of Texas at Austin

    PRINCIPAL INVESTIGATOR

Study Design

Study Type
interventional
Phase
not applicable
Allocation
RANDOMIZED
Masking
NONE
Purpose
TREATMENT
Intervention Model
PARALLEL
Sponsor Type
OTHER
Responsible Party
PRINCIPAL INVESTIGATOR
PI Title
Professor

Study Record Dates

First Submitted

February 22, 2022

First Posted

April 5, 2022

Study Start

January 1, 2021

Primary Completion

November 1, 2023

Study Completion

November 1, 2023

Last Updated

April 24, 2025

Record last verified: 2025-04

Data Sharing

IPD Sharing
Will share

All data will be made available by the online publication date. These data will be placed in public servers for any interested researcher to access it

Locations