AI-Powered Artificial Vision for Visual Prostheses
1 other identifier
interventional
7
2 countries
2
Brief Summary
Visual impairment is one of the ten most prevalent causes of disability and poses extraordinary challenges to individuals in our society that relies heavily on sight. Living with acquired blindness not only lowers the quality of life of these individuals, but also strains society's limited resources for assistance, care and rehabilitation. However, to date, there is no effective treatment for man patients who are visually handicapped as a result of degeneration or damage to the inner layers of the retina, the optic nerve or the visual pathways. Therefore, there are compelling reasons to pursue the development of a cortical visual prosthesis capable of restoring some useful sight in these profoundly blind patients. However, the quality of current prosthetic vision is still rudimentary. A major outstanding challenge is translating electrode stimulation into a code that the brain can understand. Interactions between the device electronics and the retinal neurophysiology lead to distortions that can severely limit the quality of the generated visual experience. Rather than aiming to one day restore natural vision (which may remain elusive until the neural code of vision is fully understood), one might be better off thinking about how to create practical and useful artificial vision now. The goal of this work is to address fundamental questions that will allow the development of a Smart Bionic Eye, a device that relies on AI-powered scene understanding to augment the visual scene (similar to the Microsoft HoloLens), tailored to specific real-world tasks that are known to diminish the quality of life of people who are blind (e.g., face recognition, outdoor navigation, reading, self-care).
Trial Health
Trial Health Score
Automated assessment based on enrollment pace, timeline, and geographic reach
participants targeted
Target at below P25 for not_applicable
Started Oct 2023
Longer than P75 for not_applicable
2 active sites
Health score is calculated from publicly available data and should be used for screening purposes only.
Trial Relationships
Click on a node to explore related trials.
Study Timeline
Key milestones and dates
Study Start
First participant enrolled
October 2, 2023
CompletedFirst Submitted
Initial submission to the registry
October 25, 2023
CompletedFirst Posted
Study publicly available on registry
November 7, 2023
CompletedPrimary Completion
Last participant's last visit for primary outcome
August 31, 2027
ExpectedStudy Completion
Last participant's last visit for all outcomes
August 31, 2027
July 31, 2025
July 1, 2025
3.9 years
October 25, 2023
July 28, 2025
Conditions
Outcome Measures
Primary Outcomes (3)
Phosphene shape
The effect of stimulation strategy on the shape of phosphenes elicited by electrical stimulation. Phosphene shape will be recorded via participant drawings on a touchscreen and quantified using image moments (e.g., area, orientation, eccentricity).
through study completion, an average of 1 year
Pattern discrimination accuracy
The effect of stimulation strategy on the ability to discriminate patterns (accuracy, precision, recall) elicited by electrical stimulation as assessed by verbal responses
through study completion, an average of 1 year
Scene understanding performance
The effect of stimulation strategy to locate objects of interest (accuracy, precision, recall) and relay accurate descriptions of the visual scene as assessed by verbal responses
through study completion, an average of 1 year
Study Arms (1)
Perception resulting from AI-powered artificial vision
EXPERIMENTALThe investigators will produce visual percepts in visual prosthesis patients either by directly stimulating electrodes (using FDA-approved pulse trains), or by asking them to view a computer or projector screen and using standard stimulation protocols (as is standardly used for their devices) to convert the computer or projector screen image into pulse trains on their electrodes. Informed by psychophysical data and computational models, the investigators will test the ability of different stimulus encoding methods to support simple perceptual and behavioral tasks (e.g., object recognition, navigation). These encoding methods may include computer vision and machine learning methods to highlight important objects in the scene or to highlight nearby obstacles and may be tailored to each individual patient.
Interventions
In response to the stimulation/image on the monitor, subjects will be asked to either make a perceptual judgment or perform a simple behavioral task. Examples include detecting a stimulus ('did you see a light on that trial'), reporting size by drawing on a touch screen, or walking to a target location. Both patient response and reaction time will be recorded. In some cases, the investigators will also collect data measuring subjects' eye position. This is a noninvasive procedure that will be carried out using standard eye-tracking equipment via an infra-red camera that tracks the position of the subjects' pupil. Only measurements like eye position or eye blinks will be recorded, so these data do not contain identifiable information.
Eligibility Criteria
You may not qualify if:
- Visual prosthesis users: Subject is unwilling or unable to travel to testing facility for at least 3 days of testing within a one-week timeframe;
- Sighted controls: Subject has a history of motion sickness or flicker vertigo
- All: Subject has language or hearing impairment
Contact the study team to confirm eligibility.
Sponsors & Collaborators
Study Sites (2)
University of California, Santa Barbara
Santa Barbara, California, 93106, United States
University Miguel Hernandez
Elche, Alicante, 03202, Spain
MeSH Terms
Conditions
Interventions
Condition Hierarchy (Ancestors)
Intervention Hierarchy (Ancestors)
Study Officials
- PRINCIPAL INVESTIGATOR
Michael Beyeler, PhD
University of California, Santa Barbara
Study Design
- Study Type
- interventional
- Phase
- not applicable
- Allocation
- NA
- Masking
- NONE
- Purpose
- BASIC SCIENCE
- Intervention Model
- SINGLE GROUP
- Sponsor Type
- OTHER
- Responsible Party
- SPONSOR
Study Record Dates
First Submitted
October 25, 2023
First Posted
November 7, 2023
Study Start
October 2, 2023
Primary Completion (Estimated)
August 31, 2027
Study Completion (Estimated)
August 31, 2027
Last Updated
July 31, 2025
Record last verified: 2025-07
Data Sharing
- IPD Sharing
- Will not share