NCT06117332

Brief Summary

Visual impairment is one of the ten most prevalent causes of disability and poses extraordinary challenges to individuals in our society that relies heavily on sight. Living with acquired blindness not only lowers the quality of life of these individuals, but also strains society's limited resources for assistance, care and rehabilitation. However, to date, there is no effective treatment for man patients who are visually handicapped as a result of degeneration or damage to the inner layers of the retina, the optic nerve or the visual pathways. Therefore, there are compelling reasons to pursue the development of a cortical visual prosthesis capable of restoring some useful sight in these profoundly blind patients. However, the quality of current prosthetic vision is still rudimentary. A major outstanding challenge is translating electrode stimulation into a code that the brain can understand. Interactions between the device electronics and the retinal neurophysiology lead to distortions that can severely limit the quality of the generated visual experience. Rather than aiming to one day restore natural vision (which may remain elusive until the neural code of vision is fully understood), one might be better off thinking about how to create practical and useful artificial vision now. The goal of this work is to address fundamental questions that will allow the development of a Smart Bionic Eye, a device that relies on AI-powered scene understanding to augment the visual scene (similar to the Microsoft HoloLens), tailored to specific real-world tasks that are known to diminish the quality of life of people who are blind (e.g., face recognition, outdoor navigation, reading, self-care).

Trial Health

78
On Track

Trial Health Score

Automated assessment based on enrollment pace, timeline, and geographic reach

Enrollment
7

participants targeted

Target at below P25 for not_applicable

Timeline
16mo left

Started Oct 2023

Longer than P75 for not_applicable

Geographic Reach
2 countries

2 active sites

Status
enrolling by invitation

Health score is calculated from publicly available data and should be used for screening purposes only.

Trial Relationships

Click on a node to explore related trials.

Study Timeline

Key milestones and dates

Study Progress66%
Oct 2023Aug 2027

Study Start

First participant enrolled

October 2, 2023

Completed
23 days until next milestone

First Submitted

Initial submission to the registry

October 25, 2023

Completed
13 days until next milestone

First Posted

Study publicly available on registry

November 7, 2023

Completed
3.8 years until next milestone

Primary Completion

Last participant's last visit for primary outcome

August 31, 2027

Expected
Same day until next milestone

Study Completion

Last participant's last visit for all outcomes

August 31, 2027

Last Updated

July 31, 2025

Status Verified

July 1, 2025

Enrollment Period

3.9 years

First QC Date

October 25, 2023

Last Update Submit

July 28, 2025

Conditions

Outcome Measures

Primary Outcomes (3)

  • Phosphene shape

    The effect of stimulation strategy on the shape of phosphenes elicited by electrical stimulation. Phosphene shape will be recorded via participant drawings on a touchscreen and quantified using image moments (e.g., area, orientation, eccentricity).

    through study completion, an average of 1 year

  • Pattern discrimination accuracy

    The effect of stimulation strategy on the ability to discriminate patterns (accuracy, precision, recall) elicited by electrical stimulation as assessed by verbal responses

    through study completion, an average of 1 year

  • Scene understanding performance

    The effect of stimulation strategy to locate objects of interest (accuracy, precision, recall) and relay accurate descriptions of the visual scene as assessed by verbal responses

    through study completion, an average of 1 year

Study Arms (1)

Perception resulting from AI-powered artificial vision

EXPERIMENTAL

The investigators will produce visual percepts in visual prosthesis patients either by directly stimulating electrodes (using FDA-approved pulse trains), or by asking them to view a computer or projector screen and using standard stimulation protocols (as is standardly used for their devices) to convert the computer or projector screen image into pulse trains on their electrodes. Informed by psychophysical data and computational models, the investigators will test the ability of different stimulus encoding methods to support simple perceptual and behavioral tasks (e.g., object recognition, navigation). These encoding methods may include computer vision and machine learning methods to highlight important objects in the scene or to highlight nearby obstacles and may be tailored to each individual patient.

Device: Visual prosthesis

Interventions

In response to the stimulation/image on the monitor, subjects will be asked to either make a perceptual judgment or perform a simple behavioral task. Examples include detecting a stimulus ('did you see a light on that trial'), reporting size by drawing on a touch screen, or walking to a target location. Both patient response and reaction time will be recorded. In some cases, the investigators will also collect data measuring subjects' eye position. This is a noninvasive procedure that will be carried out using standard eye-tracking equipment via an infra-red camera that tracks the position of the subjects' pupil. Only measurements like eye position or eye blinks will be recorded, so these data do not contain identifiable information.

Also known as: Utah array (Cortivis)
Perception resulting from AI-powered artificial vision

Eligibility Criteria

Age18 Years+
Sexall
Healthy VolunteersYes
Age GroupsAdult (18-64), Older Adult (65+)

You may not qualify if:

  • Visual prosthesis users: Subject is unwilling or unable to travel to testing facility for at least 3 days of testing within a one-week timeframe;
  • Sighted controls: Subject has a history of motion sickness or flicker vertigo
  • All: Subject has language or hearing impairment

Contact the study team to confirm eligibility.

Sponsors & Collaborators

Study Sites (2)

University of California, Santa Barbara

Santa Barbara, California, 93106, United States

Location

University Miguel Hernandez

Elche, Alicante, 03202, Spain

Location

MeSH Terms

Conditions

Blindness

Interventions

Visual Prosthesis

Condition Hierarchy (Ancestors)

Vision DisordersSensation DisordersNeurologic ManifestationsNervous System DiseasesEye DiseasesSigns and SymptomsPathological Conditions, Signs and Symptoms

Intervention Hierarchy (Ancestors)

Prostheses and ImplantsEquipment and Supplies

Study Officials

  • Michael Beyeler, PhD

    University of California, Santa Barbara

    PRINCIPAL INVESTIGATOR

Study Design

Study Type
interventional
Phase
not applicable
Allocation
NA
Masking
NONE
Purpose
BASIC SCIENCE
Intervention Model
SINGLE GROUP
Sponsor Type
OTHER
Responsible Party
SPONSOR

Study Record Dates

First Submitted

October 25, 2023

First Posted

November 7, 2023

Study Start

October 2, 2023

Primary Completion (Estimated)

August 31, 2027

Study Completion (Estimated)

August 31, 2027

Last Updated

July 31, 2025

Record last verified: 2025-07

Data Sharing

IPD Sharing
Will not share

Locations