NCT06958653

Brief Summary

The study was conducted in two sequential phases to evaluate the reliability and user experience of a GBDA tailored for community-dwelling older adults. Phase 1: Reliability of digitalized Brief-BESTest assessment In the first phase, participants performed a single balance assessment session, during which both the clinician-administered Brief-BESTest and the digitalized Brief-BESTest were scored concurrently. This approach enabled direct comparison between clinical and automated assessments under identical task conditions. Testing was conducted in a controlled indoor setting featuring a 1 m × 1 m, 10 cm-thick EVA foam mat (35D density) and safety handrails on three sides. Prior to the assessment, participants completed a baseline questionnaire collecting demographic data (age, sex), anthropometric measurements (height, weight), and fall history (past 12 months). Written informed consent was obtained from all participants. During the assessment, a certified physical therapist delivered standardized verbal instructions and rated each task using the validated Brief-BESTest rubric (maximum score = 24). Simultaneously, the digitalized Brief-BESTest system recorded participants' movements using a monocular 4K camera and calculated scores via an algorithm that mirrors the original scoring criteria. The torso and joint movements were analyzed in real time, and balance scores were automatically computed. To evaluate inter-rater reliability, a second trained clinician independently rated 20% of the sample. This concurrent scoring design ensured consistent task execution while enabling evaluation of inter-method reliability of the automated system's scoring against expert clinician judgment. Phase 2: Impact of GBDA on User Experience The second phase involved a parallel group randomized controlled trial to assess the impact of gamification on user experience. Participants were randomly assigned (1:1) to either the control group (uses digitalized Brief-BESTest) or the experimental group (uses GDBA) through a simple coin-randomization method by a blinded researcher. Testing was conducted in a 1 m × 3 m evaluation zone equipped with front, side, and rear safety railings, and a centrally placed EVA foam pad (identical to Phase 1). The DBTS system included a display screen, a Logitech Brio 4K webcam (30 fps) for motion tracking, and a built-in speaker for voice prompts. A detachable, ergonomically designed user console-compliant with Chinese anthropometric standards-was mounted on the front railing for interface navigation (see Figure 2). In the control group, participants performed balance tasks following pre-recorded verbal instructions from a certified physical therapist. In the experimental group, tasks were presented via the GDBA interface, which included animated avatars, voice guidance, progress indicators, and real-time performance feedback. Each participant completed one practice trial per task to minimize learning effects, followed by the formal assessment. A 2-minute seated rest period was provided between tasks to reduce fatigue. Immediately following the assessment, participants completed self-report measures on perceived exertion, intrinsic motivation, and intention for continued use. They then participated in a brief semi-structured interview exploring their perceptions of system usability and engagement. All interviews were audio-recorded and transcribed for thematic analysis. Participants received a nominal compensation (USD $10 equivalent) upon study completion.

Trial Health

87
On Track

Trial Health Score

Automated assessment based on enrollment pace, timeline, and geographic reach

Enrollment
40

participants targeted

Target at P25-P50 for not_applicable

Timeline
Completed

Started Apr 2025

Shorter than P25 for not_applicable

Geographic Reach
1 country

1 active site

Status
completed

Health score is calculated from publicly available data and should be used for screening purposes only.

Trial Relationships

Click on a node to explore related trials.

Study Timeline

Key milestones and dates

First Submitted

Initial submission to the registry

April 17, 2025

Completed
3 days until next milestone

Study Start

First participant enrolled

April 20, 2025

Completed
16 days until next milestone

First Posted

Study publicly available on registry

May 6, 2025

Completed
14 days until next milestone

Primary Completion

Last participant's last visit for primary outcome

May 20, 2025

Completed
4 months until next milestone

Study Completion

Last participant's last visit for all outcomes

September 20, 2025

Completed
Last Updated

January 22, 2026

Status Verified

January 1, 2024

Enrollment Period

1 month

First QC Date

April 17, 2025

Last Update Submit

January 20, 2026

Conditions

Outcome Measures

Primary Outcomes (3)

  • Perceived physical exertion

    Perceived physical exertion was measured using the Borg Rating of Perceived Exertion (RPE) Scale, ranging from 6 ("no exertion") to 20 ("maximal exertion"). Participants verbally reported their RPE immediately after completing all balance tasks to reflect overall physical demand and fatigue during the assessment. This measure provided insight into the tolerability and physical burden of the assessment procedures.

    Through intervention completion, an average of 10 mins

  • Motivational engagement

    Motivational engagement was evaluated using the Intrinsic Motivation Inventory (IMI), a validated tool employing a 7-point Likert scale (1 = "not at all true," 7 = "very true") . Three subscales were analyzed: Interest/Enjoyment (assesses task engagement and inherent enjoyment of the activity); Perceived Competence (measures self-perceived ability and confidence in performing the tasks); Pressure/Tension (evaluates task-related stress and anxiety). Subscale scores were calculated as the mean of item responses, with higher scores indicating greater enjoyment, competence, or pressure, respectively. The IMI has demonstrated good internal consistency and construct validity in older adult populations

    Through intervention completion, an average of 10 mins

  • Intention to continue use

    Intention to continue use was assessed through both quantitative and qualitative methods. Quantitatively, participants rated their likelihood of using the system again on a single-item 7-point Likert scale (1 = "very unlikely," 7 = "very likely") immediately following the assessment. Qualitatively, semi-structured interviews explored factors influencing future use intentions. Interview questions included: "Would you consider using this system regularly for balance checking? Why or why not?" and "What features would encourage you to use this system more often?" Interviews lasted 3-5 minutes, were audio-recorded with permission, transcribed verbatim, and analyzed using Braun and Clarke's six-phase thematic analysis framework.

    Through intervention completion, an average of 10 mins

Other Outcomes (2)

  • Balance confidence

    Prior to the intervention

  • Balance ability

    Through intervention completion, an average of 10 mins

Study Arms (3)

Control group

ACTIVE COMPARATOR

Participants in the control group used Brief-BESTest to assess their balance ability

Device: Brief-BESTest assessment

Experimental group

ACTIVE COMPARATOR

Participants in the experimental group uses the Gamified Digital Balance Assessment to assess their balance ability

Device: Digital balance assessment tool

Gamified group

EXPERIMENTAL

The GDBA builds upon the digitalized Brief-BESTest by incorporating evidence-based gamification elements designed to enhance motivation and engagement among older adults. The gamification design was guided by self-determination theory, which posits that autonomy, competence, and relatedness are key drivers of intrinsic motivation, and by recent systematic reviews on gamification for older adult health interventions.

Device: Gamified Digital Balance Assessment (GDBA)

Interventions

Uses the traditional Brief-BESTest to assess the balance ability of the participants.

Control group

Uses the digital balance assessment tool, which is the digitalized Brief-BESTest, to assess their balance ability.

Experimental group

The GDBA builds upon the digitalized Brief-BESTest by incorporating evidence-based gamification elements designed to enhance motivation and engagement among older adults. The gamification design was guided by self-determination theory, which posits that autonomy, competence, and relatedness are key drivers of intrinsic motivation, and by recent systematic reviews on gamification for older adult health interventions.

Gamified group

Eligibility Criteria

Age60 Years+
Sexall
Healthy VolunteersYes
Age GroupsAdult (18-64), Older Adult (65+)

You may qualify if:

  • aged 60 or older,
  • living independently,
  • able to walk with or without an assistive device (without external help),
  • willing and able to provide informed consent.

You may not qualify if:

  • conditions that impede walking (e.g., hip fractures, lower limb amputations, hemiparesis),
  • medications causing dizziness or affecting balance (e.g., psychotropic drugs),
  • self-reported cardiovascular, pulmonary, neurological, musculoskeletal, or mental disorders,
  • severe fatigue or pain,
  • severe uncorrected vision or hearing impairments that may affect their ability to interact with the digital system

Contact the study team to confirm eligibility.

Sponsors & Collaborators

Study Sites (1)

Hongqi Community

Shanghai, Shanghai Municipality, 200240, China

Location

Study Design

Study Type
interventional
Phase
not applicable
Allocation
RANDOMIZED
Masking
SINGLE
Who Masked
OUTCOMES ASSESSOR
Purpose
PREVENTION
Intervention Model
PARALLEL
Sponsor Type
OTHER
Responsible Party
PRINCIPAL INVESTIGATOR
PI Title
Principle investigator

Study Record Dates

First Submitted

April 17, 2025

First Posted

May 6, 2025

Study Start

April 20, 2025

Primary Completion

May 20, 2025

Study Completion

September 20, 2025

Last Updated

January 22, 2026

Record last verified: 2024-01

Locations