Computer-Mediated Intervention to Enhance Emotional Competence in Children With Autism in Schools
1 other identifier
interventional
130
1 country
1
Brief Summary
Autism spectrum condition (ASC) is a neurodevelopmental disorder characterized fundamentally by social deficits. Emotional competence - the ability to express, recognize, understand, and regulate emotions - is a key aspect of social communication. Evidence suggests that the developmental trajectories of autistic children differ from that of neuro-typical children regarding their ability to process and recognize emotions from paralinguistic emotional facial, body language, and voice tone cues. They also have difficulty integrating these cues in context and lack in emotional language. Numerous approaches to teaching people with autism how to recognize and understand emotions have been tried, with recent increased interest in computer-based interventions (CBI). However, most of the research focused only on facial expressions, were limited to autistic children with no intellectual disabilities (ID); and showed limited generalization to real social settings. EmotiPlay, a computer-based intervention program, designed to enhance emotion recognition (ER) by addressing multiple modalities of emotional cues (facial expressions, tone of voice, body language), has shown good outcome when used at home by autistic children and no ID . However, the examination of generalization was partial and depended only on parental reports. The present study main goals are to: (1) Examine the adaptation and the integration of EmotiPlay into special education classrooms in regular schools. (2) Assess EmotiPlay's effect on emotional competence among autistic children at different functioning levels.
Trial Health
Trial Health Score
Automated assessment based on enrollment pace, timeline, and geographic reach
participants targeted
Target at P50-P75 for not_applicable
Started Mar 2022
1 active site
Health score is calculated from publicly available data and should be used for screening purposes only.
Trial Relationships
Click on a node to explore related trials.
Study Timeline
Key milestones and dates
Study Start
First participant enrolled
March 1, 2022
CompletedFirst Submitted
Initial submission to the registry
February 9, 2023
CompletedFirst Posted
Study publicly available on registry
June 15, 2023
CompletedPrimary Completion
Last participant's last visit for primary outcome
November 1, 2023
CompletedStudy Completion
Last participant's last visit for all outcomes
December 1, 2023
CompletedJune 15, 2023
June 1, 2023
1.7 years
February 9, 2023
June 5, 2023
Conditions
Outcome Measures
Primary Outcomes (15)
Emotion recognition task
Emotion recognition test includes 4 tasks to examine emotion recognition: 1. facial expressions videos 2. decontextualized vocal utterances 3. body language videos 4. Integrative video clips presenting all 3 modalities in context, that were extracted from old television shows, sound track was muffled in order to prevent semantic information, but keep prosodic cues. The test include 12 emotions, for every video or recording 4 answers are presented, the target emotion and the order of the possible answers was counterbalanced. In each modality the subject can achieve 0-12 points, a point for every emotion recognized correctly.
before intervention
Emotion recognition
Emotion recognition test includes 4 tasks to examine emotion recognition: 1. facial expressions videos 2. decontextualized vocal utterances 3. body language videos 4. Integrative video clips presenting all 3 modalities in context, that were extracted from old television shows, sound track was muffled in order to prevent semantic information, but keep prosodic cues. The test include 12 emotions, for every video or recording 4 answers are presented, the target emotion and the order of the possible answers was counterbalanced. In each modality the subject can achieve 0-12 points, a point for every emotion recognized correctly.
immediately after the intervention
Emotion recognition
Emotion recognition test includes 4 tasks to examine emotion recognition: 1. facial expressions videos 2. decontextualized vocal utterances 3. body language videos 4. Integrative video clips presenting all 3 modalities in context, that were extracted from old television shows, sound track was muffled in order to prevent semantic information, but keep prosodic cues. The test include 12 emotions, for every video or recording 4 answers are presented, the target emotion and the order of the possible answers was counterbalanced. In each modality the subject can achieve 0-12 points, a point for every emotion recognized correctly.
15 weeks after the end of the intervention
Emotion understanding
TEC - Test of Emotion Competence (Pons \& Harris, 2000) design to assess emotion understanding in 3-12 years old children, it is based on Pons et al. (2002) model of 9 developmental stages to emotion understanding among children. In the test, the subjects are presented with 23 illustrated pictures, in a boy and girl versions. In the first 5 scenarios the child is asked to recognize basic emotions from facial expressions, Next, the child is presented with short stories and the illustrated picture is missing emotional cues in the character face. The examiner reads the story and the child is asked to choose the correct emotion from 4 options. Maximum scoring 21 points.
before intervention
Emotion understanding
TEC - Test of Emotion Competence (Pons \& Harris, 2000) design to assess emotion understanding in 3-12 years old children, it is based on Pons et al. (2002) model of 9 developmental stages to emotion understanding among children. In the test, the subjects are presented with 23 illustrated pictures, in a boy and girl versions. In the first 5 scenarios the child is asked to recognize basic emotions from facial expressions, Next, the child is presented with short stories and the illustrated picture is missing emotional cues in the character face. The examiner reads the story and the child is asked to choose the correct emotion from 4 options. Maximum scoring 21 points.
immediately after the intervention
Emotion understanding
TEC - Test of Emotion Competence (Pons \& Harris, 2000) design to assess emotion understanding in 3-12 years old children, it is based on Pons et al. (2002) model of 9 developmental stages to emotion understanding among children. In the test, the subjects are presented with 23 illustrated pictures, in a boy and girl versions. In the first 5 scenarios the child is asked to recognize basic emotions from facial expressions, Next, the child is presented with short stories and the illustrated picture is missing emotional cues in the character face. The examiner reads the story and the child is asked to choose the correct emotion from 4 options. Maximum scoring 21 points.
15 weeks after the end of the intervention
Emotional-mental vocabulary
Emotion definition task - assess the subject's ability to define 12 emotions. Participants were asked to define the emotion (for example: "please explain what is happy?") and to give examples to personalize experience related to each of the emotions (e.g.: "can you describe a situation that you felt happy?"). The definition and examples were audiotaped, and then transcribed. Points will be allocated to the definition of each emotion according the subscale vocabulary in WISC- IV, all emotions falls within the range of 0 to 24 points
before intervention
Emotional-mental vocabulary
Emotion definition task - assess the subject's ability to define 12 emotions. Participants were asked to define the emotion (for example: "please explain what is happy?") and to give examples to personalize experience related to each of the emotions (e.g.: "can you describe a situation that you felt happy?"). The definition and examples were audiotaped, and then transcribed. Points will be allocated to the definition of each emotion according the subscale vocabulary in WISC- IV, all emotions falls within the range of 0 to 24 points
immediately after the intervention
Emotional-mental vocabulary
Emotion definition task - assess the subject's ability to define 12 emotions. Participants were asked to define the emotion (for example: "please explain what is happy?") and to give examples to personalize experience related to each of the emotions (e.g.: "can you describe a situation that you felt happy?"). The definition and examples were audiotaped, and then transcribed. Points will be allocated to the definition of each emotion according the subscale vocabulary in WISC- IV, all emotions falls within the range of 0 to 24 points
15 weeks after the end of the intervention
social functioning
socio-emotional functioning will be evaluated by playground observation and coded by POPE - Playground Observation of Peer Engagement (Kasari et al, 2005). This instrument is a time-interval behavior coding system. Independent observers from the research team watched the target child on the playground for 40 consecutive seconds and then coded for 2 seconds for ten minutes during school recess. The observers noted the child's engagement with peers on the playground (solitary, proximity, onlooking, parallel, parallel aware, involved in games and rules and joint engaged with peers) in each interval. Coders will also note positive and negative initiations of the target child towered other children, and positive and negative responses to a peer's social overtures.
before intervention
social functioning
socio-emotional functioning will be evaluated by playground observation and coded by POPE - Playground Observation of Peer Engagement (Kasari et al, 2005). This instrument is a time-interval behavior coding system. Independent observers from the research team watched the target child on the playground for 40 consecutive seconds and then coded for 2 seconds for ten minutes during school recess. The observers noted the child's engagement with peers on the playground (solitary, proximity, onlooking, parallel, parallel aware, involved in games and rules and joint engaged with peers) in each interval. Coders will also note positive and negative initiations of the target child towered other children, and positive and negative responses to a peer's social overtures.
immediately after the intervention
social functioning
socio-emotional functioning will be evaluated by playground observation and coded by POPE - Playground Observation of Peer Engagement (Kasari et al, 2005). This instrument is a time-interval behavior coding system. Independent observers from the research team watched the target child on the playground for 40 consecutive seconds and then coded for 2 seconds for ten minutes during school recess. The observers noted the child's engagement with peers on the playground (solitary, proximity, onlooking, parallel, parallel aware, involved in games and rules and joint engaged with peers) in each interval. Coders will also note positive and negative initiations of the target child towered other children, and positive and negative responses to a peer's social overtures.
15 weeks after the end of the intervention
spontaneous emotional mental language
Narrative re-telling task - narratives were elicited using two wordless picture-books, "Frog on His Own (Mayer, 1973) and "Frog, where are you?" (Mayer, 1969). Stories were shortened to a 15-pages, depicting a frog's adventures after departing from his boy companion. Participants are asked to listen to a story the examiner is telling with a predetermined script, while presenting the pictures on an iPad (via book creator app). One book was randomly assigned to each participant, and after listening to the story, the participants will be asked to tell the story in their own words while flipping through the pictures. The stories will be audiotaped, transcribed and coded according to Capps et al., (2000)
before intervention
spontaneous emotional mental language
Narrative re-telling task - narratives were elicited using two wordless picture-books, "Frog on His Own (Mayer, 1973) and "Frog, where are you?" (Mayer, 1969). Stories were shortened to a 15-pages, depicting a frog's adventures after departing from his boy companion. Participants are asked to listen to a story the examiner is telling with a predetermined script, while presenting the pictures on an iPad (via book creator app). One book was randomly assigned to each participant, and after listening to the story, the participants will be asked to tell the story in their own words while flipping through the pictures. The stories will be audiotaped, transcribed and coded according to Capps et al., (2000)
immediately after the intervention
spontaneous emotional mental language
Narrative re-telling task - narratives were elicited using two wordless picture-books, "Frog on His Own (Mayer, 1973) and "Frog, where are you?" (Mayer, 1969). Stories were shortened to a 15-pages, depicting a frog's adventures after departing from his boy companion. Participants are asked to listen to a story the examiner is telling with a predetermined script, while presenting the pictures on an iPad (via book creator app). One book was randomly assigned to each participant, and after listening to the story, the participants will be asked to tell the story in their own words while flipping through the pictures. The stories will be audiotaped, transcribed and coded according to Capps et al., (2000)
15 weeks after the end of the intervention
Secondary Outcomes (4)
Autistic traits
before the intervention
Autistic traits
immediately after the intervention
adaptive skills
before the intervention
adaptive skills
immediately after the intervention
Study Arms (3)
Experimental group - Autistic children
EXPERIMENTAL60 7-10-years-old autistic children, from special education classes integrated in regular schools. this group will receive EmotiPlay's intervention in the curriculum.
Control group- Autistic children
NO INTERVENTION60 7-10-years-old autistic children, from special education classes integrated in regular schools. this group will be wait-listed and receive treatment as usual.
Control group-Neurotypical
NO INTERVENTION30 6-10-years-old children, from regular education match in cognitive and linguistic abilities.
Interventions
EmotiPlay is a computer-based intervention program, designed to enhance emotion recognition (ER) by addressing multiple modalities of emotional cues (facial expressions, tone of voice, body language),
Eligibility Criteria
You may qualify if:
- autism spectrum condition
You may not qualify if:
- Verbal Intelligence (according to Wechsler) 3 or lower.
Contact the study team to confirm eligibility.
Sponsors & Collaborators
Study Sites (1)
Bar Ilan University
Ramat Gan, 5290002, Israel
Related Publications (3)
Fridenson-Hayo S, Berggren S, Lassalle A, Tal S, Pigat D, Bolte S, Baron-Cohen S, Golan O. Basic and complex emotion recognition in children with autism: cross-cultural findings. Mol Autism. 2016 Dec 19;7:52. doi: 10.1186/s13229-016-0113-9. eCollection 2016.
PMID: 28018573RESULTGolan O, Baron-Cohen S, Golan Y. The 'Reading the Mind in Films' Task [child version]: complex emotion and mental state recognition in children with and without autism spectrum conditions. J Autism Dev Disord. 2008 Sep;38(8):1534-41. doi: 10.1007/s10803-007-0533-7. Epub 2008 Feb 29.
PMID: 18311514RESULTMacrostructure, microstructure, and mental state terms in the narratives of English-Hebrew bilingual preschool children with and without specific language impairment. Applied PsychoLinguistics, 37(1), 165-193.
RESULT
MeSH Terms
Conditions
Condition Hierarchy (Ancestors)
Study Design
- Study Type
- interventional
- Phase
- not applicable
- Allocation
- RANDOMIZED
- Masking
- TRIPLE
- Who Masked
- PARTICIPANT, CARE PROVIDER, OUTCOMES ASSESSOR
- Purpose
- SUPPORTIVE CARE
- Intervention Model
- PARALLEL
- Sponsor Type
- OTHER
- Responsible Party
- PRINCIPAL INVESTIGATOR
- PI Title
- Prof. Ofer Golan
Study Record Dates
First Submitted
February 9, 2023
First Posted
June 15, 2023
Study Start
March 1, 2022
Primary Completion
November 1, 2023
Study Completion
December 1, 2023
Last Updated
June 15, 2023
Record last verified: 2023-06
Data Sharing
- IPD Sharing
- Will not share