Skip to main content

Advertisement

Log in

Conversation During a Virtual Reality Task Reveals New Structural Language Profiles of Children with ASD, ADHD, and Comorbid Symptoms of Both

  • Original Paper
  • Published:
Journal of Autism and Developmental Disorders Aims and scope Submit manuscript

Abstract

Many studies have utilized standardized measures and storybook narratives to characterize language profiles of children with Autism Spectrum Disorder (ASD) and Attention Deficit/Hyperactivity Disorder (ADHD). They report that structural language of these children is on par with mental-age-matched typically developing (TD) peers. Few studies have looked at structural language profiles in conversational contexts. This study examines conversational speech produced in a virtual reality (VR) paradigm to investigate the strengths and weaknesses of structural language abilities of these children. The VR paradigm introduced varying social and cognitive demands across phases. Our results indicate that children from these diagnostic groups produced less complex structural language than TD children. Moreover, language complexity decreased in all groups across phases, suggesting a cross-etiology sensitivity to conversational contexts.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

Download references

Acknowledgments

We thank all of the children and their families who participated in this research, as well as the dedicated students who transcribed the audio files. This research was supported by grants from the National Institute on Deafness and Other Communication Disorders (NIHDCD R01DC016665) and the Institute of Education Sciences (IES R324A110174).

Author information

Authors and Affiliations

Authors

Contributions

PM and NM contributed to the study design and development. Data collection was performed by the UC Davis MIND Institute team (PM and NM). Data preparation was completed by NA and CB, and analyses were performed by CB, both in consultation with LN. Manuscript writing was led by CB and LN. All authors contributed to the manuscript writing, revisions, and approval of the final manuscript submitted.

Corresponding author

Correspondence to Cynthia Boo.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical Approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Informed Consent

Informed consent was obtained from all individual participants included in the study.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Boo, C., Alpers-Leon, N., McIntyre, N. et al. Conversation During a Virtual Reality Task Reveals New Structural Language Profiles of Children with ASD, ADHD, and Comorbid Symptoms of Both. J Autism Dev Disord 52, 2970–2983 (2022). https://doi.org/10.1007/s10803-021-05175-6

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10803-021-05175-6

Keywords

Navigation