Humans have a remarkable ability to imagine. Throughout the history of Cognitive Science, scholars have wondered about the nature of imagery: what does it mean to perceive something that isn’t actually there? How does that perception compare to the perception you would have if it were there? And, most importantly, how can we know? The majority of imagery studies have focused on the visual modality, on whether people generate pictorial representations of images. My work, in contrast, explores auditory imagery through the lens of human language.
To study the psychological representations of auditory imagery, I investigate sound representations generated during silent reading. Language presents an excellent case study for auditory imagery in that it can be transmitted via two modalities: vision (reading) and audition (speaking). In my lab, we compare the cognitive processing of language in the visual modality to the same messages in the auditory modality. Finding similarities between these processes suggests similarities between auditory imagery and auditory perception.
In one series of studies, I demonstrated that readers had difficulty when they read words that didn’t conform to an overall imagined sentence rhythm (e.g., they were expecting PREsent and they read preSENT). In another, I observed reading slowdowns when readers encountered CAPS on a word that would not be accented if spoken aloud. I argue from these results that readers are experiencing auditory imagery during reading— that reading silently activates sound representations of words and sentences. Moreover, my data demonstrate that auditory imagery affects language understanding in real time. That is, the auditory images themselves have consequences for language understanding.
In current work, I am investigating how auditory imagery during reading is like auditory perception. Planned studies will adapt methods developed to investigate visual imagery, which have demonstrated processing overlap between imagery and perception. In one series of studies, I am recording event‐related potentials (ERPs) from the scalps of participants who are reading or listening to spoken language. I will compare these datasets to identify overlap in the neural signal between reading and listening; moreover, building on prior ERP studies on perception and memory, I will explore how these basic cognitive processes might contribute to imagery. In a second series of studies, I am investigating individual differences in auditory imagery during silent reading. This work builds on demonstrations of correlations between participants’ self‐reported imagery vividness, and the strength of imagery‐contingent behavioral effects.
By exploring the similarities between perception and imagery, research in my lab addresses foundational questions in cognitive science about the nature of mental representations. In addition, the proposed research program will have application for reading instruction. Results from these studies may shed light on the established yet unexplained relationship between children’s fluency in reading aloud and their silent reading comprehension ability. It may be the case that beginning readers who read aloud with appropriate prosody—that is, their phrasing and accents reflect the meaning of the sentence—also read silently with appropriate prosody, thereby enhancing their understanding of syntax and semantics and their overall sentence comprehension.