Skip to main content

Research

“The most violent element in society is ignorance.” -Emma Goldman


Research in the SLAB Lab focuses on the cognitive and neural processes supporting speech perception and language comprehension, as well as how these processes are impacted in stroke-induced and primary progressive aphasia. Current projects utilize electroencephalography (EEG), natural language processing and machine learning to examine neural encoding of acoustic and linguistic features of naturalistic, continuous speech (e.g., audiobook, podcast) in individuals with and without aphasia. In addition, the SLAB Lab has an ongoing collaboration with the Noninvasive Brain-Machine Interface Systems Lab (PI: Pepe Contreras-Vidal, Ph.D.) to develop a brain-computer interface (BCI) for decoding spoken language. The long-term goal of this project is to develop a BCI capable of decoding imagined language, which can be used as assistive technology for individuals who cannot produce speech on their own. Research projects in development are geared toward improving speech perception and language comprehension in individuals with aphasia using an app-based speech-language intervention and transcranial alternating current stimulation.