The finding that linguistic brain activity in Deaf users of American Sign Language occurs in the same centralized locations in the brain as spoken language has led to many findings about how brain structure affects language, however more discoveries are yet to be had in observing the differences in brain activity between signing and speaking populations. Dan et al. (2013) found that variances in phonological units in Japanese compared to English–or other "alphabet-based" languages–were associated with varying brain activation in verbal fluency tasks–Dan et al. (2013) located neurological variances in verbal fluency tasks for Japanese speakers, which provides comparable data for future research on other neurological variances and provides data to base future diagnoses of verbal impairments related to psychological and neurological disorders such as schizophrenia, Alzheimer's disease, autism, attention deficit and so forth specific to Japanese speakers.
A lack of data for non-alphabet-based language users is problematic for people of that language who need diagnoses. Sign language, like Japanese, varies from alphabet-based phonological structure. Sign language is manual-mode and thus the phonological components are manual and motion-based. The respective variances in brain activity are necessary to identify for users of American Sign Language. That is what this study seeks to find.
Dan et al. (2013) did not research how these neurological variances may differ for Japanese bilinguals and they did not collect data from populations of "alphabet-based" language speakers. In this study, which will focus on ASL populations, we will use fNIRS caps on each of our participants to collect data on the locations of brain activation as was done in Dan et al. (2013). The same verbal fluency tasks as performed in Dan et al. (2013) will be adapted to sign language phonological parameters and administered to Deaf native-ASL users and hearing bilingual-ASL users. As a control, the test will also be administered in English to hearing English-speakers. The results will not only start a database for comparing results of neurological verbal fluency tests done on Deaf ASL-using populations, but will also provide a base for future research related to the differences processing ASL versus English.
Dan, H., Dan, I., Sano, T., Kyutoku, Y., Oguro, K., Yokota, H., Tsuzuki, D., & Watanabe, E. (2013). Language-specific cortical activation patterns for verbal fluency tasks in Japanese as assessed by multichannel functional near-infrared spectroscopy. Brain and Language, 126(2), 208–216. https://doi.org/10.1016/j.bandl.2013.05.007
Kovelman, I., Shalinsky, M. H., White, K. S., Schmitt, S. N., Berens, M. S., Paymer, N., & Petitto, L.-A. (2009). Dual language use in sign-speech bimodal bilinguals: fNIRS brain-imaging evidence. Brain and Language, 109(2–3), 112–123. https://doi.org/10.1016/j.bandl.2008.09.008
Petitto, L. A., Zatorre, R. J., Gauna, K., Nikelski, E. J., Dostie, D., & Evans, A. C. (2000). Speech-like cerebral activity in profoundly deaf people processing signed languages: Implications for the neural basis of human language. Proceedings of the National Academy of Sciences, 97(25), 13961–13966. https://doi.org/10.1073/pnas.97.25.13961