Speaker from San Diego State, highlighting pride in being part of the faculty.
Focus on research and promotion of sign languages.
Aim to discuss what the study of sign languages reveals about human languages and the brain.
Overview of Sign Languages
Examples of sign languages globally:
American Sign Language (ASL) discussing the brain's structure.
Netherlands sign language research on dialectical variations.
Libras (Brazilian sign language) research by Diane Lillo-Martin and Ronice Quadros on language acquisition.
Linguistic articles examining typology of sign languages worldwide.
Sign languages offer insights into universal aspects of human language.
Importance of theories that encompass both spoken and signed languages.
Perceptual Systems and Language
Differences in auditory (spoken) and visual (sign) language processing:
Auditory systems excel in fast temporal changes; spoken languages have linear structure.
Visual systems process information simultaneously, leading to simultaneous structures in sign languages.
Example: facial expressions conveying information concurrently with signs.
Sign Language Production and Perception
Sign languages produced through visible articulators (hands), unlike speech.
Visual nature allows for iconic representation of actions.
Differences in expressing spatial relationships between sign and spoken languages.
Key Questions Raised by Sign Language Studies
Do all human languages represent meaning (semantics) independently from form (phonology)?
Relationship between language and pantomime – how the brain distinguishes between them.
Impact of biological language expression on neural substrates for spatial language.
Iconicity and Meaning Representation
Iconicity in sign languages where signs resemble their meaning:
Examples from ASL: signs for hairbrush, ball, Scotland, and brain.
Question of whether semantics and form are conflated in sign languages.
Study on 'tip-of-the-tongue' (TOT) experiences:
Diary study showed signers experience 'tip-of-the-finger' (TOF) states.
Elicitation tasks demonstrated partial retrieval of sign components.
Evidence suggests a separation between semantics and form in sign languages.
Brain Processing of Sign Language vs. Pantomime
Examination of Broca’s and Wernicke’s areas in sign language processing:
Both areas active in sign language production and comprehension.
Sign language engages similar brain regions as spoken language, indicating these are language regions, not speech-specific.
Differences in brain activity for language vs. pantomime:
Sign language activates Broca’s area, whereas pantomimes activate superior parietal lobule.
Spatial Language in Sign Languages
Sign languages use spatial location of signs to convey spatial relationships.
No sign languages utilize propositions or locative affixes like spoken languages.
Brain studies show different activations for spatial expressions in sign language:
Bilateral superior parietal cortex involved in spatial location expression.
For object representation, inferior temporal cortex and Broca’s area engaged.
Conclusion
Iconicity does not alter the fundamental organization of language or brain circuits.
Spatial language processing in sign languages relies on different neural mechanisms than spoken languages due to the visual-spatial nature of signing.
Presentation of appreciation for participants in studies and a brief mention of ongoing research in the lab, including bilingualism and reading in the deaf brain.
The lecture concludes with acknowledgments to collaborators, funding agencies, and an invitation to a later reception.