Effects of Chinese sign language modality on processing sentences
YAO Dengfeng1,2,3, JIANG Minghu1,2, ABUDOUKELIMU Abulizi1,2
1. Lab of Computational Linguistics, School of Humanities, Tsinghua University, Beijing 10084, China;
2. Center for Psychology and Cognitive Science, Tsinghua University, Beijing 10084, China;
3. Beijing Key Laboratory of Information Service Engineering, Beijing Union University, Beijing 100101, China
Abstract:The effects of Chinese sign language (CSL) modality on processing sentences are investigated by using the event-related potential method to focus on the influence of both semantic and phonological information on the lexical accessment during sentence processing and its timing. The semantic expectancy and phonetic form of the target signs with the final inflections are manipulated during stimuli selection in the test design. The electroencephalogram data of 18 native CSL signers were recorded and analyzed. The sign perception evidence (100-200 ms) reflects the process of mapping the primary visual features into the phonological features of the signs. This was followed by two negativities associated with matching the phonological features of the target sign with contextual information (200-400 ms) and the dynamic construction of the semantic restriction of the discourse context (400-650 ms) that promoted integration of the lexicon and sentence context. The phonological and semantic-related signs shared a similar timing pattern and the effect direction verified the similarity of the semantic and phonological properties access and the similar online processing cost of CSL and ordinary Chinese sentence comprehension. An N300 effect was found that uniquely indicates the successful processing of phonetic features in the sign language modality. This effect represents the direct link between the phonological feature recognition and the sign recognition. The results show that the subject considers the match between context and target sign as a priority, rather than the phonological features of the signs. This result indicates the interaction between the phonological and semantic-related information at the moment of lexical selection.
姚登峰, 江铭虎, 阿布都克力木·阿布力孜. 中国手语模态对句子加工的影响[J]. 清华大学学报(自然科学版), 2016, 56(9): 942-948.
YAO Dengfeng, JIANG Minghu, ABUDOUKELIMU Abulizi. Effects of Chinese sign language modality on processing sentences. Journal of Tsinghua University(Science and Technology), 2016, 56(9): 942-948.
[1] Kutas M, Hillyard S A. Reading senseless sentences: Brain potentials reflect semantic incongruity [J]. Science, 1980, 207(4427): 203-205.
[2] Friederici A D. Towards a neural basis of auditory sentence processing [J]. Trends in Cognitive Sciences, 2002, 6(2): 78-84.
[3] Kolk H H J, Chwilla D J, Van Herten M, et al. Structure and limited capacity in verbal working memory: A study with event-related potentials [J]. Brain and language, 2003, 85(1): 1-36.
[4] Kuperberg G R. Neural mechanisms of language comprehension: Challenges to syntax [J]. Brain Research, 2007, 1146: 23-49.
[5] Bornkessel I, Schlesewsky M. The extended argument dependency model: a neurocognitive approach to sentence comprehension across languages [J]. Psychological Review, 2006, 113(4): 787.
[6] Brouwer H, Fitz H, Hoeks J. Getting real about semantic illusions: rethinking the functional role of the P600 in language comprehension [J]. Brain Research, 2012, 1446: 127-143.
[7] Stevens K N, Blumstein S E. The search for invariant acoustic correlates of phonetic features [J]. Perspectives on the Study of Speech, 1981: 1-38.
[8] Stokoe W C. Sign language structure: An outline of the visual communication systems of the American deaf [J]. Journal of deaf studies and deaf education, 2005, 10(1): 3-37.
[9] Morgan G, Barrett-Jones S, Stoneham H. The first signs of language: Phonological development in British Sign Language [J]. Applied Psycholinguistics, 2007, 28(01): 3-22.
[10] Corina D P, Hildebrandt U C. Psycholinguistic investigations of phonological structure in ASL [J]. Modality and Structure in Signed and Spoken Languages, 2002: 88-111.
[11] Carreiras M, Gutiérrez-Sigut E, Baquero S, et al. Lexical processing in Spanish sign language (LSE) [J]. Journal of Memory and Language, 2008, 58(1): 100-122.
[12] Dufour S, Peereman R. Inhibitory priming effects in auditory word recognition: When the target's competitors conflict with the prime word [J]. Cognition, 2003, 88(3): B33-B44.
[13] Gutierrez E, Williams D, Grosvald M, et al. Lexical access in American Sign Language: An ERP investigation of effects of semantics and phonology [J]. Brain Research, 2012, 1468: 63-83.
[14] Neville H J, Coffey S A, Lawson D S, et al. Neural systems mediating American Sign Language: Effects of sensory experience and age of acquisition [J]. Brain and Language, 1997, 57(3): 285-308.
[15] Eimer M. Does the face-specific N170 component reflect the activity of a specialized eye processor? [J]. Neuroreport, 1998, 9(13): 2945-2948.
[16] Eimer M. The face-specific N170 component reflects late stages in the structural encoding of faces [J]. Neuroreport, 2000, 11(10): 2319-2324. -2324.
[17] Chauncey K, Holcomb P J, Grainger J. Effects of stimulus font and size on masked repetition priming: An event-related potentials (ERP) investigation [J]. Language and Cognitive Processes, 2008, 23(1): 183-200.
[18] Emmorey K, Corina D. Lexical recognition in sign language: Effects of phonetic structure and morphology [J]. Perceptual and Motor Skills, 1990, 71(3f): 1227-1252.
[19] Dambacher M, Rolfs M, Giillner K, et al. Event-related potentials reveal rapid verification of predicted visual input [J]. PLoS ONE, 2009, 4(3): e5047.
[20] Kim A, Lai V. Rapid interactions between lexical semantic and word form analysis during word recognition in context: Evidence from ERPs [J]. Journal of Cognitive Neuroscience, 2012, 24(5): 1104-1112.
[21] Emmorey K, Damasio H, McCullough S, et al. Neural systems underlying spatial language in American Sign Language [J]. Neuroimage, 2002, 17(2): 812-824.
[22] Kutas M, Neville H J, Holcomb P J. A preliminary comparison of the N400 response to semantic anomalies during reading, listening and signing [J]. Electroencephalography and Clinical Neurophysiology Supplement, 1987, 39: 325-330.
[23] Colin C, Zuinen T, Bayard C, et al. Phonological processing of rhyme in spoken language and location in sign language by deaf and hearing participants: A neurophysiological study [J]. Neurophysiologie Clinique/Clinical Neurophysiology, 2013, 43(3): 151-160.
[24] Barrett S E, Rugg M D. Event-related potentials and the semantic matching of pictures [J]. Brain and cognition, 1990, 14(2): 201-212.
[25] McPherson W B, Holcomb P J. An electrophysiological investigation of semantic priming with pictures of real objects [J]. Psychophysiology, 1999, 36(01): 53-65.
[26] Luck S J. An introduction to the event-related potential technique [M]. MIT press, 2014.
[27] Holcomb P J, Grainger J, O'rourke T. An electrophysiological study of the effects of orthographic neighborhood size on printed word perception [J]. Journal of Cognitive Neuroscience, 2002, 14(6): 938-950.
[28] Massol S, Grainger J, Dufau S, et al. Masked priming from orthographic neighbors: An ERP investigation [J]. Journal of Experimental Psychology: Human Perception and Performance, 2010, 36(1): 162-174.
[29] Diependaele K, Ziegler J C, Grainger J. Fast phonology and the bimodal interactive activation model [J]. European Journal of Cognitive Psychology, 2010, 22(5): 764-778.
[30] Niznikiewicz M, Squires N K. Phonological processing and the role of strategy in silent reading: behavioral and electrophysiological evidence [J]. Brain and Language, 1996, 52(2): 342-364.
[31] McClelland J L, Rumelhart D E. An interactive activation model of context effects in letter perception: I. An account of basic findings [J]. Psychological Review, 1981, 88(5): 375-407.