NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 6 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Wang, Wentao; Vong, Wai Keen; Kim, Najoung; Lake, Brenden M. – Cognitive Science, 2023
Neural network models have recently made striking progress in natural language processing, but they are typically trained on orders of magnitude more language input than children receive. What can these neural networks, which are primarily distributional learners, learn from a naturalistic subset of a single child's experience? We examine this…
Descriptors: Brain Hemisphere Functions, Linguistic Input, Longitudinal Studies, Self Concept
Peer reviewed Peer reviewed
Direct linkDirect link
Sakine Çabuk-Balli; Jekaterina Mazara; Aylin C. Küntay; Birgit Hellwig; Barbara B. Pfeiler; Paul Widmer; Sabine Stoll – Cognitive Science, 2025
Negation is a cornerstone of human language and one of the few universals found in all languages. Without negation, neither categorization nor efficient communication would be possible. Languages, however, differ remarkably in how they express negation. It is yet widely unknown how the way negation is marked influences the acquisition process of…
Descriptors: Morphemes, Native Language, Language Acquisition, Infants
Peer reviewed Peer reviewed
Direct linkDirect link
Valentini, Alessandra; Serratrice, Ludovica – Cognitive Science, 2021
Strong correlations between vocabulary and grammar are well attested in language development in monolingual and bilingual children. What is less clear is whether there is any directionality in the relationship between the two constructs, whether it is predictive over time, and the extent to which it is affected by language input. In the present…
Descriptors: Bilingualism, Correlation, English (Second Language), Second Language Learning
Peer reviewed Peer reviewed
Direct linkDirect link
Ouyang, Long; Boroditsky, Lera; Frank, Michael C. – Cognitive Science, 2017
Computational models have shown that purely statistical knowledge about words' linguistic contexts is sufficient to learn many properties of words, including syntactic and semantic category. For example, models can infer that "postman" and "mailman" are semantically similar because they have quantitatively similar patterns of…
Descriptors: Semiotics, Computational Linguistics, Syntax, Semantics
Peer reviewed Peer reviewed
Direct linkDirect link
Janciauskas, Marius; Chang, Franklin – Cognitive Science, 2018
Language learning requires linguistic input, but several studies have found that knowledge of second language (L2) rules does not seem to improve with more language exposure (e.g., Johnson & Newport, 1989). One reason for this is that previous studies did not factor out variation due to the different rules tested. To examine this issue, we…
Descriptors: Linguistic Input, Second Language Learning, Age Differences, Syntax
Peer reviewed Peer reviewed
Direct linkDirect link
Ambridge, Ben; Rowland, Caroline F.; Pine, Julian M. – Cognitive Science, 2008
According to Crain and Nakayama (1987), when forming complex yes/no questions, children do not make errors such as "Is the boy who smoking is crazy?" because they have innate knowledge of "structure dependence" and so will not move the auxiliary from the relative clause. However, simple recurrent networks are also able to avoid…
Descriptors: Children, Language Processing, Language Patterns, Linguistic Input