NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 3 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Mahowald, Kyle; Kachergis, George; Frank, Michael C. – First Language, 2020
Ambridge calls for exemplar-based accounts of language acquisition. Do modern neural networks such as transformers or word2vec -- which have been extremely successful in modern natural language processing (NLP) applications -- count? Although these models often have ample parametric complexity to store exemplars from their training data, they also…
Descriptors: Models, Language Processing, Computational Linguistics, Language Acquisition
Peer reviewed Peer reviewed
Direct linkDirect link
Schuler, Kathryn D.; Kodner, Jordan; Caplan, Spencer – First Language, 2020
In 'Against Stored Abstractions,' Ambridge uses neural and computational evidence to make his case against abstract representations. He argues that storing only exemplars is more parsimonious -- why bother with abstraction when exemplar models with on-the-fly calculation can do everything abstracting models can and more -- and implies that his…
Descriptors: Language Processing, Language Acquisition, Computational Linguistics, Linguistic Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Friesen, Norm – Mind, Culture, and Activity, 2009
As an alternative to dominant cognitive-constructivist approaches to educational technology, this article makes the case for what has been termed a discursive, or postcognitive, psychological research paradigm. It does so by adapting discursive psychological analyses of conversational activity to the study of educational technology use. It applies…
Descriptors: Constructivism (Learning), Psychological Studies, Educational Technology, Psychology