NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 130 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Stefan E. Huber; Kristian Kiili; Steve Nebel; Richard M. Ryan; Michael Sailer; Manuel Ninaus – Educational Psychology Review, 2024
This perspective piece explores the transformative potential and associated challenges of large language models (LLMs) in education and how those challenges might be addressed utilizing playful and game-based learning. While providing many opportunities, the stochastic elements incorporated in how present LLMs process text, requires domain…
Descriptors: Artificial Intelligence, Language Processing, Models, Play
Peer reviewed Peer reviewed
Direct linkDirect link
Frank, Stefan L. – Language Learning, 2021
Although computational models can simulate aspects of human sentence processing, research on this topic has remained almost exclusively limited to the single language case. The current review presents an overview of the state of the art in computational cognitive models of sentence processing, and discusses how recent sentence-processing models…
Descriptors: Multilingualism, Language Processing, Computational Linguistics, Psycholinguistics
Peer reviewed Peer reviewed
Direct linkDirect link
Michelle Pauley Murphy; Woei Hung – TechTrends: Linking Research and Practice to Improve Learning, 2024
Constructing a consensus problem space from extensive qualitative data for an ill-structured real-life problem and expressing the result to a broader audience is challenging. To effectively communicate a complex problem space, visualization of that problem space must elucidate inter-causal relationships among the problem variables. In this…
Descriptors: Information Retrieval, Data Analysis, Pattern Recognition, Artificial Intelligence
Peer reviewed Peer reviewed
Direct linkDirect link
Thornton, Chris – Cognitive Science, 2021
Semantic composition in language must be closely related to semantic composition in thought. But the way the two processes are explained differs considerably. Focusing primarily on propositional content, language theorists generally take semantic composition to be a truth-conditional process. Focusing more on extensional content, cognitive…
Descriptors: Semantics, Cognitive Processes, Linguistic Theory, Language Usage
Peer reviewed Peer reviewed
Direct linkDirect link
Mahowald, Kyle; Kachergis, George; Frank, Michael C. – First Language, 2020
Ambridge calls for exemplar-based accounts of language acquisition. Do modern neural networks such as transformers or word2vec -- which have been extremely successful in modern natural language processing (NLP) applications -- count? Although these models often have ample parametric complexity to store exemplars from their training data, they also…
Descriptors: Models, Language Processing, Computational Linguistics, Language Acquisition
Peer reviewed Peer reviewed
Direct linkDirect link
Ehren Helmut Pflugfelder; Joshua Reeves – Journal of Technical Writing and Communication, 2024
The use of generative artificial intelligence (GAI) large language models has increased in both professional and classroom technical writing settings. One common response to student use of GAI is to increase surveillance, incorporating plagiarism detection services or banning certain composing activities from the classroom. This paper argues such…
Descriptors: Technical Writing, Artificial Intelligence, Supervision, Teaching Methods
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Jamie Magrill; Barry Magrill – Teaching & Learning Inquiry, 2024
The rapid advancement of artificial intelligence technologies, exemplified by systems including Open AI's ChatGPT, Microsoft's Bing AI, and Google's Bard (now Gemini 1.5Pro), present both challenges and opportunities for the academic world. Higher education institutions are at the forefront of preparing students for this evolving landscape. This…
Descriptors: Higher Education, Artificial Intelligence, Technological Advancement, Technology Integration
Peer reviewed Peer reviewed
Direct linkDirect link
Schuler, Kathryn D.; Kodner, Jordan; Caplan, Spencer – First Language, 2020
In 'Against Stored Abstractions,' Ambridge uses neural and computational evidence to make his case against abstract representations. He argues that storing only exemplars is more parsimonious -- why bother with abstraction when exemplar models with on-the-fly calculation can do everything abstracting models can and more -- and implies that his…
Descriptors: Language Processing, Language Acquisition, Computational Linguistics, Linguistic Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Margaret A.L. Blackie – Teaching in Higher Education, 2024
Large language models such as ChatGPT can be seen as a major threat to reliable assessment in higher education. In this point of departure, I argue that these tools are a major game changer for society at large. Many of the jobs we now consider highly skilled are based on pattern recognition that can much more reliably be carried by fine-tuned…
Descriptors: Artificial Intelligence, Synchronous Communication, Science and Society, Evaluation
Patience Stevens; David C. Plaut – Grantee Submission, 2022
The morphological structure of complex words impacts how they are processed during visual word recognition. This impact varies over the course of reading acquisition and for different languages and writing systems. Many theories of morphological processing rely on a decomposition mechanism, in which words are decomposed into explicit…
Descriptors: Written Language, Morphology (Languages), Word Recognition, Reading Processes
Peer reviewed Peer reviewed
Direct linkDirect link
Stringer, David – Second Language Research, 2021
Westergaard (2021) presents an updated account of the Linguistic Proximity Model and the micro-cue approach to the parser as an acquisition device. The property-by-property view of transfer inherent in this approach contrasts with other influential models that assume that third language (L3) acquisition involves the creation of a full copy of only…
Descriptors: Transfer of Training, Linguistic Theory, Second Language Learning, Multilingualism
Peer reviewed Peer reviewed
Direct linkDirect link
González-Bueno, Manuela – Applied Language Learning, 2021
A new technique to teach language grammar is proposed. It consists of the blending of two previously existing techniques--VanPatten's (1996) Processing Instruction (PI) and Adair-Hauck and Donato's (2002) Presentation, Attention, Co-construct, and Extension (PACE) Model. The result is the S-PACE Model, which incorporates the whole-language…
Descriptors: Grammar, Second Language Learning, Second Language Instruction, Teaching Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Lieven, Elena; Ferry, Alissa; Theakston, Anna; Twomey, Katherine E. – First Language, 2020
During language acquisition children generalise at multiple layers of granularity. Ambridge argues that abstraction-based accounts suffer from lumping (over-general abstractions) or splitting (over-precise abstractions). Ambridge argues that the only way to overcome this conundrum is in a purely exemplar/analogy-based system in which…
Descriptors: Language Acquisition, Children, Generalization, Abstract Reasoning
Peer reviewed Peer reviewed
Direct linkDirect link
Hoeben Mannaert, Lara; Dijkstra, Katinka – International Journal of Behavioral Development, 2021
Over the past decade or so, developments in language comprehension research in the domain of cognitive aging have converged on support for resilience in older adults with regard to situation model updating when reading texts. Several studies have shown that even though age-related declines in language comprehension appear at the level of the…
Descriptors: Young Adults, Older Adults, Language Processing, Resilience (Psychology)
Peer reviewed Peer reviewed
Direct linkDirect link
González Alonso, Jorge; Rothman, Jason – Second Language Research, 2021
In this commentary to Westergaard (2021), we focus on two main questions. The first, and most important, is what type of L3 data may be construed as supporting evidence--as opposed to a compatible outcome--for the Linguistic Proximity Model. In this regard, we highlight a number of areas in which it remains difficult to derive testable predictions…
Descriptors: Transfer of Training, Second Language Learning, Native Language, Linguistic Theory
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9