Publication Date
In 2025 | 0 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 3 |
Since 2016 (last 10 years) | 5 |
Since 2006 (last 20 years) | 5 |
Descriptor
Source
Grantee Submission | 5 |
Author
Vincent Aleven | 2 |
Aaron R. Lyon | 1 |
Aleven, Vincent | 1 |
Alex R. Dopp | 1 |
Bruce M. McLaren | 1 |
Bryce D. McLeod | 1 |
Danielle S. McNamara | 1 |
Emma R. Dear | 1 |
Holstein, Kenneth | 1 |
Jori Blankestijn | 1 |
Kenneth Holstein | 1 |
More ▼ |
Publication Type
Reports - Research | 3 |
Speeches/Meeting Papers | 2 |
Journal Articles | 1 |
Reports - Descriptive | 1 |
Reports - Evaluative | 1 |
Education Level
Elementary Secondary Education | 2 |
Junior High Schools | 2 |
Middle Schools | 2 |
Secondary Education | 2 |
Elementary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Emma R. Dear; Bryce D. McLeod; Nicole M. Peterson; Kevin S. Sutherland; Michael D. Broda; Alex R. Dopp; Aaron R. Lyon – Grantee Submission, 2024
Introduction: Due to usability, feasibility, and acceptability concerns, observational treatment fidelity measures are often challenging to deploy in schools. Teacher self-report fidelity measures with specific design features might address some of these barriers. This case study outlines a community-engaged, iterative process to adapt the…
Descriptors: Measures (Individuals), Data Collection, Observation, Learning Analytics
Tong Li; Sarah D. Creer; Tracy Arner; Rod D. Roscoe; Laura K. Allen; Danielle S. McNamara – Grantee Submission, 2022
Automated writing evaluation (AWE) tools can facilitate teachers' analysis of and feedback on students' writing. However, increasing evidence indicates that writing instructors experience challenges in implementing AWE tools successfully. For this reason, our development of the Writing Analytics Tool (WAT) has employed a participatory approach…
Descriptors: Automation, Writing Evaluation, Learning Analytics, Participatory Research
Vincent Aleven; Jori Blankestijn; LuEttaMae Lawrence; Tomohiro Nagashima; Niels Taatgen – Grantee Submission, 2022
Past research has yielded ample knowledge regarding the design of analytics-based tools for teachers and has found beneficial effects of several tools on teaching and learning. Yet there is relatively little knowledge regarding the design of tools that support teachers when a class of students uses AI-based tutoring software for self-paced…
Descriptors: Educational Technology, Artificial Intelligence, Problem Solving, Intelligent Tutoring Systems
Holstein, Kenneth; McLaren, Bruce M.; Aleven, Vincent – Grantee Submission, 2019
Involving stakeholders throughout the creation of new educational technologies can help ensure their usefulness and usability in real-world contexts. However, given the complexity of learning analytics (LA) systems, it can be challenging to meaningfully involve non-technical stakeholders throughout their design and development. This article…
Descriptors: Learning Analytics, Technology Uses in Education, Artificial Intelligence, Stakeholders
Kenneth Holstein; Bruce M. McLaren; Vincent Aleven – Grantee Submission, 2017
Intelligent tutoring systems (ITSs) are commonly designed to enhance student learning. However, they are not typically designed to meet the needs of teachers who use them in their classrooms. ITSs generate a wealth of analytics about student learning and behavior, opening a rich design space for real-time teacher support tools such as dashboards.…
Descriptors: Intelligent Tutoring Systems, Technology Integration, Educational Technology, Middle School Teachers