NotesFAQContact Us
Collection
Advanced
Search Tips
What Works Clearinghouse Rating
Showing 1 to 15 of 1,618 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Tammy L. Stephens; Pedro Olvera; Edward K. Schultz – Contemporary School Psychology, 2024
This article positions the core-selective evaluation process (C-SEP), a pattern of strengths and weaknesses (PSW) model of identifying specific learning disabilities (SLD), within guidelines and best practices recommended for assessing English learners. C-SEP is a broad approach that utilizes multiple sources of data to establish underachievement,…
Descriptors: English Language Learners, Learning Disabilities, Evaluation Methods, Best Practices
Peer reviewed Peer reviewed
Direct linkDirect link
A. M. Sadek; Fahad Al-Muhlaki – Measurement: Interdisciplinary Research and Perspectives, 2024
In this study, the accuracy of the artificial neural network (ANN) was assessed considering the uncertainties associated with the randomness of the data and the lack of learning. The Monte-Carlo algorithm was applied to simulate the randomness of the input variables and evaluate the output distribution. It has been shown that under certain…
Descriptors: Monte Carlo Methods, Accuracy, Artificial Intelligence, Guidelines
Peer reviewed Peer reviewed
Direct linkDirect link
André Seixas de Novais; José Alexandre Matelli; Messias Borges Silva – International Journal of Artificial Intelligence in Education, 2024
This research aims to present a Fuzzy Expert System with psychologist expertise that seeks to assist professors, researchers and educational institutions in assessing the level of incorporation of students' Soft Skills while attending Active Learning sessions. The difficulties encountered by higher education institutions, researchers and…
Descriptors: Soft Skills, Active Learning, Teaching Methods, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Göran Lövestam; Susanne Bremer-Hoffmann; Koen Jonkers; Pieter van Nes – Research Ethics, 2025
The Joint Research Centre (JRC) is the European Commission's in-house science and knowledge service, employing a substantial staff of scientists devoted to conducting research to provide independent scientific advice for EU policy. Focussed on various research areas aligned with EU priorities, the JRC excels in delivering scientific evidence for…
Descriptors: Integrity, Ethics, Scientific Research, Scientists
Peer reviewed Peer reviewed
Direct linkDirect link
Björn Hammarfelt; Claes-Fredrik Helgesson; Gustaf Nelhans; Erik Joelsson – Research Evaluation, 2024
Disciplines display field-specific ways of valuing research contributions, and these different 'styles of valuation' influence how academic careers are assessed and formed. Yet, differences in how research is evaluated are also prevalent between different levels of assessment: collegial and organizational. Consequently, we employ a multifaceted…
Descriptors: Evaluation Methods, Guidelines, College Faculty, Humanities
Peer reviewed Peer reviewed
Direct linkDirect link
James Ohisei Uanhoro – Educational and Psychological Measurement, 2024
Accounting for model misspecification in Bayesian structural equation models is an active area of research. We present a uniquely Bayesian approach to misspecification that models the degree of misspecification as a parameter--a parameter akin to the correlation root mean squared residual. The misspecification parameter can be interpreted on its…
Descriptors: Bayesian Statistics, Structural Equation Models, Simulation, Statistical Inference
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Tan, Teck Kiang – Practical Assessment, Research & Evaluation, 2023
Researchers often have hypotheses concerning the state of affairs in the population from which they sampled their data to compare group means. The classical frequentist approach provides one way of carrying out hypothesis testing using ANOVA to state the null hypothesis that there is no difference in the means and proceed with multiple comparisons…
Descriptors: Comparative Analysis, Hypothesis Testing, Statistical Analysis, Guidelines
Matthew John Davidson – ProQuest LLC, 2022
Digitally-based assessments create opportunities for collecting moment to moment information about how students are responding to assessment items. This information, called log or process data, has long been regarded as a vast and valuable source of data about student performance. Despite repeated assurances of its vastness and value, process data…
Descriptors: Data Use, Psychometrics, Item Response Theory, Test Items
Danita K. Ladson – ProQuest LLC, 2024
The United States Chairman of the Joint Chiefs of Staff declared critical thinking as a capability imperative for Force XXI. He directed Military Services to develop critical thinkers. There is limited evidence of success in terms of developing critical thinking abilities. Specifically, the U.S. Army lacks a single, generally accepted, codified…
Descriptors: Military Personnel, Thinking Skills, Critical Thinking, Armed Forces
Peer reviewed Peer reviewed
Direct linkDirect link
Tora Storesund Tremoen; Pål Lagestad – Sport, Education and Society, 2025
In 2020, a new curriculum, Kunnskapsløftet 2020 (LK20), was introduced in Norwegian schools. This study investigated how physical education (PE) teachers in upper secondary schools have changed their assessment practice following the introduction of LK20. To achieve this, nine individual in-depth interviews were conducted with PE teachers from six…
Descriptors: Foreign Countries, Physical Education, Physical Education Teachers, Teacher Attitudes
Peer reviewed Peer reviewed
Direct linkDirect link
Oscar Clivio; Avi Feller; Chris Holmes – Grantee Submission, 2024
Reweighting a distribution to minimize a distance to a target distribution is a powerful and flexible strategy for estimating a wide range of causal effects, but can be challenging in practice because optimal weights typically depend on knowledge of the underlying data generating process. In this paper, we focus on design-based weights, which do…
Descriptors: Evaluation Methods, Causal Models, Error of Measurement, Guidelines
Peer reviewed Peer reviewed
Direct linkDirect link
Guo, Wenjing; Choi, Youn-Jeng – Educational and Psychological Measurement, 2023
Determining the number of dimensions is extremely important in applying item response theory (IRT) models to data. Traditional and revised parallel analyses have been proposed within the factor analysis framework, and both have shown some promise in assessing dimensionality. However, their performance in the IRT framework has not been…
Descriptors: Item Response Theory, Evaluation Methods, Factor Analysis, Guidelines
Victoria L. Bernhardt – Eye on Education, 2025
With the 5th Edition of Data Analysis for Continuous School Improvement, best-selling Victoria Bernhardt has written the go-to-resource for data analysis in your school! By incorporating collaborative structures to implement, monitor, and evaluate the vision and continuous improvement plan, this book provides a framework to show learning…
Descriptors: Learning Analytics, Data Analysis, Educational Improvement, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Paulina Lehmkuhl; Benedikt Wagner; Stefanie Frisch; Dominik Rumlich; Judith Visser – TESOL Journal, 2025
The use of digital tools for second and foreign language lexical learning is increasingly popular and research in this area is constantly expanding. However, little has been written about specific criteria that could be used to identify tools with high-quality lexical input, as available checklists and frameworks for digital media tend to neglect…
Descriptors: Check Lists, Second Language Learning, Second Language Instruction, Pilot Projects
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Daisy Imbaquingo; Javier Díaz; José Jácome – Journal of Technology and Science Education, 2024
Higher Education Institutions (HEIs) need a specialized computer audit method to minimize quality and security risks and facilitate institutional evaluation and accreditation. This study aimed to develop a Computer Audit Method for HEIs (MAIIES) providing methodological support for the computer audit process. The MAIIES method includes planning,…
Descriptors: Computer Security, Audits (Verification), Information Security, Risk
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9  |  10  |  11  |  ...  |  108