NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)3
Since 2006 (last 20 years)13
Laws, Policies, & Programs
Education Amendments 19743
Assessments and Surveys
What Works Clearinghouse Rating
Showing 1 to 15 of 32 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Hipólito-Nájera, A. Ricardo; Moya-Hernandez, M. Rosario; Gomez-Balderas, Rodolfo; Rojas-Hernandez, Alberto; Romero-Romo, Mario – Journal of Chemical Education, 2017
Validation of analytical methods is a fundamental subject for chemical analysts working in chemical industries. These methods are also relevant for pharmaceutical enterprises, biotechnology firms, analytical service laboratories, government departments, and regulatory agencies. Therefore, for undergraduate students enrolled in majors in the field…
Descriptors: Program Validation, Evaluation Methods, Chemistry, Laboratory Experiments
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2017
The What Works Clearinghouse (WWC) evaluates research studies that look at the effectiveness of education programs, products, practices, and policies, which the WWC calls "interventions." Many studies of education interventions make claims about impacts on students' outcomes. Some studies have designs that enable readers to make causal…
Descriptors: Program Design, Program Development, Program Effectiveness, Program Evaluation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Richer, Amanda; Charmaraman, Linda; Ceder, Ineke – Afterschool Matters, 2018
Like instruments used in afterschool programs to assess children's social and emotional growth or to evaluate staff members' performance, instruments used to evaluate program quality should be free from bias. Practitioners and researchers alike want to know that assessment instruments, whatever their type or intent, treat all people fairly and do…
Descriptors: Cultural Differences, Social Bias, Interrater Reliability, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Gargani, John; Strong, Michael – Journal of Teacher Education, 2015
In Gargani and Strong (2014), we describe The Rapid Assessment of Teacher Effectiveness (RATE), a new teacher evaluation instrument. Our account of the validation research associated with RATE inspired a review by Good and Lavigne (2015). Here, we reply to the main points of their review. We elaborate on the validity, reliability, theoretical…
Descriptors: Evidence, Teacher Effectiveness, Teacher Evaluation, Evaluation Methods
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Landry, Jeffrey P.; Pardue, J. Harold; Daigle, Roy; Longenecker, Herbert E., Jr. – Information Systems Education Journal, 2013
This paper describes an instrument designed for assessing learning outcomes in data management. In addition to assessment of student learning and ABET outcomes, we have also found the instrument to be effective for determining database placement of incoming information systems (IS) graduate students. Each of these three uses is discussed in this…
Descriptors: Database Management Systems, Evaluation Methods, Graduate Students, Information Systems
Peer reviewed Peer reviewed
Direct linkDirect link
Gormally, Cara; Brickman, Peggy; Lutz, Mary – CBE - Life Sciences Education, 2012
Life sciences faculty agree that developing scientific literacy is an integral part of undergraduate education and report that they teach these skills. However, few measures of scientific literacy are available to assess students' proficiency in using scientific literacy skills to solve scenarios in and beyond the undergraduate biology classroom.…
Descriptors: Testing, Biology, Undergraduate Study, Educational Change
Peer reviewed Peer reviewed
Direct linkDirect link
Dunlap, Laurie A. – MathAMATYC Educator, 2012
This article describes how to design program assessment for mathematics departments, in two-year and four-year colleges across the Midwest, based on a set of components that was generated from a Delphi survey. An example is provided to illustrate how this was done at a small four-year college. There is an alignment between these components and a…
Descriptors: Mathematics Instruction, Program Evaluation, Program Design, Research Design
Peer reviewed Peer reviewed
Direct linkDirect link
Coryn, Chris L. S. – Journal of MultiDisciplinary Evaluation, 2007
The author discusses validation hierarchies grounded in the tradition of quantitative research that generally consists of the criteria of validity, reliability and objectivity and compares this with similar criteria developed by the qualitative tradition, described as trustworthiness, dependability and confirmability. Although these quantitative…
Descriptors: Research Methodology, Statistical Analysis, Value Judgment, Qualitative Research
Peer reviewed Peer reviewed
Direct linkDirect link
Febey, Karen; Coyne, Molly – American Journal of Evaluation, 2007
The field of program evaluation lacks interactive teaching tools. To address this pedagogical issue, the authors developed a collaborative learning technique called Program Evaluation: The Board Game. The authors present the game and its development in this practitioner-oriented article. The evaluation board game is an adaptable teaching tool…
Descriptors: Teaching Methods, Program Evaluation, Evaluation Methods, Cooperative Learning
Peer reviewed Peer reviewed
Direct linkDirect link
Li, Jianghong; D'Angiulli, Amedeo; Kendall, Garth E. – Early Years: An International Journal of Research and Development, 2007
The Early Development Index (EDI) is a teacher-completed checklist, intended to be a population-level tool to measure children's readiness for school and to alert communities to potential developmental problems in children. In response to the increasing popularity of the EDI, this paper provides a critical and timely evaluation and identifies the…
Descriptors: Check Lists, Developmental Disabilities, Psychometrics, School Readiness
Southwest Educational Development Lab., Austin, TX. – 1979
This report documents the proceedings of a conference on the validation of exemplary educational programs, products, and practices presented by the Regional Exchange of the Southwest Educational Development Laboratory (SEDL/RX). It is different from other conference syntheses, as it includes the process for creating the conference as well as the…
Descriptors: Conferences, Evaluation Methods, Guides, Program Validation
Peer reviewed Peer reviewed
Direct linkDirect link
Pass, Kenneth; Green, Nancy S.; Lorey, Fred; Sherwin, John; Comeau, Anne Marie – Mental Retardation and Developmental Disabilities Research Reviews, 2006
The term "pilot study" has been used over the years to describe the evaluation of the many elements involved in deciding whether a proposed condition should be added to a newborn screening (NBS) panel, and until recently, was unilaterally used to describe the evaluation of the assay to be used before the condition was officially adopted by a state…
Descriptors: Pilot Projects, Neonates, Screening Tests, Infant Care
Peer reviewed Peer reviewed
Direct linkDirect link
Timmins, Paul; Bham, Mohammed; McFadyen, Jane; Ward, Joanna – Educational Psychology in Practice, 2006
In this article we describe how the RADIO process enabled EPiTs to negotiate research with an EPS around its desire to evaluate and develop its consultation work with schools. Findings of the evaluation and their implications for the Service are described and the potential of RADIO as a tool for providing external research support from HEIs for…
Descriptors: Research and Development, Evaluation Methods, Educational Psychology, Action Research
Peer reviewed Peer reviewed
Miller, Allen H. – Studies in Higher Education, 1984
Possible reasons for conducting evaluations of college and university courses are outlined, and the evaluation system developed at the Australian National University is described. Sources of information for teachers to use in making course improvements are suggested. (MSE)
Descriptors: Course Evaluation, Curriculum Development, Data Collection, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Stapleton, Paul; Helms-Park, Rena – English for Specific Purposes, 2006
This paper introduces the Website Acceptability Tiered Checklist (WATCH), a preliminary version of a multi-trait scale that could be used by instructors and students to assess the quality of websites chosen as source materials in students' research papers in a Humanities program. The scale includes bands for assessing: (i) the authority and…
Descriptors: Information Sources, Web Sites, English for Academic Purposes, Check Lists
Previous Page | Next Page »
Pages: 1  |  2  |  3