NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)0
Since 2006 (last 20 years)3
Laws, Policies, & Programs
Assessments and Surveys
Program for International…1
What Works Clearinghouse Rating
Showing 1 to 15 of 16 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Hall, Jori N.; Ryan, Katherine E. – Qualitative Inquiry, 2011
This article discusses the importance of mixed-methods research, in particular the value of qualitatively driven mixed-methods research for quantitatively driven domains like educational accountability. The article demonstrates the merits of qualitative thinking by describing a mixed-methods study that focuses on a middle school's system of…
Descriptors: Methods Research, Evaluation Methods, Accountability, Case Studies
Friedman, Daniel B. – National Resource Center for the First-Year Experience and Students in Transition, 2012
"The First-Year Seminar: Designing, Implementing, and Assessing Courses to Support Student Learning and Success," a five-volume series, is designed to assist educators who are interested in launching a first-year seminar or revamping an existing program. Each volume examines a different aspect of first-year seminar design or…
Descriptors: First Year Seminars, Instructional Design, Instructional Development, Curriculum Implementation
OECD Publishing (NJ1), 2012
The "PISA 2009 Technical Report" describes the methodology underlying the PISA 2009 survey. It examines additional features related to the implementation of the project at a level of detail that allows researchers to understand and replicate its analyses. The reader will find a wealth of information on the test and sample design,…
Descriptors: Quality Control, Research Reports, Research Methodology, Evaluation Criteria
Peer reviewed Peer reviewed
Caracelli, Valerie J.; Greene, Jennifer C. – Educational Evaluation and Policy Analysis, 1993
The following four integrative data analysis strategies for mixed-method evaluation designs are derived from and illustrated by empirical practice: (1) data transformation; (2) typology development; (3) extreme case analysis; and (4) data consolidation and merging. Use of these methods to realize the full potential of mixed-method approaches is…
Descriptors: Classification, Data Analysis, Evaluation Methods, Program Design
Peer reviewed Peer reviewed
Hanes, Michael L. – Theory Into Practice, 1977
This paper examines (1) the problems of program design and implementation in relation to evaluation design, the confusion between research and evaluation design, and the overemphasis on quantitative data; (2) clarifies misconceptions about these problems and the evaluation process; and (3) suggests ways of avoiding these problems in the future.…
Descriptors: Data Analysis, Data Collection, Educational Problems, Educational Programs
Peer reviewed Peer reviewed
Mangieri, John N.; Kemper, Richard E. – Action in Teacher Education, 1983
A model is proposed for gathering information about staff development efforts conducted by schools of education. The model can be used to evaluate staff development undertakings, report significant data, and help other institutions design comparable programs. (PP)
Descriptors: Data Analysis, Evaluation Methods, Higher Education, Inservice Teacher Education
Peer reviewed Peer reviewed
Crosby, Jeanie – Journal of Staff Development, 1982
Six areas of staff development program evaluation in which participants may be meaningfully involved are: (1) clarification of the program's goals; (2) development of a design for the evaluation study; (3) development of methods for measurement; (4) analysis of information; (5) response to evaluation instruments; and (6) reporting on an evaluation…
Descriptors: Data Analysis, Educational Objectives, Evaluation Methods, Measurement Techniques
Peer reviewed Peer reviewed
Direct linkDirect link
Carvalho, Soniya; White, Howard – American Journal of Evaluation, 2004
The theory-based evaluation approach documents the assumptions implicit in program design and points to the data required to test these assumptions. Collecting and analyzing such data through quantitative and qualitative techniques enhances understanding of the validity of the assumptions and the relevance of key program processes. This article…
Descriptors: Program Effectiveness, Program Design, Evaluation Methods, Social Theories
St. John, Mark – 1985
The choice of methods is part of the overall evaluation design process. The process consists of the following steps: (1) analyzing the problem context; (2) asking a few general questions; (3) selecting the methods (strategies) to use; and (4) selecting the specific techniques (tactics) to use. To operate successfully the evaluator needs to know…
Descriptors: Data Analysis, Data Collection, Evaluation Criteria, Evaluation Methods
Contract Research Corp., Belmont, MA. – 1975
A summary of the final report analyzing the Federal Bonding Program includes an overview, historical summary and analysis, summary of findings, conclusions, and recommendations. The bonding program provides fidelity bonding for individuals who are normally excluded from insurance policy bonding, including ex-offenders, enabling them to work at…
Descriptors: Data Analysis, Disadvantaged, Employment Services, Evaluation Methods
Smith, Allen G.; And Others – 1976
This interim report describes the development of program implementation and cost studies for Year II of the process evaluation of Project Developmental Continuity (PDC), a Head Start demonstration program aimed at providing educational and developmental continuity between children's Head Start and primary school experiences. Specific areas focused…
Descriptors: Charts, Cost Effectiveness, Data Analysis, Data Collection
Peer reviewed Peer reviewed
Direct linkDirect link
Newell, Susan; Schoenike, Sumner L.; Lisko, Elaine A. – Journal of School Nursing, 2003
School nurses need to become more influential administrators, managers, and entrepreneurs. They must learn to lead and collaborate effectively in designing, implementing, and evaluating coordinated school health programs. Quality assurance is an essential ingredient in this process that requires accurate, timely, and confidential incident…
Descriptors: School Nurses, Quality Control, School Health Services, School Safety
Jorgensen, C. C.; Hoffer, P. L. – 1978
Developed for application in Air Force weapons systems training programs, a research project investigated the methods, procedures, and data available for conducting cost and training effectiveness analyses (CTEA). The primary objective of the project was to develop a user-oriented procedure for early formulation of training programs that can be…
Descriptors: Cost Effectiveness, Cost Estimates, Data Analysis, Equipment Utilization
Morris, Lynn Lyons; Fitz-Gibbon, Carol Taylor – 1978
Measuring attainment of the program's objectives and describing the program's implementation are listed as two of the evaluator's major responsibilities. The description should include an explanation of the context in which the program was initiated, as well as the component materials and activities. This booklet has three purposes: (1) suggesting…
Descriptors: Data Analysis, Data Collection, Educational Assessment, Evaluation Methods
Young, Malcolm B.; Schuh, Russell G. – 1975
This guide is intended to assist in the evaluation of career education programs. It has been developed around the concept that the evaluation should be viewed as a management tool for the improvement of program performance. The guide recognizes the key roles of the evaluators and program managers in the evaluation process. It is addressed…
Descriptors: Administration, Administrator Guides, Career Education, Data Analysis
Previous Page | Next Page ยป
Pages: 1  |  2