NotesFAQContact Us
Collection
Advanced
Search Tips
What Works Clearinghouse Rating
Showing 1 to 15 of 174 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Bower, Kyle L. – American Journal of Evaluation, 2022
The purpose of this paper is to introduce the Five-Level Qualitative Data Analysis (5LQDA) method for ATLAS.ti as a way to intentionally design methodological approaches applicable to the field of evaluation. To demonstrate my analytical process using ATLAS.ti, I use examples from an existing evaluation of a STEM Peer Learning Assistant program.…
Descriptors: Qualitative Research, Data Analysis, Program Evaluation, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Hallberg, Kelly; Williams, Ryan; Swanlund, Andrew – Journal of Research on Educational Effectiveness, 2020
More aggregate data on school performance is available than ever before, opening up new possibilities for applied researchers interested in assessing the effectiveness of school-level interventions quickly and at a relatively low cost by implementing comparative interrupted times series (CITS) designs. We examine the extent to which effect…
Descriptors: Data Use, Research Methodology, Program Effectiveness, Design
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Sahin, Harun; Caner, H. Nuran; Akmaz-Genç, Imren – International Journal of Curriculum and Instruction, 2020
The present study aims to examine the graduate studies related to in-service training programs for teachers conducted between the years 2000 and 2018 in Turkey. In this context, the content analysis of 88 graduate studies, including 71 master thesis and 17 dissertations, was conducted for the in-service teacher training accessible to the Council…
Descriptors: Foreign Countries, Inservice Teacher Education, Graduate Study, Masters Theses
Hallberg, Kelly; Williams, Ryan; Swanlund, Andrew; Eno, Jared – Educational Researcher, 2018
Short comparative interrupted times series (CITS) designs are increasingly being used in education research to assess the effectiveness of school-level interventions. These designs can be implemented relatively inexpensively, often drawing on publicly available data on aggregate school performance. However, the validity of this approach hinges on…
Descriptors: Educational Research, Research Methodology, Comparative Analysis, Time
Seftor, Neil – Regional Educational Laboratory, 2016
This short brief for education decision makers discusses three main factors that may contribute to a finding of no effects: failure of theory, failure of implementation, and failure of research design. It provides readers with questions to ask themselves to better understand "no effects" findings, and describes other contextual factors…
Descriptors: Research Design, Research Methodology, Program Evaluation, Program Effectiveness
Peterson, Jean S. – Gifted Child Quarterly, 2019
Intended to guide scholars who are new to qualitative research, this methods brief focuses mostly on what reviewers look for in manuscripts submitted for publication. The author acknowledges that reviewers' preferences likely reflect their theoretical perspectives, research-oriented coursework, mentors, and research and writing experiences. The…
Descriptors: Academically Gifted, Qualitative Research, Educational Research, Vocabulary
Peer reviewed Peer reviewed
Direct linkDirect link
Phillips, Alana S.; Sheffield, Anneliese; Moore, Michelle; Robinson, Heather A. – Quarterly Review of Distance Education, 2016
There is a need for a holistic usability evaluation framework that accommodates social constructivist online courses. Social knowledge construction may not be adequately evaluated using current frameworks. This qualitative research study examined the usability needs of a social constructivist online course. Data from an online course were analyzed…
Descriptors: Usability, Online Courses, Constructivism (Learning), Qualitative Research
Peer reviewed Peer reviewed
Direct linkDirect link
Ro, Hyun Kyoung; Menard, Tiffany; Kniess, Dena; Nickelsen, Ashley – New Directions for Institutional Research, 2017
This chapter provides examples of innovative methods and tools to collect, analyze, and report both quantitative and qualitative data in student affairs assessment.
Descriptors: Student Personnel Services, Academic Support Services, Program Evaluation, Evaluation Methods
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Sanzo, Karen – International Journal of Education Policy and Leadership, 2016
This article presents a content analysis of the 2013 School Leadership Program (SLP) grants. SLP projects provide a unique opportunity for participants in the field to explore innovative leadership preparation and development and their impact on program participants, schools, school districts, and students. The article begins with an overview of…
Descriptors: Program Evaluation, Program Proposals, Leadership Training, Administrator Education
Peer reviewed Peer reviewed
Direct linkDirect link
Khosravi, Arash; Ahmad, Mohammad Nazir – Education and Information Technologies, 2016
The use of an effective supervision mechanism is crucial between a student and supervisor. The essential knowledge shared and transferred between these two parties must be observed and understood very well in order to ensure that students are produced at good level of quality for future professional knowledge workers. The aim of this study was to…
Descriptors: Knowledge Management, Sharing Behavior, Information Dissemination, Supervision
Schiazza, Daniela Marie – ProQuest LLC, 2013
The nascent field of mixed methods research has yet to develop a cohesive framework of guidelines and procedures for mixed methods data analysis (Greene, 2008). To support the field's development of analytical frameworks, this case study reflects on the development and implementation of a mixed methods study engaged in integrated data analysis.…
Descriptors: Case Studies, Mixed Methods Research, Data Analysis, Inquiry
Peer reviewed Peer reviewed
Direct linkDirect link
Nelson, Amy Grack; Cohn, Sarah – Journal of Museum Education, 2015
Museums often evaluate various aspects of their audiences' experiences, be it what they learn from a program or how they react to an exhibition. Each museum program or exhibition has its own set of goals, which can drive what an evaluator studies and how an evaluation evolves. When designing an evaluation, data collection methods are purposefully…
Descriptors: Data Collection, Research Methodology, Program Evaluation, Museums
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Akers, Lauren; Resch, Alexandra; Berk, Jillian – National Center for Education Evaluation and Regional Assistance, 2014
This guide for district and school leaders shows how to recognize opportunities to embed randomized controlled trials (RCTs) into planned policies or programs. Opportunistic RCTs can generate strong evidence for informing education decisions--with minimal added cost and disruption. The guide also outlines the key steps to conduct RCTs and responds…
Descriptors: School Districts, Educational Research, Guides, Program Evaluation
Warner, Laura A.; Stubbs, Eric; Murphrey, Theresa Pesl; Huynh, Phuong – Journal of Agricultural Education, 2016
The purpose of this study was to identify the specific competencies needed to apply social marketing, a promising approach to behavior change, to Extension programming. A modified Delphi study was used to achieve group consensus among a panel of experts on the skills, characteristics, and knowledge needed to successfully apply this behavior change…
Descriptors: Competence, Marketing, Behavior Change, Extension Education
Peer reviewed Peer reviewed
Direct linkDirect link
Lamm, Alexa J.; Israel, Glenn D.; Diehl, David – Journal of Extension, 2013
In order to enhance Extension evaluation efforts it is important to understand current practices. The study reported here researched the evaluation behaviors of county-based Extension professionals. Extension professionals from eight states (n = 1,173) responded to a survey regarding their evaluation data collection, analysis, and reporting…
Descriptors: Extension Education, Extension Agents, Experimenter Characteristics, Program Evaluation
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9  |  10  |  11  |  12