NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
Assessments and Surveys
Program for International…1
What Works Clearinghouse Rating
Showing all 14 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Raykov, Tenko – Measurement: Interdisciplinary Research and Perspectives, 2023
This software review discusses the capabilities of Stata to conduct item response theory modeling. The commands needed for fitting the popular one-, two-, and three-parameter logistic models are initially discussed. The procedure for testing the discrimination parameter equality in the one-parameter model is then outlined. The commands for fitting…
Descriptors: Item Response Theory, Models, Comparative Analysis, Item Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Murray, Keith B.; Zdravkovic, Srdan – Journal of Education for Business, 2016
Considerable debate continues regarding the efficacy of the website RateMyProfessors.com (RMP). To date, however, virtually no direct, experimental research has been reported which directly bears on questions relating to sampling adequacy or item adequacy in producing what favorable correlations have been reported. The authors compare the data…
Descriptors: Computer Assisted Testing, Computer Software Evaluation, Student Evaluation of Teacher Performance, Item Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Handal, Boris; Campbell, Chris; Cavanagh, Michael; Petocz, Peter – Mathematics Education Research Journal, 2016
This study validated the semantic items of three related scales aimed at characterising the perceived worth of mathematics-education-related mobile applications (apps). The technological pedagogical content knowledge (TPACK) model was used as the conceptual framework for the analysis. Three hundred and seventy-three preservice students studying…
Descriptors: Preservice Teacher Education, Mathematics Education, Courseware, Relevance (Education)
Peer reviewed Peer reviewed
Direct linkDirect link
Lee, Jeong-Sook; Kim, Sung-Wan – Journal of Educational Computing Research, 2015
The purpose of this study is to develop and validate an evaluation tool of educational apps for smart education. Based on literature reviews, a potential model for evaluating educational apps was suggested. An evaluation tool consisting of 57 survey items was delivered to 156 students in middle and high schools. An exploratory factor analysis was…
Descriptors: Educational Technology, Courseware, Computer Software Evaluation, Test Construction
Peer reviewed Peer reviewed
Direct linkDirect link
Unlu, Ali; Sargin, Anatol – Applied Psychological Measurement, 2009
Mondrian is state-of-the-art statistical data visualization software featuring modern interactive visualization techniques for a wide range of data types. This article reviews the capabilities, functionality, and interactive properties of this software package. Key features of Mondrian are illustrated with data from the Programme for International…
Descriptors: Statistical Data, Computer Graphics, Computer Software, Item Analysis
Boggan, Matthew K.; Harper, Sallie – Journal of Case Studies in Education, 2011
According to a study conducted by the American Association of Colleges of Teacher Education (2002), approximately 90% of universities and colleges of education use portfolios regarding student assessment. Forty percent of universities use electronic portfolios in teacher certification programs for licensing. Because of the popular use of…
Descriptors: Cohort Analysis, Leadership Training, Portfolios (Background Materials), Electronic Publishing
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Hafezi, Soheila; Farahi, Ahmad; Mehri, Soheil Najafi; Mahmoodi, Hosein – Turkish Online Journal of Distance Education, 2010
The web is playing a central role in distance education. The word "usability" is usually synonymous with functionality of the system for the user. Also, usability of a website is defined as something that can be used by a specific group of people to carry out specific objectives in an effective way, with efficiency and satisfaction.…
Descriptors: Usability, Distance Education, Web Sites, Program Validation
Lee, Seon Ah – ProQuest LLC, 2010
The purpose of this study was to develop a tool to evaluate the quality of a clinical information system (CIS) conceived by nurses and conduct a pilot test with the developed tool as an initial assessment. CIS quality is required for successful implementation in information technology (IT) environments. The study started with the realization that…
Descriptors: Research Methodology, Nurses, Content Validity, Nursing
Peer reviewed Peer reviewed
Direct linkDirect link
Basturk, Ramazan – Assessment & Evaluation in Higher Education, 2008
This study investigated the usefulness of the many-facet Rasch model (MFRM) in evaluating the quality of performance related to PowerPoint presentations in higher education. The Rasch Model utilizes item response theory stating that the probability of a correct response to a test item/task depends largely on a single parameter, the ability of the…
Descriptors: Higher Education, Teaching (Occupation), Rating Scales, Program Effectiveness
Peer reviewed Peer reviewed
Direct linkDirect link
Wentzel, Carolyn – Journal of Science Education and Technology, 2006
INTEGRITY, an item analysis and statistical collusion detection (answer copying) online application, was reviewed. Features of the software and examples of program output are described in detail. INTEGRITY was found to be easily utilized with an abundance of well-organized documentation and built-in features designed to guide the user through the…
Descriptors: Item Analysis, Computer Software, Multiple Choice Tests, Costs
Peer reviewed Peer reviewed
Direct linkDirect link
Luik, P. – Journal of Computer Assisted Learning, 2007
It is important for the teacher to choose effective software for students. It is also important for designers to know what features of educational software make it effective. But the results of studies dealing with the effectiveness of educational software are contradictory. One reason for such results might be the fact that meta-analysis covers…
Descriptors: Program Effectiveness, Drills (Practice), Courseware, Computer Software Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Finger, Michael S. – International Journal of Testing, 2004
MicroFACT 2.0, an item factor analysis computer program, was reviewed. Program features were described in detail, as well as the program output. The performance of MicroFACT was evaluated on a Windows 2000, Pentium III platform. MicroFACT was found to be fairly easy to use, and one problem encountered was reported. Program requirements and…
Descriptors: Computer Software, Factor Analysis, Guidelines, Purchasing
Peer reviewed Peer reviewed
Direct linkDirect link
Topping, K. J.; Samuels, J.; Paul, T. – School Effectiveness and School Improvement, 2007
This study elaborates the "what works?" question by exploring the effects of variability in program implementation quality on achievement. Particularly, the effects on achievement of computerized assessment of reading were investigated, analyzing data on 51,000 students in Grades 1-12 who read over 3 million books. When minimum implementation…
Descriptors: Program Implementation, Achievement Gains, Reading Achievement, Independent Reading
Peer reviewed Peer reviewed
Direct linkDirect link
Liu, Chao-Lin – Educational Technology & Society, 2005
The author analyzes properties of mutual information between dichotomous concepts and test items. The properties generalize some common intuitions about item comparison, and provide principled foundations for designing item-selection heuristics for student assessment in computer-assisted educational systems. The proposed item-selection strategies…
Descriptors: Test Items, Heuristics, Classification, Item Analysis