NotesFAQContact Us
Collection
Advanced
Search Tips
Showing 1 to 15 of 18 results Save | Export
Peer reviewed Peer reviewed
Gross, Leon J. – Evaluation and the Health Professions, 1988
A new test critique procedure developed by the National Board of Examiners in Optometry (NBEO) is described. It encourages candidates to identify potentially flawed items during, rather than after, the test. Therefore, candidates retain the opportunity to challenge the test, while the NBEO is able to retain item security. (TJH)
Descriptors: Feedback, Graduate Medical Education, Higher Education, Licensing Examinations (Professions)
Peer reviewed Peer reviewed
Wolf, Frederic M.; And Others – Evaluation and the Health Professions, 1986
The purpose of this study was to examine the reasons and underlying dimensions of the motivations of primary care physicians for participating in continuing medical education. Physicians rated the importance of eighteen reasons for participating on a Motivation for Continuing Medical Education Inventory. (Author/LMO)
Descriptors: Correlation, Graduate Medical Education, Motivation, Physicians
Peer reviewed Peer reviewed
Giannini, Gemma; Engel, John D. – Evaluation and the Health Professions, 1986
A paired-comparisons study of Patient Management Problem (PMP) test scores was undertaken to investigate the relationship between performance and corresponding scores (proficiency and pathway). Findings suggest that assigning points at the "option level" culminates in scores that cannot readily be linked to behavior they are supposed to…
Descriptors: Clinical Diagnosis, Graduate Medical Education, Measurement Techniques, Medical Evaluation
Peer reviewed Peer reviewed
Diamond, James J.; McCormick, Janet – Evaluation and the Health Professions, 1986
Using item responses from an in-training examination in diagnostic radiology, the application of a strength of association statistic to the general problem of item analysis is illustrated. Criteria for item selection, general issues of reliability, and error of measurement are discussed. (Author/LMO)
Descriptors: Achievement Tests, Difficulty Level, Error of Measurement, Graduate Medical Education
Peer reviewed Peer reviewed
Bland, Carole J.; And Others – Evaluation and the Health Professions, 1984
Drawing on current models for increasing the use of information from external evaluations, the authors offer a user-centered approach for increasing the use of results from internal evaluations of graduate medical education. The overriding emphasis of the user-centered approach is the utility of the resultant data. (Author/BW)
Descriptors: Communication (Thought Transfer), Data Collection, Decision Making, Evaluation Methods
Peer reviewed Peer reviewed
Cavanaugh, Sally Hixon; Loadman, William E. – Evaluation and the Health Professions, 1988
The dimensionality was assessed of 20 problem score measures derived from a nationally standardized written clinical simulation examination--the National Board for Respiratory Care--for which prior evidence of validity exists. The stability and generalizability of the resulting factor structure across examinee groups and test forms (N=5,679…
Descriptors: Clinical Diagnosis, Factor Structure, Graduate Medical Education, Higher Education
Peer reviewed Peer reviewed
Young, Larry D.; And Others – Evaluation and the Health Professions, 1986
A method for curriculum evaluation that utilizes content and student performance analyses of both an external certifying exam and curriculum components is reported. Findings demonstrate that a multidisciplinary grouping of courses can satisfactorily cover content deemed important by external judges and by internal course directors and instructors.…
Descriptors: Behavioral Sciences, Course Content, Curriculum Development, Curriculum Evaluation
Peer reviewed Peer reviewed
Vaytovich, Anthony E.; And Others – Evaluation and the Health Professions, 1986
This article describes an interactive computer program that allows medical students to enter a completely free-form problem list at the keyboard and receive immediate analysis of the accuracy and thoroughness of their diagnostic impressions. (Author/LMO)
Descriptors: Clinical Diagnosis, Computer Oriented Programs, Computer Software, Correlation
Peer reviewed Peer reviewed
Givner, Nathaniel; Hynes, Kevin – Evaluation and the Health Professions, 1978
Sixty-six medical school applicants responded to both old and new forms of the Medical College Admission Test (MCAT). The results supported the claim that the new MCAT assesses skills not measured by the old MCAT. Available from: Sage Publications, Inc., 275 South Beverly Drive, Beverly Hills, California 90212. (Author/JAC)
Descriptors: Academic Aptitude, Admission Criteria, College Entrance Examinations, Comparative Analysis
Peer reviewed Peer reviewed
Baggaley, Andrew R.; Hull, Alan L. – Evaluation and the Health Professions, 1983
Various combinations of nonlinear transformations were applied to responses to a clinical performance evaluation instrument that uses a four-point behaviorally anchored scale. The factorial structure of the 15 items constituting the evaluation form was minimally affected by the transformations, suggesting that parametric statistics can be applied…
Descriptors: Clinical Teaching (Health Professions), Evaluation Methods, Factor Structure, Graduate Medical Education
Peer reviewed Peer reviewed
Pohlman, Mary; And Others – Evaluation and the Health Professions, 1979
A simulated Medical College Admission Test (MCAT) was administered to 39 premedical students two weeks prior to the new MCAT. High correlations between simulated and active test scores were obtained in the biology, chemistry, physics, science problems, reading, and quantitative areas. (MH)
Descriptors: College Entrance Examinations, Diagnostic Tests, Graduate Medical Education, Higher Education
Peer reviewed Peer reviewed
Risucci, Donald A.; And Others – Evaluation and the Health Professions, 1992
The reliability and accuracy of evaluations of 126 surgical faculty made by 47 general surgery residents over 2 years were examined. The general accuracy and reliability over both years indicate that anonymous ratings of surgical faculty by groups of residents can be a valuable evaluation method. (SLD)
Descriptors: Correlation, Evaluation Methods, Graduate Medical Education, Graduate Medical Students
Peer reviewed Peer reviewed
Shea, Judy A.; And Others – Evaluation and the Health Professions, 1992
Video and print formats of cardiovascular motion studies were compared for use as assessment measures of interpretive skills for 392 doctors taking a cardiovascular disease certification test. Although video studies were easier to interpret, the equivalence of both motion studies supports use of the print format in national examinations. (SLD)
Descriptors: Cardiovascular System, Comparative Testing, Graduate Medical Education, Interpretive Skills
Peer reviewed Peer reviewed
Weinholtz, Donn; And Others – Evaluation and the Health Professions, 1986
Two separate reliability studies were conducted on an observational instrument derived from previous qualitative research and designed for collecting data on teaching behaviors during attending rounds. The reliability estimates from both studies were quite high, indicating that the instrument shows promise for use in both research and evaluation…
Descriptors: Clinical Teaching (Health Professions), Graduate Medical Education, Higher Education, Interrater Reliability
Peer reviewed Peer reviewed
Shea, Judy A.; And Others – Evaluation and the Health Professions, 1988
Item response theory's (IRT) suitability for medical examination data was examined, using data on 2,000 candidates who took the 1980 and 1982 American Board of Internal Medicine Certifying Examinations. Focus was on determining whether the tests met IRT assumptions and applying one-parameter and three-parameter IRT models to the data. (TJH)
Descriptors: Content Validity, Goodness of Fit, Graduate Medical Education, Guessing (Tests)
Previous Page | Next Page ยป
Pages: 1  |  2