NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 21 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Stupak, Oksana – Advanced Education, 2020
The introduction of educational technologies, which are developing at a rapid pace, requires professionalism and readiness of managers to implement the latest information technologies. Therefore, the use of electronic resources in the educational process in higher education can contribute both to developing professional skills and gaining the…
Descriptors: Educational Technology, Technology Uses in Education, Higher Education, Educational Resources
Peer reviewed Peer reviewed
Direct linkDirect link
Debuse, Justin C. W.; Lawley, Meredith – British Journal of Educational Technology, 2016
Providing students with high quality feedback is important and can be achieved using computer-based systems. While student and educator perspectives of such systems have been investigated, a comprehensive multidisciplinary study has not yet been undertaken. This study examines student and educator perspectives of a computer-based assessment and…
Descriptors: Teacher Student Relationship, Educational Benefits, Teacher Attitudes, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Azevedo, Ana, Ed.; Azevedo, José, Ed. – IGI Global, 2019
E-assessments of students profoundly influence their motivation and play a key role in the educational process. Adapting assessment techniques to current technological advancements allows for effective pedagogical practices, learning processes, and student engagement. The "Handbook of Research on E-Assessment in Higher Education"…
Descriptors: Higher Education, Computer Assisted Testing, Multiple Choice Tests, Guides
Peer reviewed Peer reviewed
Direct linkDirect link
Thompson, Meredith Myra; Braude, Eric John – Journal of Educational Computing Research, 2016
The assessment of learning in large online courses requires tools that are valid, reliable, easy to administer, and can be automatically scored. We have evaluated an online assessment and learning tool called Knowledge Assembly, or Knowla. Knowla measures a student's knowledge in a particular subject by having the student assemble a set of…
Descriptors: Computer Assisted Testing, Teaching Methods, Online Courses, Critical Thinking
Peer reviewed Peer reviewed
PDF on ERIC Download full text
NORDSCI, 2018
This volume includes two sections of the 2018 NORDSCI international conference proceedings: (1) Education and Educational Research; and (2) Language and Linguistics. Education and Educational Research includes 22 papers relating to scientific topics in the full spectrum of education, including history, sociology and economy of education,…
Descriptors: Employment Qualifications, Doctoral Programs, Graduate Students, Career Readiness
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Sampson, Demetrios G., Ed.; Ifenthaler, Dirk, Ed.; Isaías, Pedro, Ed. – International Association for Development of the Information Society, 2021
These proceedings contain the papers of the 18th International Conference on Cognition and Exploratory Learning in the Digital Age (CELDA 2021), held virtually, due to an exceptional situation caused by the COVID-19 pandemic, from October 13-15, 2021, and organized by the International Association for Development of the Information Society…
Descriptors: Computer Simulation, Open Educational Resources, Telecommunications, Handheld Devices
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Bicard, Sara; Bicard, David F.; Casey, Laura Baylot; Smith, Clinton; Plank, Esther; Casey, Cort – Journal of Educational Technology, 2008
This study was an empirical investigation of active student responding (ASR) utilizing a student response system (SRS) vs. single student questioning (SSQ) and no student responding in a graduate level special education class of 23 participants. During the SRS condition, every participant responded to questions using remotes/clickers. During the…
Descriptors: Audience Response Systems, Graduate Students, Higher Education, Questioning Techniques
Peer reviewed Peer reviewed
White, Michael J. – Counselor Education and Supervision, 1988
Presents rationale and procedure for a computer-administered examination in professional ethics. Discusses advantages and implications of computer-administered testing in professional ethics, noting benefits for instructors and students of professional ethics in counseling and counseling psychology. (Author/NB)
Descriptors: Computer Assisted Testing, Counselor Educators, Counselor Training, Ethics
Peer reviewed Peer reviewed
Pinsoneault, Terry B. – Computers in Human Behavior, 1996
Computer-assisted and paper-and-pencil-administered formats for the Minnesota Multiphasic Personality Inventories were investigated. Subjects were 32 master's and doctoral-level counseling students. Findings indicated that the two formats were comparable and that students preferred the computer-assisted format. (AEF)
Descriptors: Comparative Analysis, Computer Assisted Testing, Graduate Students, Higher Education
Peer reviewed Peer reviewed
Rekhart, Deborah; Dunkel, Patricia – Applied Language Learning, 1992
The speech of a sample of Chinese and Korean nonnative speakers of English (n=30) was evaluated by raters of the Speaking Proficiency English Assessment Kit (SPEAK) for a degree of fluency and by a computerized silent-pause-detection program. (35 references) (JL)
Descriptors: Computer Assisted Testing, English (Second Language), Foreign Students, Graduate Students
Peer reviewed Peer reviewed
Bennett, Randy Elliot; Rock, Donald A. – Journal of Educational Measurement, 1995
Examined the generalizability and validity and examinee perceptions of a computer-delivered version of 8 formulating-hypotheses tasks administered to 192 graduate students. Results support previous research that has suggested that formulating-hypotheses items can broaden the abilities measured by graduate admissions measures. (SLD)
Descriptors: Admission (School), College Entrance Examinations, Computer Assisted Testing, Generalizability Theory
PDF pending restoration PDF pending restoration
Plake, Barbara S.; And Others – 1994
In self-adapted testing (SAT), examinees select the difficulty level of items administered. This study investigated three variations of prior information provided when taking an SAT: (1) no information (examinees selected item difficulty levels without prior information); (2) view (examinees inspected a typical item from each difficulty level…
Descriptors: Adaptive Testing, College Students, Computer Assisted Testing, Difficulty Level
Peer reviewed Peer reviewed
Bennett, Randy Elliot; Morley, Mary; Quardt, Dennis; Rock, Donald A.; Singley, Mark K.; Katz, Irvin R.; Nhouyvanisvong, Adisack – Journal of Educational Measurement, 1999
Evaluated a computer-delivered response type for measuring quantitative skill, the "Generating Examples" (GE) response type, which presents under-determined problems that can have many right answers. Results from 257 graduate students and applicants indicate that GE scores are reasonably reliable, but only moderately related to Graduate…
Descriptors: College Applicants, Computer Assisted Testing, Graduate Students, Graduate Study
Bennett, Randy Elliot; Rock, Donald A. – 1993
Formulating-Hypotheses (F-H) items present a situation and ask the examinee to generate as many explanations for it as possible. This study examined the generalizability, validity, and examinee perceptions of a computer-delivered version of the task. Eight F-H questions were administered to 192 graduate students. Half of the items restricted…
Descriptors: Computer Assisted Testing, Difficulty Level, Generalizability Theory, Graduate Students
Powell, Z. Emily – 1992
Little research exists on the psychological impacts of computerized adaptive testing (CAT) and how it may affect test performance. Three CAT procedures were examined, in which items were selected to match students' achievement levels, from the item pool at random, or according to student choice of item difficulty levels. Twenty-four graduate…
Descriptors: Academic Achievement, Adaptive Testing, Comparative Testing, Computer Assisted Testing
Previous Page | Next Page »
Pages: 1  |  2