NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Showing all 9 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Ayfer Sayin; Sabiha Bozdag; Mark J. Gierl – International Journal of Assessment Tools in Education, 2023
The purpose of this study is to generate non-verbal items for a visual reasoning test using templated-based automatic item generation (AIG). The fundamental research method involved following the three stages of template-based AIG. An item from the 2016 4th-grade entrance exam of the Science and Art Center (known as BILSEM) was chosen as the…
Descriptors: Test Items, Test Format, Nonverbal Tests, Visual Measures
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Juškaite, Loreta – International Baltic Symposium on Science and Technology Education, 2019
The new research results on the online- testing method in the Latvian education system for a learning process assessment are presented. Data mining is a very important field in education because it helps to analyse the data gathered in various researches and to implement the changes in the education system according to the learning methods of…
Descriptors: Foreign Countries, Information Retrieval, Data Analysis, Data Use
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Blumenthal, Stefan; Blumenthal, Yvonne – International Journal of Educational Methodology, 2020
Progress monitoring of academic achievement is an essential element to prevent learning disorders. A prominent approach is curriculum-based measurement (CBM). Various studies have documented positive effects of CBM on students' achievement. Nevertheless, the use of CBM is associated with additional work for teachers. The use of tablets may be of…
Descriptors: Instructional Effectiveness, Curriculum Based Assessment, Computer Assisted Testing, Handheld Devices
Chen, Guanhua – ProQuest LLC, 2018
This study is part of a larger design study that iteratively improves a robotics programming curriculum as well as a computational thinking (CT) instrument. Its focus was majorly on CT assessment and particularly on an online CT instrument with logging functionality that can store a student's problem-solving process by recording interactions…
Descriptors: Elementary School Students, Test Construction, Cognitive Tests, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Foxworth, Lauren L.; Hashey, Andrew; Sukhram, Diana P. – Reading & Writing Quarterly, 2019
In an age when students are increasingly expected to demonstrate technology-based writing proficiency, fluency challenges with word processing programs can pose a barrier to successful writing when students are asked to compose using these tools. The current study was designed to determine whether differences existed in typing fluency and digital…
Descriptors: Writing Skills, Students with Disabilities, Learning Disabilities, Word Processing
Peer reviewed Peer reviewed
Direct linkDirect link
Molnar, Gyongyver; Hodi, Agnes; Magyar, Andrea – AERA Online Paper Repository, 2016
Vocabulary knowledge assessment methods and instruments have gone through a significant evolution. Computer-based tests offer more opportunities than their paper-and-pencil counterparts, however, most digital vocabulary assessments are linear and adaptive solutions in this domain are scarce. The aims of this study were to compare the effectiveness…
Descriptors: Adaptive Testing, Vocabulary Skills, Computer Assisted Testing, Student Evaluation
Liu, Junhui; Brown, Terran; Chen, Jianshen; Ali, Usama; Hou, Likun; Costanzo, Kate – Partnership for Assessment of Readiness for College and Careers, 2016
The Partnership for Assessment of Readiness for College and Careers (PARCC) is a state-led consortium working to develop next-generation assessments that more accurately, compared to previous assessments, measure student progress toward college and career readiness. The PARCC assessments include both English Language Arts/Literacy (ELA/L) and…
Descriptors: Testing, Achievement Tests, Test Items, Test Bias
Steedle, Jeffrey; McBride, Malena; Johnson, Marc; Keng, Leslie – Partnership for Assessment of Readiness for College and Careers, 2016
The first operational administration of the Partnership for Assessment of Readiness for College and Careers (PARCC) took place during the 2014-2015 school year. In addition to the traditional paper-and-pencil format, the assessments were available for administration on a variety of electronic devices, including desktop computers, laptop computers,…
Descriptors: Computer Assisted Testing, Difficulty Level, Test Items, Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Wüstenberg, Sascha; Greiff, Samuel; Vainikainen, Mari-Pauliina; Murphy, Kevin – Journal of Educational Psychology, 2016
Changes in the demands posed by increasingly complex workplaces in the 21st century have raised the importance of nonroutine skills such as complex problem solving (CPS). However, little is known about the antecedents and outcomes of CPS, especially with regard to malleable external factors such as classroom climate. To investigate the relations…
Descriptors: Individual Differences, Problem Solving, Difficulty Level, Foreign Countries