Educational PsychometricsOverviewEducational Psychometrics is a field that focuses on the measurement of psychological attributes related to education, such as cognitive abilities, skills, and learning outcomes. It combines principles of psychology, education, and statistics to design, evaluate, and interpret assessments and tests used in educational settings. The goal of educational psychometrics is to ensure that educational assessments are valid, reliable, and fair, providing accurate measures of students' knowledge and abilities. In this project, we strive to develop innovative methods, tools, and technologies that harness the power of data analytics and predictive modeling to inform evidence-based decision-making in education. By combining cutting-edge techniques from these fields, our research seeks to foster improved teaching and learning practices, enhance educational assessment, and ultimately, contribute to the advancement of education on a global scale. His research topics related to Educational Psychometrics includes:
Representative work
"A General Dynamic Learning Model Framework for Cognitive Diagnosis, with Zichu Liu, Shiyu Wang, Shumei Zhang, and Tao Qiu, 2025.
British Journal of Mathematical and Statistical Psychology.. Understanding students' learning trajectories is crucial for educators to effectively monitor and enhance progress. With the rise of computer-based testing, researchers now have access to rich datasets that provide deeper insights into student performance. This study introduces a general dynamic learning model framework that integrates response accuracy and response times to capture different test-taking behaviors and estimate learning trajectories related to polytomous attributes over time. A Bayesian estimation method is proposed to estimate model parameters. Rigorous validation through simulation studies confirms the effectiveness of the MCMC algorithm in parameter recovery and highlights the model's utility in understanding learning trajectories and detecting different test-taking behaviors in a learning environment. Applied to real data, the model demonstrates practical value in educational settings. Overall, this comprehensive and validated model offers educators and researchers nuanced insights into student learning progress and behavioral dynamics.
Digital Product Performance
Dynamic Learning
Cognitive Diagnosis
Educational Psychology
"A Two-Step Item Bank Calibration Strategy based on 1-bit Matrix Completion for Small-Scale Computerized Adaptive Testing, with Yawei Shen and Shiyu Wang, 2024.
British Journal of Mathematical and Statistical Psychology.. Computerized adaptive testing (CAT) is a widely embraced approach for delivering personalized educational assessments, tailoring each test to the real-time performance of individual examinees. Despite its potential advantages, CAT's application in small-scale assessments has been limited due to the complexities associated with calibrating the item bank using sparse response data and small sample sizes. This study addresses these challenges by developing a two-step item bank calibration strategy that leverages the 1-bit matrix completion method in conjunction with two distinct incomplete pretesting designs. We introduce two novel 1-bit matrix completion-based imputation methods specifically designed to tackle the issues associated with item calibration in the presence of sparse response data and limited sample sizes. To demonstrate the effectiveness of these approaches, we conduct a comparative assessment against several established item parameter estimation methods capable of handling missing data. This evaluation is carried out through two sets of simulation studies, each featuring different pretesting designs, item bank structures, and sample sizes. Furthermore, we illustrate the practical application of the methods investigated, using empirical data collected from small-scale assessments.
Digital Product Performance
Educational Psychology
Machine Learning
Matrix Completion
Small Big Analysis
Computerized Adaptive Testing
"Adaptive Weight Estimation of Latent Ability: Application to Computerized Adaptive Testing With Response Revision, with Shiyu Wang and Allan Cohen, 2021.
Journal of Educational and Behavioral Statistics. An adaptive weight estimation approach is proposed to provide robust latent ability estimation in computerized adaptive testing (CAT) with response revision. This approach assigns different weights to each distinct response to the same item when response revision is allowed in CAT. Two types of weight estimation procedures, nonfunctional and functional weight, are proposed to determine the weight adaptively based on the compatibility of each revised response with the assumed statistical model in relation to remaining observations. The application of this estimation approach to a data set collected from a large-scale multistage adaptive testing demonstrates the capability of this method to reveal more information regarding the test taker’s latent ability by using the valid response path compared with only using the very last response. Limited simulation studies were concluded to evaluate the proposed ability estimation method and to compare it with several other estimation procedures in literature. Results indicate that the proposed ability estimation approach is able to provide robust estimation results in two test-taking scenarios.
Digital Product Performance
Educational Psychology
Machine Learning
Adaptive Weight Estimation
Robust Estimation
Computerized Adaptive Testing
Response Revision
|