Publications by Year: 2017

2017
Rushkin, I., Rosen, Y., Ang, A., Fredericks, C., Tingley, D., Blink, M. J., & Lopez, G. (2017). Adaptive Assessment Experiment in a HarvardX MOOC. In Proceedings of the 10th International Conference on Educational Data Mining (pp. 466-471) . International Conference on Educational Data Mining. Publisher's VersionAbstract
We report an experimental implementation of adaptive learning functionality in a self-paced HarvardX MOOC (massive open online course). In MOOCs there is need for evidence-based instructional designs that create the optimal conditions for learners, who come to the course with widely differing prior knowledge, skills and motivations. But users in such a course are free to explore the course materials in any order they deem fit and may drop out any time, and this makes it hard to predict the practical challenges of implementing adaptivity, as well as its effect, without experimentation. This study explored the technological feasibility and implications of adaptive functionality to course (re)design in the edX platform. Additionally, it aimed to establish the foundation for future study of adaptive functionality in MOOCs on learning outcomes, engagement and drop-out rates. Our preliminary findings suggest that the adaptivity of the kind we used leads to a higher efficiency of learning (without an adverse effect on learning outcomes, learners go through the course faster and attempt fewer problems, since the problems are served to them in a targeted way). Further research is needed to confirm these findings and explore additional possible effects.
paper_167.pdf
Milano, B. (2017). Adaptive learning featured in HarvardX course. Harvard Gazette. Publisher's Version
Milano, B. (2017). Harvard boosts on-campus reuse of online course content. Harvard Gazette. Publisher's Version
Ciregna, E. M., & Perez, E. (2017). Emerging challenges in digital higher ed. Harvard Gazette. Publisher's Version
Lopez, G., Seaton, D. T., Ang, A., Tingley, D., & Chuang, I. (2017). Google BigQuery for Education: Framework for Parsing and Analyzing edX MOOC Data. Proceedings of the Fourth (2017) ACM Conference on Learning@ Scale . ACM. Publisher's Version
Whitehill, J., Mohan, K., Seaton, D., Rosen, Y., & Tingley, D. (2017). MOOC Dropout Prediction: How to Measure Accuracy? Proceedings of the Fourth (2017) ACM Conference on Learning@ Scale . ACM. Publisher's Version
Turkay, S., & Wong, T. (2017). What engages MOOC learners: An interview study with ChinaX learners. Poster presented at the Annual meeting of American Educational Research Association. turkay_wong_2017_aera.pdf
Turkay, S., Eidelman, H., Rosen, Y., Seaton, D., Lopez, G., & Whitehill, J. (2017). Getting to know English language learners in MOOCs: Their motivations, behaviors and outcomes. In Proceedings of the Fourth Annual ACM Conference on Learning at Scale (pp. 209 - 212) . ACM. turkay_et_al_2017_ells_in_moocs.pdf
Rosen, Y. (2017). Enabling adaptive assessments [and learning] in HarvardX. In . Microsoft Assessment Deep Dive Workshop, Redmond, WA. rosen-adaptive-assessment_3-2-17.pdf
Rosen, Y., Rushkin, I., & Ang, A. (2017). Enabling adaptive and principled assessment design in MOOCs. In National Council on Measurement in Education, San Antonio, TX. adaptivity_and_assessment_design_in_moocs_.pdf
Rosen, Y., Rushkin, I., Ang, A., Federicks, C., Tingley, D., & Blink, M. - J. (2017). Designing adaptive assessments in MOOCs. Proceedings of the Fourth ACM Conference on Learning @ Scale . Publisher's Version rosen_et_al_ls_2017.pdf
Williams, J. J., Rafferty, A. N., Maldonado, S., Ang, A., Tingley, D., & Kim, J. (2017). MOOClets: A Framework for Dynamic Experimentation and Personalization. In Fourth (2017) ACM Conference on Learning @ Scale (pp. 287-290) . ACM. designing-tools-dynamic.pdf
Williams, J. J., Rafferty, A. N., Ang, A., Tingley, D., Lasecki, W. S., & Kim, J. (2017). Connecting Instructors and Learning Scientists via Collaborative Dynamic Experimentation. In CHI'17 CHI Conference on Human Factors in Computing Systems (pp. 3012-3018) . ACM. connecting_instructors_and_learning.pdf
Rosen, Y. (2017). Assessing Students in Human-to-Agent Settings to Inform Collaborative Problem-Solving Learning. Journal of Educational Measurement , 54 (1), 36-53. Publisher's VersionAbstract

In order to understand potential applications of collaborative problem-solving (CPS) assessment tasks, it is necessary to examine empirically the multifaceted student performance that may be distributed across collaboration methods and purposes of the assessment. Ideally, each student should be matched with various types of group members and must apply the skills in varied contexts and tasks. One solution to these assessment demands is to use computer-based (virtual) agents to serve as the collaborators in the interactions with students. This article proposes a human-to-agent (H-A) approach for formative CPS assessment and describes an international pilot study aimed to provide preliminary empirical findings on the use of H-A CPS assessment to inform collaborative learning. Overall, the findings showed promise in terms of using a H-A CPS assessment task as a formative tool for structuring effective groups in the context of CPS online learning.