Publications by Year: 2016

2016
Simon, C. (2016). MOOCs ahead. Harvard Gazette. Publisher's Version
Turkay, S., & Mouton, S. (2016). The educational impact of whiteboard animations: An experiment using popular social science lessons. In Proceedings of the MIT Learning International Networks Consortium.Abstract
Whiteboard animations are increasingly used in education despite little evidence of their efficacy. In this study, we measured the impact of whiteboard animations and other common instructional formats on learning outcomes, experience, and motivation. We recruited participants from Amazon’s Mechanical Turk (N=568; 326 females). Participants were randomly assigned to view online lessons about popular topics in social science from wellestablished scholars in one of five common instructional formats: whiteboard animation, electronic slideshow, stage lecture, audio, and text. Results showed a benefit of whiteboard animations in terms of learning and subjective experiences of enjoyment and engagement.
Reich, J., Stewart, B., Mavon, K., & Tingley, D. (2016). The Civic Mission of MOOCs: Measuring Engagement across Political Differences in Forums. Proceedings of the Third Annual ACM Conference on Learning at Scale.
Williams, J. J., Kim, J., Glassman, E., Rafferty, A., & Lasecki, W. (2016). Making Static Lessons Adaptive through Crowdsourcing & Machine Learning. In Design Recommendations for Intelligent Tutoring Systems: Domain Modeling (Vol. 4).Abstract

Text components of digital lessons and problems are often static: they are written once and too often not improved over time. This is true for both large text components like webpages and documents as well as the small components that form the building blocks of courses: explanations, hints, examples, discussion questions/answers, emails, study tips, motivational messages. This represents a missed opportunity, since it should be technologically straightforward to enhance learning by improving text, as instructors get new ideas and data is collected about what helps learning. We describe how instructors can use recent work (Williams, Kim, Rafferty, Maldonado, Gajos, Lasecki, & Heffernan, 2016a) to make text components into adaptive  resources  that  semi­automatically  improve  over  time,  by  combining  crowdsourcing methods from human computer interaction (HCI) with algorithms from statistical machine learning that use data for optimization. 

Making Static Lessons Adaptive through Crowdsourcing and Machine Learning.pdf
Williams, J. J., Kim, J., Rafferty, A., Maldonado, S., Gajos, K., Lasecki, W. S., & Heffernan, N. (2016). AXIS: Generating Explanations at Scale with Learnersourcing and Machine Learning. Proceedings of the Third Annual ACM Conference on Learning at Scale. AXIS - Generating Explanations at Scale with Learnersourcing and Machine Learning.pdf
Williams, J. J., Lombrozo, T., Hsu, A., Huber, B., & Kim, J. (2016). Revising Learner Misconceptions Without Feedback: Prompting for Reflection on Anomalous Facts. Proceedings of CHI (2016), 34th Annual ACM Conference on Human Factors in Computing Systems. Revising Learner Misconceptions Without Feedback - Prompting for Reflection on Anomalous Facts.pdf
Whitehill, J. (2016). Association for the Advancement in Artificial Intelligence (AAAI). Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence (AAAI 2016). Exploiting an Oracle that Reports AUC Scores in Machine Learning Contests.pdf