Rosen, Y. (2017). Assessing Students in Human-to-Agent Settings to Inform Collaborative Problem-Solving Learning. Journal of Educational Measurement , 54 (1), 36-53. Publisher's VersionAbstract

In order to understand potential applications of collaborative problem-solving (CPS) assessment tasks, it is necessary to examine empirically the multifaceted student performance that may be distributed across collaboration methods and purposes of the assessment. Ideally, each student should be matched with various types of group members and must apply the skills in varied contexts and tasks. One solution to these assessment demands is to use computer-based (virtual) agents to serve as the collaborators in the interactions with students. This article proposes a human-to-agent (H-A) approach for formative CPS assessment and describes an international pilot study aimed to provide preliminary empirical findings on the use of H-A CPS assessment to inform collaborative learning. Overall, the findings showed promise in terms of using a H-A CPS assessment task as a formative tool for structuring effective groups in the context of CPS online learning.

Reich, J., Stewart, B., Mavon, K., & Tingley, D. (2016). The Civic Mission of MOOCs: Measuring Engagement across Political Differences in Forums. Proceedings of the Third Annual ACM Conference on Learning at Scale.
Williams, J. J., Kim, J., Glassman, E., Rafferty, A., & Lasecki, W. (2016). Making Static Lessons Adaptive through Crowdsourcing & Machine Learning. In Design Recommendations for Intelligent Tutoring Systems: Domain Modeling (Vol. 4).Abstract

Text components of digital lessons and problems are often static: they are written once and too often not improved over time. This is true for both large text components like webpages and documents as well as the small components that form the building blocks of courses: explanations, hints, examples, discussion questions/answers, emails, study tips, motivational messages. This represents a missed opportunity, since it should be technologically straightforward to enhance learning by improving text, as instructors get new ideas and data is collected about what helps learning. We describe how instructors can use recent work (Williams, Kim, Rafferty, Maldonado, Gajos, Lasecki, & Heffernan, 2016a) to make text components into adaptive  resources  that  semi­automatically  improve  over  time,  by  combining  crowdsourcing methods from human computer interaction (HCI) with algorithms from statistical machine learning that use data for optimization. 

Making Static Lessons Adaptive through Crowdsourcing and Machine Learning.pdf
Williams, J. J., Kim, J., Rafferty, A., Maldonado, S., Gajos, K., Lasecki, W. S., & Heffernan, N. (2016). AXIS: Generating Explanations at Scale with Learnersourcing and Machine Learning. Proceedings of the Third Annual ACM Conference on Learning at Scale. AXIS - Generating Explanations at Scale with Learnersourcing and Machine Learning.pdf
Williams, J. J., Lombrozo, T., Hsu, A., Huber, B., & Kim, J. (2016). Revising Learner Misconceptions Without Feedback: Prompting for Reflection on Anomalous Facts. Proceedings of CHI (2016), 34th Annual ACM Conference on Human Factors in Computing Systems. Revising Learner Misconceptions Without Feedback - Prompting for Reflection on Anomalous Facts.pdf
Whitehill, J. (2016). Association for the Advancement in Artificial Intelligence (AAAI). Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence (AAAI 2016). Exploiting an Oracle that Reports AUC Scores in Machine Learning Contests.pdf
Rosen, Y., Ferrara, S., & Mosharraf, M. (2015). Handbook of Research on Technology Tools for Real-World Skill Development (2 Volumes) (pp. 824) . Information Science Reference, IGI Global. Publisher's VersionAbstract

Changes in the world economy, specifically toward information industries, have changed the skillset demand of many jobs (Organization for Economic Development [OECD], 2012a). Information is created, acquired, transmitted, and used—rather than simply learned—by individuals, enterprises, organizations, and communities to promote economic and social development. Major employers and policy makers are increasingly asking teachers and educators to help students develop so-called real-world skills (Gallup, 2013). While learning basic numeracy and literacy skills still is crucial to success in the job market, developing real-world skills also is essential to success in the job market and worldwide economic development.

Real-world skills, or “21st century skills,” include critical thinking, collaborative problem solving, creativity, and global competency. These skills that facilitate mastery and application of science, mathematics, language arts, and other school subjects will grow in importance over the coming decade (National Research Council, 2012; OECD, 2012a, 2012b). A wide range of initiatives and programs in education promote learning and assessment of real-world skills. These include, for example, the Common Core State Standards (National Governors Association Center for Best Practices and Council of Chief State School Officers, 2010a, 2010b), Next Generation Science Standards (National Research Council, 2013), Common European Framework of Reference (Council of Europe, 2011), Partnership for 21st Century Skills (Partnership for 21st Century Skills, 2009), Education for Life and Work (National Research Council, 2012), and assessment frameworks in the Programme for International Student Assessment (PISA) (OECD, 2013).

Because of the importance of promoting these skills, we have embarked on a journey to create a Handbook of Research on Technology Tools for Real-World Skill Development. Because conceptions and educational applications of real-world skills are evolving rapidly, we have welcomed a wide range of skills in the Handbook. The following four strands of skills are represented in the chapters: Thinking skills refer to higher-order cognition and dispositions such as critical thinking, complex problem solving, metacognition, and learning to learn. Social skills refer to attitudes and behaviors that enable successful communication and collaboration. Global skills refer to attitudes and behaviors that emphasize the individual’s role in, and awareness of, the local as well as the global and multicultural environment. Digital skills emphasize information and digital literacies needed in the technology-rich world in which we live. Similarly, the chapters in this Handbook describe a range of technology tools to support teaching, learning, assessment for learning (e.g., Stiggins, 2005; Wiliam, 2011), feedback for learning (e.g., Hattie, & Timperley, 2007; Shute, 2008), and scoring of student responses.

As technology-rich environments for teaching, learning, assessment, and feedback are being integrated into educational processes, there is much to be learned about how to leverage advances in technology, learning sciences, and assessment to develop real-world skills for the 21st century. Research findings on what works best are just emerging, possibly due to the strong multi-disciplinary approaches required to extract the greatest value. This Handbook is intended to serve as a first body of research in the expanding area of technology tools for teaching, learning, assessment, and feedback on real-world skills that educators can turn to in the coming years as a reference. Our aim is to bring together top researchers to summarize concepts and findings. The Handbook contains contributions of leading researchers in learning science, educational psychology, psychometrics, and educational technology. Assuming that many readers will have little grounding in those topics, each chapter outlines theory and basic concepts and connects them to technology tools for real-world skill development. We see this as one of the most crucial contributions of the Handbook, seeking to establish strong theoretical principles that can inform educational research and practice and future research and development.

Table of contents/foreword/preface
Emanuel, J. P., & Lamb, A. (2015). Open, Online, and Blended: Transactional Interactions with MOOC Content by Learners in Three Different Course Formats. SSRN. Publisher's Version
Williams, J. J., Krause, M., Paritosh, P., Whitehill, J., Reich, J., Kim, J., Mitros, P., et al. (2015). Connecting Collaborative & Crowd Work with Online Education. Proceedings of the 18th ACM Conference Companion on Computer Supported Cooperative Work & Social Computing , 313-318. Publisher's Version
Krause, M., Mogale, M., Pohl, H., & Williams, J. J. (2015). A Playful Game Changer: Fostering Student Retention in Online Education with Social Gamification. Proceedings of the Second (2015) ACM Conference on Learning @ Scale , 95-102. Publisher's Version
Reich, J. (2015). Rebooting MOOC research. Science , 347 (6217), 30-31. Publisher's Version
Whitehill, J., Williams, J. J., Lopez, G., Coleman, C., & Reich, J. (2015). Beyond Prediction: First Steps Toward Automatic Intervention in MOOC Student Stopout. Proceedings of the 8th International Conference on Educational Data Mining , 171-178. Publisher's Version
Williams, J. J., Kim, J., & Keegan, B. (2015). Supporting Instructors in Collaborating with Researchers using MOOClets. Proceedings of the Second (2015) ACM Conference on Learning @ Scale , 413-416. Publisher's Version
Ho, A. D., Chuang, I., Reich, J., Coleman, C. A., Whitehill, J., Northcutt, C. G., Williams, J. J., et al. (2015). HarvardX and MITx: Two years of Open Online Courses Fall 2012-Summer 2014. HarvardX Working Paper. Publisher's Version
Friedman, M., & Moulton, S. (2015). Science of Living Systems, 20, Psychological Science: A Case Study in Educational Research and Assessment. Manuscript.
Mullaney, T., & Reich, J. (2015). Staggered Versus All-at-Once Content Release in Massive Open Online Courses: Evaluating a Natural Experiment. Proceedings of the Second (2015) ACM Conference on Learning @ Scale , 185-194. Publisher's Version
Reich, J., Tingley, D., Leder-Luis, J., Roberts, M., & Stewart, B. (2015). Computer-Assisted Reading and Discovery for Student Generated Text in Massive Open Online Courses. Journal of Learning Analytics , 2 (1), 156-184. Publisher's Version
Champaign, J., Colvin, K. F., Liu, A., Fredericks, C., Seaton, D., & Pritchard, D. E. (2014). Correlating skill and improvement in 2 MOOCs with a student’s time on tasks. Proceedings of the First ACM conference on Learning @ scale conference - L@S ’14 , 11-20. Publisher's Version
Nesterko, S. O., Seaton, D., Reich, J., McIntyre, J., Han, Q., Chuang, I., & Ho, A. (2014). Due dates in MOOCs. Proceedings of the First (2014) ACM Conference on Learning @ Scale , 193–194. Publisher's Version
Colvin, K. F., Champaign, J., Liu, A., Zhou, Q., Fredericks, C., & Pritchard, D. E. (2014). Learning in an introductory physics MOOC: All cohorts learn equally, including an on-campus class. The International Review of Research in Open and Distributed Learning , 15 (4), 263-283. Publisher's Version