WORKSHOP ON LEARNING ANALYTIC SERVICES TO SUPPORT PERSONALIZED LEARNING & ASSESSMENT AT SCALE
The 9th International Learning Analytics & Knowledge Conference
The 9th International Learning Analytics & Knowledge Conference
This workshop will focus on the conceptual frameworks, algorithms, and examples of performing operational and post hoc analytics for supporting learning and assessment.
Modern systems of computer-supported education have matured enough so that there is a nascent need for starting a conversation on forming a conceptual schema covering a wide array of learning analytical procedures embedded into the existing and future products at various stages of the educational support process from the operational recommendation and adaptation to the offline investigation.
These approaches are often grounded in compartmentalized cognitive and psychometric theories and would greatly benefit from a joint discussion to address the issues education is facing today. Recent developments indicate that society is interested in rethinking learning and assessment systems that have largely been developing separately and seldom thought of as complimentary parts of a unified Learning Assessment System (LAS). Such compartmentalization resulted in predominantly incremental improvement of the systems we have. Educators often require formative assessments that operationally reflect upon classroom teaching, students’ learning, and various in-class activities delivered as part of a unified learning experience.
The focus of this workshop is on multidisciplinary research in the areas of personalization and adaptation of digital education and assessment tools. The organizing committee includes experts from fields of educational measurement, computational psychometrics, and adaptive testing (Alina von Davier), intelligent tutoring systems (Ken Koedinger & Steve Ritter), machine learning and adaptation methods in education (Michael Yudelson & Peter Brusilovsky). These scientists bring many years of experience and deep knowledge from their respective fields and are passionate about making connections across disciplines. They are actively involved in several research initiatives that explore new forms of teaching and assessment tools that combine research from both machine learning and educational measurement fields.
There is also a growing interest in performance assessments and learning that are individualized and adaptive, and efforts are being made to develop these technologies in a ubiquitous manner. Traditional approaches are largely unable to explain why students perform as they do and are unsuited to measure increasingly important constructs like behavior, affect, or collaboration. The desire of the field at large is to make progress towards LASs that are educationally effective, reflect realistic educational goals, accommodate student collaboration, and provide reliable instructional support for teachers.
Early attempts to create such systems demonstrate great potential. However, these LASs come with many challenges in terms of measurement, operational, and unification and standardization. Recent advances in applied machine learning (ML) offer opportunities to address these challenges by aggregating and analyzing the Big Data that is produced when students interact with LASs. Among other approaches are those that structure data into various forms of learner record stores, align instructional content and the assessment content across theoretically or empirically defined knowledge component schema and standards, such as the Common Core State Standards or the ACT Holistic Framework, analytical platforms and service providers that offer operational and investigational support for adaptive products and the process of content and assessment delivery.
The workshop will be full-day with an estimated number of 8-11 invited oral presentations. The organizers will moderate the course of the workshop. They will provide an introduction to the workshop, will lead the discussion after each oral presentation, and summarize the workshop. Workshop attendees will be actively involved in post-presentation Q&A and other discussions throughout the workshop. The workshop will follow an open attendance model: any conference delegate may register to attend. Each speaker will present their ongoing research and will give a brief overview of the state-of-the-art methods and applications in their respective field. Below is the list of invited speakers all of whom have confirmed their participation.
|Alina von Davier, Steve Polyak||ACTNext by the ACT, Inc.||RAD API|
|Neil Heffernan, Anthony Botelho||Worcester Polytechnic Institute||ASSISTments|
|Kenneth Koedinger||Carnegie Mellon University||LearnSphere|
|Yigal Rosen, Ilia Rushkin||ACTNext by ACT, Inc., Harvard University||HarvardX|
|John Stamper, Mary-Jean Blink||Carnegie Mellon University, TutorGen||Scale|
|Norm Bier||Carnegie Mellon University||Open Learning Initiative (OLI)|
|Michelle Barrett, Bingnan Jiang||ACT||Echo-Adapt|
|Burr Settles, Klinton Bicknell||Duolingo||Duolingo|
|Michael Yudelson, Peter Brusilovsky||ACTNext by ACT, Inc., University of Pittsburg||PERSEUS|
|Zachary Pardos||University of California, Berkeley||AskOski|
|Steve Ritter||Carnegie Learning, Inc.||Cognitive Tutor|
A schedule (below) is to consist of 4 sessions, 3 breaks for coffee, lunch, and a closing discussion:
| 8:30 - Registration
8:50 - 9:00 - Introductory remarks
9:00 - 10:30 - Session 1 (3 presentations):
|10:30 - 11:00 - Morning tea/coffee
11:00 - 12:00 - Session 2 (2 presentations):
| 12:00 - 1:00 - Lunch
1:00 - 3:30 - Session 3 (5 presentations):
|3:30 - 4:00 - Afternoon tea/coffee
4:00 - 5:00 - Session 4 (2 presentations):
|5:00 - 5:10 - Closing Remarks|
We believe multidisciplinary research and collaboration is key to developing the next generation learning and assessment systems that amass a critical set of adaptive support methods to cater to student needs and bring the sophistication and pedagogical nuances of a good teacher. This workshop would provide a forum for the sharing of knowledge and ideas across disciplines including computational psychometrics, adaptive learning and testing, and learning analytics, machine learning, educational measurement, and natural language processing. The research is relevant and timely for advances in learning and performance assessment simulation systems and collaborative LASs. We expect that by bringing together some of the best minds in these fields, we will be able to further the state of the art and generate an increasing interest and excitement in this area.
Sign up to stay up to date with the latest ACTNext news & use the hashtag #LAK19 on Twitter!