The 9th International Learning Analytics & Knowledge Conference

This workshop will focus on the conceptual frameworks, algorithms, and examples of performing operational and post hoc analytics for supporting learning and assessment.
Modern systems of computer-supported education have matured enough so that there is a nascent need for starting a conversation on forming a conceptual schema covering a wide array of learning analytical procedures embedded into the existing and future products at various stages of the educational support process from the operational recommendation and adaptation to the offline investigation.

These approaches are often grounded in compartmentalized cognitive and psychometric theories and would greatly benefit from a joint discussion to address the issues education is facing today. Recent developments indicate that society is interested in rethinking learning and assessment systems that have largely been developing separately and seldom thought of as complimentary parts of a unified Learning Assessment System (LAS). Such compartmentalization resulted in predominantly incremental improvement of the systems we have. Educators often require formative assessments that operationally reflect upon classroom teaching, students’ learning, and various in-class activities delivered as part of a unified learning experience.

Learning Analytics & Knowledge Conference
Tempe, AZ


The focus of this workshop is on multidisciplinary research in the areas of personalization and adaptation of digital education and assessment tools. The organizing committee includes experts from fields of educational measurement, computational psychometrics, and adaptive testing (Alina von Davier), intelligent tutoring systems (Ken Koedinger & Steve Ritter), machine learning and adaptation methods in education (Michael Yudelson & Peter Brusilovsky). These scientists bring many years of experience and deep knowledge from their respective fields and are passionate about making connections across disciplines. They are actively involved in several research initiatives that explore new forms of teaching and assessment tools that combine research from both machine learning and educational measurement fields.

There is also a growing interest in performance assessments and learning that are individualized and adaptive, and efforts are being made to develop these technologies in a ubiquitous manner. Traditional approaches are largely unable to explain why students perform as they do and are unsuited to measure increasingly important constructs like behavior, affect, or collaboration. The desire of the field at large is to make progress towards LASs that are educationally effective, reflect realistic educational goals, accommodate student collaboration, and provide reliable instructional support for teachers.

Early attempts to create such systems demonstrate great potential. However, these LASs come with many challenges in terms of measurement, operational, and unification and standardization. Recent advances in applied machine learning (ML) offer opportunities to address these challenges by aggregating and analyzing the Big Data that is produced when students interact with LASs. Among other approaches are those that structure data into various forms of learner record stores, align instructional content and the assessment content across theoretically or empirically defined knowledge component schema and standards, such as the Common Core State Standards or the ACT Holistic Framework, analytical platforms and service providers that offer operational and investigational support for adaptive products and the process of content and assessment delivery.


Peter Brusilovsky University of Pittsburg


Kenneth Koedinger Carnegie Mellon University


Steve Ritter Carnegie Learning, Inc.


Alina von Davier ACTNext


Michael Yudelson ACTNext


The workshop will be full-day with an estimated number of 8-11 invited oral presentations. The organizers will moderate the course of the workshop. They will provide an introduction to the workshop, will lead the discussion after each oral presentation, and summarize the workshop. Workshop attendees will be actively involved in post-presentation Q&A and other discussions throughout the workshop. The workshop will follow an open attendance model: any conference delegate may register to attend. Each speaker will present their ongoing research and will give a brief overview of the state-of-the-art methods and applications in their respective field. Below is the list of invited speakers all of whom have confirmed their participation.

Name(s) Affiliation(s) Product Email(s)
Alina von Davier, Steve Polyak ACTNext by the ACT, Inc. RAD API
Neil Heffernan, Anthony Botelho Worcester Polytechnic Institute ASSISTments,
Kenneth Koedinger Carnegie Mellon University LearnSphere
Yigal Rosen, Ilia Rushkin ACTNext by ACT, Inc., Harvard University HarvardX,
John Stamper, Mary-Jean Blink Carnegie Mellon University, TutorGen Scale
Norm Bier
Michelle Barrett ACT Echo-Adapt
Burr Settles, Researcher (TBD) Duolingo Duolingo
Michael Yudelson, Peter Brusilovsky ACTNext by ACT, Inc., University of Pittsburg PERSEUS,
Marsha Lovett Acrobatiq
Zachary Pardos University of California, Berkeley
Lou Pugliese, Elle Wang Arizona State University EdPlus,
Steve Ritter Carnegie Learning, Inc. Cognitive Tutor

A tentative schedule (below) is to consist of 4 sessions, 3 breaks for coffee and lunch and a closing discussion:

Activity Time Slot
Introductory remarks 9:00-9:15
Session 1 (3 presentations) 9:15-10:45
Coffee break 10:45-11:00
Session 2 (3 presentations) 11:00-12:30
Lunch 12:30-1:30
Session 3 (3 presentations) 1:30-3:00
Coffee break 3:00-3:15
Session 4 (2 presentations) 3:15-4:15
Discussion & Closing 4:15-4:45


Full details regarding submission guidelines will be forthcoming. Please contact Michael Yudelson with any preliminary questions you may have.


Sign up to stay up to date with the latest ACTNext news & use the hashtag #LAK19 on Twitter!