MARCH 4, 2019 | 9 AM - 5 PM | TEMPE, ARIZONA


The 9th International Learning Analytics & Knowledge Conference

This workshop will focus on the conceptual frameworks, algorithms, and examples of performing operational and post hoc analytics for supporting learning and assessment.
Modern systems of computer-supported education have matured enough so that there is a nascent need for starting a conversation on forming a conceptual schema covering a wide array of learning analytical procedures embedded into the existing and future products at various stages of the educational support process from the operational recommendation and adaptation to the offline investigation.

These approaches are often grounded in compartmentalized cognitive and psychometric theories and would greatly benefit from a joint discussion to address the issues education is facing today. Recent developments indicate that society is interested in rethinking learning and assessment systems that have largely been developing separately and seldom thought of as complimentary parts of a unified Learning Assessment System (LAS). Such compartmentalization resulted in predominantly incremental improvement of the systems we have. Educators often require formative assessments that operationally reflect upon classroom teaching, studentsā€™ learning, and various in-class activities delivered as part of a unified learning experience.

Learning Analytics & Knowledge Conference
Arizona State University
Tempe, AZ


The focus of this workshop is on multidisciplinary research in the areas of personalization and adaptation of digital education and assessment tools. The organizing committee includes experts from fields of educational measurement, computational psychometrics, and adaptive testing (Alina von Davier), intelligent tutoring systems (Ken Koedinger & Steve Ritter), machine learning and adaptation methods in education (Michael Yudelson & Peter Brusilovsky). These scientists bring many years of experience and deep knowledge from their respective fields and are passionate about making connections across disciplines. They are actively involved in several research initiatives that explore new forms of teaching and assessment tools that combine research from both machine learning and educational measurement fields.

There is also a growing interest in performance assessments and learning that are individualized and adaptive, and efforts are being made to develop these technologies in a ubiquitous manner. Traditional approaches are largely unable to explain why students perform as they do and are unsuited to measure increasingly important constructs like behavior, affect, or collaboration. The desire of the field at large is to make progress towards LASs that are educationally effective, reflect realistic educational goals, accommodate student collaboration, and provide reliable instructional support for teachers.

Early attempts to create such systems demonstrate great potential. However, these LASs come with many challenges in terms of measurement, operational, and unification and standardization. Recent advances in applied machine learning (ML) offer opportunities to address these challenges by aggregating and analyzing the Big Data that is produced when students interact with LASs. Among other approaches are those that structure data into various forms of learner record stores, align instructional content and the assessment content across theoretically or empirically defined knowledge component schema and standards, such as the Common Core State Standards or the ACT Holistic Framework, analytical platforms and service providers that offer operational and investigational support for adaptive products and the process of content and assessment delivery.


Peter Brusilovsky University of Pittsburg


Kenneth Koedinger Carnegie Mellon University


Steve Ritter Carnegie Learning, Inc.


Alina von Davier ACTNext


Michael Yudelson ACTNext


The workshop will be full-day with an estimated number of 8-11 invited oral presentations. The organizers will moderate the course of the workshop. They will provide an introduction to the workshop, will lead the discussion after each oral presentation, and summarize the workshop. Workshop attendees will be actively involved in post-presentation Q&A and other discussions throughout the workshop. The workshop will follow an open attendance model: any conference delegate may register to attend. Each speaker will present their ongoing research and will give a brief overview of the state-of-the-art methods and applications in their respective field. Below is the list of invited speakers all of whom have confirmed their participation.

Name(s) Affiliation(s) Product
Alina von Davier, Steve Polyak ACTNext by the ACT, Inc. RAD API
Neil Heffernan, Anthony Botelho Worcester Polytechnic Institute ASSISTments
Kenneth Koedinger Carnegie Mellon University LearnSphere
Yigal Rosen, Ilia Rushkin ACTNext by ACT, Inc., Harvard University HarvardX
John Stamper, Mary-Jean Blink Carnegie Mellon University, TutorGen Scale
Norm Bier Carnegie Mellon University Open Learning Initiative (OLI)
Michelle Barrett, Bingnan Jiang ACT Echo-Adapt
Burr Settles, Klinton Bicknell Duolingo Duolingo
Michael Yudelson, Peter Brusilovsky ACTNext by ACT, Inc., University of Pittsburg PERSEUS
Zachary Pardos University of California, Berkeley AskOski
Josine Verhagen Kidaptive Kidaptive
Steve Ritter Carnegie Learning, Inc. Cognitive Tutor

A schedule (below) is to consist of 4 sessions, 3 breaks for coffee, lunch, and a closing discussion:

8:30 - Registration
8:50 - 9:00 - Introductory remarks
9:00 - 10:30 - Session 1 (3 presentations):
  • 9:00 - 9:30 - T. Carmichael, M.J. Blink, & J. Stamper
  • TutorGen SCALEĀ® - Student Centered Adaptive Learning Engine
  • 9:30 - 10:00 - M. Barrett & B. Jiang
  • Expanding Adaptive Algorithms in New Ways: Echo-Adapt Software-As-A-Service
  • 10:00 - 10:30 - N. Bier, S. Moore, M. Van Velsen
  • Instrumenting Courseware and Leveraging Data with the Open Learning Initiative (OLI)
    10:30 - 11:00 - Morning tea/coffee
    11:00 - 12:00 - Session 2 (2 presentations):
  • 11:00 - 11:30 - S. Polyak, M. Yudelson, K. Peterschmidt, B. Deonovic, & A. von Davier
  • Enhancing Test Preparation via Continuous Tracking of Practice Assessment Analytics & Personalized Resource Recommendations
  • 11:30 - 12:00 - Klinton Bicknell & Burr Settles
  • Predicting student knowledge at scale at Duolingo
    12:00 - 1:00 - Lunch
    1:00 - 3:30 - Session 3 (5 presentations):
  • 1:00 - 1:30 - Michael Yudelson & Peter Brusilovsky
  • PERSEUS -- A Personalization Services Engine for Online Learning
  • 1:30 - 2:00 - K. Koedinger, J. Stamper, & P. Carvalho
  • LearnSphere: Learning Analytics Development and Sharing Made Simple
  • 2:00 - 2:30 - Zachary Pardos
  • AskOski: Using University Enrollment Data to Surface Novel Semantic Structure and Personalized Guidance
  • 2:30 - 3:00 - S. Ritter, S. Fancsali, M. Sandbothe, & R. Hausmann
  • Conceptual Change as Evidence of Learning
  • 3:00 - 3:30 - J. Verhagen, D. Hatfield, D. Arena
  • Kidaptive's Journey Towards a Scalable Learning Analytics Solution
    3:30 - 4:00 - Afternoon tea/coffee
    4:00 - 5:00 - Session 4 (2 presentations):
  • 4:00 - 4:30 - A. Botelho, A. Sales, T. Patikorn, & N. Heffernan
  • The ASSISTments TestBed: Opportunities and Challenges of Experimentation in Online Learning Platforms
  • 4:30 - 5:00 - Y. Rosen, G. Lopez, I. Ruskin, A. Ang, D. Tingley
  • The Effects of Adaptive Learning in a Massive Open Online Course on Learners' Skill Development
    5:00 - 5:10 - Closing Remarks


    We believe multidisciplinary research and collaboration is key to developing the next generation learning and assessment systems that amass a critical set of adaptive support methods to cater to student needs and bring the sophistication and pedagogical nuances of a good teacher. This workshop would provide a forum for the sharing of knowledge and ideas across disciplines including computational psychometrics, adaptive learning and testing, and learning analytics, machine learning, educational measurement, and natural language processing. The research is relevant and timely for advances in learning and performance assessment simulation systems and collaborative LASs. We expect that by bringing together some of the best minds in these fields, we will be able to further the state of the art and generate an increasing interest and excitement in this area.


    Sign up to stay up to date with the latest ACTNext news & use the hashtag #LAK19 on Twitter!