Skip to content

ranydb/EAKT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

EAKT: Embedding Cognitive Framework with Attention for Interpretable Knowledge Tracing

Python 3.7 PyTorch 1.3 cuDNN 7.6.1



Figure 1. The architecture of EAKT.

About

This is an implementation of the EAKT model, described in the following paper: EAKT: Embedding Cognitive Framework with Attention for Interpretable Knowledge Tracing. The datasets generated and/or analysed during the current study are available in the EAKT repository, [https://github.com/ranydb/EAKT]

Contributors

State Key Laboratory of Software Development Environment Admire Group, School of Computer Science and Engineering, Beihang University

Dataset

The csv folder has four datasets, assist2009, assist2015, assist2017, simulation. Every dataset include three files:train data, test data, and Q-matrix data.

The statistical information of the above datasets is given in following Table.

Dataset Student Question Interaction Average length Maximum length
ASSIST2009 4,151 110 325,637 78 1,261
ASSIST2015 19,917 100 708,631 35 632
ASSIST2017 1,709 102 942,816 551 3,057
Simu 20,000 30 1,000,000 50 50

Usage

Usage:
    run.py (eakt) --data=<h> --questions=<h> [options]

Options:
    --length=<int>                      max length of question sequence [default: 50]
    --questions=<int>                   num of question [default: 124]
    --lr=<float>                        learning rate [default: 0.001]
    --bs=<int>                          batch size [default: 64]
    --seed=<int>                        random seed [default: 59]
    --epochs=<int>                      number of epochs [default: 30]
    --cuda=<int>                        use GPU id [default: 0]
    --hidden=<int>                      dimention of hidden state [default: 128]
    --kc=<int>                          knowledge compoments dimention [default: 10]
    --layers=<int>                      layers of rnn or transformer [default: 1]
    --heads=<int>                       head number of transformer [default: 8]
    --dropout=<float>                   dropout rate [default: 0.1]
    --beta=<float>                      reduce rate of MyModel [default: 0.95]
    --data=<string>                     dataset [default: assist2009]
    --kernels=<int>                     the kernel size of CNN [default: 7]
    --memory_size=<int>                 memory size of DKVMN model [default: 20]
    --weight_decay=<float>              weight_decay of optimizer [default: 0]
    --sigmoida=<float>                  coefficient of custom sigmoid function [default: 5]
    --sigmoidb=<float>                  constant custom sigmoid function [default: 6.9]
    --save_model=<bool>                 whether save the KT model [default: true]
    --save_epoch=<int>                  the epoch to save the KT model [default: 0]
    --gpu=<int>                         select the gpu card [default: 0]

example

# Run EAKT model with assist2009 dataset.
python -m evaluation.run eakt --data=assist2009updated --questions=110 --kc=30 --heads=8 --hidden=128 --bs=5 --epochs=20 --save_epoch=19 --lr=0.001 --weight_decay=0.000001 --sigmoida=1 --sigmoidb=0

# Run EAKT model with assist2015 dataset.
python -m evaluation.run eakt --data=assist2015 --questions=100 --kc=10 --heads=8 --hidden=128 --bs=64 --epochs=20 --save_epoch=19 --lr=0.001 --weight_decay=0 --sigmoida=1 --sigmoidb=0

# Run EAKT model with assist2017 dataset.
python -m evaluation.run eakt --data=assist2017 --questions=102 --kc=10 --heads=8 --hidden=128 --bs=5 --epochs=20 --save_epoch=19 --lr=0.001 --weight_decay=0.000001 --sigmoida=1 --sigmoidb=0

# Run EAKT model with simu dataset.
python -m evaluation.run eakt --data=simu --questions=30 --kc=10 --heads=8 --hidden=256 --bs=32 --epochs=10 --save_epoch=9 --lr=0.00001 --weight_decay=0 --sigmoida=10 --sigmoidb=6.9

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages