Skip to content
/ SML Public

This is an implementation for our SIGIR 2020 paper: How to Retrain Recommender System? A Sequential Meta-Learning Method.

Notifications You must be signed in to change notification settings

zyang1580/SML

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SML

This is an implementation for our SIGIR 2020 paper: How to Retrain Recommender System? A Sequential Meta-Learning Method.

Contributors: Yang Zhang, Chenxu Wang, Fuli Feng, Xiangnan He

Requirements

pytorch >= 1.2

numpy

Parameters

  • --MF_lr: learning rate for $\hat(w)_t$
  • --TR_lr: learning rate for Transfer
  • --l2: $\lambda_1$ in paper
  • --TR_l2: $\lambda_2$ in paper
  • --MF_epochs: epochs of learning MF $\hat(w)_t$ (line 6 in Alg 1 in paper)
  • --TR_epochs: epochs of learning Transfer $\theta$ (line 9 in Alg 1 in paper)
  • --multi_num: stop condition (line 4 in Alg 1 in paper)
  • others: read help, or "python main_yelp.py --help"

Dataset

Save as array,

Examples

Yelp

nohup python main_yelp.py --MF_epochs=1 --TR_epochs=1 --multi_num=10 > yelp_log.out &

or

python main_yelp.py --MF_epochs=1 --TR_epochs=1 --multi_num=10

Adressa

nohup python main_news.py --MF_epochs=2 --TR_epochs=2 --multi_num=7 > yelp_log.out &

for Adressa, if we set the MF_epochs and TR_epochs same to paper (=1) , we can also get a similar performance if we adjust multi_num.

About

This is an implementation for our SIGIR 2020 paper: How to Retrain Recommender System? A Sequential Meta-Learning Method.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages