Skip to content

Using teacher assistant networks to distill recommender systems

Notifications You must be signed in to change notification settings

hieunm44/hyperkdma

Repository files navigation

HyperKDMA: Distilling Recommender Systems via Hypernetwork-based Teacher Assistants

Overview

This is the implementation for our paper HyperKDMA: Distilling Recommender Systems via Hypernetwork-based Teacher Assistants. In this work, we propose HyperKDMA, a distillation scheme using multiple hypernetwork-based teacher assistants to bridge the teacher-student gap in knowledge distillation for top-K recommendation. We verify the effectiveness of our method through experiments using three base models: BPR, NeuMF and LightGCN; and two public data sets: CiteULike and Foursquare.

Built With

Usage

  1. Clone the repo
    git clone https://github.com/hieunm44/hyperkdma.git
    cd hyperkdma
  2. Generate datasets
    python3 gen_dataset_seed.py
    Then dataset files will be generated in the folder datasets.
  3. Train a teacher model
    python3 main_no_KD --model BPR --dim 200 --dataset CiteULike 
  4. Now you have different ways to train a student model, for example:
  • Train a student model without KD
    python3 main_no_KD --model BPR --dim 20 --dataset CiteULike 
  • Train a student model with KD using DE
    python3 main_DE --model BPR --teacher_dim 200 --student_dim 20 --dataset CiteULike
  • Train a student model with KD using HyperKDMA-DE
    python3 main_DETA --model BPR --teacher_dim 200 --student_dim 20 --num_TAs 8 --dataset CiteULike

Results

We compare our model with the following competitors: Distillation Experts (DE), Personalized Hint Regression (PHR), Knowledge Distillation via Teacher Assistant(TAKD), and Densely Guided Knowledge Distillation (DGKD). Our model HyperKDMA significantly outperforms other KD methods thanks to the personalized learning mechanism.

About

Using teacher assistant networks to distill recommender systems

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages