Publications

DDG-DA: Data Distribution Generation for Predictable Concept Drift Adaptation

Published in AAAI, 2022

Wendi Li, Xiao Yang, Weiqing Liu, Yingce Xia, Jiang Bian

Paper link: [arXiv] [CODE] [POSTER] [REPORT (Chinese)]

  • Proposed a meta-learning method DDG-DA for concept drift adaptation. The meta-model learns to guide the retraining process of the base model by sample reweighting and aims to improve the performance of the base model on unseen test data.
  • The meta-model improved performance by $11.3\%$ in the signal-based metrics, and $46.7\%$ in the portfolio-based metrics compared to the baseline.

Population-based Hyperparameter Tuning with Multi-task Collaboration

Published in IEEE Transactions on Neural Networks and Learning Systems, 2021

Wendi Li, Ting Wang, Wing W. Y. Ng

Paper link: [WEB] [PDF]

  • Propose a population-based hyperparameter tuning framework combining multi-task collaboration. For the tasks in the task pool, it can find hyperparameter configurations that work for the current task, which can also be put into use in other tasks after slightly re-tuned.
  • The results show significant improvements in generalization abilities yielded by neural networks trained using our method and better performances achieved by the multi-task meta-learning concept.

HELP: An LSTM-based Approach to Hyperparameter Exploration in Neural Network Learning

Published in Neurocomputing, 2021

Wendi Li, Wing W. Y. Ng, Ting Wang, Marcello Pelillo, Sam Kwong

Paper link: [WEB]

  • Focused on a learning-to-learn problem, which aimed to automatically learn the hyperparameters of deep neural networks by using an LSTM-based predictor.
  • The proposed LSTM-based predictor not only retains the successful historical examples but also learns from failure lessons, used ablation studies to determine its optimal architecture.