Hello there! I’m Guodong Du, currently a research assistant at Knowledge and Language Computing Lab @Harbin Institute of Technology (Shenzhen). Prior to joining HITSZ, I was a research intern and master student in Learning and Vision Lab @National University of Singapore, advised by Professor Xinchao Wang and Jiashi Feng. Besides, I had been working as an intern in the area of low level computer vision, mentored by Xueyi Zou, in Huawei Noah’s Ark Lab, Shenzhen, China.

My research interest includes knowledge transfer, fusion and compression, multimodal, heuristic algorithms, spiking neural networks and low level computer vision.

🔥 News

Good News Bad News.

  • 2025.05:   Two first author papers are submitted to NeurIPS 2025.
  • 2025.05:  🎉 Two first author papers are accepted by ACL 2025 main. Thanks to all my collaborators!!
  • 2025.05:   One first author paper is rejected by ICML 2025.
  • 2025.01:   One first author paper is rejected by ICLR 2025.
  • 2024.12:   One first author paper is rejected by AAAI 2025.
  • 2024.09:  🎉 One first author papers is accepted by NeurIPS 2024. Thanks to all my collaborators!!
  • 2024.05:  🎉 One first author paper is accepted by ACL 2024 Findings. Thanks to all my collaborators!!
  • 2023.12:   Two first author papers are rejected by ICLR 2024.

📝 Recent Projects

  • Knowledge Fusion: A Comprehensive Survey. Github Repo,
  • Multi-objective LLM Alignment with Hierarchical Experts. [PDF], [Poster], submitted to NeurIPS 2025.

📝 Publications

ArXiv 2024
sym

Knowledge Grafting of Large Language Models

Guodong Du, Xuanning Zhou, Junlin Lee, Zhuo Li, Wanyu Lin, Jing Li

paper | | code | poster

  • We propose a knowledge grafting approach that efficiently transfers capabilities from heterogeneous LLMs to a target LLM through modular SkillPacks.
ACL 2025 main
sym

Neural Parameter Search for Slimmer Fine-Tuned Models and Better Transfer

Guodong Du, Zitao Fang, Junlin Lee, Runhua Jiang, Jing Li

paper | | code | poster

  • We propose Neural Parameter Search to enhance the efficiency of pruning fine-tuned models for better knowledge transfer, fusion, and compression of LLMs.
ACL 2025 main
sym

Multi-Modality Expansion and Retention for LLMs through Parameter Merging and Decoupling

Junlin Lee, Guodong Du*, Wenya Wang, Jing Li

paper | | code | poster

  • We propose MMER (Multi-modality Expansion and Retention), a training-free approach that integrates existing MLLMs for effective multimodal expansion while retaining their original performance.
NeurIPS 2024
sym

Parameter Competition Balancing for Model Merging

Guodong Du, Junlin Lee, Jing Li, Hanting Liu, Runhua Jiang, Shuyang Yu, Yifei Guo, Sim Kuan Goh, Ho-Kin Tang, Min Zhang

paper | | code | poster

  • We re-examine existing model merging methods, emphasizing the critical importance of parameter competition awareness, and introduce PCB-Merging, which effectively adjusts parameter coefficients.
ACL 2024
sym

Knowledge Fusion By Evolving Weights of Language Models

Guodong Du, Jing Li, Hanting Liu, Runhua Jiang, Shuyang Yu, Yifei Guo, Sim Kuan Goh, Ho-Kin Tang

paper | | code | poster

  • Model Evolution is the first approach to evolve neural parameters using Differential Evolutionary Algorithms. We introduce a novel knowledge fusion method by evolving weights of (large) language models.

    ( *: Co-first Author )

🎖 Honors and Awards

  • 2021.10 3rd Place Award in NTIRE 2021 Challenge on Video Super-resolution, Track I. (New Trends in Image Restoration and Enhacement Workshop, CVPR2021)
  • 2017.09 National Encouragement scholarship
  • 2016.05 Honorable Award in MCM (The Mathematical Contest in Modeling)
  • 2016.05 National Encouragement scholarship
  • 2015.05 National scholarship

📖 Educations

  • 2019.09 - 2021.05, Part-time PhD student, National University of Singapore (NUS)
  • 2018.08 - 2019.05, M.S., National University of Singapore (NUS)
  • 2016.07 - 2016.12, Visiting. City University of Hong kong (CityU) (Non-degree Undergraduate Exchange)
  • 2014.08 - 2018.06, B.S., University of Electronic Science and Technology of China (UESTC)

💻 Internships