Hello there! I’m Guodong Du, currently pursuing my PhD at The Hong Kong Polytechnic University (PolyU). Before that, I was a research assistant at Knowledge and Language Computing Lab @Harbin Institute of Technology (Shenzhen). Prior to joining HITSZ, I was a research intern and master student in Learning and Vision Lab @National University of Singapore, advised by Professor Xinchao Wang and Jiashi Feng. Besides, I had been working as an intern in the area of low level computer vision, mentored by Xueyi Zou, in Huawei Noah’s Ark Lab, Shenzhen, China.

My research interest includes VLA, RL in Embodiment Intelligence, knowledge transfer, fusion and compression, multimodal, heuristic algorithms, spiking neural networks and low level computer vision.

🔥 News

Good News Bad News.

  • 2026.01:  🎉 One first author and one co-first author paper are accepted by ICLR 2026.
  • 2025.08:  🎉 One co-corresponding paper is accepted by EMNLP 2025 main.
  • 2025.05:  🎉 One first author and one co-first author paper are accepted by ACL 2025 main.
  • 2024.09:  🎉 One first author paper is accepted by NeurIPS 2024.
  • 2024.05:  🎉 One first author paper is accepted by ACL 2024 Findings.

📝 Recent Projects

  • FuseVLA, VLA+RL
  • Knowledge Fusion: A Comprehensive Survey. Github Repo,

📝 Publications

ICLR 2026
sym

Knowledge Grafting of Large Language Models

Guodong Du, Xuanning Zhou, Junlin Lee, Zhuo Li, Wanyu Lin, Jing Li

paper | | code | poster

  • We propose a knowledge grafting approach that efficiently transfers capabilities from heterogeneous LLMs to a target LLM through modular SkillPacks.
ICLR 2026
sym

Multi-objective Large Language Model Alignment with Hierarchical Experts

Zhuo Li, Guodong Du*, Wenya Wang, Min Zhang, Jing Li

paper | | code | poster

  • HoE consists of three hierarchical components: LoRA Experts, Router Experts and Preference Routing, reaching optimal Pareto frontiers and achieving a trade-off between parameter size, training cost, and performance.
EMNLP 2025 main
sym

To See a World in a Spark of Neuron: Disentangling Multi-task Interference for Training-free Model Merging

Zitao Fang, Guodong Du*, Jing Li, Ho-Kin Tang, Sim Kuan Goh

paper | | code | poster

  • we introduced NeuroMerging, a novel merging framework developed to mitigate task interference within neuronal subspaces, enabling training-free model fusion across diverse tasks.
ACL 2025 main
sym

Neural Parameter Search for Slimmer Fine-Tuned Models and Better Transfer

Guodong Du, Zitao Fang, Junlin Lee, Runhua Jiang, Jing Li

paper | | code | poster

  • We propose Neural Parameter Search to enhance the efficiency of pruning fine-tuned models for better knowledge transfer, fusion, and compression of LLMs.
ACL 2025 main
sym

Multi-Modality Expansion and Retention for LLMs through Parameter Merging and Decoupling

Junlin Lee, Guodong Du*, Wenya Wang, Jing Li

paper | | code | poster

  • We propose MMER (Multi-modality Expansion and Retention), a training-free approach that integrates existing MLLMs for effective multimodal expansion while retaining their original performance.
NeurIPS 2024
sym

Parameter Competition Balancing for Model Merging

Guodong Du, Junlin Lee, Jing Li, Hanting Liu, Runhua Jiang, Shuyang Yu, Yifei Guo, Sim Kuan Goh, Ho-Kin Tang, Min Zhang

paper | | code | poster

  • We re-examine existing model merging methods, emphasizing the critical importance of parameter competition awareness, and introduce PCB-Merging, which effectively adjusts parameter coefficients.
ACL 2024
sym

Knowledge Fusion By Evolving Weights of Language Models

Guodong Du, Jing Li, Hanting Liu, Runhua Jiang, Shuyang Yu, Yifei Guo, Sim Kuan Goh, Ho-Kin Tang

paper | | code | poster

  • Model Evolution is the first approach to evolve neural parameters using Differential Evolutionary Algorithms. We introduce a novel knowledge fusion method by evolving weights of (large) language models.

    ( *: Co-first Author )

🎖 Honors and Awards

  • 2021.10 3rd Place Award in NTIRE 2021 Challenge on Video Super-resolution, Track I. (New Trends in Image Restoration and Enhacement Workshop, CVPR2021)
  • 2017.09 National scholarship
  • 2016.05 Honorable Award in MCM (The Mathematical Contest in Modeling)
  • 2016.05 National Encouragement scholarship
  • 2015.05 National Encouragement scholarship

📖 Educations

  • 2019.09 - 2021.05, Part-time PhD student, National University of Singapore (NUS)
  • 2018.08 - 2019.05, M.S., National University of Singapore (NUS)
  • 2016.07 - 2016.12, Visiting. City University of Hong kong (CityU) (Non-degree Undergraduate Exchange)
  • 2014.08 - 2018.06, B.S., University of Electronic Science and Technology of China (UESTC)

💻 Internships