Xu Guo

I am a Wallenberg-NTU Presidential Postdoctoral Fellow at Nanyang Technological University, Singapore advised by Prof. Miao Chun Yan. Previously, I was a research fellow at SCSE, NTU under the supervision of Prof. Boyang Albert Li. I received my Ph.D. in 2023 from NTU, where I was supervised by Prof. Han Yu.

Research

Research Interests: 1. Natural Language Processing (Pretrained Language Models, Language Understanding, Language Generation, etc.); 2. Machine Learning (Transfer Learning, Adversarial Learning, Federated Learning, etc.).

My PhD research was mostly driven by developing algorithms to adapt (🔥) Pretrained Language Models (PLMs) to low-resource domains against potential domain shift. Check out the survey, on the domain adaptation and generalization of pretrained lanaguage models. My thesis, data-efficient domain adaptation for pretrained language models, provides a few promising solutions, such as latent optimization[1], parameter-efficient adaptation[2] (❄️), and personalization[3], to boost PLMs in data-scarce domains under different resource constraints and settings.

On top of larger PLMs or LLMs for short, my postdoctoral research focus on delivering societal benefits through, e.g., Generative AI[4] and Green AI in the real world. In general, I work on efficient synthetic data generation methods[5],[6], enhancing the robustness of LLMs💪[7],[8], accelerating LLM🚀[9] inference for high throughput, and merging LLM for seamless plug-and-play integration. They are mainly done in a parameter-efficient manner and aim to contribute to our sustainable earth.

Publications đź“–

[EMNLP’24] Yige Xu, Xu Guo, Zhiwei Zeng, Chunyan Miao. “RevMUX: Data Multiplexing with Reversible Adapters for Efficient LLM Batch Inference” (Long Paper)

[EMNLP Findings’24] Yongjie Wang, Xiaoqi Qiu, Yue Yu, Xu Guo, Zhiwei Zeng, Yuhong Feng, Zhiqi Shen. “A Survey on Natural Language Counterfactual Generation” (Long Paper)

[COLM’24] Xu Guo, Zilin Du, Boyang Li, Chunyan Miao. “Generating Synthetic Datasets for Few-shot Prompt Tuning” (Long Paper)

[ACL’24] Xiaoqi Qiu*, Yongjie Wang*, Xu Guo, Zhiwei Zeng, Yue Yu, Yuhong Feng, and Chunyan Miao. “PairCFR: Enhancing Model Training on Paired Counterfactually Augmented Data through Contrastive Learning” (Long Paper)

[EMNLP Findings’23] Meizhen Liu, Xu Guo, Jiakai He, Jianye Chen, Fengyu Zhou, and Siu Hui. “InteMATs: Integrating Granularity-Specific Multilingual Adapters for Cross-Lingual Transfer”. (Long Paper)

[CIKM’23] Meizhen Liu, Jiakai He, Xu Guo, Jianye Chen, Siu Cheung Hui, and Fengyu Zhou. “GranCATs: Cross-Lingual Enhancement through Granularity-Specific Contrastive Adapters”. (Long Paper)

[ACM MM’23] Zilin Du, Yunxin Li, Xu Guo, Yidan Sun, and Boyang Li. “Training Multimedia Event Extraction With Generated Images and Captions”. (Long Paper)

[IJCNN’22] Fei Luo, Hangwei Qian, Di Wang, Xu Guo, Yan Sun, Eng Sing Lee, Hui Hwang Teong, Ray Tian Rui Lai, Chunyan Miao. “Missing value imputation for diabetes prediction”. (Long Paper)

[EMNLP Findings’22] Xu Guo, Boyang Li, Han Yu. “Improving the Sample Efficiency of Prompt Tuning with Domain Adaptation”. (Long Paper)

[ACM TIST’22] Xu Guo, Han Yu, Boyang Li, Hao Wang, Pengwei Xing, Siwei Feng, Zaiqing Nie, and Chunyan Miao. “Federated learning for personalized humor recognition” (Long Paper)

[NAACL’21] Xu Guo, Boyang Li, Han Yu, Chunyan Miao. “Latent-Optimized Adversarial Neural Transfer for Sarcasm Detection” (Long Paper)

[IJCAI’19] Xu Guo, Han Yu, Chunyan Miao, and Yiqiang Chen. “Agent-based Decision Support for Pain Management in Primary Care Settings” (Demo Paper)

Fellowship and Grant 🏆

  1. Wallenberg-NTU Presidential Postdoctoral Fellowship. 2023 Dec - 2024 Dec. NTU, Singapore. Principal Investigator - “Advancing Low-resource Regimes with Generative AI”, S$100,000.
  2. WiEST Development Grant, 2023. Awarded by Women in Engineering, Science, and Technology, NTU. S$3,000
  3. Student Travel Grant, 2019. Awared by IJCAI. U$ 350