Welcome to my website.

Hi! My name is Li Lin (林丽). I’m currently an Assistant Professor at the School of Computer Science and Engineering in Southeast University.

Education

  • Ph.D. in School of Software, Tsinghua University, 2022
  • B.S. in Department of Computer Science and Technology, Xi’an Jiaotong University, 2016

Research Outline

My research interests lie in the fields of sequential modeling, multimodal learning, and reasoning on LLMs. In particular, I am interested in multimodel LLMs and focusing on:

  • Reasoning and Efficient Prompting on LLMs
  • Multimodal Prediction and Decision-making
  • Sequential Forecasting and Generation

Publications

You can also find the publications on my Google Scholar profile.

2025   Kaiwen Xia, Li Lin, Xinrui Zhang, Haotian Wang, Shuai Wang, Tian He. A Transferable Spatio-temporal Learning Framework for Cross-city Logistics Demand Prediction (SIGKDD) CCF-A

2025   Kaiwen Xia, Li Lin, Shuai Wang, Anqi Zheng, Zhao-Dong Xu, Desheng Zhang, Tian He. H2DGL: Adaptive Metapath-Based Dynamic Graph Learning for Supply Forecasting in Logistics System. (TITS) CCF-B, JCR Q1

2025   Li Lin; Kaiwen Xia; Haotian Shen; Shuai Wang. CoANBR: A Collaborative Aggregation Model for Next Basket Recommendation with Time-independent Sequence Modeling. (DASFAA) CCF-B

2025   Xinwei Li; Li Lin*; Shuai Wang; Hanqian Wu. Seeing Beyond Hallucinations: LLM-based Compositional Information Extraction for Multimodal Reasoning. (SIGIR) CCF-A

2025   Kaiwen Xia, Li Lin*, Shuai Wang, Qi Zhang, Shuai Wang, and Tian He. ProST: Prompt Future Snapshot on Dynamic Graphs for Spatio-Temporal Prediction. (KDD) CCF-A

2025   Shuai Wang; Hai Wang; Li Lin*; Xiaohui Zhao; Tian He; Dian Shen. HPST-GT: Full-Link Delivery Time Estimation Via Heterogeneous Periodic Spatial-Temporal Graph Transformer. (TKDE) CCF-A, JCR Q1

2024   Xinwei Li; Li Lin*; Shuai Wang; Chen Qian; Self-Improving Teacher Cultivates Better Student: Distillation Calibration for Multimodal Large Language Models. (SIGIR) CCF-A

2024   Zhiyuan Zhou, Li Lin*, Hai Wang, Xiaolei Zhou, Gong Wei, and Shuai Wang. 2024. A Cross-Domain Method for Customer Lifetime Value Prediction in Supply Chain Platform[C]// In Proceedings of the ACM Web Conference 2024 (WWW) CCF-A

2024   Li Lin; Xin Xu; Hai Wang; Tian He; Desheng Zhang; Shuai Wang. DIFN: A Dual Intention-aware Network for Repurchase Recommendation with Hierarchical Spatio-temporal Fusion. (CIKM) CCF-B

2024   Li Lin; Xinyao Chen; Kaiwen Xia; Shuai Wang; Desheng Zhang; Tian He. Hierarchical Information Propagation and Aggregation in Disentangled Graph Networks for Audience Expansion. (CIKM) CCF-B

2024   Li Lin; Zhiqiang Lu; Shuai Wang; Yunhuai Liu; Zhiqing Hong; Haotian Wang; Shuai Wang. MulSTE: A Multi-view Spatio-temporal Learning Framework with Heterogeneous Event Fusion for Demand-supply Prediction. (SIGKDD) CCF-A

2024   Li Lin; Kaiwen Xia; Anqi Zheng; Shijie Hu; Shuai Wang. Hierarchical Spatio-Temporal Graph Learning Based on Metapath Aggregation for Emergency Supply Forecasting. (CIKM) CCF-B

2024   Shuai Wang, Tongtong Kong, Baoshen Guo, Li Lin*, Hai Wang. Hierarchical Spatio-Temporal Graph Learning Based on Metapath Aggregation for Emergency Supply Forecasting. (CIKM) CCF-B

2023   Kaiwen Xia; Li Lin; Shuai Wang; Hai Wang; Desheng Zhang; Tian He. A predict-then-optimize couriers allocation framework for emergency last-mile logistics. (SIGKDD) CCF-A

2023   Zan Zong; Li Lin; Leilei Lin; Lijie Wen; Yu Sun. STR: Hybrid Tensor Re-Generation to Break Memory Wall for DNN Training. (TPDS) CCF-A

Before   Li Lin; Yixin Cao; Lifu Huang; Shu’Ang Li; Xuming Hu; Lijie Wen; Jianmin Wang What Makes the Story Forward? Inferring Commonsense Explanations as Prompts for Future Event Generation(SIGIR) CCF-A

Before   Li Lin; Zan Zong; Lijie Wen; Chen Qian; Shuang Li; Jianmin Wang MM-CPred: A Multi-task Predictive Model for Continuous-Time Event Sequences with Mixture Learning Losses. (DASFAA) CCF-B

Before   Li Lin; Lijie Wen; Jianmin Wang. MM-Pred: A Deep Predictive Model for Multi-attribute Event Sequence. (SDM) CCF-B

Prospective Students

We sincerely welcome interested students! Thanks for your interest in joining our group!