Huiqiang Jiang (姜慧强)

Research SDE in Microsoft Research Asia (Shanghai),
a fake MLSys/NLPer Google schoal,
Research focus on Efficient Methods (in LLMs)

A unpopular blogger Blog & Zhihu
A programming enthusiast @iofu728

Phone: +86 178 xxxx xxxx
Email: hjiang[aT]microsoft[DoT.]com


Huiqiang Jiang obtained his Master's Degree in Software Engineering from Peking University, working with A.P. Xiang Jing. And also was a research intern at the KC Group, Microsoft Research Asia (19/6-21/3) with Börje Karlsson and Guoxin Wang as well as the search group, Ant Group (20/6-20/8).
Huiqiang's research primarily focus on efficient methods to accelerate inference or training, including dynamic sparse attention (MInference, RetrievalAttention), prompt compression (LLMLingua), KV-cache compression, speculative decoding, model compression, sparse inference (PIT), neural architecture search (NAS), and efficient tuning, with a particular emphasis on LLMs. Additionally, he is interested in addressing typical challenges in natural language processing.

I'm seeking research interns to work on efficient LLMs methods. If you're interested in my research topics, please contact me at hjiang[aT]microsoft[DoT]com.

Selected Publications

† equal contribution, ‡ student I advised.

NLP & MLSys

  1. RetrievalAttention: Accelerating Long-Context LLM Inference via Vector Retrieval
    Di Liu, Meng Chen, Baotong Lu, Huiqiang Jiang, Zhenhua Han, Qianxi Zhang, Qi Chen, Chengruidong Zhang, Bailu Ding, Kai Zhang, Chen Chen, Fan Yang, Yuqing Yang, Lili Qiu.
    arXiv

  2. MInference 1.0: Accelerating Pre-filling for Long-Context LLMs via Dynamic Sparse Attention
    Huiqiang Jiang, Yucheng Li, Chengruidong Zhang, Qianhui Wu, Xufang Luo, Surin Ahn, Zhenhua Han, Amir H. Abdi, Dongsheng Li, Chin-Yew Lin, Yuqing Yang, Lili Qiu.
    In Proc. of NeurIPS'24 (Spotlight)
    Also appeared in ICML Workshop Efficient Systems for Foundation Models (Es-FoMo), 2024.
    [Code] [Project Page] [Demo]

  3. Mitigate Position Bias in Large Language Models via Scaling a Single Dimension
    Yijiong Yu, Huiqiang Jiang, Xufang Luo, Qianhui Wu, Chin-Yew Lin, Dongsheng Li, Yuqing Yang, Yongfeng Huang, Lili Qiu.
    In ICML Workshop Long Context Foundation Models (LCFM) (Oral), 2024.
    [Code]

  4. LLMLingua-2: Data Distillation for Efficient and Faithful Task-Agnostic Prompt Compression
    Zhuoshi Pan, Qianhui Wu, Huiqiang Jiang, Menglin Xia, Xufang Luo, Jue Zhang, Qingwei Lin, Victor Rühle, Yuqing Yang, Chin-Yew Lin, H. Vicky Zhao, Lili Qiu, Dongmei Zhang.
    In Proc. of ACL'24 Findings
    [Code] [Project Page] [Demo]

  5. LongLLMLingua: Accelerating and Enhancing LLMs in Long Context Scenarios via Prompt Compression
    Huiqiang Jiang, Qianhui Wu, Xufang Luo, Dongsheng Li, Chin-Yew Lin, Yuqing Yang, Lili Qiu.
    In Proc. of ACL'24
    Also appeared in ICLR Workshop Mathematical and Empirical Understanding of Foundation Models (ME-FoMo), 2024.
    [Code] [Project Page] [Demo]

  6. LLMLingua: Compressing Prompts for Accelerated Inference of Large Language Models
    Huiqiang Jiang, Qianhui Wu, Chin-Yew Lin, Yuqing Yang, Lili Qiu.
    In Proc. of EMNLP'23 (Oral)
    [Code] [Project Page] [Demo]

  7. PIT: Optimization of Dynamic Sparse Deep Learning Models via Permutation Invariant Transformation
    Ningxin Zheng, Huiqiang Jiang, Quanlu Zhang, Zhenhua Han, Lingxiao Ma, Yuqing Yang, Fan Yang, Chengruidong Zhang, Lili Qiu, Mao Yang, Lidong Zhou.
    In Proc. of SOSP'23
    [Code]

  8. Accurate and Structured Pruning for Efficient Automatic Speech Recognition
    Huiqiang Jiang, Li Lyna Zhang, Yuang Li, Yu Wu, Shijie Cao, Ting Cao, Yuqing Yang, Jinyu Li, Mao Yang, Lili Qiu.
    In Proc. of Interspeech'23

  9. Position Engineering: Boosting Large Language Models through Positional Information Manipulation
    Zhiyuan He, Huiqiang Jiang, Zilong Wang, Yuqing Yang, Luna Qiu, Lili Qiu.
    In Proc. of EMNLP'24

  10. CoLaDa: A Collaborative Label Denoising Framework for Cross-lingual Named Entity Recognition
    Tingting Ma, Qianhui Wu, Huiqiang Jiang, Börje Karlsson, Tiejun Zhao, Chin-Yew Lin.
    In Proc. of ACL'23
    [Code]

  11. Multi-Level Knowledge Distillation for Out-of-Distribution Detection in Text
    Qianhui Wu, Huiqiang Jiang, Haonan Yin, Börje F. Karlsson, Chin-Yew Lin.
    In Proc. of ACL'23
    [Code]

  12. Decomposed Meta-Learning for Few-Shot Sequence Labeling
    Tingting Ma, Qianhui Wu, Huiqiang Jiang, Jieru Lin, Börje F Karlsson, Tiejun Zhao, Chin-Yew Lin.
    IEEE/ACM Transactions on Audio, Speech, and Language Processing (TASLP), 2024.
    [Code]

  13. Decomposed Meta-Learning for Few-Shot Named Entity Recognition
    Tingting Ma, Huiqiang Jiang, Qianhui Wu, Tiejun Zhao, Chin-Yew Lin.
    In Proc. of ACL'22 Findings
    [Code]

  14. AdvPicker: Effectively Leveraging Unlabeled Data via Adversarial Discriminator for Cross-Lingual NER
    Weile Chen, Huiqiang Jiang, Qianhui Wu, Börje F. Karlsson, Yi Guan.
    In Proc. of ACL'21
    [Code]

  15. BoningKnife: Joint Entity Mention Detection and Typing for Nested NER via prior Boundary Knowledge
    Huiqiang Jiang, Guoxin Wang, Weile Chen, Chengxi Zhang, Börje F. Karlsson.
    arXiv (work between 2019/2020)

CV

  1. ElasticViT: Conflict-aware Supernet Training for Deploying Fast Vision Transformer on Diverse Mobile Devices
    Chen Tang, Li Lyna Zhang, Huiqiang Jiang, Jiahang Xu, Ting Cao, Quanlu Zhang, Yuqing Yang, Zhi Wang, Mao Yang.
    In Proc. of ICCV'23
    [Code]

  2. Attentive Mask CLIP
    Yifan Yang, Weiquan Huang, Yixuan Wei, Houwen Peng, Xinyang Jiang, Huiqiang Jiang, Fangyun Wei, Yin Wang, Han Hu, Lili Qiu, Yuqing Yang.
    In Proc. of ICCV'23
    [Code]

Selected Honors & Awards

  • Awarded as Top Reviewer in NeurIPS, 2024.
  • Awarded as Microsoft Global Hackathon Executive Challenge Winner Award, 2023, 2024.
  • Awarded as Microsoft Machine Learning, AI & Data Science Conference Distinguished Contribution Award Winner, 2024.
  • Awarded as Microsoft Global Hackathon Executive Challenge Winner Award, 2023, 2024.
  • Awarded as Zhejiang Province Excellent Graduate Award, 2018.
  • Awarded by Zhejiang Province Scholarship Awardee (5%), 2017.
  • Awarded by Zhejiang University First-Class Scholarship for Outstanding Students, 2015-2017.
  • Awarded by Zhejiang University Excellent Student Award, 2017.
  • Awarded as First Prize in the College Students' Mathematical Contest in Modeling, Zhejiang Province, 2016.

Academic Service

  • ICLR 24/25, NeurIPS 24, ARR 23/24, EMNLP 23, COLING 24/25, TIST

Last Updated: Nov., 2024 Website Hit Counter