My name is Hangbo Bao (鲍航波), currently a final year Ph.D. in Harbin Institute of Technology advised by Songhao Piao. Meawhile, I am a long-term research intern in Microsoft Reserach Asia mentored by Dr. Furu Wei and Dr. Li Dong.
Harbin Institute of Technology, Sept. 2013 - Jul. 2017
B.S. in School of Computer Science and Technology
Harbin Institute of Technology, Sept. 2017 - 2023
Ph.D. student in School of Computer Science and Technology
Joint Ph.D. program with Microsoft Reserach Asia
x@y where x=addf400, y=foxmail.com
Microsoft Research Asia, Jul. 2016 – Sep. 2017
Research Intern in Natural Language Computing, Mentor: Dr. Furu Wei
Microsoft Research Asia, Mar. 2018 – Present
Research Intern in Natural Language Computing, Mentor: Dr. Furu Wei & Dr. Li Dong
pre-trained models, natural language processing, representation learning
Image as a Foreign Language: BEIT Pretraining for All Vision and Vision-Language Tasks
Wenhui Wang*, Hangbo Bao*, Li Dong*, Johan Bjorck, Zhiliang Peng, Qiang Liu, Kriti Aggarwal, Owais Khan Mohammed, Saksham Singhal, Subhojit Som and Furu Wei. * = equal contribution.
Preprint.
A Unified View of Masked Image Modeling
Zhiliang Peng, Li Dong, Hangbo Bao, Qixiang Ye and Furu Wei.
Preprint.
BEiT v2: Masked Image Modeling with Vector-Quantized Visual Tokenizers
Zhiliang Peng, Li Dong, Hangbo Bao, Qixiang Ye and Furu Wei.
Preprint. Code
VL-BEiT: Generative Vision-Language Pretraining
Hangbo Bao*, Wenhui Wang*, Li Dong and Furu Wei. * = equal contribution.
Preprint.
THE-X: Privacy-Preserving Transformer Inference with Homomorphic Encryption
Tianyu Chen, Hangbo Bao, Shaohan Huang, Li Dong, Binxing Jiao, Daxin Jiang, Haoyi Zhou, Jianxin Li, and Furu Wei.
Findings of Association for Computational Linguistics (Findings of ACL), 2022
Corrupted Image Modeling for Self-Supervised Visual Pre-Training
Yuxin Fang, Li Dong, Hangbo Bao, Xinggang Wang and Furu Wei.
International Conference on Learning Representations (ICLR), 2023.
VLMo: Unified Vision-Language Pre-Training with Mixture-of-Modality-Experts
Hangbo Bao*, Wenhui Wang*, Li Dong, Qiang Liu, Owais Khan Mohammed, Kriti Aggarwal, Subhojit Som and Furu Wei. * = equal contribution.
Advances in Neural Information Processing Systems (NeurIPS), 2022.
BEiT: BERT Pre-Training of Image Transformers
Hangbo Bao, Li Dong, Songhao Piao and Furu Wei.
International Conference on Learning Representations (ICLR), 2022. Oral paper.
A Path to the BERT Moment of CV
Attention Temperature Matters in Abstractive Summarization Distillation
Shengqiang Zhang, Xingxing Zhang, Hangbo Bao and Furu Wei.
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (ACL), 2022.
s2s-ft: Fine-Tuning Pretrained Transformer Encoders for Sequence-to-Sequence Learning
Hangbo Bao, Li Dong, Wenhui Wang, Nan Yang and Furu Wei.
Preprint
Learning to Sample Replacements for ELECTRA Pre-Training
Yaru Hao, Li Dong, Hangbo Bao, Ke Xu and Furu Wei.
Findings of Association for Computational Linguistics (Findings of ACL), 2021.
MiniLMv2: Multi-Head Self-Attention Relation Distillation for Compressing Pretrained Transformers
Wenhui Wang, Hangbo Bao, Shaohan Huang, Li Dong and Furu Wei.
Findings of Association for Computational Linguistics (Findings of ACL), 2021.
MiniLM: Deep Self-Attention Distillation for Task-Agnostic Compression of Pre-Trained Transformers
Wenhui Wang, Furu Wei, Li Dong, Hangbo Bao, Nan Yang and Ming Zhou.
Advances in Neural Information Processing Systems (NeurIPS), 2020.
Unilmv2: Pseudo-masked language models for unified language model pre-training
Hangbo Bao, Li Dong, Furu Wei, Wenhui Wang, Nan Yang, Xiaodong Liu, Yu Wang, Jianfeng Gao, Songhao Piao, Ming Zhou and Hsiao-Wuen Hon.
International Conference on Machine Learning (ICML), 2020.
Inspecting Unification of Encoding and Matching with Transformer: A Case Study of Machine Reading Comprehension
Hangbo Bao, Li Dong, Furu Wei, Wenhui Wang, Nan Yang, Lei Cui, Songhao Piao and Ming Zhou.
Proceedings of the 2nd Workshop on Machine Reading for Question Answering (MRQA), 2019.
Neural melody composition from lyrics
Hangbo Bao, Shaohan Huang, Furu Wei, Lei Cui, Yu Wu, Chuanqi Tan, Songhao Piao and Ming Zhou.
National CCF Conference on Natural Language Processing and Chinese Computing (NLPCC), 2019.
Neural Question Generation from Text: A Preliminary Study
Qingyu Zhou, Nan Yang, Furu Wei, Chuanqi Tan, Hangbo Bao, and Ming Zhou.
National CCF Conference on Natural Language Processing and Chinese Computing (NLPCC) 2017.