Wenhui Wang
Wenhui Wang
Microsoft Research
Email verificata su microsoft.com
Titolo
Citata da
Citata da
Anno
Gated self-matching networks for reading comprehension and question answering
W Wang, N Yang, F Wei, B Chang, M Zhou
Proceedings of the 55th Annual Meeting of the Association for Computational …, 2017
4912017
Unified language model pre-training for natural language understanding and generation
L Dong, N Yang, W Wang, F Wei, X Liu, Y Wang, J Gao, M Zhou, HW Hon
Advances in Neural Information Processing Systems, 13063-13075, 2019
3112019
Graph-based dependency parsing with bidirectional LSTM
W Wang, B Chang
Proceedings of the 54th Annual Meeting of the Association for Computational …, 2016
1262016
Multiway Attention Networks for Modeling Sentence Pairs.
C Tan, F Wei, W Wang, W Lv, M Zhou
IJCAI, 4411-4417, 2018
632018
Unilmv2: Pseudo-masked language models for unified language model pre-training
H Bao, L Dong, F Wei, W Wang, N Yang, X Liu, Y Wang, J Gao, S Piao, ...
International Conference on Machine Learning, 642-652, 2020
382020
MiniLM: Deep Self-Attention Distillation for Task-Agnostic Compression of Pre-Trained Transformers
W Wang, F Wei, L Dong, H Bao, N Yang, M Zhou
arXiv preprint arXiv:2002.10957, 2020
242020
Learning to Ask Unanswerable Questions for Machine Reading Comprehension
H Zhu, L Dong, F Wei, W Wang, B Qin, T Liu
arXiv preprint arXiv:1906.06045, 2019
202019
Cross-Lingual Natural Language Generation via Pre-Training.
Z Chi, L Dong, F Wei, W Wang, XL Mao, H Huang
AAAI, 7570-7577, 2020
182020
Improved Dependency Parsing using Implicit Word Connections Learned from Unlabeled Data
W Wang, B Chang, M Mansur
Proceedings of the 2018 Conference on Empirical Methods in Natural Language …, 2018
142018
InfoXLM: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-Training
Z Chi, L Dong, F Wei, N Yang, S Singhal, W Wang, X Song, XL Mao, ...
arXiv preprint arXiv:2007.07834, 2020
122020
Harvesting and Refining Question-Answer Pairs for Unsupervised QA
Z Li, W Wang, L Dong, F Wei, K Xu
arXiv preprint arXiv:2005.02925, 2020
82020
Inspecting Unification of Encoding and Matching with Transformer: A Case Study of Machine Reading Comprehension
H Bao, L Dong, F Wei, W Wang, N Yang, L Cui, S Piao, M Zhou
Proceedings of the 2nd Workshop on Machine Reading for Question Answering, 14-18, 2019
22019
MiniLMv2: Multi-Head Self-Attention Relation Distillation for Compressing Pretrained Transformers
W Wang, H Bao, S Huang, L Dong, F Wei
arXiv preprint arXiv:2012.15828, 2020
2020
Il sistema al momento non può eseguire l'operazione. Riprova più tardi.
Articoli 1–13