Segui
Ivan Montero
Titolo
Citata da
Citata da
Anno
Plug and play autoencoders for conditional text generation
F Mai, N Pappas, I Montero, NA Smith, J Henderson
arXiv preprint arXiv:2010.02983, 2020
302020
Sentence Bottleneck Autoencoders from Transformer Language Models
I Montero, N Pappas, NA Smith
arXiv preprint arXiv:2109.00055, 2021
212021
How much does attention actually attend? questioning the importance of attention in pretrained transformers
M Hassid, H Peng, D Rotem, J Kasai, I Montero, NA Smith, R Schwartz
arXiv preprint arXiv:2211.03495, 2022
182022
Pivot through english: Reliably answering multilingual questions without document retrieval
I Montero, S Longpre, N Lao, AJ Frank, C DuBois
arXiv preprint arXiv:2012.14094, 2020
22020
Il sistema al momento non può eseguire l'operazione. Riprova più tardi.
Articoli 1–4