Gemini: a family of highly capable multimodal models G Team, R Anil, S Borgeaud, Y Wu, JB Alayrac, J Yu, R Soricut, ... arXiv preprint arXiv:2312.11805, 2023 | 1490 | 2023 |
Gemma: Open models based on gemini research and technology G Team, T Mesnard, C Hardin, R Dadashi, S Bhupatiraju, S Pathak, ... arXiv preprint arXiv:2403.08295, 2024 | 425 | 2024 |
Gemini 1.5: Unlocking multimodal understanding across millions of tokens of context M Reid, N Savinov, D Teplyashin, D Lepikhin, T Lillicrap, J Alayrac, ... arXiv preprint arXiv:2403.05530, 2024 | 358 | 2024 |
Compositionality and generalization in emergent languages R Chaabouni, E Kharitonov, D Bouchacourt, E Dupoux, M Baroni arXiv preprint arXiv:2004.09124, 2020 | 132 | 2020 |
Anti-efficient encoding in emergent communication R Chaabouni, E Kharitonov, E Dupoux, M Baroni Advances in Neural Information Processing Systems 32, 2019 | 110 | 2019 |
EGG: a toolkit for research on Emergence of lanGuage in Games E Kharitonov, R Chaabouni, D Bouchacourt, M Baroni arXiv preprint arXiv:1907.00852, 2019 | 75 | 2019 |
Emergent communication at scale R Chaabouni, F Strub, F Altché, E Tarassov, C Tallec, E Davoodi, ... International conference on learning representations, 2022 | 67 | 2022 |
Communicating artificial neural networks develop efficient color-naming systems R Chaabouni, E Kharitonov, E Dupoux, M Baroni Proceedings of the National Academy of Sciences 118 (12), e2016569118, 2021 | 59 | 2021 |
Entropy minimization in emergent languages E Kharitonov, R Chaabouni, D Bouchacourt, M Baroni International Conference on Machine Learning, 5220-5230, 2020 | 37* | 2020 |
" LazImpa": Lazy and Impatient neural agents learn to communicate efficiently M Rita, R Chaabouni, E Dupoux arXiv preprint arXiv:2010.01878, 2020 | 35 | 2020 |
What they do when in doubt: a study of inductive biases in seq2seq learners E Kharitonov, R Chaabouni arXiv preprint arXiv:2006.14953, 2020 | 34 | 2020 |
Word-order biases in deep-agent emergent communication R Chaabouni, E Kharitonov, A Lazaric, E Dupoux, M Baroni arXiv preprint arXiv:1905.12330, 2019 | 34 | 2019 |
Can transformers jump around right in natural language? assessing performance transfer from SCAN R Chaabouni, R Dessì, E Kharitonov arXiv preprint arXiv:2107.01366, 2021 | 19 | 2021 |
Learning weakly supervised multimodal phoneme embeddings R Chaabouni, E Dunbar, N Zeghidour, E Dupoux arXiv preprint arXiv:1704.06913, 2017 | 13 | 2017 |
Memory consolidation enables long-context video understanding I Balažević, Y Shi, P Papalampidi, R Chaabouni, S Koppula, OJ Hénaff arXiv preprint arXiv:2402.05861, 2024 | 7 | 2024 |
Memory consolidation enables long-context video understanding I Balazevic, Y Shi, P Papalampidi, R Chaabouni, S Koppula, OJ Hénaff Forty-first International Conference on Machine Learning, 2024 | 4 | 2024 |
Countering reward over-optimization in llm with demonstration-guided reinforcement learning M Rita, F Strub, R Chaabouni, P Michel, E Dupoux, O Pietquin arXiv preprint arXiv:2404.19409, 2024 | 3 | 2024 |
Language Evolution with Deep Learning M Rita, P Michel, R Chaabouni, O Pietquin, E Dupoux, F Strub arXiv preprint arXiv:2403.11958, 2024 | 1 | 2024 |
Emerging linguistic universals in communicating neural network agents R Chaabouni Université Paris sciences et lettres, 2021 | | 2021 |
SIMULATION OF EMERGENT COMMUNICATION WITH LARGE SCALE MACHINE LEARNING R Chaabouni, F Strub, F Altché, C Tallec, E Trassov, E Davoodi, ... | | |