Segui
Aizhan Imankulova
Aizhan Imankulova
PhD, CogSmart Co., Ltd., Tokyo Metropolitan University
Email verificata su ed.tmu.ac.jp - Home page
Titolo
Citata da
Citata da
Anno
Improving low-resource neural machine translation with filtered pseudo-parallel corpus
A Imankulova, T Sato, M Komachi
Proceedings of the 4th Workshop on Asian Translation (WAT2017), 70-78, 2017
532017
Exploiting out-of-domain parallel data through multilingual transfer learning for low-resource neural machine translation
A Imankulova, R Dabre, A Fujita, K Imamura
arXiv preprint arXiv:1907.03060, 2019
432019
Gender bias in masked language models for multiple languages
M Kaneko, A Imankulova, D Bollegala, N Okazaki
arXiv preprint arXiv:2205.00551, 2022
382022
From masked language modeling to translation: Non-English auxiliary tasks improve zero-shot spoken language understanding
R Van Der Goot, I Sharaf, A Imankulova, A Üstün, M Stepanovic, ...
Proceedings of the 2021 Conference of the North American Chapter of the …, 2021
31*2021
Filtered pseudo-parallel corpus improves low-resource neural machine translation
A Imankulova, T Sato, M Komachi
ACM Transactions on Asian and Low-Resource Language Information Processing …, 2019
272019
Cross-lingual transfer learning for grammatical error correction
I Yamashita, S Katsumata, M Kaneko, A Imankulova, M Komachi
Proceedings of the 28th International Conference on Computational …, 2020
102020
Towards multimodal simultaneous neural machine translation
A Imankulova, M Kaneko, T Hirasawa, M Komachi
arXiv preprint arXiv:2004.03180, 2020
102020
Towards a standardized dataset on Indonesian named entity recognition
SO Khairunnisa, A Imankulova, M Komachi
Proceedings of the 1st Conference of the Asia-Pacific Chapter of the …, 2020
92020
Simultaneous multi-pivot neural machine translation
R Dabre, A Imankulova, M Kaneko, A Chakrabarty
arXiv preprint arXiv:2104.07410, 2021
62021
Pre-trained word embedding and language model improve multimodal machine translation: A case study in Multi30K
T Hirasawa, M Kaneko, A Imankulova, M Komachi
IEEE Access 10, 67653-67668, 2022
42022
Neural combinatory constituency parsing
Z Chen, L Zhang, A Imankulova, M Komachi
arXiv preprint arXiv:2106.06689, 2021
22021
English-to-Japanese diverse translation by combining forward and backward outputs
M Kaneko, A Imankulova, T Hirasawa, M Komachi
Proceedings of the Fourth Workshop on Neural Generation and Translation, 134-138, 2020
22020
Studying the impact of document-level context on simultaneous neural machine translation
R Dabre, A Imankulova, M Kaneko
Proceedings of Machine Translation Summit XVIII: Research Track, 202-214, 2021
12021
Cross-lingual Multi-task Transfer for Zero-shot Task-oriented Dialog
R van der Goot, M Stepanovic, A Ramponi, I Sharaf, A Üstün, ...
RESOURCEFUL-2020: RESOURCEs and representations For Under-resourced …, 2021
2021
A Study on Exploiting Additional Resources for Low-resource Neural Machine Translation
A Imankulova
東京都立大学, 2021
2021
Japanese-Russian TMU Neural Machine Translation System using Multilingual Model for WAT 2019
AIMKM Komachi
WAT 2019, 165, 2019
2019
Japanese-Russian TMU Neural Machine Translation System using Multilingual Model for WAT 2019
A Imankulova, M Kaneko, M Komachi
Proceedings of the 6th Workshop on Asian Translation, 165-170, 2019
2019
Preliminary Experiments toward NMT on E-commerce Product Titles
A Imankulova, K Murakami
2018
Il sistema al momento non può eseguire l'operazione. Riprova più tardi.
Articoli 1–18