Suivre
Chia-Chien Hung
Chia-Chien Hung
Adresse e-mail validée de neclab.eu
Titre
Citée par
Citée par
Année
Multi2WOZ: A Robust Multilingual Dataset and Conversational Pretraining for Task-Oriented Dialog
CC Hung, A Lauscher, I Vulić, SP Ponzetto, G Glavaš
NAACL 2022, 3687–3703, 2022
302022
DS-TOD: Efficient domain specialization for task-oriented dialog
CC Hung, A Lauscher, SP Ponzetto, G Glavaš
ACL 2022, 891–904, 2022
252022
Can demographic factors improve text classification? revisiting demographic adaptation in the age of transformers
CC Hung, A Lauscher, D Hovy, SP Ponzetto, G Glavaš
EACL 2023, 1565–1580, 2023
112023
Stacked model based argument extraction and stance detection using embedded LSTM model
P Rajula, CC Hung, SP Ponzetto
Conference and Labs of the Evaluation Forum (CLEF) 2022, 3064–3073, 2022
62022
LeviRANK: Limited query expansion with voting integration for document retrieval and ranking
A Rana, P Golchha, R Juntunen, A Coajă, A Elzamarany, CC Hung, ...
Conference and Labs of the Evaluation Forum (CLEF) 2022, 3074–3089, 2022
4*2022
TADA: Efficient Task-Agnostic Domain Adaptation for Transformers
CC Hung, L Lange, J Strötgen
ACL 2023, 487–503, 2023
32023
Walking a Tightrope - Evaluating Large Language Models in High-Risk Domains
CC Hung, WB Rim, L Frost, L Bruckner, C Lawrence
Proceedings of the 1st GenBench Workshop on (Benchmarking) Generalisation in …, 2023
12023
Linking Surface Facts to Large-Scale Knowledge Graphs
G Radevski, K Gashteovski, CC Hung, C Lawrence, G Glavaš
EMNLP 2023, 7189–7207, 2023
12023
ZusammenQA: Data Augmentation with Specialized Models for Cross-lingual Open-retrieval Question Answering System
CC Hung, T Green, R Litschko, T Tsereteli, S Takeshita, M Bombieri, ...
Proceedings of the Workshop on Multilingual Information Access (MIA), NAACL …, 2022
12022
ANHALTEN: Cross-Lingual Transfer for German Token-Level Reference-Free Hallucination Detection
J Herrlein, CC Hung, G Glavaš
ACL 2024 SRW, 186–194, 2024
2024
Le système ne peut pas réaliser cette opération maintenant. Veuillez réessayer plus tard.
Articles 1–10