Segui
Jang Kangwook
Titolo
Citata da
Citata da
Anno
FitHuBERT: Going Thinner and Deeper for Knowledge Distillation of Speech Self-Supervised Learning
Y Lee*, K Jang*, J Goo, Y Jung, H Kim
Interspeech 2022, 3588-3592, 2022
432022
Recycle-and-Distill: Universal Compression Strategy for Transformer-based Speech SSL Models with Attention Map Reusing and Masking Distillation
K Jang*, S Kim*, SY Yun, H Kim
Interspeech 2023, 316-320, 2023
52023
STaR: Distilling Speech Temporal Relation for Lightweight Speech Self-Supervised Learning Models
K Jang, S Kim, H Kim
ICASSP 2024, 2024
12024
Il sistema al momento non può eseguire l'operazione. Riprova più tardi.
Articoli 1–3