TRANSFER LEARNING AND CUSTOM LOSS APPLIED TO TRANSFORMER-BASED TEXT TRANSLATION FOR SIGN LANGUAGE ANIMATED SUBTITLES

Transfer Learning and Custom Loss Applied to Transformer-Based Text Translation for Sign Language Animated Subtitles

Transfer Learning and Custom Loss Applied to Transformer-Based Text Translation for Sign Language Animated Subtitles

Blog Article

Online learning’s feline 1-hcpch vaccine rise presents unique challenges for the deaf community, particularly in understanding educational videos.This research addresses the problem by proposing a solution to generate animated subtitles in the Indonesian Sign Language System, namely SIBI (Sistem Isyarat Bahasa Indonesia).The existing method produces word-by-word animations from directly recognized spoken text that are too fast and difficult to follow.We developed a text translation model for the Moodle application to shorten the original spoken text of the educational videos into SIBI text without losing the crucial information.We propose transfer learning to train our transformer-based models and use a custom loss function to ensure alignment with the SIBI dictionary.

Pre-trained models mBART50 and NLLB200 were fine-tuned on the SIBIVID-MP12 dataset, which was created in collaboration with Special Education teachers.Experiment results show that the proposed method improves translation metrics significantly, with the best performance observed for the NLLB200 model fine-tuned with our proposed 15-eg1053cl custom loss, achieving sacreBLEU, chrF++, METEOR, and ROUGE-L improvements of 71%, 9.79%, 22.92%, and 14.55%, respectively.

This research demonstrates the potential for enhanced inclusivity in online learning for the deaf community.

Report this page