Mimicking Infants' Bilingual Language Acquisition for Domain Specialized Neural Machine Translation

Chanjun Park, Woo Young Go, Sugyeong Eo, Hyeonseok Moon, Seolhwa Lee, Heuiseok Lim

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Existing methods of training domain-specialized neural machine translation (DS-NMT) models are based on the pretrain-finetuning approach (PFA). In this study, we reinterpret existing methods based on the perspective of cognitive science related to cross language speech perception. We propose the cross communication method (CCM), a new DS-NMT training approach. Inspired by the learning method of infants, we perform DS-NMT training by configuring and training DC and GC concurrently in batches. Quantitative and qualitative analysis of our experimental results show that CCM can achieve superior performance compared to the conventional methods. Additionally, we conducted an experiment considering the DS-NMT service to meet industrial demands.

Original languageEnglish
Pages (from-to)38684-38693
Number of pages10
JournalIEEE Access
Volume10
DOIs
Publication statusPublished - 2022

Keywords

  • Domain-specialized neural machine translation
  • cross communication method
  • deep learning
  • neural machine translation

ASJC Scopus subject areas

  • Engineering(all)
  • Materials Science(all)
  • Electrical and Electronic Engineering
  • Computer Science(all)

Fingerprint

Dive into the research topics of 'Mimicking Infants' Bilingual Language Acquisition for Domain Specialized Neural Machine Translation'. Together they form a unique fingerprint.

Cite this