Online Adaptation of Neural Machine Translation Models
Lilt's translation models adapt to translators as they work, updating parameters automatically with each sentence translated. This tight loop allows adaptation to specific document-level and project-level vocabulary, structural patterns, and idiosyncrasies. Our research team focuses on fast and effective adaptation of state-of-the-art neural machine translation models using methods that are efficient enough to support large-scale personalized neural machine translation.
Simianer, Wuebker, and DeNero (NAACL 2019)
Measuring Immediate Adaptation Performance for Neural Machine Translation
Wuebker, Simianer, and DeNero (EMNLP 2018)
Compact personalized models for neural machine translation.
Interactive Neural Machine Translation
An interactive neural machine translation system that supports localization must do more than translate full sentences in isolation: it must make suggestions about what translators will type next in context, how they will transfer formatting from the source document to the target, and what edits will be performed by reviewers. Interactive systems must take termbases, translation memories, and contextual constraints into account for all of these suggestions. Our research team focuses on the full range of automatically generated suggestions that can improve the speed and quality of human localization work across translation, reviewing, and quality assurance.
Zenkel, Wuebker, and DeNero (ACL 2020)
End-to-End Neural Word Alignment Outperforms GIZA++
Zenkel, Wuebker, and DeNero (ArXiv 2019)
Adding Interpretable Attention to Neural Translation Models Improves Word Alignment.
Human-Computer Interaction and Data Science for Localization
Lilt's human-in-the-loop approach to localization places both professional translators and artificial intelligence technology together at the core of our operations. A broad range of human-computer interaction problems arise in this setting, from text-editing interfaces to assigning translators to project workflows. Our research team focuses on interaction design across the Lilt platform and data science across Lilt's business and translator community.