Open Access System for Information Sharing

Login Library

 

Article
Cited 3 time in webofscience Cited 3 time in scopus
Metadata Downloads
Full metadata record
Files in This Item:
There are no files associated with this item.
DC FieldValueLanguage
dc.contributor.authorShin, Jaehun-
dc.contributor.authorLee, Wonkee-
dc.contributor.authorGo, Byung-Hyun-
dc.contributor.authorJung, Baikjin-
dc.contributor.authorKim, Youngkil-
dc.contributor.authorLee, Jong-Hyeok-
dc.date.accessioned2022-06-21T09:50:12Z-
dc.date.available2022-06-21T09:50:12Z-
dc.date.created2021-12-23-
dc.date.issued2021-11-
dc.identifier.issn2375-4699-
dc.identifier.urihttps://oasis.postech.ac.kr/handle/2014.oak/112915-
dc.description.abstractAutomatic post-editing (APE) is the study of correcting translation errors in the output of an unknown machine translation (MT) system and has been considered as a method of improving translation quality without any modification to conventional MT systems. Recently, several variants of Transformer that take both the MT output and its corresponding source sentence as inputs have been proposed for APE; and models introducing an additional attention layer into the encoder to jointly encode the MT output with its source sentence recorded a high-rank in the WMT19 APE shared task. We examine the effectiveness of such joint-encoding strategy in a controlled environment and compare four types of decoder multi-source attention strategies that have been introduced into previous APE models. The experimental results indicate that the joint-encoding strategy is effective and that taking the final encoded representation of the source sentence is the more proper strategy than taking such representation within the same encoder stack. Furthermore, among the multi-source attention strategies combined with the joint-encoding, the strategy that applies attention to the concatenated input representation and the strategy that adds up the individual attention to each input improve the quality of APE results over the strategy using the joint-encoding only.-
dc.languageEnglish-
dc.publisherASSOC COMPUTING MACHINERY-
dc.relation.isPartOfACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING-
dc.titleExploration of Effective Attention Strategies for Neural Automatic Post-editing with Transformer-
dc.typeArticle-
dc.identifier.doi10.1145/3465383-
dc.type.rimsART-
dc.identifier.bibliographicCitationACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, v.20, no.6-
dc.identifier.wosid000721586800020-
dc.citation.number6-
dc.citation.titleACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING-
dc.citation.volume20-
dc.contributor.affiliatedAuthorShin, Jaehun-
dc.contributor.affiliatedAuthorLee, Wonkee-
dc.contributor.affiliatedAuthorGo, Byung-Hyun-
dc.contributor.affiliatedAuthorJung, Baikjin-
dc.contributor.affiliatedAuthorLee, Jong-Hyeok-
dc.identifier.scopusid2-s2.0-85127392702-
dc.description.journalClass1-
dc.description.journalClass1-
dc.description.isOpenAccessN-
dc.type.docTypeArticle-
dc.subject.keywordAuthorAutomatic post-editing-
dc.subject.keywordAuthormachine translation-
dc.subject.keywordAuthorneural networks-
dc.subject.keywordAuthorattention mechanism-
dc.relation.journalWebOfScienceCategoryComputer Science, Artificial Intelligence-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher

이종혁LEE, JONG HYEOK
Grad. School of AI
Read more

Views & Downloads

Browse