“Killing me” is not a spoiler: Spoiler detection model using graph neural networks with dependency relation-aware attention mechanism

Buru Chang, Inggeol Lee, Hyunjae Kim, Jaewoo Kang

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    Several machine learning-based spoiler detection models have been proposed recently to protect users from spoilers on review websites. Although dependency relations between context words are important for detecting spoilers, current attention-based spoiler detection models are insufficient for utilizing dependency relations. To address this problem, we propose a new spoiler detection model called SDGNN that is based on syntax-aware graph neural networks. In the experiments on two real-world benchmark datasets, we show that our SDGNN outperforms the existing spoiler detection models.

    Original languageEnglish
    Title of host publicationEACL 2021 - 16th Conference of the European Chapter of the Association for Computational Linguistics, Proceedings of the Conference
    PublisherAssociation for Computational Linguistics (ACL)
    Pages3613-3617
    Number of pages5
    ISBN (Electronic)9781954085022
    Publication statusPublished - 2021
    Event16th Conference of the European Chapter of the Associationfor Computational Linguistics, EACL 2021 - Virtual, Online
    Duration: 2021 Apr 192021 Apr 23

    Publication series

    NameEACL 2021 - 16th Conference of the European Chapter of the Association for Computational Linguistics, Proceedings of the Conference

    Conference

    Conference16th Conference of the European Chapter of the Associationfor Computational Linguistics, EACL 2021
    CityVirtual, Online
    Period21/4/1921/4/23

    ASJC Scopus subject areas

    • Software
    • Computational Theory and Mathematics
    • Linguistics and Language

    Fingerprint

    Dive into the research topics of '“Killing me” is not a spoiler: Spoiler detection model using graph neural networks with dependency relation-aware attention mechanism'. Together they form a unique fingerprint.

    Cite this