Semantic Representation Using Sub-Symbolic Knowledge in Commonsense Reasoning †

Dongsuk Oh, Jungwoo Lim, Kinam Park, Heuiseok Lim

Research output: Contribution to journalArticlepeer-review

Abstract

The commonsense question and answering (CSQA) system predicts the right answer based on a comprehensive understanding of the question. Previous research has developed models that use QA pairs, the corresponding evidence, or the knowledge graph as an input. Each method executes QA tasks with representations of pre-trained language models. However, the ability of the pre-trained language model to comprehend completely remains debatable. In this study, adversarial attack experiments were conducted on question-understanding. We examined the restrictions on the question-reasoning process of the pre-trained language model, and then demonstrated the need for models to use the logical structure of abstract meaning representations (AMRs). Additionally, the experimental results demonstrated that the method performed best when the AMR graph was extended with ConceptNet. With this extension, our proposed method outperformed the baseline in diverse commonsense-reasoning QA tasks.

Original languageEnglish
Article number9202
JournalApplied Sciences (Switzerland)
Volume12
Issue number18
DOIs
Publication statusPublished - 2022 Sep

Keywords

  • abstract meaning representation
  • commonsense question and answering
  • commonsense reasoning
  • ConceptNet
  • pre-trained language model
  • semantic representation
  • sub-symbolic

ASJC Scopus subject areas

  • Materials Science(all)
  • Instrumentation
  • Engineering(all)
  • Process Chemistry and Technology
  • Computer Science Applications
  • Fluid Flow and Transfer Processes

Fingerprint

Dive into the research topics of 'Semantic Representation Using Sub-Symbolic Knowledge in Commonsense Reasoning †'. Together they form a unique fingerprint.

Cite this