Probabilistic shift-reduce parsing model using rich contextual information

Yong Jae Kwak, So Young Park, Joon Ho Lim, Hae-Chang Rim

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

In this paper, we present a probabilistic shift-reduce parsing model which can overcome low context-sensitivity of previous LR parsing models. Since previous models are restricted by LR parsing framework, they can utilize only a lookahead and a LR state (stack). The proposed model is not restricted by LR parsing framework, and is able to add rich contextual information as needed. To show an example of contextual information designed for applying the proposed model to Korean, we devise a new context scheme named "surface-context- types" which uses syntactic structures, sentential forms, and selective lexicals. Experimental results show that rich contextual information used by our model can improve the parsing accuracy, and our model outperforms the previous models even when using a lookahead alone.

Original languageEnglish
Pages (from-to)93-96
Number of pages4
JournalLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume2945
Publication statusPublished - 2004 Dec 1

Fingerprint

Parsing
Look-ahead
Model
Syntactics
Experimental Results
Context

ASJC Scopus subject areas

  • Computer Science(all)
  • Biochemistry, Genetics and Molecular Biology(all)
  • Theoretical Computer Science

Cite this

@article{584649c28b2144648a0024d69c80182e,
title = "Probabilistic shift-reduce parsing model using rich contextual information",
abstract = "In this paper, we present a probabilistic shift-reduce parsing model which can overcome low context-sensitivity of previous LR parsing models. Since previous models are restricted by LR parsing framework, they can utilize only a lookahead and a LR state (stack). The proposed model is not restricted by LR parsing framework, and is able to add rich contextual information as needed. To show an example of contextual information designed for applying the proposed model to Korean, we devise a new context scheme named {"}surface-context- types{"} which uses syntactic structures, sentential forms, and selective lexicals. Experimental results show that rich contextual information used by our model can improve the parsing accuracy, and our model outperforms the previous models even when using a lookahead alone.",
author = "Kwak, {Yong Jae} and Park, {So Young} and Lim, {Joon Ho} and Hae-Chang Rim",
year = "2004",
month = "12",
day = "1",
language = "English",
volume = "2945",
pages = "93--96",
journal = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
issn = "0302-9743",
publisher = "Springer Verlag",

}

TY - JOUR

T1 - Probabilistic shift-reduce parsing model using rich contextual information

AU - Kwak, Yong Jae

AU - Park, So Young

AU - Lim, Joon Ho

AU - Rim, Hae-Chang

PY - 2004/12/1

Y1 - 2004/12/1

N2 - In this paper, we present a probabilistic shift-reduce parsing model which can overcome low context-sensitivity of previous LR parsing models. Since previous models are restricted by LR parsing framework, they can utilize only a lookahead and a LR state (stack). The proposed model is not restricted by LR parsing framework, and is able to add rich contextual information as needed. To show an example of contextual information designed for applying the proposed model to Korean, we devise a new context scheme named "surface-context- types" which uses syntactic structures, sentential forms, and selective lexicals. Experimental results show that rich contextual information used by our model can improve the parsing accuracy, and our model outperforms the previous models even when using a lookahead alone.

AB - In this paper, we present a probabilistic shift-reduce parsing model which can overcome low context-sensitivity of previous LR parsing models. Since previous models are restricted by LR parsing framework, they can utilize only a lookahead and a LR state (stack). The proposed model is not restricted by LR parsing framework, and is able to add rich contextual information as needed. To show an example of contextual information designed for applying the proposed model to Korean, we devise a new context scheme named "surface-context- types" which uses syntactic structures, sentential forms, and selective lexicals. Experimental results show that rich contextual information used by our model can improve the parsing accuracy, and our model outperforms the previous models even when using a lookahead alone.

UR - http://www.scopus.com/inward/record.url?scp=35048869693&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=35048869693&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:35048869693

VL - 2945

SP - 93

EP - 96

JO - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

JF - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SN - 0302-9743

ER -