In this paper, we present a probabilistic shift-reduce parsing model which can overcome low context-sensitivity of previous LR parsing models. Since previous models are restricted by LR parsing framework, they can utilize only a lookahead and a LR state (stack). The proposed model is not restricted by LR parsing framework, and is able to add rich contextual information as needed. To show an example of contextual information designed for applying the proposed model to Korean, we devise a new context scheme named "surface-context- types" which uses syntactic structures, sentential forms, and selective lexicals. Experimental results show that rich contextual information used by our model can improve the parsing accuracy, and our model outperforms the previous models even when using a lookahead alone.