TY - GEN
T1 - Adaptive convolution for text classification
AU - Choi, Byung Ju
AU - Park, Jun Hyung
AU - Lee, Sang Keun
N1 - Funding Information:
This research was supported by the National Research Foundation of Korea (NRF) funded by the Ministry of Science and ICT (MIST) (No.2018R1A2A1A05078380). This research was also in part supported by the Information Technology Research Center (ITRC) support program supervised by the Institute for Information & communications Technology Promotion (IITP) (IITP-2019-2016-0-00464).
Publisher Copyright:
© 2019 Association for Computational Linguistics
PY - 2019
Y1 - 2019
N2 - In this paper, we present an adaptive convolution for text classification to give stronger flexibility to convolutional neural networks (CNNs). Unlike traditional convolutions that use the same set of filters regardless of different inputs, the adaptive convolution employs adaptively generated convolutional filters that are conditioned on inputs. We achieve this by attaching filter-generating networks, which are carefully designed to generate input-specific filters, to convolution blocks in existing CNNs. We show the efficacy of our approach in existing CNNs based on our performance evaluation. Our evaluation indicates that adaptive convolutions improve all the baselines, without any exception, as much as up to 2.6 percentage point in seven benchmark text classification datasets.
AB - In this paper, we present an adaptive convolution for text classification to give stronger flexibility to convolutional neural networks (CNNs). Unlike traditional convolutions that use the same set of filters regardless of different inputs, the adaptive convolution employs adaptively generated convolutional filters that are conditioned on inputs. We achieve this by attaching filter-generating networks, which are carefully designed to generate input-specific filters, to convolution blocks in existing CNNs. We show the efficacy of our approach in existing CNNs based on our performance evaluation. Our evaluation indicates that adaptive convolutions improve all the baselines, without any exception, as much as up to 2.6 percentage point in seven benchmark text classification datasets.
UR - http://www.scopus.com/inward/record.url?scp=85085544480&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85085544480
T3 - NAACL HLT 2019 - 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies - Proceedings of the Conference
SP - 2475
EP - 2485
BT - Long and Short Papers
PB - Association for Computational Linguistics (ACL)
T2 - 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL HLT 2019
Y2 - 2 June 2019 through 7 June 2019
ER -