Which Sentence Representation is More Informative: An Analysis on Text Classification

[ X ]

Tarih

2023

Dergi Başlığı

Dergi ISSN

Cilt Başlığı

Yayıncı

Association for Computational Linguistics

Erişim Hakkı

info:eu-repo/semantics/closedAccess

Özet

Text classification is a popular and well-studied problem in Natural Language Processing. Most previous work on text classification has focused on deep neural networks such as LSTMs and CNNs. However, text classification studies using syntactic and semantic information are very limited in the literature. In this study, we propose a model using Graph Attention Network (GAT) that incorporates semantic and syntactic information as input for the text classification task. The semantic representations of UCCA and AMR are used as semantic information and the dependency tree is used as syntactic information. Extensive experimental results and in-depth analysis show that UCCA-GAT model, which is a semantic-aware model outperforms the AMR-GAT and DEP-GAT, which are semantic and syntax-aware models respectively. We also provide a comprehensive analysis of the proposed model to understand the limitations of the representations for the problem. © 2023 Association for Computational Linguistics.

Açıklama

Georgetown College of Arts and Sciences; Georgetown Department of Linguistics; Georgetown Faculty of Languages and Linguistics
7th International Conference on Dependency Linguistics (Depling, GURT/SyntaxFest 2023), Depling 2023 -- 9 March 2023 through 12 March 2023 -- Virtual, Online -- 192806

Anahtar Kelimeler

Classification (of information), Computational linguistics, Deep neural networks, Natural language processing systems, Syntactics, Text processing, Classification tasks, Dependency trees, In-depth analysis, Language processing, Natural languages, Network models, Semantic representation, Semantics Information, Syntactic information, Text classification, Semantics

Kaynak

Depling 2023 - 7th International Conference on Dependency Linguistics (Depling, GURT/SyntaxFest 2023), Proceedings

WoS Q Değeri

Scopus Q Değeri

Cilt

Sayı

Künye