Bölücü, NecvaCan, Burcu2025-01-062025-01-062023978-195942932-62-s2.0-85175403536https://hdl.handle.net/20.500.14669/1388Georgetown College of Arts and Sciences; Georgetown Department of Linguistics; Georgetown Faculty of Languages and Linguistics7th International Conference on Dependency Linguistics (Depling, GURT/SyntaxFest 2023), Depling 2023 -- 9 March 2023 through 12 March 2023 -- Virtual, Online -- 192806Text classification is a popular and well-studied problem in Natural Language Processing. Most previous work on text classification has focused on deep neural networks such as LSTMs and CNNs. However, text classification studies using syntactic and semantic information are very limited in the literature. In this study, we propose a model using Graph Attention Network (GAT) that incorporates semantic and syntactic information as input for the text classification task. The semantic representations of UCCA and AMR are used as semantic information and the dependency tree is used as syntactic information. Extensive experimental results and in-depth analysis show that UCCA-GAT model, which is a semantic-aware model outperforms the AMR-GAT and DEP-GAT, which are semantic and syntax-aware models respectively. We also provide a comprehensive analysis of the proposed model to understand the limitations of the representations for the problem. © 2023 Association for Computational Linguistics.eninfo:eu-repo/semantics/closedAccessClassification (of information)Computational linguisticsDeep neural networksNatural language processing systemsSyntacticsText processingClassification tasksDependency treesIn-depth analysisLanguage processingNatural languagesNetwork modelsSemantic representationSemantics InformationSyntactic informationText classificationSemanticsWhich Sentence Representation is More Informative: An Analysis on Text ClassificationConference Object219