A novel embedding approach to learn word vectors by weighting semantic relations: SemSpace
[ X ]
Tarih
2021
Yazarlar
Dergi Başlığı
Dergi ISSN
Cilt Başlığı
Yayıncı
Pergamon-Elsevier Science Ltd
Erişim Hakkı
info:eu-repo/semantics/closedAccess
Özet
In this study, we propose a novel embedding approach, called as SemSpace, to determine word vectors of synsets and to find the best weights for semantic relations. First, SemSpace finds the optimum weights to the semantic relations in WordNet by aligning them to values produced by human intelligence, and then, determines word vectors of synsets by adjusting euclidean distances among them. Proposed approach requires two inputs; first, a lexical-semantic network such as WordNet, second, a word-level similarity dataset generated by people. In the experiments, we used WordNet 3.0 data for the lexical-semantic network, and three (RG65, WS353, and MEN3K) benchmark testsets to align semantic weights. Using the aligned semantic weights and the determined word vectors, the obtained resultsresults on the benchmark testsets are compared with literature studies. According to the obtained results, it might be concluded that SemSpace is not only successful to find word level semantic similarity values and semantic weights, but also to discover new semantic relations with their semantic levels.
Açıklama
Anahtar Kelimeler
SemSpace, Embedding, Word vectors, Aligning semantic relations to weights, WordNet
Kaynak
Expert Systems With Applications
WoS Q Değeri
Q1
Scopus Q Değeri
Q1
Cilt
180