Yazar "Sarac Essiz, Esra" seçeneğine göre listele
Listeleniyor 1 - 3 / 3
Sayfa Başına Sonuç
Sıralama seçenekleri
Öğe Binary Anarchic Society Optimization for Feature Selection(Editura Acad Romane, 2023) Kılıç, Umit; Sarac Essiz, Esra; Kaya Keles, MumineDatasets comprise a collection of features; however, not all of these features may be necessary. Feature selection is the process of identifying the most relevant features while eliminating redundant or irrelevant ones. To be effective, feature selection should improve classification performance while reducing the number of features. Existing algorithms can be adapted and modified into feature selectors. In this study, we introduce the implementation of the Anarchic Society Optimization algorithm, a human-inspired algorithm, as a feature selector. This is the first study that utilizes the binary version of the algorithm for feature selection. The proposed Binary Anarchic Society Algorithm is evaluated on nine datasets and compared to three known algorithms: Binary Genetic Algorithm, Binary Particle Swarm Optimization, and Binary Gray Wolf Optimization. Additionally, four traditional feature selection techniques (Info Gain, Gain Ratio, Chi-square, and ReliefF) are incorporated for performance comparison. Our experiments highlight the competitive nature of the proposed method, suggesting its potential as a valuable addition to existing feature selection techniques.Öğe Performance Analysis of Artificial Neural Network Based Classfiers for Cyberbulling Detection(Institute of Electrical and Electronics Engineers Inc., 2018) Curuk, Eren; Aci, Cigdem; Sarac Essiz, EsraIn this study, analyzes were performed to detection of cyberbullying by Artificial Neural Network (ANN) based classifiers. In contrast to the general classifiers used in the detection of cyberbullying in the literature, ANN basis classifiers as Support Vector Machines (SVM), Stochastic Gradient Descent (SGD), Radial Basis Function (RBF) and Logistic Regression (LR) classifiers have been tested. The performances of the classifiers mentioned in the study were tested with comments from Formspring.me and Myspace media. N-gram model was used for the qualitative derivation and N = 1 was chosen because we wanted to measure the overall performance of the classifiers, also stop-words have been removed from features. In these studies, the F-measure value was taken over than 0.90. Given the accuracy and time performance of the classifiers, it has been observed that the most appropriate classifier for cyberbullying detection is the SGD classifier. © 2018 IEEE.Öğe The Effects of Attribute Selection in Artificial Neural Network Based Classifiers on Cyberbullying Detection(Institute of Electrical and Electronics Engineers Inc., 2018) Curuk, Eren; Aci, Cigdem; Sarac Essiz, EsraRecently, as a result of the rapid increase of information communication technologies, the use of smartphones, tablets and laptop computers has become widespread. Especially among young people, social networks have become a part of everyday life and the cyberbullying problem has arisen as a result of hiding the credentials of people in cyberspace and reaching every level. In this study was carried out assays for the detection of cyberbullying. A total of 3469 reviews from Youtube were tagged as positive and negative, depending on whether have contained bullying. In the analyzes, the number of features of the data set was reduced to 10, 50, 100, 250 and 500 using the minimum redundancy and maximum relevance (MRMR), ReliefF and recursive feature elimination (RFE) algorithms as feature selection algorithms, support vector machines (SVM), stochastic gradient descent (SGD), radial basis function (RBF) and logistic regression (LR) have been preferred as classification algorithms. As a result of the experimental studies, the use of the SGD classifier together with the RFE attribute selection algorithm resulted 0.943 F-measure value. Other quality selection algorithms did not produce as high values as RFE, but F-measure values of about 0.76 to 0.84 were obtained. © 2018 IEEE.