Human activity recognition from multiple sensors data using deep CNNs
dc.authorid | Topuz, Elif Kevser/0000-0002-0207-8069 | |
dc.authorid | KAYA, Yasin/0000-0002-9074-0189 | |
dc.contributor.author | Kaya, Yasin | |
dc.contributor.author | Topuz, Elif Kevser | |
dc.date.accessioned | 2025-01-06T17:43:16Z | |
dc.date.available | 2025-01-06T17:43:16Z | |
dc.date.issued | 2024 | |
dc.description.abstract | Smart devices with sensors now enable continuous measurement of activities of daily living. Accordingly, various human activity recognition (HAR) experiments have been carried out, aiming to convert the measures taken from smart devices into physical activity types. HAR can be applied in many research areas, such as health assessment, environmentally supported living systems, sports, exercise, and security systems. The HAR process can also detect activity-based anomalies in daily life for elderly people. Thus, this study focused on sensor-based activity recognition, and we developed a new 1D-CNN-based deep learning approach to detect human activities. We evaluated our model using raw accelerometer and gyroscope sensor data on three public datasets: UCI-HAPT, WISDM, and PAMAP2. Parameter optimization was employed to define the model's architecture and fine-tune the final design's hyper-parameters. We applied 6, 7, and 12 classes of activity recognition to the UCI-HAPT dataset and obtained accuracy rates of 98%, 96.9%, and 94.8%, respectively. We also achieved an accuracy rate of 97.8% and 90.27% on the WISDM and PAMAP2 datasets, respectively. Moreover, we investigated the impact of using each sensor data individually, and the results show that our model achieved better results using both sensor data concurrently. | |
dc.identifier.doi | 10.1007/s11042-023-15830-y | |
dc.identifier.endpage | 10838 | |
dc.identifier.issn | 1380-7501 | |
dc.identifier.issn | 1573-7721 | |
dc.identifier.issue | 4 | |
dc.identifier.scopus | 2-s2.0-85163166035 | |
dc.identifier.scopusquality | Q1 | |
dc.identifier.startpage | 10815 | |
dc.identifier.uri | https://doi.org/10.1007/s11042-023-15830-y | |
dc.identifier.uri | https://hdl.handle.net/20.500.14669/2574 | |
dc.identifier.volume | 83 | |
dc.identifier.wos | WOS:001019903000012 | |
dc.identifier.wosquality | Q2 | |
dc.indekslendigikaynak | Web of Science | |
dc.indekslendigikaynak | Scopus | |
dc.language.iso | en | |
dc.publisher | Springer | |
dc.relation.ispartof | Multimedia Tools and Applications | |
dc.relation.publicationcategory | Makale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı | |
dc.rights | info:eu-repo/semantics/closedAccess | |
dc.snmz | KA_20241211 | |
dc.subject | Human activity recognition | |
dc.subject | 1D-CNN | |
dc.subject | Deep learning | |
dc.subject | Signal processing | |
dc.title | Human activity recognition from multiple sensors data using deep CNNs | |
dc.type | Article |