Yazar "Mowla, Md. Najmul" seçeneğine göre listele
Listeleniyor 1 - 2 / 2
Sayfa Başına Sonuç
Sıralama seçenekleri
Öğe Internet of Things and Wireless Sensor Networks for Smart Agriculture Applications: A Survey(IEEE-Inst Electrical Electronics Engineers Inc, 2023) Mowla, Md. Najmul; Mowla, Neazmul; Shah, A. F. M. Shahen; Rabie, Khaled M.; Shongwe, ThokozaniThe increasing food scarcity necessitates sustainable agriculture achieved through automation to meet the growing demand. Integrating the Internet of Things (IoT) and Wireless Sensor Networks (WSNs) is crucial in enhancing food production across various agricultural domains, encompassing irrigation, soil moisture monitoring, fertilizer optimization and control, early-stage pest and crop disease management, and energy conservation. Wireless application protocols such as ZigBee, WiFi, SigFox, and LoRaWAN are commonly employed to collect real-time data for monitoring purposes. Embracing advanced technology is imperative to ensure efficient annual production. Therefore, this study emphasizes a comprehensive, future-oriented approach, delving into IoT-WSNs, wireless network protocols, and their applications in agriculture since 2019. It thoroughly discusses the overview of IoT and WSNs, encompassing their architectures and summarization of network protocols. Furthermore, the study addresses recent issues and challenges related to IoT-WSNs and proposes mitigation strategies. It provides clear recommendations for the future, emphasizing the integration of advanced technology aiming to contribute to the future development of smart agriculture systems.Öğe UAVs-FFDB: A high-resolution dataset for advancing forest fire detection and monitoring using unmanned aerial vehicles (UAVs)(Elsevier Inc., 2024) Mowla, Md. Najmul; Asadi, Davood; Tekeoglu, Kadriye Nur; Masum, Shamsul; Rabie, KhaledForest ecosystems face increasing wildfire threats, demanding prompt and precise detection methods to ensure efficient fire control. However, real-time forest fire data accessibility and timeliness require improvement. Our study addresses the challenge through the introduction of the Unmanned Aerial Vehicles (UAVs) based forest fire database (UAVs-FFDB), characterized by a dual composition. Firstly, it encompasses a collection of 1653 high-resolution RGB raw images meticulously captured utilizing a standard S500 quadcopter frame in conjunction with a RaspiCamV2 camera. Secondly, the database incorporates augmented data, culminating in a total of 15560 images, thereby enhancing the diversity and comprehensiveness of the dataset. These images were captured within a forested area adjacent to Adana Alparslan Türkeş Science and Technology University in Adana, Turkey. Each raw image in the dataset spans dimensions from 353 × 314 to 640 × 480, while augmented data ranges from 398 × 358 to 640 × 480, resulting in a total dataset size of 692 MB for the raw data subset. In contrast, the augmented data subset accounts for a considerably larger size, totaling 6.76 GB. The raw images are obtained during a UAV surveillance mission, with the camera precisely angled a -180-degree to be horizontal to the ground. The images are taken from altitudes alternating between 5 - 15 meters to diversify the field of vision and to build a more inclusive database. During the surveillance operation, the UAV speed is 2 m/s on average. Following this, the dataset underwent meticulous annotation using the advanced annotation platform, Makesense.ai, enabling accurate demarcation of fire boundaries. This resource equips researchers with the necessary data infrastructure to develop innovative methodologies for early fire detection and continuous monitoring, enhancing efforts to protect ecosystems and human lives while promoting sustainable forest management practices. Additionally, the UAVs-FFDB dataset serves as a foundational cornerstone for the advancement and refinement of state-of-the-art AI-based methodologies, aiming to automate fire classification, recognition, detection, and segmentation tasks with unparalleled precision and efficacy. © 2024 The Author(s)