Real-time fault detection in multirotor UAVs using lightweight deep learning and high-fidelity simulation data with single and double fault magnitudes

dc.authoridSohel, Ferdous/0000-0003-1557-4907
dc.authoridasadi, davood/0000-0002-2066-6016
dc.authoridMowla, Najmul/0000-0003-0613-9858
dc.contributor.authorMowla, Md. Najmul
dc.contributor.authorAsadi, Davood
dc.contributor.authorSohel, Ferdous
dc.date.accessioned2026-02-27T07:33:31Z
dc.date.available2026-02-27T07:33:31Z
dc.date.issued2025
dc.description.abstractRobust fault detection and diagnosis (FDD) in multirotor unmanned aerial vehicles (UAVs) remains challenging due to limited actuator redundancy, nonlinear dynamics, and environmental disturbances. This work introduces two lightweight deep learning architectures: the Convolutional-LSTM Fault Detection Network (CLFDNet), which combines multi-scale one-dimensional convolutional neural networks (1D-CNN), long short-term memory (LSTM) units, and an adaptive attention mechanism for spatio-temporal fault feature extraction; and the Autoencoder LSTM Multi-loss Fusion Network (AELMFNet), a soft attention-enhanced LSTM autoencoder optimized via multi-loss fusion for fine-grained fault severity estimation. Both models are trained and evaluated on UAV-Fault Magnitude V1, a high-fidelity simulation dataset containing 114,230 labeled samples with motor degradation levels ranging from 5% to 40% in the take-off, hover, navigation, and descent phases, representing the most probable and recoverable fault scenarios in quadrotor UAVs. Including coupled faults enables models to learn correlated degradation patterns and actuator interactions while maintaining controllability under standard flight laws. CLFDNet achieves 96.81% precision in fault severity classification and 100% accuracy in motor fault localization with only 19.6K parameters, demonstrating suitability for real-time onboard applications. AELMFNet achieves the lowest reconstruction loss of 0.001 with Huber loss and an inference latency of 6 ms/step, underscoring its efficiency for embedded deployment. Comparative experiments against 15 baselines, including five classical machine learning models, five state-of-the-art fault detection methods, and five attention-based deep learning variants, validate the effectiveness of the proposed architectures. These findings confirm that lightweight deep models enable accurate and efficient diagnosis of UAV faults with minimal sensing.
dc.description.sponsorshipTrkiye Bilimsel ve Teknolojik Arascedil;timath;rma Kurumu [223M312]
dc.description.sponsorshipThis research is supported by the Scientific and Technological Research Council of Turkey (TUBITAK) under the TUBITAK 1001 program, with project number 223M312.
dc.identifier.doi10.1007/s40747-025-02195-y
dc.identifier.issn2199-4536
dc.identifier.issn2198-6053
dc.identifier.issue2
dc.identifier.urihttp://dx.doi.org/10.1007/s40747-025-02195-y
dc.identifier.urihttps://hdl.handle.net/20.500.14669/4623
dc.identifier.volume12
dc.identifier.wosWOS:001655537000001
dc.indekslendigikaynakWeb of Science
dc.language.isoen
dc.publisherSpringer Heidelberg
dc.relation.ispartofComplex & Intelligent Systems
dc.relation.publicationcategoryMakale - Uluslararas� Hakemli Dergi - Kurum ��retim Eleman�
dc.rightsinfo:eu-repo/semantics/openAccess
dc.snmzKA_20260302
dc.subjectMultirotor UAV
dc.subjectFault detection
dc.subjectHigh-fidelity simulation
dc.subjectMotor fault analysis
dc.subjectDeep learning
dc.subjectLoss function optimization
dc.titleReal-time fault detection in multirotor UAVs using lightweight deep learning and high-fidelity simulation data with single and double fault magnitudes
dc.typeArticle

Dosyalar