Image-based UAV position and velocity estimation using a monocular camera

dc.authoridNabavi Chashmi, Seyed Yaser/0000-0003-1836-2600
dc.authoridAhmadi, Karim/0000-0002-2633-3351
dc.authoridasadi, davood/0000-0002-2066-6016
dc.contributor.authorNabavi-Chashmi, Seyed-Yaser
dc.contributor.authorAsadi, Davood
dc.contributor.authorAhmadi, Karim
dc.date.accessioned2025-01-06T17:43:45Z
dc.date.available2025-01-06T17:43:45Z
dc.date.issued2023
dc.description.abstractAutonomous landing of aerial vehicles is challenging, especially in emergency flight scenarios in which precise information about the vehicle and the environment is required for near-to-ground maneuvers. In this paper, the optic-flow concept based on feature detection is applied to estimate the vertical distance and the velocity vector of a multirotor UAV (MUAV) for landing. The UAV kinematics, the optical flow equations, and the detected feature states, provided by a low-cost monocular camera, are combined to develop a novel appropriate model for estimation. The proposed algorithm applies the variation of detected features, the angular velocities, as well as the Euler angles, measured by the Inertial Measurement Unit (IMU), to estimate the vertical distance of the UAV to the ground, the MUAV velocity vector, and also to predict the future features position. Extended Kalman filter (EKF) is applied as the estimation method on the coupled optic-flow and kinematic equations. The accuracy of state estimation is enhanced by the idea of multiple-feature tracking. The 6-DOF simulations, laboratory experiments, and comparison of results demonstrate the capability of height and velocity estimation of a MUAV in the landing phase of flight by just applying the low-cost camera information. Monte Carlo simulations have been performed to study the effect of IMU acceleration, and angular velocity measurement noises as well as the number of the detected features on the success probability of the estimation process. The results reveal that increasing the number of detected features, i.e tracking multiple features, increases the estimation accuracy, however, it mainly improves the success probability, which is a more important factor in practical scenarios.
dc.description.sponsorshipScientific and Technological Research Council of Turkey (TUBITAK) under 3501 program [120M793]
dc.description.sponsorshipThis research is supported by the Scientific and Technological Research Council of Turkey (TUBITAK) under 3501 program, with project number [120M793].
dc.identifier.doi10.1016/j.conengprac.2023.105460
dc.identifier.issn0967-0661
dc.identifier.issn1873-6939
dc.identifier.scopus2-s2.0-85149170078
dc.identifier.scopusqualityQ1
dc.identifier.urihttps://doi.org/10.1016/j.conengprac.2023.105460
dc.identifier.urihttps://hdl.handle.net/20.500.14669/2770
dc.identifier.volume134
dc.identifier.wosWOS:000949676000001
dc.identifier.wosqualityQ1
dc.indekslendigikaynakWeb of Science
dc.indekslendigikaynakScopus
dc.language.isoen
dc.publisherPergamon-Elsevier Science Ltd
dc.relation.ispartofControl Engineering Practice
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı
dc.rightsinfo:eu-repo/semantics/closedAccess
dc.snmzKA_20241211
dc.subjectMultirotor UAV
dc.subjectVision-based
dc.subjectMonocular camera
dc.subjectEstimation
dc.subjectLanding phase
dc.titleImage-based UAV position and velocity estimation using a monocular camera
dc.typeArticle

Dosyalar