UAV Automated Landing System (TALS) based on optical recognition. Here is model of data processing, simulates video from drone camera. Captured video is viewed to define featured points – markers in runway.
Detected marker is labeled by green rectangle, that will be one of the targets in the video frame. Group of targets defines runway and its spatial position qq663pn. Tracking system traces group of targets during landing and produce control signals for the autopilot.
Model provides different runway moving, including X, Y, X rotation and change distance to camera. Speed of video changes correspond to UAV dynamic capabilities and allows reliable tracking process.
Each target string shows number of label (green rectangle), label size, rotated angle and aspect ratio. These data is used in autopilot tracker for grouping of labels and track every label in time.
For better browsing, use 720p (HD) or 1080p resolution and full-screen mode.
Below you can see landing video preprocessing before tracking. Used plane captured video from www.justplanes.com
TALS detects landscape feature points and runway markers. At the approach stage, TALS seeks feature (reference) points – FP, labeled in the video as FP01 … FP04, not all FP marked. FP outputs to TALS tracking subsystem.
At the landing stage, system discovers runway and locked runway image for coordinate processing.