Platform for visual non-GNSS navigation

Abstract

Affordable commercial off-the-shelf drone platforms have become powerful enablers for various futuristic-seeming applications regardless of industry. These solutions require accurate and errortolerant location information and cannot rely solely on the availability of a GNSS-position. The limitations of using only GNSS-location include disturbances in the GNSS-signal, precision of the GNSS-location available to a drone, and the possibility of the signal being affected by an adversary. Huld is developing a computer vision solution that enables real-time georeferencing of observable terrain in unmanned aerial vehicles (UAV) using a single camera sensor. The developed solution combines modern deep learning and edge computing with classical remote sensing to enable autonomous, spatially aware flying platforms. In addition to enabling accurate autonomous navigation based on visual sensory input, this technology allows precisely locating arbitrary objects in the UAV’s field of vision. The video feed or images taken by a UAV can be georeferenced based on a reference map that can originate from satellite imagery, airborne image acquisition, or even a pre-recorded flight video.

Select your language of interest to view the total content in your interested language

Viewing options

Flyer image

Share This Article