Skip to main content

Complimenting Sensors for Navigation in Urban Canyons



Unmanned Aircraft System Navigation in the Urban Environment: A Systems Analysis
Journal of Aerospace Information Systems

            This article from the Journal of Aerospace Information Systems analyzes alternative methods for Unmanned Aerospace Systems (UAS) navigation within urban environments.  Navigation accuracy by Global Positioning System (GPS) is severely degraded due to urban canyons, where accuracy is particularly poor.  An urban canyon is best described as area flanked by tall buildings.  Although Global Navigation Satellite System (GNSS) is unreliable in the vicinity of dense urban structure it can be used in combination with other complimentary sensors to provide position and velocity measurement.

            Urban UAS missions related to law enforcement, traffic surveillance, riot control, and anti-terrorism are all challenged by the physical obstacles and communication and navigation issues.  There is a need for a more accurate mode of navigation in GPS denied area.  Navigation of a UAS is accomplished through multiple exteroceptive sensors that gather various types of information about the environment.  Navigation can also be considered a proprioceptive concept because the unmanned system is becoming aware of its own relative position in space.

GPS and IMU

            An Inertial Measurement Unit (IMU) consisting of gyroscopes, accelerometers, and magnetometer provides measurements of angular velocity gravity and magnetic north vectors (Rufa, 2016).  An advantage of IMU for navigation is that measurements are based on inertial accelerations and not affected by the urban structure. The disadvantage the IMU is attitude drift of approximate position.  However when GPS and IMU are combined, UAS position, velocity and attitude can be more accurately achieved.  To get an idea of how inaccurate GPS alone is in an urban environment, a study was done in Hong Kong; when GPS was available, the accuracy was worse than 20 meters for 40% of the points and worse than 100 meters for 9% of the points.  

Computer Vision

            Computer vision for navigation is a method that is still actively being researched.  Here, in a well-lit environment, computer vision provides information to a filter that can generate position, airspeed and attitude measurements.  Unfortunately these sensors are limited to high contrast and well lit environments.  How does this work?  How can computer vision provide location information for a UAS? 
            “Optical flow is defined as the distribution of apparent velocities of brightness pattern in an image (Rufa, 2016).”  Optical flow is calculated by “comparing pixels in sequential images to determine the local velocity” to determine the velocity of the UAS camera that is capturing the images.

Air Data

            Air data via static pressure and dynamic port system generates airspeed and altitude measurements like passenger aircraft.  An advantage with an air data system on an UAS in an urban environment is that it provides navigation data independent from other sensors but it will not always be reliable in an urban environment with shifting winds and gusts that occur in the vicinity of urban structures.
 
Long-Term Evolution (LTE)

            Long-Term Evolution (LTE) can provide location data from a cellular network that can increase accuracy of navigation in urban terrain.  The FCC’s 911 system requires that cellular carriers meet accuracy for phone location within 300 meters.  These locations for geotagging can be as accurate as 3 to 31 meters. LTE positioning is comprised of 3 techniques: enhanced cellular identification (E-CID), observed time difference of arrival (OTDOA) and assisted global navigation satellite systems (A-GNSS).  These methods for location within a cellular network are independent of GPS and are in an active area of research for urban navigation.

            In this study the different techniques of navigation sensors went through a simulated operation and were characterized and evaluated.  Vision provided an airspeed measurement but it needed to be augments with an IMU or other inertial position measurement.  LTE was not sufficient in horizontal position.  LTE experiences a 4 second delay.  If the delay is substantially decreased it may boost accuracy.  The researchers will continue to explore LTE as a means of navigation.  This article was informative on the types of sensors used for an UAS navigating in an urban environment and analyzed how sensors complement one another to yield an accurate measurement of location and movement.

References:

Rufa, J. R., & Atkins, E. M. (2016). Unmanned aircraft system navigation in the urban environment: A systems analysis. Journal of Aerospace Information Systems, 13(4), 143-160. doi:10.2514/1.I010280

Comments

Popular posts from this blog

ADS-B Detect, Sense and Avoid Sensor Selection for Unmanned Aerospace Systems

Introduction There is a need for a more efficient and safer environment in support of existing aeronautical operations that reduce the risk of collisions for manned and unmanned aircraft.  Operators of Small Unmanned Aerospace Systems (sUAS) under 55 pounds hold a responsibility to safe flight in the airspace in which they are permitted.  Payload weight on aircraft this small is significant and should be kept to a minimum for operating efficiency.  Weight requirement and cost effectiveness are key factors for Sense and Avoid (SAA) sensor selection.  A Traffic Collision and Avoidance System (TCAS) are too large and heavy for sUAS.  SAA technology for UAS is part of a much bigger picture.  Each development brings UAS closer to their consent in the National Airspace System (NAS).  NASA conducts collaborative research “with the Federal Aviation Administration (FAA), the Radio Technical Commission for Aeronautics (RTCA) and commercial aerospace enti...