Tuesday 1 January 2019

UAS and Manned Aircraft Autonomy

Assignment 6.5 - UAS and Manned Aircraft Autonomy

Generally speaking there are three main levels of autonomy.  They are your low level, mid-level and high level classes.  The low level rarely involves the system as the human in the loop possesses the overall situational awareness and control.  This would be akin to a UAS that is flown as a FPV mode for hobbyist racers.  The second level is the medium level of autonomy which sees about 50% of the computer providing the input and flight control over the UAS.  This would be akin to a small UAS that provides features such as autopilot so that it lowers workload and provides easy maneuverability for the human operator to fly as they want (ie. if altitude is locked, then the operator is able to just control yaw and turns rather than pitch).  The last level is high autonomy which means that the human interface is very little while the computer itself is providing almost all of the control and decision making.  A pre-programmed flight plan for a large UAV with dedicated GCS would be an illustration of this example.  All flight destinations would be pre-programmed and automatically executed as waypoint navigation.  The only time that the human would be involved is if it explicitly interrupted the system to gain control over the flight controls.

In my opinion the main difference that needs to be considered on the subject of automation for manned operations versus unmanned operations is the ability to adaptively sustain and maintenance situational awareness of the mission.  For manned aircraft operations, you have a crew with you that will provide you constant feedback and input and query to all arrive at the same type of information and decision.  For an unmanned operation, assuming you are the only person in the loop operating the device, then you are left to your own senses when it comes to the mission at hand.  You do not have the luxury to call upon your first officer to confirm your suspicions on navigation issues or troubleshooting in flight emergencies.  This is where I believe automation will help complement the unmanned operator by providing intelligence via updates and adaptive feedback.  Even something as simple as a pre-flight walkaround check may substantially different as in manned operations you have at least 2 or even 3 sets of eyes to ensure that the aircraft is airworthy.  Whereas in unmanned operations, you may only have 1 set of eyes and the rest will rely on automated indications to ensure all systems are sound for operation.  In a remote setting, the launch and recovery team would provide this checkout where as the remotely controlled GCS operator will not even have the opportunity to conduct such a check.  Automation is what will be able to fill this void and enable some level of situational awareness for the remote unmanned operator.

I believe that the current industry under utilizes automation. Looking at the example of the NASA and FAA initiatives for NextGen, we are only seeing the tip of what is to come.  With innovations such as ADS-B and digital datalinks to provide real-time information and flight data, it will help manned and unmanned aircraft operations integrate as the overall demand for air aviation increases over the next few years.  With more precise navigational information, more aircrafts will be able to fly in a denser grid while maintaining safety buffer distances and proper sense and avoid protocols.  Communications among traffic controllers, pilots and unmanned operators will be facilitated via datalinks that help transcend archaic voice communications.  The next few years of air transport will be completely transformed with the migration of these technologies and levels of automation that will only help evolve our status quo achievements!

Reference


Obringer, L. (2017, August 6). NASA. Retrieved from Autonomous System: https://www.nasa.gov/feature/autonomous-systems#adsb

No comments:

Post a Comment