Monday 18 July 2016

Sense and Avoid Sensor Selection

Hi Everyone!

For this week's research assignment, I chose the DJI Phantom 4 to discuss its sensor suite for collision avoidance technology.

I chose this small UAV because it is considered the leader in commercial UAVs for aerial photography and video.  It's flagship model, the Phantom 4, was recently released for the mass market in Spring 2016 and it contains small improvements on speed and battery life over its predecessor but primarily boasts its novel sense and avoid system which separates it from the rest of its rival small UAVs.

DJI Phantom 4
The DJI Phantom 4 costs approximately $1399 USD.  Its take-off weight is 1380g, travels a max speed of 20 m/s, operates at a maximum ceiling of 6,000 metres and can last for up to 28 minutes of flight.  The Phantom 4 is powered on a single LiPo4S battery with 5350 mAh capacity and takes approximately 1 hour for a full charge at maximum power of 100W.

Dual Under-Carriage Sensors
The sense and avoid system have a range of up to 50 metres and has 60 degrees of viewing angle. It has two forward facing optical cameras and two sonar and optical cameras underneath for ground detection.  The minimum separation distance between obstacles is programmed at around 2 metres, so any intentional or accidental collision courses would be prevented.  The main limitation of this small UAV sensor system is the lack of overhead sensors and rear facing sensors.

Dual Forward Facing Sensors

The technical process that governs the entire sense and avoid system is focused around its flight controller which acts as the brain to compute and analyze all of the inputs gathered from its front and under-carriage sensors.  The computer algorithm calculates incorrect data and invalidates it to determine obstacles, then cross-references it with its positional data gathered by its proprioceptive sensors such as its dual INUs for inertial status and its barometer for altitude reading.  The GPS also acts to determine its position relative to its environmental settings and the flight controller outputs the requisite manipulations to coordinate adjustments to its 4 motors, resulting in a mid-air side-stepping of any potential obstacles.
Phantom 4 Sense and Avoid Path
Although expensive, this compact sUAS is the leader in aerial photography drones currently available in the market.  It is a leader in sense and avoid technology and boasts the follow-me function without the operator having to carry and transponder device.  It is a great asset whose collision avoidance technology could be leveraged for other unmanned systems.

Monday 11 July 2016

6.5 - Control Station Analysis

Hi Everyone,

For this week's assignment, I chose the unmanned underwater vehicle called the Crabster CR200.  It is a remotely operated vessel that was inspired by the the natural design of lobsters and crabs with a primary objective of being able to steady itself for deep seafloor explorations.  Past vessels that wanted to explore the ocean floor were ill-equipped to brace and counter the harsh undercurrents and turbulent deep-water tides, as such, out of this necessity, Crabster CR200 was born.
Crabster CR200

The remote control unit that operates the Crabster is housed at the surface level within a 20 ft sea container.  Within the sea container, it contains a lot of equipment that is meant for a team of four operators to use.  You have the pilot, co-pilot, navigator and also sonar/sensor operator - it almost looks like an aircraft cockpit!
Multi-Program Driven

The sea container consists of seven computers, nine LCD monitors, four joysticks, and power supply.  The hardware is powered on a Intel Quad Core i7 CPU, 8 GB main memory, and Gigabit Ethernet interface.  The agent program is based on Linux Software but there are also many commercial off the shelf (COTS) software that is used such as the Navigation Program and Video Program which are all integrated via the Agent Program.

The data presentation of the Crabster is mostly traditional 2D views and live-video views from the pilot view.  However, the pilot video does have an interface similar to a heads-up display found (shows tide/current speed, pitch/roll/heading and speed) on most modern aircrafts which helps, from a human factors perspective, assimilate information much more easily.  There is also a 3D-view of the Crabster itself that is made possible by the on-board sensors relaying positional and other environmental state information back to the remote control station which then is simulate through the 3-D viewer program to create a dashboard view of all critical information of the vessel itself.
3D Dashboard View

Pilot HUD view

For future improvements, it must be stated that the portability of the control station needs to be investigated to ensure its footprint can be made more concise.  It is currently cumbersome to transport and requires lifts and transportation capable to move a sea-container.  That said, I do believe the 4 personnel team required to operate the Crabster CR200 is a good design choice as it allows each specialist to focus on their task at hand to optimize results and share the overall workload of such a complex unmanned underwater vehicle.
Remote Control Station

Hope you enjoyed my research for this week - cheers!




References
IEEE Spectrum. (2014, July 30). IEEE Spectrum. Retrieved from Huge Six-Legged Robot Crabster Goes Swimming: http://spectrum.ieee.org/automaton/robotics/industrial-robots/six-legged-underwater-robot-crabster
Kim, B., Shim, H., Yoo, S.-Y., Jun, B.-H., Park, S.-W., & Lee, P.-M. (2013). Operating Software for a Multi-legged Subsea Robot CR200. Retrieved from IEEE Xplore: http://ieeexplore.ieee.org.ezproxy.libproxy.db.erau.edu/stamp/stamp.jsp?tp=&arnumber=6608151
Sea Technology Magazine. (2013, October 13). Hexapod Robot Crabster CR200 for High Tide, Turbid Water Exploration. Retrieved from Sea Technology Magazine: http://www.sea-technology.com/features/2013/1013/6.php
 

Tuesday 28 June 2016

Unmanned System Data Protocol - ScanEagle

Hi Everyone,

For this week's assignment, I had to hand it in late due to the airshow that required me to work all weekend.  A video of the weekend's events can be found here for your leisure.

My candidate of choice was the Boeing/Insitu designed ScanEagle.

The Scan Eagle is a medium altitude long endurance UAV with a sensor payload for ISR missions.

It  has a max speed of 126 km/h and max altitude of 16,000 ft.  The nose turret of the ScanEagle houses its sensors which consists of an electro-optical and infrared sensor to provide detailed imagery and video of how to 640 x 480 pixels.  The sensor payload is also able to be customized with other devices such as a CBRN sensor or magnetometer.  A recent study of maritime fauna also installed a DSLR camera as its payload proving the modular and inter-operable design of the ScanEagle.  With this study, I also inferred that commercially available SD cards were compatible data storage devices.



The onboard data recorded is recorded in NTSC video which is also convertible to MPEG 2 file format.  The Local Data Sets are encoded in ISO7 format and its metadata is transmitted via UHF 900 MHz datalink and a 2.4 GHz S-Band down link to the ground control station for line-of-sight ranges.  For BLOS ranges, the Iridium Satellite system is a viable alternative to transfer data up and down stream to the ground control station.

The architecture used for the datalink is a common centralized concept which allows only one ScanEagle to transmit/receive information to the ground control station.

A future recommendation would be as per Lim's research at Naval Postgraduate School in 2007 which is to leverage the use of wireless cards to integrate a mesh architecture.  I also believe that 4G-LTE datalink capabilities would be able to advance the ScanEagle into the next generation.  A combination of these two recommendations would allow the ScanEagle System as a whole (qty 4 UAVs, GCS, Launcher and Recovery System) to have expanded ISR coverage and also range since the backbone/clusterhead UAV would act as a rearguard to relay information among the swarm/mesh to the ground control station.

 
ScanEagle launched from the Mark-4 System

Monday 20 June 2016

UAS Sensor Placement for Aerial Drone and FPV Racer Drone

Hi Everyone,

For this week's assignment, I have selected the DJI Inspire 1 for my candidate aerial quadcopter and the
Hubsan X4 quadcopter as my FPV racer.

I will first talk about the DJI Inspire 1.

I selected the DJI Inspire 1 mainly for its prominent feature as the industry leader for introducing 4K video format and photography capture.  The DJI Inspire 1 is also very unique in that it allows for dual operators to control the flight operations independently of the video or photography capture.  This allows each operator to focus on the quality of their respective scope of responsibilities to ensure a quality product is produced.  While researching this system, it led me to think about how it could be a good candidate for my development research project which has to deal with aircraft structures and maintenance inspections and how UAVs can complement and speed up the process.

The DJI Inspire 1 also provides a very interesting navigation concept which is independent of GPS signals. This would be useful in an indoor environment where GPS signal may not be available.  The DJI Inspire 1 uses a vision positioning system which essentially relies on two sonar sensors and one monocular camera to detect the surface ground below it and serves as an extereoceptive sensor to help stabilize its vertical distance in relation to the ground.  This would be very useful for low-level flight indoors.

The DJI Inspire 1 is designed with high quality carbon fibre to reduce its overall weight and has impressive specifications such as a maximum speed of almost 50 mph. It is also robust enough to sustain operations in moderate to high winds.  The main downfall of its system design is the duration of the battery life which permits operations of only up to 18 minutes.  This also results in a total cost of about $3000 which will only target enthusiasts and professionals market segment.

All in all, the DJI Inspire 1 is a high quality and high end product that will cater to those who need the 4K video and photography capability, which would be ideal for someone who may be considering incorporating this system to ..let's say aircraft maintenance inspections via UAVs.

DJI Inspire 1 Quadcopter

The second quadcopter that I would like to discuss is the Hubsan X4 H107D FPV Racer. I chose this due to its low-cost to entry and undoubtedly, the drone would result in a catastrophic end given my inexperience to the FPV Racing environment.  This FPV Racer does not boast any remarkable sensors such as the DJI Inspire 1 but rather gets the job done which consists of a small frame, 640 by 480 resolution video to help the FPV operator navigate thru their FPV goggles.  The small body and no-frills design gets te job done via its frontal sensor and as long as you are not expecting to use this as a video of photography capturing device, then you will have your expectations met as an entry level candidate to enter this world of FPV drone racing.  The battery life is quite impressive which is rated at approximately 30 minutes of operation fly time.  The video range is limited at 100 ft which is sufficient for more on-site FPV Racing meetups and given its inability to sustain much weather conditions, it is recommended that indoor environments be the preference.  The total upfront cost is under $200 and would entice a very broad range of enthusiasts willing to modify their assemblies or the beginner who wants the quadcopter "ready to fly" straight out of box. After having researched this hobby, it was reminiscent of the star wars chase scene through the woods and has definitely peaked my interest to get my feet wet in this sport!

Hubsan X4 H107D FPV Racer


Monday 13 June 2016

Unmanned System for Maritime Search and Rescue - SARbot by SeaBotix

Assignment 2.5 - Unmanned Systems Maritime Search and Rescue


Hi Everyone,

For this assignment, I came across a remotely operated vehicle that was tried tested and true during the search and rescue / salvage mission conducting in Japan after the Tsunami disaster in 2011 (Murphy, et al., 2012).

The unmanned ROV that I selected is called the SARbot made by SeaBotix.  It boasts proprioceptive sensors and exteroceptives sensors to accomplish its mission mandate as a complementary tool for operators to conduct timely rescue and recovery operations.

SARbot by SeaBotix

 For proprioceptive sensors (Carey, 2011) , it is equipped with your standard GPS which can be overlayed onto existing mapping software such as google maps.  It also has more unique features such as an Ultra-Short-Baseline (USBL) System which acts as an underwater GPS system which allows the surface platform/ground control station to track the SARbots precise location underwater via responder/transponder modules.  There is also an emergency locator beacon to help detect the location of the SARbot should its connection or physical tether be severed from the land ground station.

For exteroceptive sensors, the SARbot consists of a high resolution imaging sonar which is effective up to 197 ft.  It is a very fast and reliable way to narrow down the search areas of interest.  The SARbot also boasts a high-resolution video camera that can be very effective in low-light conditions but can only capture video and imagery from an effective range of approximately 30 ft (Teledyne, 2016).  It does, however, have a very good filtering lens which focuses on blue and red hues so that it can properly remove human victims and separate it from dirt or debris.  The video imagery is very important as it helps the operator see where to operate its mechanic grapple so that it may actually carry out the rescue or recovery of a human victim or object. 

Real-time imagery and Sonar View

A major upgrade that I would consider would be to increase the effective range of the video camera and also enable it to have an automatic focus.  The operator is already inundated with multiple inputs and considerations, so one less thing to worry about such as manually focusing in on objects of interest would lower the overall effort for the operator.  Another thought would be to incorporate an infrared / thermal sensor so that human victims can be identified as soon as possible, especially if the incident was within the 'golden hour' and victim recovered/saved within the first 60 minutes.

Sonar Image of Human Victim

As described in the Japan Tsunami Recovery, the integration of an UAV and USV with the ROV provides a high degree of confidence and expediency if used in tandem to cover the same search region.  This concept of operation allows the rescue team to move from sector to sector without any doubts or hesitation that they may have missed an important clue or detail.  Inter-operability among all three platforms are key and should not be underestimated to ensure a thorough and effective search operation.

There are undoubtedly many benefits for employment of an unmanned system for maritime search and rescue.  The primary reason that comes to mind is to keep the rescue operators safe when it comes to conducting operations in hostile and inclement environments and weather conditions. The last thing you want is for your search party to become a liability thus further exacerbating the situation.

References

Carey, B. (2011, February 8). Popular Science. Retrieved June 11, 2016, from SARBOT Robo-Sub Helps
Rescue Drowning Victims in Minutes: http://www.popsci.com/technology/article/2011-01/sarbot-robo-sub-helps-rescue-drowning-victims-minutes

Murphy, R., Dreger, K., Newsome, S., Rodocker, J., Slaughter, B., Smith, R., et al. (2012). Marime Heterogeneous Multirobot Systems at the Great Eastern Japan Tsunami Recovery. Journal of Field Robotos - Wiley Periodicals, 13.

Teledyne Maritime Systems Company. (2016). SeaBotix. Retrieved 12 06, 2016, from http://www.seabotix.com/products/pdf_files/sarbot.pdf  

Sunday 5 June 2016

UNSY 605 - Activity 1.5 - Blog Entry



Hello Classmates,

The article that I chose this week is sourced from AviationWeek (and an accompanying Reuters' Feature on YouTube) and I chose it because it highly aligns with my intended course development project and it addresses the challenges that I have observed in the work setting as a maintenance engineering officer.
 
I am a Captain the Canadian Air Force and a big part of my raison d’etre is to optimize business processes to ensure more wrench-turning time can be applied by technicians to the aircraft.  This will ultimately help our mission statement which is to delivery tactical air power anywhere and anytime. Unfortunately, I have witnessed too many wasted efforts dedicated to awaiting material to enact aircraft repairs, or awaiting support equipment and tooling so when I read about the course project to develop a sensor to fit a perceived need as it relates to a management practice, operational policy of need for safety, efficiency and effectiveness, I thought that the use of an UAV equipped with the pertinent sensors would be an ideal pursuit!

The article I found on google actually stated that a budget airline in the U.K. named EasyJet is spearheading this very same initiative to pursue the use of an unmanned aircraft to inspect its fleet of aircrafts. Its listed benefits are that it will reduce overall wait time for passengers as UAVs will more efficiently carry out inspections versus traditional/manned methods. There are also efforts gained as there would be no need to have to tow the aircraft into hangars which is a time-intensive ordeal.  EasyJet has contracted with two U.K. companies called Blue Bear Systems Research and Createc to modify off-the-shelf UAVs and equip them with intelligent sensors to perform stand-off distance inspections and utilize a high-definition camera with a laser-system for navigation indoors and collision avoidance. There was mention of GPS capability but geared for outdoor use only. The youtube video from Reuter's mentions the use of a LIDAR sensor which would be used to help with navigation and range-finding.

The initial design is for a semi-autonomous system but the ultimate intent is for a fully autonomous vehicle which will work straightaway after unpacking the box.  The concept of design is that the vehicle will know the aircraft it will inspect via a database and that the based on pre-determined known points on the aircraft will calculate its pre-determined path and carry on its thorough inspection.

The vehicle’s specifications is currently 8.8 lbs and measures 10.8 ft-sq but airlines want it to become even lighter to prevent and minimize any potential contact damage. The goal is to reduce it by half its current weight which will make it a lot more portable as well.

I like the fact that this UAV concept is being designed as a tool to complement existing maintenance technicians to help speed up their processes and NOT to replace them.  I see this becoming an industry norm and would love to see it applied to military maintenance organizations – I wonder if anyone has seen anything like this implemented in the USAF, USMC or USN?

I look forward to your thoughts and discussions!

EasyJet Prototype  - UAV to inspect fleet!
 

First Blog

First Blog...

Hello World!

Realizing that it is 2016 and the world has seen many technological advancements, it is quite surreal to acknowledge that I have not yet ventured into the wonderful world of blogging.  I have Embry-Riddle Aeronautic University (ERAU) to thank for helping me take the first plunge this world of blogging...here we go:

Education

I completed my undergraduate degree in Mechanical Engineering at the Royal Military College of Canada.  This is the Canadian Military University which is the equivalent to Sandhurst of the U.K. or Westpoint in the U.S. I completed this in 2007.

Shortly after that, I started a part-time MBA program at the University of Ottawa during my first military posting in Ottawa.  I completed this very rewarding program knowing a little bit more about the business/for-profit world and also met a lot of lifelong friends.  This program lasted about 2.5 years from 2009 until late 2011.


As of March 2016, I decided to pursue more secondary education and enrolled at Embry-Riddle Aeronautic University (ERAU) in the Masters of Science Program for Unmanned Systems.  I see this an an emerging sector and want to be well positioned and prepared as it becomes an integral feature of our everyday life.

Employment

I started working in 2009 as the C130E/H Avionics Officer in charge of various Repair and Overhaul Contracts. In 2010, I moved into the Deputy Business Planning role for the Level 2 Air Force Directorate of Aerospace Engineering Program. Since 2012, I have worked as the C130J Aircraft Maintenance Officer in charge of all aspects of maintenance for the Canadian Fleet. I have cherished the first-line opportunity and experiences gained and as of this summer, I will be relocating to DRDC Toronto to work on research and development. I am currently a Captain.
 

Family

I married the love of my life in 2012 after 5 years of dating and since then have continued to have fun! The honeymoon period has not appeared to have ended just yet!
 

Hobbies/Interests

I enjoy traveling a lot and hope to visit many more places before my wife and I grow our family! 

I also enjoy soccer (or more accurately known as football). It is not that fun to watch but I am an avid player and hope to expand my interest to venture into refereeing in the near future.

Taken in the most northern inhabited place in the world (CFS Alert) in summer 2015.

Cheers!