Dad of Drones

It all started back in 2010, when my husband came home and asked me what did I think of him starting a PhD on unmanned aerial vehicles (UAV’s or drones)?  Well if I am honest, it started way before 2010 with remote control planes, but the drone interest is more recent. It’s pretty awesome for our kids that their dad works with these really cool toys. They even helped our kid’s school get a visit from the Emergency Helicopter. But they are not toys really. And let’s get one thing straight – the correct terminology is not ‘drone’, but ‘Unmanned Aerial Vehicles ‘(UAVs). But calling him “Dad of Unmanned Aerial Vehicles’ is too much of a mouthful and isn’t quirky. So, introducing, the ‘Dad of Drones’.

Darren flying an oktokopter over the moss beds at Casey Station, Antarctica.

“Dad of Drones” aka Darren, flying an oktokopter over the moss beds at Casey Station, Antarctica.

Darren started a part-time PhD when our third child was 1 year old. Just a few months ago the dissertation was returned, fully passed (and already published with four papers in peer reviewed journals).  His PhD, ‘Multi-sensor, multi-temporal, and ultra-high resolution environmental remote sensing from UAVs ‘ looked at image processing algorithms that enabled UAVs to be used to gather useful and measurable scientific data. He has demonstrated the versatility of using a UAV for various vegetation types including coastal vegetation, precision agriculture, viticulture and the Antarctic moss beds (yes, with my bryological background you can imagine my pain that he got to visit the moss beds!).

I never would have thought my husband’s and my work would collide. No, unfortunately I’m not part of the moss bed research, but I am working on a project which utilises satellite imagery and historical aerial photograph to map Athrotaxis cupressoides (Pencil Pine) at both landscape and plot detail perspectives. Satellite imagery, while useful, can be expensive, infrequent, and of insufficient resolution for ecological processes. Slowly, but surely, UAVs are becoming increasingly used by the commercial sector and research institutes as a new remote sensing tool. Progressive technology over the last 5 – 10 years means that UAVs are now more reliable, simpler to use, and there is the scope for greater function for carrying larger payloads (see Anderson and Gaston 2013 for an overview and Kennedy et al 2014 for more reading). The diversity of UAVs is growing – they come in many shapes and sizes with widely varying capabilities and purposes. Multi-rotor UAVs (the typical “drone” we see on the TV news these days) are very easy to fly and operationally flexible, but they are typically small and thus can only map a small area. If larger areas are to be mapped it is common to used fixed wing UAVs (planes) but they suffer from the need for more space to take off and land and are often harder to fly.

In parallel with the explosion in the use of UAVs for research has been the development around image processing algorithms that allow accurate mosaics of areas to be created from the sometimes hundreds of images that are collected during a single UAV flight. These algorithms are based on computer vision algorithms originally used in the robotic fields. This had led to the term “Structure from Motion” (SfM) in which it is now possible to reconstruct in 3D the surface or object being photographed from a series of highly overlapping photographs. In practice, what this means is that not only can the photos from an UAV be joined together into a mosaic (also known as an orthophoto) but we can also generate a 3D Digital Surface Model (DSM) from the data.

An example of what a 3D orthophoto might look like (showing central Tasmania from an incomplete data set)

An example of what a 3D orthophoto looks like (showing central Tasmania from an incomplete data set)

GoogleEarth_Pencil PinesDarren and I have together discussed at length how a UAV can be deployed in ecological research. For example, using a UAV for a mapping program. A few things will dictate the UAV platform chosen including the area to be surveyed and the resolution of imagery required (which dictates flying height of the UAV). For the Pencil Pine research project, it is impracticable to use a UAV to collect imagery over the entire area to be mapped. The same may apply to a ground cover project.  There are some possible alternatives. One approach, which would apply to both Pencil Pine and ground cover project examples, would be to use UAV data as ground truth for satellite data. A methodology would be used to measure ground cover or pencil pine cover and applied to satellite data. The results could then be validated with the finer scale UAV data and the methodology adjusted if necessary, to improve classification accuracy.

Have you ever tried to rationalise with someone from a non-ecological background the benefits of using quadrats with vegetation mapping? Or the benefits of using projective foliage cover (PFC) and a Braun-Blanquet scale to monitor /survey/map an area of vegetation? What about why it is important to repeat measurements over time? When Darren started his PhD I found myself explaining all of these things. I then expanded the concept to extrapolate these measurements to the entire area to be “mapped”.  Of course he quickly realised that it may be difficult to sample everything on the ground and the reality is that, for some projects, it may be that less than 1% of area to be mapped is actually sampled. Assessments of ground cover in rangeland Australia are a good example of where it is laborious, time consuming, costly and often impractical to sample everything on the ground.  Darren coined the term ‘mega-quadrats’ and explained that in the example where vegetation types cover large areas, UAVs can be utilised to map ‘mega-quadrats’ of ~10 ha. The advantages include that the area sampled by the UAV will be a higher percentage of the total area to be mapped compared to conventional methods, and that the cost consumer, time in the field, is minimised.  Image processing can be automated with image processing routines (e.g.  Object Based Image Analysis (OBIA) Laliberte and Rango, 2009) or manual. Or there may even be the potential to crowdsource image analysis.

The increase in both UAV and image processing technology and the potential applications in remote sensing has caught the attention of spatial ecologists (Anderson and Gaston 2013) and the application of UAVs for ecological research requires further enquiry and research.  Ultimately the choice of method will depend on the amount of resources (funds, labour, time) available, and the underlying reasons/aims for needing to monitor.  This all sounds fine and dandy, but there is a catch (there always is!).

The landscape around Hobart International Airport. The circle of 5.5 km radius, shows the area within which you cannot operate a UAV.

The landscape around Hobart International Airport. The circle with a radius of 5.5 km shows the area within which you cannot operate a UAV.

UAVs are governed by strict regulations, as controlled by the Civil Aviation Safety Authority (CASA) here in Australia. As far as CASA is concerned, a UAV being operated commercially or for research purposes is an aircraft that is in Australian airspace and, for reasons of safety, must comply with regulations similar to any other aircraft. It is important to remember that the goal of the regulations is to keep our skies safe, a thought from which we can all take comfort next time we are on a domestic flight and we know that there should not be anyone operating a UAV within three nautical miles of the airport, aerodrome or helicopter landing area. Put simply, to fly a UAV you need a license. This catches out a lot of research institutes that buy a UAV to carry out research and then find they cannot legally fly it. However, all is not lost – there are now many organisations who offer training so that it is possible to get a basic license to fly a UAV and many organisations will also assist the research group to get all the paperwork sorted to legalise any operations. This does all, of course, cost money, so it is important this expense is planned for. Personally, having seen what a UAV can do, I think the expense is worth it.

(Thanks to Darren Turner, who co-wrote this piece with me and to school friend Anne-Marie who coined the name “Dad of Drones”).

References

Karen Anderson, and Kevin J Gaston (2013) Lightweight unmanned aerial vehicles will revolutionize spatial ecology. Frontiers in Ecology and the Environment 11: 138–146. http://dx.doi.org/10.1890/120150

Robert E Kennedy, Serge Andréfouët, Warren B Cohen, Cristina Gómez, Patrick Griffiths, Martin Hais, Sean P Healey, Eileen H Helmer, Patrick Hostert, Mitchell B Lyons, Garrett W Meigs, Dirk Pflugmacher, Stuart R Phinn, Scott L Powell, Peter Scarth, Susmita  Sen, Todd A Schroeder, Annemarie Schneider, Ruth Sonnenschein, James E Vogelmann, Michael A Wulder, and Zhe Zhu (2014) Bringing an ecological view of change to Landsat-based remote sensing. Frontiers in Ecology and the Environment 12: 339–346. http://dx.doi.org/10.1890/130066

Andrea S. Laliberte and Albert Rango (2009) Texture and scale in Object-Based Analysis of subdecimeter resolution Unmanned Aerial Vehicle  UAV) imagery. IEEE Transactions on Geoscience and Remote Sensing 47 (3): 761–770. http://dx.doi.org/10.1109/TGRS.2008.2009355

Turner D., Lucieer A. & de Jong S. M. (2015) Time Series Analysis of Landslide Dynamics Using an Unmanned Aerial Vehicle (UAV). Remote Sensing 7, 1736-57.

Turner D., Lucieer A., Malenovsky Z., King D. H. & Robinson S. A. (2014a) Spatial Co-Registration of Ultra-High Resolution Visible, Multispectral and Thermal Images Acquired with a Micro-UAV over Antarctic Moss Beds. Remote Sensing 6, 4003-24.

Turner D., Lucieer A. & Wallace L. (2014b) Direct Georeferencing of Ultrahigh-Resolution UAV Imagery. Ieee Transactions on Geoscience and Remote Sensing 52, 2738-45.

Turner D., Lucieer A. & Watson C. (2012) An Automated Technique for Generating Georectified Mosaics from Ultra-High Resolution Unmanned Aerial Vehicle (UAV) Imagery, Based on Structure from Motion (SfM) Point Clouds. Remote Sensing 4, 1392-410.

Leave a comment