Tag Archives: autopilot

Using existing mapping data to control UAV mapping flights – Part 1 – Preliminary Ideas and Experimentation

An intrinsic problem with photogrammetry is its requirement to keep the camera facing the subject matter. A much higher quality and more accurate 3D model is produced using the method than taking photographs at an oblique angle. This is especially true of buildings with with flat facades, (this has already been discussed in another blog).

Work has been done using computer vision to automate the control of the camera position so that it follows targets selected by the pilot. Although this has potential for some recording methods such as site tours, as discussed in another blog, it doesn’t aid in the recording of complex topography or architecture. Although there is potential for the recording of architectural elements using computer vision  technologies (this will be discussed in a later blog).

Other work is being done in using a low detail 3D model of a building to aid in the control of a UAV flying around it, but these are more aimed at collision avoidance than quality recording.

While in the future i plan to look at the potential pre-scanning a building with an aerial LiDAR scanner mounted on a drone before recording with UAV.

Potential solution

The camera gimbal of a UAV can be controlled both remotely and from the autopilot of the UAV which could be used to always keep the camera facing the subject matter, but without pertinent information this would have to be done manually. With wireless camera technology it is possible to remotely view what the camera is recording and so control the movement of the gimbal when required, but this would require a second person to control the camera while the UAV is being flown and would be difficult to implement effectively and costly in a commercial environment.

But it would seem to be possible to use existing 3D data of an area to control the flight of a UAV; both controlling the altitude and the angle that the camera gimbal is pointing. I have already discussed the use of DroneKit Python to create a UAV mapping flight, thish can also be used to control the angle of the camera gimbal.

Existing Data

There are a number of existing sources of data that can be used to aid in creating a mapping flight.

Within the UK LiDAR data is freely available at different spatial resolutions, much of the country is available down to 1m while other areas are available down to 0.25 m.

This resource through processing in GIS (Geographic Information System) software provides all of the information required to create a flight path over the area under study and to control the angle of the camera gimbal so that it will record it to a higher quality than before.

A digital elevation model (DEM) created using photogrammetry from existing overlapping aerial photographs can also be employed once it is georeferenced to its correct location. This resource may provide a higher spatial resolution than the LiDAR data and so a better resource for the creation of the flight path, but the landscape and structures may have changed since the photographs were taken causing problems (this can of course be a problem with the LiDAR data as well).

Co-ordinate system problems

One complication with using LiDAR data to control the UAV is the fact that it is in a different co-ordinate system than the GPS of the UAV (OSGB and WGS84). This can be solved be translating one set of data to the co-ordinate system of the other. As the number of points for the mission path will be a lot less than that for the LiDAR data it would make sense to convert the GPS data to OSGB, but this also requires that it be converted back after the flight path has been created added a certain amount of inaccuracy into the data as a conversion is never 100% accurate.

required Data

Three different pieces of data need to be derived from the LiDAR data which are required for the UAV mapping flight:

  • Altitude.
  • Slope.
  • Aspect.

The Altitude is contained within each point of the LiDAR data and is used when displaying the data in GIS software.

The Slope of the topography/buildings is measure in increments up to 90 degrees, with o degrees being flat and 90 degrees being a vertical face.

The Aspect is which way any slope is pointing in is measured in increments from 1-360 degrees. (degrees).

 

Slope_angles

Slope angles

Although it would be possible to create software that extracts the data from the LiDAR file while creating a flight path this is not currently an option. The flight path is currently created in a piece of software such as the Open Source ‘Mission Planner’ system. In this an area is chosen together with other variables and an optimal flight path is created. This flight path file can then be saved, it contains the X and Y co-ordinates of each point of the mission.

UAV Control

At its simplest the flight path can be created with the altitude and slope derived from the LiDAR being used to control both the UAV altitude and camera gimbal angle. This would work well for sloping topography but would be more complicated for areas with sharp breaks in slope (such as buildings).

Altitude Control

The altitude will need to be carefully controlled to make sure that the quality of the imaginary is consistent across the whole area under study. At its simplest this is easy to do using the altitude data within the LiDAR data, together with obstacle avoidance sensors to aid with safety.

The problem arises when needing to record something near or completely vertical. Rather than requiring a set altitude the UAV needs to maintain a set distance horizontally. This may be possible by creating a buffer in the data around steeply sloping areas.

Drone_flight_path

Problem with vertical offset

Camera Gimbal Control

Most low cost UAV systems come with a 2-axis gimbals, this means that the camera is stabilized so that it always stays in a horizontal plane but also that its rotation can be controlled downwards.

Gimbal_angle

The angle of the gimbal begins at 0 degrees for a forward pointing position to 90 degrees for a downwards facing position. This is how is its controlled within DroneKit.

As seen earlier the slope is calculated between 0 and 90 degrees for a slope.

There are two intrinsic problems with this method:

  1. The slope only goes between 0 and 90 degrees so there is no aspect data within it. If the drone camera is to be controlled to record the building as it flies over if needs to know which way the building is pointing as the 45 degrees on the left is not the same as the 45 degrees on the right. This could be solved by combining the information from the slope and aspect to give more detailed resulting data.
  2. Most standard gimbals are designed to only point forwards and downwards. This means that the UAV has to turn around to record the back side of the building or it needs to fly the path in reverse. The other solution is to use a UAV with a camera that can point in 360 degrees.

GIS Processing

A certain amount of processing is required within GIS software to get the required data from the LiDAR data and combine it with the required mapping flight path. For this ArcGIS has been used both due to its availability at university and my own familiarity with it.

Lidar

Considering the LiDAR data is for a specific square it makes sense to use raster data rather than the points and lines of vector data as it retains the accuracy of the data. The LiDAR data can be simply loaded into the GIS software as a raster.

Within GIS software the Aspect and Slope can be calculated and a raster created showing the results.

This can be done using the Spatial Analyst or 3D Analyst Toolboxes to provide Slope and Aspect rasters.

The data for these can be incorporated into Attribute Tables which can be exported into text files. It is possible to combine all of the data into one attribute table containing the Altitude, Slope and Aspect.

Although it is possible to export this whole raster file including all of the data it is not currently possible to automatically derive the data in software using the flight path, so a flight path has to be loaded into the GIS software.

Flight Path

The flight path file we created in the Mission Planner software needs to be loaded into a feature class in the GIS software. This can be done by loading the point data for the flight path into the software, this is the beginning and end point for each of the back and forth paths across the area needed to be recorded.

We next need to recreate the flight path using ‘point to line’.

Even though we have recreated the lines, deriving enough data from them is not possible as the flight path is designed to fly back and forth at a set altitude. For this reason we need to create a number of extra points. This can be done using ‘Construct points’ where points can be created at set intervals along a line. This can be linked to the level of data that is being used, so for this LiDAR data the points can be set at intervals of 1m.

Once this his has been done ‘Extract multi-values to points’ can be run on the 3 sets of data to create a table containing all of the required data for each point on the flight path we have created.

UAV Mission Creation

Now that we have all of the required input data for the UAV mapping flight we need to create the mission within Dronekit Python.

For the first level of experimentation we can just load the point data file into python then create a number of points for the UAV to fly to which give the X and Y co-ordinates and the required altitude. At the same time we can also program in the angle for the camera gimbal. It may be best to have the UAV hover at the positions for a second or two so that we know how the recording is going.

As already mentioned if we are only using a 2-axis gimbal we are going to have to have the UAV turn through 180 degrees to record the back sides of buildings and slopes sloping away from the camera. We should be able to do this by altering the UAV Yaw. We will need to have the Python read the aspect angle and change how it creates the flight path depending on the aspect of the slope/building.

Future Directions

ArcGIS allows the use of Python to run tools in its toolbox so it seems possible to create a python script which would automatically create a file with all of the information required from input files of LiDAR data and a flight path.

As QGIS also allows the use of python it would also seem possible to create the required file within this open-source solution.

 

Advertisements

Autonomous Systems Launch Event – University of Southampton

On the 20th of March I was present at the Launch Event for the Autonomous Systems USRG (University Strategic Research Group) at the University of Southampton.

It included a number of 3 minute presentations some of which were very pertinent to the autonomous recording of Archaeology and Cultural Heritage.

Control and Implementation of Autonomous Quadcopter in GPS-denied Environments – Chang Liu

Chang Liu is a PHD student in the department of Engineering.

He has been working in using Optical flow technology and ultrasonic sensors to control the velocity of UAVs using his own autopilot system in an environment without the ability to use GPS.

He is currently perfecting a system using a monocular uEye camera and Quad Core Linux ODROID computer running the ROS (Robot Operating System) using SLAM (Simultaneous Localization And Mapping) algorithms to enable the single camera to identify natural features and act as a more advanced position hold technology.

DSC_0193

Chang Liu’s Autonomous Quadcopter

DSC_0195

Chang Liu’s Analysis Software

 

Autonomous UAVs for Search and Rescue and Disaster Response

Dr. Luke Teacy of the Department of Electronics and Computer Science (ECS) discussed the use of autonomous UAVs in search and rescue and disaster response by co-ordination between multiple platforms. Their low cost and the fact that they are easy to deploy make them ideal for the purpose.  Camera equipped UAV can search for someone in the wilderness using computer vision to see the person. He is using observation modelling to see how the view of a person is effected by distance and how to maximise the information required to find a person. And discussed the issues of how to control UAVs and allocate them to a task using the Monte Carlo Tree search algorithm as well as path planning.

Human Agent Collaboration for Multi-UAV Coordination

Dr. Sarvapali Ramchurn of the Department of  Electronics and Computer Science (ECS) discussed the MOSAIC (Multiagent Collectives for Sensing Autonomy Intelligence and Control) Project. His work involved the allocation of tasks to UAVs, which may have different capabilities, by a human-agent.  Once a task is set the nearest drone will move to complete the task. By teaming the UAVs up to accomplish tasks it maximises efficiency. If a UAV fails a new one takes it place, while when new tasks are allocated a UAV is reallocated to the task.  He discussed using the Max-Sum algorithm to coordinate tasks between the UAVs autonomously.

An intelligent, heuristic path planner for multiple agent unmanned air systems

Chris Crispin is a PhD program in Complex Systems Simulation and is part of the  ASTRA environmental monitoring Project. It involves a group of unmanned vehicles co-coordinating with each other in the mapping. Feasible optimal flight paths are designated and searched along. Once the UAVs begin flying areas are designated with a level of uncertainty by a central computer, and this determines whether a UAV is sent to this area, with the higher the uncertainty the more likely a UAV will be dispatched to map it.

The UAVs use an ODROID-C1 Quad Core Linux computer using a a PX4 Autopilot, while the control computer is an ODROID-XU3. The system uses the JSBSim open source flight dynamics model (FDM).

https://sotonastra.wordpress.com/

Archaeology and autonomous systems

Dr. Fraser Sturt of the Department of Archaeology discussed the various potential applications of autonomous systems in archaeology. This included survey and site identification by high resolution photographic and topographical mapping. He also discussed the benefits of multi-spectral imaging in seeing adobe (mud brick) structures, and how the results have shown that rather than collapsing the Nasca civilisation had moved to the coast. Next he discussed the potential of GPR (Ground Penetrating Radar) being carried on UAVs. And finally discussed the fact that there are approximately 3 million shipwrecks worldwide which need to be studied/made stable.

3DRobotics Dronekit

3DRobotics have announced the release of DroneKit which offers an Open Source Software Development Kit (SDK) and web Application Program Interface (API) for developing drone apps. It works on systems powered by the APM flight code such as the ArduPilot, APM and Pixhawk autopilot systems, all supplied by 3DRobotics.

It allows the creation of custom purpose built UAV (Unmanned Aerial Vehicle) control apps without having to redesign the control system software.

The apps can be developed on three different platforms:

  1. Mobile apps with DroneKit Android.
  2. Web-based apps with DroneKit Cloud.
  3. Computer apps with DroneKit Python.

It enables the user to:

  • Control the flight path with waypoints.
  • Control a spline flight path with fine control over the vehicle velocity and position.
  • Set the UAV to follow a GPS target (Follow Me).
  • Control the camera and gimbal by setting Regions Of Interest (ROI) points which the camera locks on to.
  • Access full telemetry from the UAV using 3DR Radio, Bluetooth, Wi-Fi, or over the internet.
  • Playback and analyse the log of any mission.

The advantages of DroneKit are:

  • It is truly open unlike the similar DJI SDK, without levels of access.
  • Once an app has been created the interface is always the same across different computing platforms.
  • It can be used with planes, copters and rovers.
  • It works on laptop computers as well, mobile devices and vehicle data can even be accessed via the web.

DroneKit already powers a number of flight control programs:

  • The Tower (formerly Droidplanner) flight planning mobile app for Android was built on DroneKit for Android.

Tower (DroidPlanner 3)

  • Droneshare is a global social network for drone pilots that allows them to view and share missions, it is built on DroneKit web services.
  • Googles Project Tango Indoor Navigation is built on the Pixhawk and APM sutopilot systems and the Tower flight planning app.
  • The IMSI/Design TurboSite aerial reporting app for construction allows the setting up of flight waypoints to GPS locations and the capturing of photographs, videos, dictations, text notes and “punch list” action items. Photographs can be annotated while the UAV is still in flight using markup and measurements tools.

UAV (Unmanned Aerial Vehicle) Archaeological and Cultural Heritage Recording

Introduction

Recent developments in a number of technical areas has allowed the development of battery powered UAV (Unmanned Aerial Vehicles) systems which has allowed recording technologies to become airborne easily with extensive control over what is being recorded.

Many of these low cost UAV solution come ready to fly out of the box with some even coming with a camera. These systems have allowed archaeology and cultural heritage to be recorded in whole new innovative ways.

There are basically two types of UAV systems that are employed, each with their benefits and drawbacks:

  • Fixed wing systems use less power and can spend longer flying, but don’t have the ability to hover in one place or change direction quickly. They are also designed for mapping so carry cameras that only point downwards.
    The 3DRobitics Aero-M fixed-wing drone

    The 3DRobitics Aero-M fixed-wing drone

  • Multi-rotor systems use more power as their multiple rotors are turning all of the time they are in the air, so they can spend less time flying and recording. They have the ability to hover and change direction and altitude quickly.
    X8-M

    The 3DRobotics X8-M hexacopter

Recent developments have seen UAVs which combine the two systems, such as the SkyProwler Kickstarter Project. This is a system that can be fixed wing, fixed wing with rotors or just rotors. The rotors are retractable which can be deployed when required while in the fixed wing configuration.

Krossblade SkyProwler

Krossblade SkyProwler

UAV Technological developments

A number of technological developments have led to the recent proliferation of UAV systems being used in different industries and hobbies.

Batteries

The development of the LiPo (Lithium Polymer) battery brought a number of improvements over the previous NiMH (Nickel-metal hydride) battery technology used in Radio Control (RC) Vehicles. They:

  • Have a larger capacity and last longer.
  • Are more powerful.
  • Are smaller and lighter.
  • Are cheaper.

This means that a UAV system can fly for longer, further and faster with a battery that weighs less.

Brushless Motors

The brushless motor has taken over from the brushed motor in the RC (Radio Control) industry with their:

  • Superior power.
  • Higher efficiency.
  • Greater reliability.
  • Higher accuracy.
  • Reduced noise.
  • Lower susceptibility to mechanical wear.
  • Longer lifetime.
  • Smaller size.

A brushed motor works by controlling the polarity of an electromagnet (coil of wires) between two magnets of different poles. The brushes in the brushed motor carry the electric current to the armature (electromagnet) of the motor by being in constant contact with it as it rotates. This causes wear to the brushes and at higher speeds friction, reducing torque and creating heat.

The brushed motor works in the opposite way, by having the coils of wire on the outside, with the magnet in the middle. It removes the need of the brushes but complicates the process as the position of the rotor needs to be sensed and the coils controlled in phase by an electronic speed controller (ESC). Although they are mechanically more complicated and cost more than brushed motors, their other benefits outweigh those of the brushed motor.

Brushed and brushless motors (http://www.eskyhelicopters.com/)

Brushed and brushless motors (http://www.eskyhelicopters.com/)

Brushless Gimbal

Linked intrinsically with the Brushless Motor is the Brushless Gimbal, this is a system which both; holds a camera steadily and level in a single position while the UAV moves around it thereby removing camera shake, and it also can be panned up and down and side to side in more expensive systems.

Camera gimbal pitch, yaw and roll. (http://science.howstuffworks.com/)

Camera gimbal pitch, yaw and roll. (http://science.howstuffworks.com/)

UAV brushless camera gimbals come in two types:

  1. 2-axis. This has two brushless motors which control the camera pitch and roll. This system relies upon filming in the direction that the UAV is pointing and is generally available on the cheaper UAV systems. These have legs which would be in the way if the camera was able to yaw.

    DJI Zenmuse H3-2D Gimbal

    DJI Zenmuse H3-2D Gimbal

  2. 3-axis. This has three brushless motors which control the camera pitch, yaw and roll. This allows cinematography to be conducted from UAV platform with the movement of the camera almost completely removed from the movement of the UAV. And in many cases a separate person can control the camera. These more expensive systems generally have retractable legs allowing the camera to pan left and right.

    DJI Zenmuse Z15-5D III (HD) Gimbal

    DJI Zenmuse Z15-5D III (HD) Gimbal

Gimbals can also be constructed using technology and how-to guides readily available on the internet. A number of Brushless Gimbal Controler Boards are available (including the V3 Martinez Board) as well as gimbal kits and brushless motors.

V3 Martinez Brushless Gimbal Controller Board

V3 Martinez Brushless Gimbal Controller Board

GPS (Global Positioning System)

GPS has a number of applications within the UAV industry:

  1. It is used to make flight easier by using the GPS signal to hold the UAV in one place by calculating if it is moving.
  2. It can be used together with an autopilot to automatically control the flight path of the system.
  3. The GPS co-ordinates at which each photograph is taken can be used within photogrammetry software to help with the locating and aligning of the photographs.
3DR uBlox GPS with Compass

3DR uBlox GPS with Compass

Mast

Mast

The GoPro

The lightweight GoPro series of camera was another enabling factor in the development of the UAV market. Many UAVs are in fact developed with the GoPro camera in mind. Not only are the GoPro series of cameras a lightweight powerful system but they also have wireless communication allowing the camera to be controlled, and the display to be viewed remotely.

Although there are now a number of different sports cameras the majority of low cost UAV systems, particularly those on the Kickstarter and Indiegogo crowdfunding platforms are designed with carrying the GoPro in mind.

Autopilot Systems

Autopilot systems can be very beneficial in the recording of different aspects of Archaeology and Cultural Heritage.

  • They can be used to plan a flight path for the UAV to take over the subject matter.
  • They can be used to create a grid pattern flight path as part of a mapping operation.
Mission Planner

Setting a flight path within Autopilot software

3DRobotics

3DRobtics is a personal drone manufacturing company which also manufactures open source autopilot systems that are the most popular in the world. They are used on many of the Kickstarter systems available.

The company produce two Autopilot systems:

  • The APM 2.6 System is based on the Arduino micro controller. It includes a 3-axis gyro, 6 DoF (Degree of Freedom) accelerometer, high-performance barometric pressure sensor and automatic datalogging.

    APM 2.6 Autopilot

    APM 2.6 Autopilot

  • The Pixhawk is an advanced autopilot system designed by the PX4 open-hardware project and manufactured by 3D Robotics, it includes a 3-axis 16-bit gyroscope, 3-axis 14-bit accelerometer/magnetometer, 3-axis accelerometer/gyroscope and a barometer. A digital airspeed sensor and GPS and compass can be purchased with the system and plugged into the Pixhawk board.

    Pixhawk Autopilot

    Pixhawk Autopilot

NAVIO

The NAVIO is an Indigogo project to build a Raspberry Pi based autopilot which attaches to the top of the credit-card sized computer allowing the construction and control of a UAV system with the powerful small computer system.

NAVIO Autopilot

NAVIO Autopilot

Follow Me Technology

A recently developed technology which is becoming increasingly popular in the drone industry is the ‘Follow Me’ technology where the drone will follow some sort of controller, whether a mobile phone / tablet / laptop or a specially designed piece of technology such as the AirLeash which comes with the AirDog drone. These systems use autopilots systems combined with GPS technology in the controller to control the direction and speed of the UAV system.

The ‘Follow Me’ technology is designed for the extreme sports industry where the drone can follow the user, filming as it flies, whether they are on a motorbike/mountain bike, surf board or other sports equipment. As well as controlling the flight path of the drone following the user it also controls the gimbal that holds the camera keeping the user in shot at all times. In some systems a number of pre-determined video filming techniques are made available with the control app which add to the abilities of the system.

Within Archaeology and Cultural Heritage the system has the potential for filming site tours, where the person giving the tour is tracked by the ‘Follow Me’ system, they would also have a digital audio recording system attached to them to record the dialogue which could be matched with the video in post-production. It would allow one person to do all of the production of the video.

Object tracking and following

Object tracking and following is an expansion of the ‘Follow Me’ technology where rather than following a GPS enabled device an object is selected within an interface and the system visually tracks the object. The UAV follows the tracked object with the camera being locked onto it.

Shift

The Shift has been developed by Perceptiv Labs, it attaches to existing UAV systems, such as those designed by DJI and 3DRobotics, as well as custom systems created with a number of different flight control systems and autopilot systems.

shift

Shift Object Tracking System

Shift Object Tracking System

Through the use of a Shift Eye video camera (attached to the camera of the UAV system) and a Shift Processor computer which plugs into the autopilot it provides standard UAV systems with the ability to track up to four subjects and follow and record them in flight. The subjects are selected using an app available on Android devices.

shift3

shift4

This could add to the site tours potential of the ‘Follow Me’ technology by selection the tour guide within the app when they are walking around, then also selecting the area under study when appropriate. This would be done best by having a second person controlling the app by watching the video, selecting the areas of interest when required and moving back to the tour guide when needed.

Autonomous

Development in autonomous flight of UAVs has a number of benefits for recording where a UAV could record the progress of an excavation at intervals, or record a standing building without any need for control by the pilot.

A number of simple autonomous technologies have already been integrated within commercial UAV technology.

Ultrasonic distance sensors

Ultrasonic distance sensors calculate distance by sending an ultrasonic wave and calculating the time it takes to receive the wave back.

MB1240 XL-MaxSonar®-EZ4™ High Performance Ultrasonic Range Finder

MB1240 XL-MaxSonar®-EZ4™ High Performance Ultrasonic Range Finder

They can be used in obstacle avoidance systems for UAVs, with the ultrasonic beam bouncing off objects in the UAVs path. The ultrasonic sensor is attached to the autopilot which can alter the path of the UAV when an obstacle is detected. With the APM autopilot the sensor it is enabled and calibrated within the Mission Planner Software.

Sonar_Setup

Optical Flow Technology

Optical flow technology is a technique where multiple images from a sensor are compared to determine movement, recent developments in technology mean that this can now be done in real time. It can be used in combination with Autopilot systems to stabilise the flight position of a UAV by detecting movement between photograph and altering the flight accordingly.

The PlexiDrone Indiegogo crowfunding project even comes with the technology demonstrating the growing availability and cheapness of the technology.

The PlexiDrone Indiegogo crowfunding project even comes with the technology demonstrating the growing availability and cheapness of the technology.

The Parrot Bebop Drone comes with Optical Flow Technology integrated into its design (8), with an image of the ground being taken every 16 milliseconds and then compared with the previous one.

Plexidrone

Plexidrone

The PlexiDrone Indiegogo crowfunding project also comes with the technology demonstrating the growing availability and cheapness of the technology in crowdfunding projects.

As well as being integrated into in some UAV technology the sensors themselves can be purchased separately; in UAV technology they come in two distinct types:

    • Mouse Based. The mouse based sensor is based upon the technology of optical computer mice.
3DRobotics Optical Flow Sensor Board with ADNS3080 mouse sensor

3DRobotics Optical Flow Sensor Board with ADNS3080 mouse sensor

  • CMOS Based. The CMOS based sensor uses a CMOS camera chip to capture the images.
    PX4FLOW KIT with MT9V034 machine vision CMOS sensor with global shutter

    3DRobotics PX4FLOW KIT with MT9V034 machine vision CMOS sensor with global shutter

    Vision Positioning System

    The Vision Positioning System present in the DJI Inspire 1 uses a combination of Ultrasonic sensors and Optical Flow Technology to control the position of the UAV in environments where GPS signals cannot reach. It can hold its position and stop when the RC controls are released.

    DJI Inspire 1 Vision Positioning System – [1] Two sonar sensors [2] One binocular camera.

    DJI Inspire 1 Vision Positioning System – [1] Two sonar sensors [2] One binocular camera.

    Intel RealSense

    The Intel RealSense is a new depth-capturing camera technology designed to be incorporated into the latest laptop and tablet technology. It has the ability, thanks to its specialised lens array, to alter the focus of photographs after they have been taken, like the Lytro Illum. It can also track hand gestures to control the computer systems and 3D scan real world objects.

    The new Astech Trinity Autopilot system incorporates 6 Intel RealSense cameras enabling 360˚ motion capture and obstacle avoidance . 
    The Astech Trinity Autopilot will be incorporated into the AscTec Firefly later this year.

    Asctec  Trinity

    Asctec Trinity Autopilot – http://www.theverge.com/

    Swarm Technology

    A lot of technical development has gone into the idea of drone swarms where multiples drones fly together in cooperation. Amongst the applications considered for their use are search and rescue, crop pollination, surveillance, monitoring traffic and as a distributed computing and communications network in disaster areas. Work at the GRASP (General Robotics, Automation, Sensing & Perception) Lab at the University of Pennsylvania has included navigating obstacles, Simultaneous Localization and Mapping (SLAM) using a Microsoft Kinnect and Laser Rangefinder, flying formation by monitoring each other’s position and co-operation in building structures.

    Within Archaeological and Cultural Heritage recording this has the potential for a swarm of UAVs to record areas in co-operation reducing the time taken, as well as the potential of using different technologies to record at the same time.

    Recording Technologies

    Introduction

    A UAV can act as a platform for a number of different recording technologies that can be employed in the recording of Archaeology and Cultural Heritage.

    Photogrammetry

    Photogrammetry is a technique for taking measurements from photographs and can be used to create a number of different results.

    The type of camera system used depends on the type of UAV system employed, the more powerful the system the heavier and more powerful the camera that it can carry.

    Weight is an important consideration; cheaper UAV systems are designed to only carry the GoPro or another extreme sports camera. While the more expensive/powerful systems can carry higher powered digital SLR cameras which record in much higher levels of detail and without lens distortion. The better quality the camera the more details are recorded.

    A number of 360° camera systems have been released which can be attached to the bottom of UAV systems. These have the potential to record many more photos than a single camera, this could potentially speed up photogramemtric recording as well as providing immersive experiences using VR (Virtual Reality) technology such as the Oculus Rift.

    360Heros 360° GoPro mount

    360Heros 360° GoPro mount

    Archaeological Mapping

    UAVs are used for mapping within a number of industries, and have already begun to be used in the mapping of archaeological sites. They provide an ideal platform for the creation of DTM (Digital Terrain Model) and DSM (Digital Surface Model) models which can be used in GIS (Geographical Information Systems) applications.

    This is the ideal project for a fixed wing UAV which can be deployed to fly over the area under study with a downward facing camera. The major benefits of such a systems is the stability, the amount of time that they can fly and hence the amount of recording that they can do in one flight.

    Using autopilot systems and software for programming the autopilot, such as Mission Planner, the flight plan for the UAV can be programmed.

    MP-FP-Screen

    Setting out a UAV light pattern in the Mission Planner Software

    Such software has the ability to create a grid flight pattern using the study area selected in the map interface, the altitude flown, the image overlap and the characteristics of the camera being used. Any alteration in the altitude, image overlap or camera specification (such as lens used) will alter the grid pattern to accommodate the alterations.

    Grid

    Setting out a grid UAV flight pattern in the Mission Planner software

    A grid of circular paper targets can be set up on the ground with each target being surveyed in using a GPS (Global Positioning System) which both increased the accuracy of the photogrammetry model and georeferences the results so they can incorporated with other data within a GIS system

    Standing Building Recording

    Photogrammetry has a long history in standing building recording which has be enhanced by the ability of the Total Station to survey points accurately. Limitations of ground based photogrammetry include the ability to record information high above the ground or masked from view. Traditionally this has been solved by using scaffolding, but his is an expensive and time consuming system which can also be dangerous.

    Another option is to use standard building photogrammetric recording techniques to record structures in high detail using a UAV to fly the camera at set heights parallel to the structure. This would mean that high quality imagery could be created using standard methods. The UAV can act as a mobile camera platform/tripod which has the ability to take to camera to heights not easily accessible by other means. Certain points on the building surface would need surveying in using a total station to georeference the 3D model created by the photogrammetry process and to make it more accurate. Orthophotos (geometrically corrected images) can be created from the images taken which are an important element in building recording.

    Increasing development in autonomous flight can be used to automate the flight patterns having the UAV automatically record buildings.

    HDR (High Dynamic Range)

    High Dynamic Range photography is a technique where multiple images are taken with different exposures (bracketing), these are then merged together using computer software to form an image with all of the detail from the images. Many modern digital camera have an auto-exposure bracketing (AEB) setting which allows this to set up on the camera to be done automatically. There are also dedicated HDR cameras. It provides images which are close to what the human eye can see and with more information than standard photographs.

    The problem with using UAVs for this technology is that the images need to be taken while the camera is perfectly still, and even with a camera gimbal a UAV is likely to move slightly between the photographs being taken.

    HDR photographs can also be used in photogrammetry.

    Video

    The video capabilities of most cameras that UAVs carry mean that they can record videos. As we have already seen the ‘Follow-Me’ technology has the potential within archaeology or cultural heritage to record a site tour, filming the tour guide as they walk around site, with a separate digital recording system recording the audio which can later be combined with the video footage in post-production. The Hexo+ UAV Director’s Toolkit allows different filming scenarios such as crane; pan, tilt, crab, dolly, 360° around you, and far-to-close/close-to-far.

    The UAV has the potential to create immersive fly through videos of sites thanks to the recent introduction of multi-camera systems or systems with multi-lens cameras, this can aid in public interaction and interest.

    Archaeological Prospection

    Lidar (Light Detection And Ranging)

    LIDAR is a technology which has already proved useful in Archaeology and Cultural Heritage, it works by firing a pulsed laser beam at the ground and recording the returned beams, the time it takes for the beam to return is recorded and this is used to determine the distance. It is tied to the flight instruments of the light aircraft carrying the LIDAR and accurately records the 3D position and height of the results creating a dense point-cloud of the topography being recorded. The resulting LIDAR point data can be loaded in GIS systems.

    It has the potential to discover archaeological remain under woodland by removing points from the LIDAR point-cloud leaving only the points that hit the ground between the forest cover.

    Although not a cheap technology a number of LIDAR systems have recently been developed which can be carried as a payload on UAV systems. This includes the Phoenix Aerial Systems AL3 S1000 Copter which combines a DJI S1000 Octocopter with their AL3 technology which includes the Velodyne HDL-32 high definition LiDAR sensor.

    AL3 S1000 Copter

    Phoenix Aerial Systme – AL3 S1000 Copter

    If the UAV system recorded high quality photographs as well, these could be recorded in a separate flight using the same flight path, these could be used to overlay the LIDAR data.

    LIDAR can be analysed with a number of computer tools enabling more information to be visualised.

    LiDAR Data with Multiple Hillshades and with Principal Component Analysis (PCA).

    LiDAR Data with Multiple Hillshades and with Principal Component Analysis (PCA).

    Multi-Spectral and Hyper-Spectral Imaging

    Multi-Spectral and Hyper-Spectral imaging involves the recording of the electromagnetic spectrum outside the visible spectrum, this includes the infrared which can detect differences in ground moisture helping to determine what is below the ground level.

    Traditionally this has been done using satellites but spectral imagers are also available for UAV platforms.

    Comparative multispectral imagery of prehistoric field systems near Stonehenge © Historic England.NMR; Source Environment Agency

    Comparative multispectral imagery of prehistoric field systems near Stonehenge © Historic England.NMR; Source Environment Agency

    Ground Penetrating Radar (GPR)

    Ground Penetrating Radar is a technology that is used within field archaeology to discover buried features, it works by recording reflected radio waves that have been transmitted into the ground. GPR can be used on areas such as concrete, stone and tarmac where other geophysical techniques won’t work.

    MSc students from the University of Southampton carrying out a GPR survey in the vicinity of the Episcopio, between Portus and the Isola Sacra, Italy (https://kdstrutt.wordpress.com)

    The potential of having UAVs carry Ground Penetrating Radar recording equipment has already been tested in a number of fields including the detection of IEDs (Improvised Explosive Devices) and mines and the characterization of soil properties. But studies, including one at the University of Leicester, are looking into the potential of GPR carrying UAVs in archaeological recording.

    Bibliography

    Amiri, Amin, Kenneth Tong, and Kevin Chetty. “Feasibility study of multi-frequency Ground Penetrating Radar for rotary UAV platforms.” (2012): 92-92.

    Eisenbeiß, Henri. UAV photogrammetry. Zurich, Switzerland: ETH, 2009. http://www.igp-data.ethz.ch/berichte/Blaue_Berichte_PDF/105.pdf

    Gray, S. UAV Survey: A Guide to Good Practice, 2014. Part 1: http://www.jiscdigitalmedia.ac.uk/infokit/3d/uav-survey Part 2: http://guides.archaeologydataservice.ac.uk/g2gp/AerialPht_UAV

    Jacobs, Axel. “High dynamic range imaging and its application in building research.” Advances in building energy research 1, no. 1 (2007): 177-202.

    Li, Zhe, Yan Li, and Nankai Tian Jin. “Photogrammetric recording of ancient buildings by using unmanned helicopters-cases in China.” International Archives of the Photogrammetry, Remote (2011).

    Michael, Nathan, Shaojie Shen, Kartik Mohta, Yash Mulgaonkar, Vijay Kumar, Keiji Nagatani, Yoshito Okada et al. “Collaborative mapping of an earthquake‐damaged building via ground and aerial robots.” Journal of Field Robotics 29, no. 5 (2012): 832-841.

    Ntregkaa, A., A. Georgopoulosa, and M. Santana Quinterob. “Photogrammetric Exploitation of HDR Images for Cultural Heritage Documentation.” ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences 5 (2013): W1.

    Saleri, Renato, Nocerino Erica, Fabio Remondino, and Fabio Menna. “ACCURACY AND BLOCK DEFORMATION ANALYSIS IN AUTOMATIC UAV AND TERRESTRIAL PHOTOGRAMMETRY-LESSON LEARNT.” In International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. 2, pp. 203-208. 2013.

Kickstarter UAVs

With the change in the way that projects are funded a number of drone/UAV (Unmanned Aerial Vehicle) systems have been funded on Kickstarter, these systems are generally designed to fill a perceived gap in the market providing everything from wildlife conservation to tornado research.

A number of these have innovative technologies which can aid in the recording of archaeology/cultural heritage.

This blog entry will be an evaluation/comparison of these various systems, charting their speifications and abilities to help with archaeological and cultural heritage recording.

Kickstarter Drone/UAV Flight Specifications
Name No. of rotors Control System Flight time Autopilot Cost
AirDog 4 App and AirLeash 10-20 mins Pixhawk $1,295 – $1,495
Anura Pocket Drone 4 App 10 mins Unknown $275
Easy Done 4 RC Controller 12-16 mins Based upon APM 2.6 $985 – $1,545
HERO+ 6 App 15 mins Pixhawk $949.00 – $1,149.00
Incredible HLQ (Heavy Lift Quadcopter) 4 ? ? APM2.5+ ?
MicroFly 4 Computer wi-fi ? ? ?
Phenox 4 Voice and gesture 5 mins Voice and gesture controlled and programmable ?
RObot 4 also 4 wheel rover RC Controller ? None ?
The Pocket Drone 3 RC Controller / App 20 mins Pixhawk $549 – $599
ZANO 4 App 10 – 15 minute Unknown £169.95

Kickstarter Drone/UAV Payload/Sensors Specifications
Name Camera Gimbal Other Mounts Sensors/Other abilities
AirDog GoPro 2 or 3 Axis None
Anura Pocket Drone 720×1280 HD camera None None Fits in pocket
Easy Done GoPro – can lift heavier cameras None None Modular
HERO+ GoPro 2 or 3 Axis 360 Cam
Incredible HLQ (Heavy Lift Quadcopter) None, but can carry 50 pounds None None
MicroFly None None None None
Phenox 2 cameras to recognise the controller None None Range sensor
RObot None None None None
The Pocket Drone GoPro None None Collapsible compact design
ZANO OV5640 5MP camera None None Swarm capability

Conclusions
The obvious conclusion can be drawn that the GoPro camera is the single most popular camera for use with UAVs due to its size and capabilities. But of equal popularity is the series of open source autopilot systems designed by 3D Robotics (APM and Pixhawk), this demonstrates the quality and abilities of these products.

It could be suggested that the bubble has at least partially burst on the funding of UAV (Unmanned Aerial Vehicle) projects on Kickstarter from the number of new projects that have failed to reach their funding goal recently. Although some Kickstarter projects are probably not funded due to the fact that they bring nothing new to the area of UAVs, some others appear to have innovative ideas and still receive little funding.

The potential of autopilot follow-me technology for the recording of site tours has been discussed in the reviews of the AirDog and HEXO+ systems. The swarm capabilities of the ZANO system brings forward the possibility of multiple UAV systems recording the subject matter from multiple different angles at the same time allowing the recording of significantly more visual data than a single UAV.

New developments have been used to control the position of the done and the ability to avoid obstacles including ultrasonic distance measurement.

The micro and nano drone systems provide a means of recording a subject with systems that, in one case can fit in a pocket and otherwise, can be easily portable and taken to the subject that requires recording, even recording tight shaft trenches with a system that has ultrasonic distance detection technology keeping the system away from the edges of the metal sheet piles.

Zano

The Zano is a Kickstarter project Nano drone.

Amongst its features are:

  • 5 megapixel HD video camera
  • IR obstacle avoidance
  • Echo sounding sonar and high resolution air pressure sensor for altitude control
  • Follow me
  • Bidirectional motor control (Zano can drive motors in either direction)
  • iOS and Android compatible app
  • 10 – 15 minute flight time
  • 15 – 30 meter optimal operating range
  • 25 mph top speed
  • Can fly in “Free Flight” mode, using on-screen Joy Sticks). On screen slide bars control rotation and altitude
  • ZANO will hold its position unless instructed otherwise
  • Tilt control of the Zano using a smart device
  • Automatic return to smart device

Other project developments include:

  • Tracking of a set target through image processing
  • Facial recognition capability
  • 360 and 180 degree Panoramics
  • Swarming capability

The app will also allow the purchase of future developments of the system.

The ZANO will cost £169.95.

Potential
It’s different technologies allow number of different applications including site tours and recording videos of sites. The swarming capability opens up the possibility of recording sites with a number of different drones at the same time. While it’s size makes it very portable and easy to use in confined spaces allowing the recording of areas that other drones may not be able to reach.

It comes with a with two part SDK (Software Development Kit) which allows development of apps using the core functionality of the drone and even control of the system using a VR headset.

Limitations
It’s camera is quite low quality so the photographic/video results may not be high quality.

DJI SDK (Software Development Kit)

DJI Innovations has released a SDK (Software Development Kit) which allows developers and pilots access to the inner workings of some of their UAVs (Unammed Aerial Vehicles).

The SDK is designed to integrate with the DJI Phantom Vision and Vision+ via iOS and Android apps at a greater level than with the controller. It provides access to the camera, long range video downlink, gimbal, flight status system, battery, Wi-Fi range extender, secure Wi-Fi transmission, GPS information, telemetry information and flight control data.

The SDK comes in levels 1 and 2. Level 2 adds abilities linked to the ground station and flight control allowing long range piloting complete with route planning and full flight telemetry allowing complete control of the flight of the UAV, what the camera does and what happens to the images once they have been taken.

It does not work with the standard Phantom 2 quadcopters as the gimbal and camera do not come with the system so are not controlled in the same way.

Business Partners

The Software Development Kit has already allowed collaboration between a number of business partners.

Pix4D
Pix4d are a software company that is one of the leading providers of UAV photogrammetry processing software; they have developed an app that allows the user to configure an automatic mapping flight pattern for the DJI Phantom Vision+. The area to map and height are set and the software calculates when to capture images with the camera for the optimal capture of the area under study. The app then guides the user in how to best process the images using the Pix4Dmapper software.

The images taken are automatically tagged and geo-tagged.

The BETA version of the software is currently free only for Android devices on the app store, although an iOS will also be available soon. It supports the DJI Phantom Vision+ with support for other UAV systems being added later.

Drone Deploy
Drone Deploy have used the SDK to allow the integration of the UAV into the already existing DroneDeploy mapping system. This system allows the the control a whole fleet of UAVs remotely via a web interface, using a map interface to determine the area to be recorded together with camera type and other information, it even takes into consideration wind speed.

Photographs are automatically uploaded to the cloud and processed into 3D models, Digital Elevation Models and Orthophotos. If an out of focus photograph is detected the UAV can automatically refly this part. The system can also integrate with other recording sensor systems.

DroneDeploy is compatible with many different UAV systems but  other systems can be upgraded to use the system by adding a DroneDeploy CoPilot.

Screenshot 2015-01-29 11.24.38

DroneDeploy CoPilot

PixiePath

PixiePath‘s Drone Fleet Management Platform is a cloud-based system which can build complex drone applications, managing the coordinated movement and activities of fleets of UAVs in real time.

Pixiepath

PixiPath System Diagram

 

Field of View
Field of View offers a multispectral mapping payload for the DJI Phantom 2 and is increasingly its integration with the Phantom via a customized app.