Tag Archives: px4

UAVs for site tour recording – Part 1 – Theory

Thanks to UAVs there is a growing potential for the provision of high quality visualizations of sites from the air for public consumption; whether as part of the requirement of many archaeology companies as charities, as part of planning policies to interact with the public, or the growing importance of crowdfunding archaeological excavations (DigVentures) which require interaction with their backers. UAVs can provide a means of providing this sort of imagery as part of an overall recording strategy. This includes the recording of site tours which can provide details of a sites which can easily be disseminated to the public.

At its simplest the UAV can provide an aerial element to the video of the site tour by flying past or through elements of the site or flying past or hovering in front of the site tour guide.

The DJI Inspire 1 is one such aerial video platform which can be purchased with two remote controllers; one for controlling the UAV, while the other is used to control the camera gimbal. This allows a pilot to fly the UAV on a set path while someone experienced in film making has complete control of the camera.

DJI Inspire 1

DJI Inspire 1

Although the UAV can provide an excellent platform for aerial video recording as part of site tours, recently developed technologies can make this much more automated and provide a means for one person to both:

  1. The site tour guide.
  2. The UAV pilot recording the site tour.

There are two ways in which this can be done.

1. GPS ‘Follow Me’ technology

'Follow Me' technology (DroneDog using Pixhawk)

‘Follow Me’ technology (DroneDog using Pixhawk)

This functionality is available on many UAVs, including some of the DJI series and those using the open source PX4 and Pixhawk autopilot technologies.

With the PX4/Pixhawk systems the mode can be controlled from a number of base station software solutions including Tower, which can run on Android mobile devices such as smartphones.

The systems uses the GPS of the mobile device as a target for the UAV.

A number of cinematic controls for the UAV are available in the app:

  • Leash – UAV follows actor.
  • Lead – UAV leads actor pointing back at them.
  • Left/Right – UAV keeps pace with actor to the side.
  • Circle – UAV circles actor at specified radius.
'Follow Me' controls (3DR Tower)

‘Follow Me’ controls (3DR Tower)

The following parameters can also be set:

  • Altitude.
  • Radius.
3DR Tower - Altitude and Radius

3DR Tower – Altitude and Radius

The system also controls the camera gimbal, pointing the camera towards the GPS enabled device.

Together these controls can provide various aerial video elements useful for integration in a site tour video which can be controlled directly from the mobile device in the hand of the site tour guide.

2.Computer vision technologies

Computer Vision technologies are an important developing area in robotics and are beginning to be fitted to UAVs.

Some of these technologies use image recognition algorithms to match the subject matter between consecutive video frames allowing the UAV to follow a person or object even when it is rotating and so changing the way it appears.

They come in three forms:

A. Software

Currently in beta testing the Vertical Studio app (available on iOS and Android) uses existing camera hardware on the DJI Phantom 3 or Inspire to provide the imagery for the image recognition algorithms running in the app. A target is chosen in the app which then controls the flight of the UAV.

Vertical Studio App

Vertical Studio App

You can also draw walls in the app that designate no fly areas for the UAV.

Walls in the Vertical Studio App

Walls in the Vertical Studio App

B. Add-on technology

The second is an add-on technology that is fitted to an existing UAV, which connects to the autopilot and controls the flight of the UAV. In the case of the Percepto (funded on the Indiegogo crowdfunding website) the processing is done in a companion computer while the video is taken from an add-on camera, controls are then sent to the autopilot and gimbal to control the movement of them in relation to the subject matter.

Percepto Tracking

Percepto Tracking

Percepto Kit

Percepto Kit

C. Integrated technology

The third is an an integral part a newly built UAV, but is in effect a very similar technology to B.

This is the case with the soon to be released DJI Phantom 4, which is the first commercially available UAV with the technology integrated into it.


The app connects to a companion computer on the UAV which uses the imagery from the camera as a source for the computer vision algorithms. Once again the subject matter is selected in the app and the UAV will follow it.

Phantom 4 App

Phantom 4 App

 

Sources
https://3dr.com/kb/follow-instructions/

http://www.dji.com/product/phantom-4

http://www.dji.com/product/intelligent-flight-modes

http://vertical.ai/features/

http://www.percepto.co/

Advertisements

Autonomous Systems Launch Event – University of Southampton

On the 20th of March I was present at the Launch Event for the Autonomous Systems USRG (University Strategic Research Group) at the University of Southampton.

It included a number of 3 minute presentations some of which were very pertinent to the autonomous recording of Archaeology and Cultural Heritage.

Control and Implementation of Autonomous Quadcopter in GPS-denied Environments – Chang Liu

Chang Liu is a PHD student in the department of Engineering.

He has been working in using Optical flow technology and ultrasonic sensors to control the velocity of UAVs using his own autopilot system in an environment without the ability to use GPS.

He is currently perfecting a system using a monocular uEye camera and Quad Core Linux ODROID computer running the ROS (Robot Operating System) using SLAM (Simultaneous Localization And Mapping) algorithms to enable the single camera to identify natural features and act as a more advanced position hold technology.

DSC_0193

Chang Liu’s Autonomous Quadcopter

DSC_0195

Chang Liu’s Analysis Software

 

Autonomous UAVs for Search and Rescue and Disaster Response

Dr. Luke Teacy of the Department of Electronics and Computer Science (ECS) discussed the use of autonomous UAVs in search and rescue and disaster response by co-ordination between multiple platforms. Their low cost and the fact that they are easy to deploy make them ideal for the purpose.  Camera equipped UAV can search for someone in the wilderness using computer vision to see the person. He is using observation modelling to see how the view of a person is effected by distance and how to maximise the information required to find a person. And discussed the issues of how to control UAVs and allocate them to a task using the Monte Carlo Tree search algorithm as well as path planning.

Human Agent Collaboration for Multi-UAV Coordination

Dr. Sarvapali Ramchurn of the Department of  Electronics and Computer Science (ECS) discussed the MOSAIC (Multiagent Collectives for Sensing Autonomy Intelligence and Control) Project. His work involved the allocation of tasks to UAVs, which may have different capabilities, by a human-agent.  Once a task is set the nearest drone will move to complete the task. By teaming the UAVs up to accomplish tasks it maximises efficiency. If a UAV fails a new one takes it place, while when new tasks are allocated a UAV is reallocated to the task.  He discussed using the Max-Sum algorithm to coordinate tasks between the UAVs autonomously.

An intelligent, heuristic path planner for multiple agent unmanned air systems

Chris Crispin is a PhD program in Complex Systems Simulation and is part of the  ASTRA environmental monitoring Project. It involves a group of unmanned vehicles co-coordinating with each other in the mapping. Feasible optimal flight paths are designated and searched along. Once the UAVs begin flying areas are designated with a level of uncertainty by a central computer, and this determines whether a UAV is sent to this area, with the higher the uncertainty the more likely a UAV will be dispatched to map it.

The UAVs use an ODROID-C1 Quad Core Linux computer using a a PX4 Autopilot, while the control computer is an ODROID-XU3. The system uses the JSBSim open source flight dynamics model (FDM).

https://sotonastra.wordpress.com/

Archaeology and autonomous systems

Dr. Fraser Sturt of the Department of Archaeology discussed the various potential applications of autonomous systems in archaeology. This included survey and site identification by high resolution photographic and topographical mapping. He also discussed the benefits of multi-spectral imaging in seeing adobe (mud brick) structures, and how the results have shown that rather than collapsing the Nasca civilisation had moved to the coast. Next he discussed the potential of GPR (Ground Penetrating Radar) being carried on UAVs. And finally discussed the fact that there are approximately 3 million shipwrecks worldwide which need to be studied/made stable.