Tag Archives: computer vision

News – Flyt

The uses of computer vision technologies in controlling UAV flight have been mentioned in some blogs already and will be discussed in more detail in a future blog. It can provide autonomous means of flight which are difficult to replicate manually allowing the use of a UAV in agriculture, inspections, surveys, delivery or emergency response.

The means to implement computer vision technologies on UAVs has generally meant proprietary technology or a complicated open source route with the setup of hardware and the installation of a number of pieces of software before experimentation can even begin.

But the work of  Indian Navstick Labs has developed a number of products to solve these problems.

FlyTOS

FlyTOS operating system is an application development framework built upon Linux and ROS (Robot Operating System), meaning an integration with ROS modules/libraries and sensors. It also supports the APM and PX4 (Pixhawk) open source autopilot systems.

The systems allows the development of obstacle avoidance, autonomous landing with AR tags and object recognition, tracking and following. It’s object tracking can use simple OpenCV based algorithms to detect objects using color and shape and use a Kalman Filter for tracking. It can also incorporate OpenTLD for selecting objects in a display and then following them (this was originally published in MATLAB by Zdenek Kalal).

FlytConsole and FlytVision are inbuilt on-board web apps that aid on in the creation of applications.

It comes with  a web-based control station called FlyConsole and a 3D simulator called FlytSim.

The software can be downloaded for free and installed on an ODROID XU4 companion computer.

ODROID XU4

ODROID XU4

FlytPOD

The FlytPOD – Advanced Flight Computer is a companion computer system running the FlytOS system which is currently being funded as part of an Indiegogo project.

As well as coming with a suspended IMU for vibration damping and an external magnetometer it also supports RTK (Real Time Kinematics) GPS.

It’s USB3.0,  USB2.0, HDMI and user configurable I/Os connectors support a number of systems out-of-the-box including a Gimbal, PX4Flow (Optical Flow sensor), LiDAR (distance sensor) and USB Cameras. While the hardware interfaces on the FlytPOD support a number of specialized sensors including multi-spectral cameras, stereo cameras and LiDAR.

It is designed to be able to process photographs in the companion computer and stream them to ground.

The system comes in two models:

  1. FlytPOD Kit – uSD storage. It costs $499 ($399 in Indiegogo).
  2. FlytPOD PRO Kit –  This kit has the same features as the basic one but also offers sensor redundancy, with triple 3-axis Accelerometers and 3-axis Gyroscopes as well as double external Magnetometers, Barometers and External GPS. It also comes with the faster eMMC storage. It cost $799 ($699 on Indiegogo).

The Indiegogo funding ends on the 3rd October.

Advertisements

UAVs for site tour recording – Part 1 – Theory

Thanks to UAVs there is a growing potential for the provision of high quality visualizations of sites from the air for public consumption; whether as part of the requirement of many archaeology companies as charities, as part of planning policies to interact with the public, or the growing importance of crowdfunding archaeological excavations (DigVentures) which require interaction with their backers. UAVs can provide a means of providing this sort of imagery as part of an overall recording strategy. This includes the recording of site tours which can provide details of a sites which can easily be disseminated to the public.

At its simplest the UAV can provide an aerial element to the video of the site tour by flying past or through elements of the site or flying past or hovering in front of the site tour guide.

The DJI Inspire 1 is one such aerial video platform which can be purchased with two remote controllers; one for controlling the UAV, while the other is used to control the camera gimbal. This allows a pilot to fly the UAV on a set path while someone experienced in film making has complete control of the camera.

DJI Inspire 1

DJI Inspire 1

Although the UAV can provide an excellent platform for aerial video recording as part of site tours, recently developed technologies can make this much more automated and provide a means for one person to both:

  1. The site tour guide.
  2. The UAV pilot recording the site tour.

There are two ways in which this can be done.

1. GPS ‘Follow Me’ technology

'Follow Me' technology (DroneDog using Pixhawk)

‘Follow Me’ technology (DroneDog using Pixhawk)

This functionality is available on many UAVs, including some of the DJI series and those using the open source PX4 and Pixhawk autopilot technologies.

With the PX4/Pixhawk systems the mode can be controlled from a number of base station software solutions including Tower, which can run on Android mobile devices such as smartphones.

The systems uses the GPS of the mobile device as a target for the UAV.

A number of cinematic controls for the UAV are available in the app:

  • Leash – UAV follows actor.
  • Lead – UAV leads actor pointing back at them.
  • Left/Right – UAV keeps pace with actor to the side.
  • Circle – UAV circles actor at specified radius.
'Follow Me' controls (3DR Tower)

‘Follow Me’ controls (3DR Tower)

The following parameters can also be set:

  • Altitude.
  • Radius.
3DR Tower - Altitude and Radius

3DR Tower – Altitude and Radius

The system also controls the camera gimbal, pointing the camera towards the GPS enabled device.

Together these controls can provide various aerial video elements useful for integration in a site tour video which can be controlled directly from the mobile device in the hand of the site tour guide.

2.Computer vision technologies

Computer Vision technologies are an important developing area in robotics and are beginning to be fitted to UAVs.

Some of these technologies use image recognition algorithms to match the subject matter between consecutive video frames allowing the UAV to follow a person or object even when it is rotating and so changing the way it appears.

They come in three forms:

A. Software

Currently in beta testing the Vertical Studio app (available on iOS and Android) uses existing camera hardware on the DJI Phantom 3 or Inspire to provide the imagery for the image recognition algorithms running in the app. A target is chosen in the app which then controls the flight of the UAV.

Vertical Studio App

Vertical Studio App

You can also draw walls in the app that designate no fly areas for the UAV.

Walls in the Vertical Studio App

Walls in the Vertical Studio App

B. Add-on technology

The second is an add-on technology that is fitted to an existing UAV, which connects to the autopilot and controls the flight of the UAV. In the case of the Percepto (funded on the Indiegogo crowdfunding website) the processing is done in a companion computer while the video is taken from an add-on camera, controls are then sent to the autopilot and gimbal to control the movement of them in relation to the subject matter.

Percepto Tracking

Percepto Tracking

Percepto Kit

Percepto Kit

C. Integrated technology

The third is an an integral part a newly built UAV, but is in effect a very similar technology to B.

This is the case with the soon to be released DJI Phantom 4, which is the first commercially available UAV with the technology integrated into it.


The app connects to a companion computer on the UAV which uses the imagery from the camera as a source for the computer vision algorithms. Once again the subject matter is selected in the app and the UAV will follow it.

Phantom 4 App

Phantom 4 App

 

Sources
https://3dr.com/kb/follow-instructions/

http://www.dji.com/product/phantom-4

http://www.dji.com/product/intelligent-flight-modes

http://vertical.ai/features/

http://www.percepto.co/