Tag Archives: odroid

News – Flyt

The uses of computer vision technologies in controlling UAV flight have been mentioned in some blogs already and will be discussed in more detail in a future blog. It can provide autonomous means of flight which are difficult to replicate manually allowing the use of a UAV in agriculture, inspections, surveys, delivery or emergency response.

The means to implement computer vision technologies on UAVs has generally meant proprietary technology or a complicated open source route with the setup of hardware and the installation of a number of pieces of software before experimentation can even begin.

But the work of  Indian Navstick Labs has developed a number of products to solve these problems.

FlyTOS

FlyTOS operating system is an application development framework built upon Linux and ROS (Robot Operating System), meaning an integration with ROS modules/libraries and sensors. It also supports the APM and PX4 (Pixhawk) open source autopilot systems.

The systems allows the development of obstacle avoidance, autonomous landing with AR tags and object recognition, tracking and following. It’s object tracking can use simple OpenCV based algorithms to detect objects using color and shape and use a Kalman Filter for tracking. It can also incorporate OpenTLD for selecting objects in a display and then following them (this was originally published in MATLAB by Zdenek Kalal).

FlytConsole and FlytVision are inbuilt on-board web apps that aid on in the creation of applications.

It comes with  a web-based control station called FlyConsole and a 3D simulator called FlytSim.

The software can be downloaded for free and installed on an ODROID XU4 companion computer.

ODROID XU4

ODROID XU4

FlytPOD

The FlytPOD – Advanced Flight Computer is a companion computer system running the FlytOS system which is currently being funded as part of an Indiegogo project.

As well as coming with a suspended IMU for vibration damping and an external magnetometer it also supports RTK (Real Time Kinematics) GPS.

It’s USB3.0,  USB2.0, HDMI and user configurable I/Os connectors support a number of systems out-of-the-box including a Gimbal, PX4Flow (Optical Flow sensor), LiDAR (distance sensor) and USB Cameras. While the hardware interfaces on the FlytPOD support a number of specialized sensors including multi-spectral cameras, stereo cameras and LiDAR.

It is designed to be able to process photographs in the companion computer and stream them to ground.

The system comes in two models:

  1. FlytPOD Kit – uSD storage. It costs $499 ($399 in Indiegogo).
  2. FlytPOD PRO Kit –  This kit has the same features as the basic one but also offers sensor redundancy, with triple 3-axis Accelerometers and 3-axis Gyroscopes as well as double external Magnetometers, Barometers and External GPS. It also comes with the faster eMMC storage. It cost $799 ($699 on Indiegogo).

The Indiegogo funding ends on the 3rd October.

Advertisements

Autonomous Systems Launch Event – University of Southampton

On the 20th of March I was present at the Launch Event for the Autonomous Systems USRG (University Strategic Research Group) at the University of Southampton.

It included a number of 3 minute presentations some of which were very pertinent to the autonomous recording of Archaeology and Cultural Heritage.

Control and Implementation of Autonomous Quadcopter in GPS-denied Environments – Chang Liu

Chang Liu is a PHD student in the department of Engineering.

He has been working in using Optical flow technology and ultrasonic sensors to control the velocity of UAVs using his own autopilot system in an environment without the ability to use GPS.

He is currently perfecting a system using a monocular uEye camera and Quad Core Linux ODROID computer running the ROS (Robot Operating System) using SLAM (Simultaneous Localization And Mapping) algorithms to enable the single camera to identify natural features and act as a more advanced position hold technology.

DSC_0193

Chang Liu’s Autonomous Quadcopter

DSC_0195

Chang Liu’s Analysis Software

 

Autonomous UAVs for Search and Rescue and Disaster Response

Dr. Luke Teacy of the Department of Electronics and Computer Science (ECS) discussed the use of autonomous UAVs in search and rescue and disaster response by co-ordination between multiple platforms. Their low cost and the fact that they are easy to deploy make them ideal for the purpose.  Camera equipped UAV can search for someone in the wilderness using computer vision to see the person. He is using observation modelling to see how the view of a person is effected by distance and how to maximise the information required to find a person. And discussed the issues of how to control UAVs and allocate them to a task using the Monte Carlo Tree search algorithm as well as path planning.

Human Agent Collaboration for Multi-UAV Coordination

Dr. Sarvapali Ramchurn of the Department of  Electronics and Computer Science (ECS) discussed the MOSAIC (Multiagent Collectives for Sensing Autonomy Intelligence and Control) Project. His work involved the allocation of tasks to UAVs, which may have different capabilities, by a human-agent.  Once a task is set the nearest drone will move to complete the task. By teaming the UAVs up to accomplish tasks it maximises efficiency. If a UAV fails a new one takes it place, while when new tasks are allocated a UAV is reallocated to the task.  He discussed using the Max-Sum algorithm to coordinate tasks between the UAVs autonomously.

An intelligent, heuristic path planner for multiple agent unmanned air systems

Chris Crispin is a PhD program in Complex Systems Simulation and is part of the  ASTRA environmental monitoring Project. It involves a group of unmanned vehicles co-coordinating with each other in the mapping. Feasible optimal flight paths are designated and searched along. Once the UAVs begin flying areas are designated with a level of uncertainty by a central computer, and this determines whether a UAV is sent to this area, with the higher the uncertainty the more likely a UAV will be dispatched to map it.

The UAVs use an ODROID-C1 Quad Core Linux computer using a a PX4 Autopilot, while the control computer is an ODROID-XU3. The system uses the JSBSim open source flight dynamics model (FDM).

https://sotonastra.wordpress.com/

Archaeology and autonomous systems

Dr. Fraser Sturt of the Department of Archaeology discussed the various potential applications of autonomous systems in archaeology. This included survey and site identification by high resolution photographic and topographical mapping. He also discussed the benefits of multi-spectral imaging in seeing adobe (mud brick) structures, and how the results have shown that rather than collapsing the Nasca civilisation had moved to the coast. Next he discussed the potential of GPR (Ground Penetrating Radar) being carried on UAVs. And finally discussed the fact that there are approximately 3 million shipwrecks worldwide which need to be studied/made stable.