The Open Source Dronecode Project has been announced under the auspices of the Linux Foundation, it will bring together existing projects including the APM/ArduPilot and PX4 open source autopilot systems as well as advancing new technologies. It will provide a common platform for Drone and robotics opens source projects aiming to unite the open source industry.
The maker community has already dramatically increased the development of drones and the Dronecode Project is hoping to advance the technologies required and both improve them and make them more affordable.
The Linux Foundation can provide an existing organisation and collaborative framework allowing the the Dronecolde Project to concentrate on the innovation of new technology.
Google Cardboard are VR (Virtual Reality) goggles made out of cardboard using an android smart phone as the central processing unit and display via the Google Cardboard App. The cardboard shell can either be purchased for less than £10 complete with lenses and magnet button control or the cardboard design downloaded from the website and the other parts purchased separately.
The mobile phone provides you with orientation tracking using a gyroscope and accelerometer built into the phone which means that the the app can track the movement of the user’s head updating the imagery on the phone depending on the direction that the user is facing.
A magnetic trigger on the outside of the cardboard allows interaction with the VR environment by effecting the magnetometer in the phone. Although with calibrated magnetometers this can only act as a single button, with uncalibrated magnetometers incorporated in newer phone there is greater variety of abilities with the possibility of incorporating a joypad into the outside of the case.
The Google Camera App (On Android 4.4 and later) can record 360° Photo Spheres which can easily be viewed in the Google Cardboard App on the smartphone, other Photo Spheres can be viewed be editing their file names. The app can also view videos on YouTube including those designed for the Oculus Rift or other VR Gear with two separate views in the video. Integration with Google Earth and Google Maps Street View is also possible.
Two SDKs (Software Development Kits) can be downloaded from the website:
- The first is the Cardboard SDK for Android which allows VR applications to be quickly created in OpenGL.
- The second is the Cardboard SDK for Unity which allows an application created in the Unity 3D game engine to be viewed in the Google Cardboard Goggles or to design one from scratch.
Although designed for phones with the Android operating system, phones using iOS can also be used in the Google Cardboard using Durvois Dive (a plastic VR Goggle frame which also uses smartphones) apps.
Google Cardboard was designed to both allow the cheap and easy ability for almost anyone to view VR and to help push forward development of the systems.
It has the ability to both view virtual reconstructions of sites and view still 360° photographs and immersive videos of sites, these can easily be downloaded and viewed by anyone anywhere in the world using the technology. The fact that the phone can be used to create 360° Photo Spheres as well enables the both the recording and viewing of views of cultural heritage and excavations with technology that may already be owned.
Because it uses a smartphone there are limitations to its abilities that would not be there with more powerful computer systems. The quality of the imagery is also completely dependent on the quality of the smartphone screen.
Although the control is limited to the one button on the outside of the Google Cardboard some wi-fi/bluetooth game controllers can be used with Android operating systems allowing much more interaction. There have however been problems with the button working, particularly on certain models, it is after-all a technical workaround to use a device for a function it was not designed for.
Posted in Terrestrial, Wearable
Tagged android, camera, Google, Google Cardboard, Google Earth, immersive, oculusrift, open_source, photo sphere, virtual environment, virtual_reality
The Pixhawk is an high-performance autopilot system designed by the PX4 open-hardware project and manufactured by 3D Robotics that can be used with fixed wing aircraft, multi-rotor aircraft, cars, boats and any other autonomous vehicle. It is designed for everything from research, amateurs and industry.
It is a combination of the PX4FMU Autopilot / Flight Management Unit and the PX4IO Airplane/Rover Servo and I/O Module previously created by PX4.
The Pixhawk includes the following sensors:
- ST Micro L3GD20H 16 bit gyroscope.
- ST Micro LSM303D 14 bit accelerometer / magnetometer.
- MEAS MS5611 barometer.
It also allows the connection of a number of other useful external devices such as:
It is controlled by an app available on a number of different hardware platform; either Mission Planner (Windows) or APM Planner for (Windows, OS X, and Linux) and Droiplanner2 on Android.
Aerial flight paths can easily be set in the software by clicking way points on a map of the area.
By selecting a polygon around the area the software can create a grid flight pattern to fly; the altitude, camera type and overlap of images can also be set which alters the amount of times the aerial vehicle flies across the area under study. This allows either an image mosaic to be created or a photogrammetry model.
The 3DR Radio Set allows wireless communication between the Pixhawk and an Android device using the DroidPlanner or Andropilot ground station app, while the inclusion of a bluetooth data link also allows an Android device with ground station apps and Bluetooth to connect to the Pixhawk. Both of these options also allow the use of the Follow Me mode in the software which allows the autopilot to follow the system that is running the app.
As it is an open hardware project the schematics can be downloaded.
The Pixhawk costs $199.99 in its basic form, $474.97 with all of the standard available options.
The Pixhawk has great potential for the control of any type of autonomous vehicle, whether flying mapping missions, flying a per-determined course to record things, or in “follow-me” mode recording site tours. The fact that it is part of an open-source community means that it is continually in development with input from the people who are using the technology.
It has become so popular in the industry that it is the technology used in a number of Kickstarter UAV projects including the AirDog and HEXO+ as well as 3D Robotics’s IRIS+ quadcopter.
The community provides extensive instructions for the system and its uses.
The OpenROV Kickstarter project is an open-source underwater robot for exploration and education.
It can reach a depth of 100 meter/328 feet of seawater which is more than double the depth that recreational SCUBA divers can reach and the battery can last up to 2 hours.
The system is powered by two horizontal thrusters at the rear of the ROV (remotely operated underwater vehicle) which allows it to move forward and aft as well as rotate, and a vertical thruster which allows the vehicle to change depth.
The system is controlled via a web interface with the tether cable attached to a computer using an ethernet cable from “Topside Adapter” box which connects to the tether cable.
The OpenROV comes with:
- Live HD video with a wide-angle lens and a tilt function.
- LED lighting which means that it can work low-light environments.
- Beaglebone Black AND Arduino MEGA microprocessors, which have dozens of input/output channels and are powerful enough computers to allow a number of features and experiments to be run.
- A 100-meter lightweight 2-wire tether cable.
- A payload area where additional hardware or equipment can be incorporated.
The OpenROV is available as a kit for $849 or fully assembled for $1,450.
The OpenROV allows the quick and cheap exploration of sites of archaeological interest at depths beyond what recreational divers can reach, this includes the exploration of shipwrecks.
Work has undergone on adding laser scaling abilities for underwater archaeology and other functions.
It has its own community which provides assitance with the OpenROV, shares adventures as well as developing the system.
The length of the tethering cable which attaches to the OpenROV limits its depth but also poses a hazard where the cable might be cut and the system lost.
It is not autonomous so must be controlled by a driver on the surface.