These Google Glass control was migrated from the prototype 3D printed frame to a large heavy lift quadcopter gimbal. Below is a demonstration of real-time wireless control of the pitch and roll axes of a heavy-payload carrying gimbal, controlled hands-free by Google Glass. This control is a applied to a project I worked on at Simera Technology Group, see: Anax Heavy Lift Quadcopter.
Unfortunately the yaw motor for the large gimbal is faulty, but a full three-axis demonstration video will be posted as soon as the gimbal motor is replaced.
To apply this project to your own gimbal system, you can see my instructable:
The figure below shows the control schematic and illustrates the principle of wireless gimbal control, which will be discussed in this post. (Click on it for clearer picture)
Wi-Fi was chosen for this application since it has the furthest range and is an available technology on both the Glass and the Arduino. In each wireless technology there may be a range of protocols one can use to send data. The two predominant protocols in the Wi-Fi class are web-sockets and User Dene Protocol (UDP) packages. Web-sockets are generally used when package delivery is very important, a connection is made with handshakes and acknowledge packages are sent to and fro. This protocol is unnecessarily complicated to implement and it’s functionality will not be fully utilised. UDP packages are a lot simpler to implement: the package is sent on a specific port to a specific IP address. Whether or not the package arrives at its destination is unknown to the client, but packages can be sent at a high frequency to ensure delivery of data, although it might not be package-specic delivery. This functionality is good enough for gimbal control, considering the fact that the frequency at which packages are sent can be defined, and the frequency at which the Arduino microcontroller scans for packages on a specic port can also be defined.
In order to achieve wireless gimbal control, it was first necessary to test wireless control of the gimbal independent of the Google Glass. This was done using a program called SocketTest v.3.0.0 to send commands to the Arduino Mega, which then relayed those commands to the gimbal controller through its UART port. SocketTest can send or receive web-sockets or UDP strings which can be used to command the gimbal through the gimbal controller’s UART port, BasecamElectronics’ serial API library for Arduino was used for that purpose, which can be found here:
The video below shows how wireless gimbal control was tested using the SocketTest program and an Arduino with a Wi-Fi shield. Commands are sent to the Alexmos gimbal controller through its UART serial port.
Wireless Control with Google Glass:
Google Glass applications are fundamentally dierent to other Android applications because Glass has a dierent hardware structure and capabilities. Glass also has a completely dierent design philosophy, and therefore the style in which applications are written dier between Glass and a mobile phone or tablet. Glass has two dierent types of applications, immersion activities and live cards. Immersion activities are the more well-known type of applications where a user enters and exits the application, whereas live cards run in the background, for example a stop-watch. The gimbal control application written for this project is done in a immersion activity, since the user will only be using the Glass to control the gimbal and won’t be relying on multi-functionality or parallel task execution.
The primary IDE used in Android application development is Android Studio which uses Java code for the functionality and structured XML for the user interface or GUI of the application. XML code is used in dierent les to set up the menu ow of the program and denes the visual features and labels on each screen and menu item. The menu flow was based on the functional requirements of the Glass application, and is simple and intuitive to use. Each menu option in the XML code links to a functional Java code le that executes the relevant commands when selected. These commands implement the functionality on Glass to achieve the necessary results.
The video below shows how Google Glass controls the gimbal on its pitch and yaw axes. Unfortunately there is too much interference in the lab to use normal head movements (such as looking left and right) to control the gimbal on its yaw axis, since a sensitive magnetometer is used to get those sensor readings.
Below is a video demonstration of my wireless gimbal control project on both the yaw and pitch axes.
Code samples from Beginning Android Wearables by Andres Calvo were extracted and used as guidance for the development of the Glass’ user interface and functionality (Calvo, 2015).
To implement wireless control via Google Glass, code samples from Beginning Google Glass Development by Jeff Tang were extracted and used as guidance for the development of the Glass’ wireless functionality.
The gimbal controller used in this project is the BaseCam Electronics 32-bit three-axis brushless control board with a frame and main IMU. The frame IMU is on board the controller itself, which is mounted on the gimbal, below-yaw (as sown in the figure below). The main IMU is mounted on the pitch axis where it is subject to all three axes’ movements (also sown in the figure below). These IMUs send real-time feedback to the gimbal controller about the orientation of the surface they are mounted on, relative to their calibration orientations. This type of closed-loop feedback is used to implement Proportional, Integral, Derivative (PID) control algorithms which correct for any errors between the target angle and the current angle of the payload. The gimbal controller is power by a 12 volt 3 cell lithium-polymer battery and in turn powers all the other electronics in the system. The PID algorithm is developed by BaseCam electronics, but the PID values are user-dened, depending on the gimbal properties such as weight, size and motor size. The PID calibration process requires some knowledge of PID controllers and the aects that each term has on the control of the gimbal. Each axis’ gimbal motor is tuned independently as to avoid interference from noise of the un-tuned motors.
The derivative control acts like a holding brake on the gimbal motors, minimising overshoot. Too high D-value results in high frequency oscillations of the gimbal motor as it tries to correct for overshoot. The Proportional value adds gain to the motor power, which provides torque to correct the error in the motors target angle and its current angle. Too high P-value results in low frequency oscillations of the motor. The Integral value denes the rate at which the motor corrects for an
error, too high I-value could result in low frequency oscillations and overshoot of the target angle. The PID values were iteratively tuned in the order of derivative, proportional and then integral control until eective stabilisation was achieved.
The video below shows the stabilisation characteristics of the gimbal and how effective it is at keeping the payload level.
The main scope for my thesis is to control a gimbal via Wi-Fi.Google Glass is a wearable technology with an optical head-mounted display developed by Google X, a semi-secret facility run by Google which makes major technological advancements. Since its release, Google Glass has been discontinued following accumulation of user feedback, both good and bad. Google decided to close the program in order to focus on future versions of Glass and to improve on the current platform. This has positive implications for this project since the technology being used is only available to a limited audience and is still considered to be cutting-edge technology. Unfortunately this also means that the amount of helpful resources available are minimal and there are only a handful of scholarly articles focussed on Glass, none of which involve gimbal control.
Glass displays information and exhibits hands-free control properties. It contains a multitude of hardware and software sensors. The operating system is based on Android 4.4.2 (API 19) and uses the Android Studio IDE to develop functional applications with real-world interactive elements. Some of the useful hardware sensors on Glass are:
3-axis magnetometer (compass)
Ambient light sensor
Glass also has software sensors which gives useful real-time data calculated from the hardware sensors. Some of the useful software sensors used in this project are:
The rotation vector software sensor is the most useful for this project application. The rotation vector uses the gyroscope, accelerometer and magnetometer data to set up a matrix describing the rotation of the Glass. This information is then used to get quaternion values for the orientation of the Glass. The orientation of glass is described in terms of yaw, pitch and roll angles in degrees. Yaw, also called azimuth, is the angle of glass around the Y-axis (in the figure below) relative to magnetic north and perpendicular to the horizon. This value tends to be prone to electromagnetic interference
(EMI) in noisy environments. Since the application of this project is focussed on SAR, there generally shouldn’t be a lot of EMI during operation. Pitch is the angle of Glass around the X-axis relative to the horizon. Roll is the angle of Glass around the Z-axis relative to the horizon. Pitch and roll values are reliable and are not prone to interference, they can therefore be used to replace the functionality of the yaw angle when there is interference. These values are sent to the Arduino via
Wi-Fi to control the angle of the gimbal.
The initial design criteria of the gimbal was based around the selection of the gimbal payload, which is specifically tailored for search and rescue purposes. One of the most advanced and commercially available infra-red cameras on the market is the FLIR Vue Pro. This thermal imaging camera is perfect for SAR applications and is lightweight and small. With that being said, the gimbal has to be capable of taking modular payloads of various sizes. With this in mind, the gimbal was designed to accommodate cameras ranging from miniature sizes up to small SLR cameras weighing up to 700 grams. This set the engineering requirement for a specic brushless gimbal motor size. The motor specications are set by the manufacturer according to the maximum payload weight the motors can handle. This is a physical constraint owing to the rotational inertial properties of the camera as it sits on the gimbal.
The most important design consideration in a three-axis gimbal is the adjustable axis mounts which gives the gimbal the ability to maintain its payload on the centre of gravity. This design feature is clearly seen in the CAD drawing in the figure below. Each axis mount has adjustable fastening positions on two axes, allowing for six degrees of freedom adjustability of the payload, depicted as a hand-held camera in the assembly. Owing to the main focus of this project not being the design of the gimbal itself, but rather the control system of the gimbal, the preferred manufacturing technique was rapid prototyping. There are several methods of rapid prototyping but the most accessible and commercially available method is 3D printing by Fused Deposition Modelling. Most commercial or hobbyist built 3D printers operate by FDM.
The 3D printed parts printed well and can be seen in the figure below.The blue parts are PLA, a printing filament that is a lot easier to print with, but more brittle and not as strong. And the white parts are ABS, a filament that can be a real pain because it tends to warp as it cools down. Both materials were used to see how they hold up under stress and to take that knowledge into consideration in the final design.
The electronics that were chosen to be integrated into the gimbal were:
Camera stabilizing gimbals have emerged with a boom. The first 2-axis gimbals were servo-controlled and mounted on multi-rotors to stabilize the video or FPV (First Person View) camera. Since then the technology has advanced drastically and brushless gimbal motors are a common sight on the market. The whole point of a gimbal is to keep a payload stable and in the same orientation independent of what its mounting platform is doing.
The video below demonstrates the principle:
That was a handheld brushless camera stabilization rig, I am designing a 3-axis brushless gimbal that is going to be mounted on a multi-rotor, most probably a quadcopter.
The idea behind the whole thesis is that it can be used for SAR (Search and Rescue) purposes. As a member of a SAR party, you want a unobstructed view of your surrounding environment, but a additional pair of eyes in the sky could be very helpful as well. Google Glass is one of the few available HUD (Head Up Display) platforms that gives you the ability to view a HUD screen whilst still having a unobstructed view of your surrounding. The screen doesn’t have to be very big if the camera payload is a infrared camera that looks for someone’s heat signature, something like that would be easily identifiable on the Google Glass’ screen.
The idea is to have a quadcopter that is flying above you, and you can control the gimbal simply by moving your head around. Additionally a position lock function would be very useful. Now this is no simple task, but it is something I would like to add to the whole system. The control algorithm will be based on a trigonometric mathematical model of the orientation of the gimbal, the heading of the quadcopter and the direction in which the gimbal is facing. All of this information will be provided by a GPS, Pressure sensor and compass, and the base controller will be a Arduino Mega 2560.
When a position lock is activated, the gimbal should face the camera lens so that the general area below the quadcopter stays in view, despite the movements of the quadcopter. Naturally the head movements of the controller will have no effective input to the system in this state. The controller then has a visible image of a certain area without having to maintain head control over the gimbal.
If this project is successful, it could prove to be a basis on which UAV (Unmanned Aerial Vehicle) assisted SAR operations are conducted and can be improved.
When the time came to decide on a topic for my undergraduate thesis I was clueless. Fortunately for me I had the amazing opportunity to get involved with a awesome company called Simera Technology Group. There I had my first experience of a real-world engineering business environment.
My boss had recently purchased a pair of Google Glasses for the company to play around with and do development on, and it was on that topic that he suggested I try and integrate some form of head movement control with a gimbal and Google Glass. I was ecstatic about that idea, being presented the opportunity to play with Google Glass and to use it in my thesis seemed almost unreal. Nevertheless I was still petrified. I don’t know anything much about gimbals, despite being an avid multi-rotor enthusiast, and I know even less about Android development. But I knew it was a opportunity to show the world, but most importantly, myself, what I am capable of.
And so begins my journey with Google Glass, Android development, Arduino board, sensors and all kinds of other thingies that any geek, nerd or techy dreams about.