Along with 3 other robotics students at Georgia Tech, I made a further development on my work on the Arduino Quadcopter GPS functions that I first introduced in this post. As shown in the introductory post, GPS data from the DJI Naza GPS receiver can be sent to Arduino via serial communications. The next step is to program Arduino to produce a location estimate based on those GPS data. Although raw (latitude,longitude) data could be used as a location estimate, problems arise when the quadrotor enters areas where the GPS signal strength is lower and (latitude, longitude) updates are received at a much lower frequency. In the outdoor tests we performed in Atlanta, the rate of GPS data retrieval varied from as high 2 Hz to as low as < 0.1 Hz. This means that if the quadrotor depends only on raw GPS data, its location estimate could be 10 seconds old or more. In addition to issues with data retrieval frequency, raw GPS data are also noisy, thus increasing the unreliability of such an unfiltered location estimation scheme.
The problems discussed above are what motivate the implementation of a Kalman Filter on board the Arduino control hardware. The Kalman Filter is a popular mathematical technique in robotics because it produces state estimates based on noisy sensor data. This great tutorial explains the Kalman Filter. You can find our online and offline Arduino implementations of the Kalman Filter on my github page. The most useful implementation is Arduino_Kalman_Online_With_Interpolation.ino because it updates the quadrotor’s state estimate in spite of a lack of GPS data from the receiver hardware.
Below is a poster we put together to succinctly describe the work. The figures show the valuable benefit of having state estimates even when GPS data are not available. Black marks show GPS data while pink marks show Kalman Filtered state estimates made by Arduino in real time. The beneficial smoothing effect and the 1.75m location estimation accuracy are also evident.
In the above video, a modified Parrot AR.Drone is reprogrammed to perform autonomous behaviors. The drone pilots itself into a simulated hallway and stops when it detects the end of the hallway. The goal of the project is to achieve autonomous hallway navigation using low-cost IR sensors. To achieve autonomous flight, the AR.Drone was retrofitted with five IR rangefinders. The readings from the rangefinders were A-D converted by an Arduino Micro and sent via serial communications to the BusyBox Linux OS running on the AR.Drone’s ARM A8 processor. I integrated the two hardware systems and programmed the drone to navigate a hallway based on the sensor readings. To achieve this, I implemented a PD controller whose inputs were sensor readings and whose outputs were desired [roll, pitch’ yaw] settings for the drone. Unlike my other projects, this is academic work. As a result, I can’t share source code like I usually do, but I hope you enjoy the results nonetheless.
I’m really happy to finally show this new prototype. In the last few months, I’ve completely rebuilt my autonomous, Arduino-based quadcopter and made significant software and hardware improvements over the previous version. This new version merges the programmatic ease of Arduino with the stability and robustness of the DJI Naza flight controller. People who follow my build instructions or otherwise use my design will be able to start writing their own flight programs inside my simple Arduino script that provides access to the same controls a human pilot would manipulate: aileron, elevator, rudder, and throttle. The beauty of this design is that, although DJI Naza is a closed-source product, I’ve made an open-source Arduino interface that makes it easy to achieve autonomy. I really want to stress the word easy. To make the craft move forward, all you have to do is type the following in your program:
That’s it. My implementation just uses functions in the well documented Arduino servo library to control motion. Here’s a demonstration of the possibilities:
Here are Receiver8 and Transmitter8, the work-in-progress versions of the code files used to control the quadcopter. The essence of this system is that an Arduino Uno in the handheld controller accepts joystick inputs and then sends control signals via RF communications to the Arduino Mega on board the quadcopter. The Arduino Mega then sends those controls to the DJI Naza by way of Arduino Servo functions. An arming/disarming system is implemented in the handheld controller. Stay tuned for future posts with updated code, circuit diagrams, and parts lists. Below is a video I sent to my collaborators during the re-construction of the quadcopter. It demonstrates the ability to control the DJI Naza using a combination of an Arduino Uno handheld controller and on-board Arduino Mega:
Quadcopters are helicopters with four motors and four propellers. Quadcopters can be very agile, but they can also be very unstable. Unlike airplanes that can glide because of the aerodynamic lift force acting on their wings, quadcopters have no passive lift mechanism; a single motor failure or motor speed mismatch will lead to a violent crash. As a result, it’s extremely difficult for a human to fly a quadcopter by directly controlling each motor because this “classic” piloting method requires extreme reaction speed and attention. Modern quadcopter pilots rely on on-board sensors and microcontrollers that detect the machine’s dynamics and adjust motor speeds many times per second to self-stabilize. With dynamic stabilization handled by the machine itself, pilots are free to adjust a quadcopter’s throttle, pitch, roll, and yaw to semi-manually guide the quadcopter in a manner similar to a traditional aircraft. Our goal is to eventually build an autonomous quadcopter, one that not only balances itself but also guides itself. The self-balancing system described above makes flying quadcopters easier for pilots, but our goal is to replace pilots with artificial intelligence software.
I started working on an autonomous quadcopter because I got inspired by the Amazon’s Prime Air delivery system and by this incredible TED talk by Dr. Rafaello D’Andrea from ETH Zurich. My goal for the project is to build a quadcopter that can deliver a light package from one point to another. My plan is to construct a quadcopter using readily available hobby parts, configure those parts to interface with an Arduino board, and program the quadcopter to guide itself autonomously. To challenge myself, I want the start and end points to be hundreds of yards apart in order to reasonably simulate a local delivery. I live in Atlanta, and so the start and end points will be further separated by difficult obstacles such as tall buildings, trees, vehicles, etc.
I’m doing this project because I want to learn more about autonomous robotics and because I’m collaborating with Vergilis, a brilliant team at Atlanta Tech Village that has greater ambitions for my project. Vergilis is an Atlanta technology startup that plans to use autonomous drone technology to enable efficient local deliveries of fast food and other small items. Needless to say, this project has a long way to go before it achieves our ambition. One note I would like to make is that, unlike my previous rover project, this project is very much a work in progress. The code and assemblies I will post will be in very early stages of development and will likely contain many bugs; this is important to keep in mind especially due to the dangers of working with quadcopters. With that said, I hope these posts become an interesting and useful record of our progress.