Burlion Advance Flight Control Lab

In this research, the objective was aerial manipulation. Specifically, manipulating white cubes using a drone equipped with a delta robot arm. 

Drone Design

A tricopter design was chosen for the drone to mimic the delta tri-arm. The arm can be rotated 60 degrees to reduce the interference with the thrust path. It has a 5-volt and 9-volt regulator built into the drone, one for the Raspberry Pi and one for the servo motors. The Raspberry Pi will be running ROS to support the localization and model prediction control process of the PixHawk 6 mini and running a separate thread that controls the delta arm's servo motors and the computer vision processing. 

Delta Arm

For the robot arm to operate in a large configuration space, long lightweight biceps and forearms were required for this design. Therefore, carbon fiber rods were used.

Bicep: 200 mm

Forearm: 285mm

After accounting for some limited joint motion, the operable volume was 320 mm in diameter with z from 250 mm to 430 mm.

End Effector Design

The end effector included two key components: ground facing ArduCam and servo grippers. It also acted as the landing surface of the drone. The IR diodes on the camera were replaced with white leds to reflect the white cube as much as possible from the surrounding background.

Localization

Localization is broken down into parts:

 Here is the algorithm to extract relevant information from a current frame:

For some reason, when taking screenshots through ssh/RealVNC, it applied a blue haze over the image. This is likely due to network constraints.

Due to the low altitude of the flight, getting a consistent altitude reading from the flight controller is infeasible. Also, we wanted to examine the possibility of obtaining the altitude from the camera feed and known landmark sizes. Therefore, we recorded the cube contour area at several heights to create an approximate height estimation formula using Excel. As expected, when pixel area increases, the estimated height from the cube decreases exponentially.

CubeExtract runs whenever a new frame is available from the camera. The AprilTag and cube localization processes are conducted on the same periodic to reduce the read time of the image. The Raspberry Pi already struggles to run ROS on one of its cores. Thus, Any effort to decrease the computer vision run time complexity was welcome.

In this video, I updated the blob detection to aid the localization process, getting an approximate x and y distance without AprilTags. As soon as the cube is within the field of view and in the red circle, the drone remains stationary until a manual input is made to operate the drone.

Controls

Forward Kinematics

Forward kinematics is the process of determining the end position of the end effector given the actuators' angles. Thus given a tuple of angles, it should simply return the theoretical position of the end effector.

Inverse Kinematics

Inverse kinematics determines the joint angles required for the robotic arm to approach the desired position. Thus given a specific position, this algorithm should return what angles the servos must rotate to. Similarly, a rotational matrix will be applied to each servo motor to have a singular helper function for inverse kinematics. If the end effector pose is rotated 120 degrees or -120 degrees around the z-axis, it would appear as if the arm is on the YZ plane and thus make calculations easier.

Applying the angles to the servo motors can be done in many ways. Directly setting the value from the initial to the end will be the fastest option but the motion will not be smooth. Applying the change for a long time softens the initial jerk. The apply over time procedure can be done exponentially or linearly.

Model Predictive Control

During arm operation, the drone will experience a force under its center of mass, creating torque in the opposite direction of the arm motion. In addition, as most of the arm's mass is near the end effector and is reaching down to grab an object, it will experience a dramatic shift in its center of mass. Normal drones do not model for such an action, leading to divergent oscillations due to overcorrection. Therefore, I ventured to design a control algorithm that stabilizes the drone's pose using Model Predictive Control.

The delta robot arm's moment of inertia was approximated to be a simple cylinder: wide and short when retracted and thin and long when extended. Although the exact moment can be calculated, the cylinder approach was more appropriate given the computational constraint of a Raspberry Pi.

Given the start and end position of the arm effector and the approximate motion of transformation, we can model the approximate torque applied to the drone at a given time. We can calculate the cylinder using forward kinematics to determine the height and approximate radius (assuming constant mass and density). Angular acceleration can also be approximated using forward kinematics by getting the trajectory of the center of cylinders over time and taking its derivative twice. With these two fields, the torque can be calculated and used to predict the flight behavior of the drone.  As we want the drone to be stationary, the negated predictive torque of the robot arm must be the torque the drone applies to keep its pose.

We simulated the MPC of the simplified moment of inertia interpretation in ROS: the drone maintained its position within a 10 cm^3 cube without big changes in its rotation or oscillation.

Here are some attempts of implementing MPC in real life

This was the last attempt I conducted on the drone as school ended. One of the grippers on the end effector broke and remains broken. Although some minor adjustments were made and simulated in ROS after this, not additional physical tests were performed. The hopes of conducting aerial manipulation remain.