• Reinforcement Learning for Obstacle Avoidance

    Reinforcement Learning for Obstacle Avoidance

    Students: Shakeeb Ahmad

    The aim of this project is to exploit Deep Q-Learning with the trajectory generation algorithm developed at the MARHES Lab for vision-aided quadrotor navigation. Certain motion primitives in three directions are computed prior to the flight and are executed online. A simulated 2-D laser scan is used as a set of raw features which are further processed.

    Epsilon-greedy policy is used to maintain a balance between exploration and exploitation. The Q-values are recursively updated based on the Bellman equation to calculate the error in the neural network which is then back-propagated to train the network. Keras library in Python is used for training the network and predicting desired actions. The Python node is also used to subscribe to and process the laser scan features. However, the front end C++ node is used to detect collisions and to execute the trajectories if they are collision-free. In this way Python script/node, exploiting the Keras library is used along with the pre-designed collision detection C++ node using service-client architecture (thanks to ROS!). The approach used in this project ensures learning while the robot undergoes collision-free exploration. The preliminary results are shown.

  • Stereo Vision-Based Obstacle Avoidance

    Stereo Vision-Based Obstacle Avoidance

    Students: Shakeeb Ahmad

    This research is inspired by a popular computer game “Race the Sun”, where a UAV has to reach as far as possible consuming solar energy while the sun sets slowly. Hence, while maneuvering forward it has to save energy and form trajectories around obstacles at the same time. A similar idea is used along-with the real-time trajectory generation techniques combined with stereo vision. Multi-threading features in the Robot Operating System (ROS) and C++ are utilized to achieve parallel trajectory generation and execution. OpenCV is extensively used for image processing tasks. The hardware is also developed to run the proposed algorithm. Jetson TX2 is used to perform all the computations onboard and a forward-facing ZED-mini stereo camera is used to provide visual odometry and the depth image stream to be used by the planning algorithm for further processing. The system is completely independent and does not need any GPS or motion capture system as well, to navigate.

    Papers:
    [1] S. Ahmad, “High-Performance Testbed for Vision-Aided Autonomous Navigation for Quadrotor UAVs in Cluttered Environments”, The University of New Mexico (Digital Repository), 2018

    [2] S. Ahmad, R. Fierro, “Real-time Quadrotor Navigation Through Planning in Depth Space in Unstructured Environments.”

  • Autonomous Maneuver through Square Targets

    Autonomous Maneuver through Square Targets

    Students: Shakeeb Ahmad, Greg Brunson

    The main idea behind the project is to develop a fully autonomous system, free of external sensings like motion capture and GPS. At the same time, it should be capable of sensing its environment for various tasks. NVIDIA Jetson TK1 is used as the main processor on-board while a forward-facing ZED stereo camera is used to get visual odometry and to detect objects in the environment. The test prototype is then used to implement autonomous navigation through a set of square targets. The stereo camera is hence used to detect the square targets and their center points and finally, a path is planned through the center points. The algorithm is implemented in C++ using the Robot Operating System (ROS) framework.

    Papers:
    [1] S. Ahmad, “High-Performance Testbed for Vision-Aided Autonomous Navigation for Quadrotor UAVs in Cluttered Environments”, The University of New Mexico (Digital Repository), 2018

  • Cooperative 3-D Mapping

    Cooperative 3-D Mapping

    Students: Jonathan West, Shakeeb Ahmad, Joseph Kloeppel

    Funding: MAST-CTA

    The capstone for the project under Army Research Labs (ARL) Micro-Autonomous Systems and Technology (MAST) included the task of exploring an unknown environment using a heterogeneous robotics test-bed. The problem motivates from the fact that in usual search and rescue operations, there is a need for cheap robots and sensors to be deployed without worrying much about their loss or damage. For that purpose, a set of bio-inspired robots, the miniROaCHes are assembled at the MARHES Lab. They are built out of chassis from Kamigami robots by Dash Robotics. They are made capable of running Linux and hence the Robot Operating System (ROS) by mounting Raspberry Pi Zeros on-board. They are also assembled so that a forward camera can be mounted on-board to capture images. The idea is to explore an unknown environment by utilizing this camera on each of the deployed miniROaCH. These pictures are taken from different random poses. In order to overcome the problem of low memory on the ground robots, the quadrotor hovers over each one of them turn-by-turn to copy these pictures and dump them to the base station via an optical communication link. This base station is connected to a cloud server where the 3-D map is generated. ROS is used as the main software framework for all robots.

  • Flying Inverted Pendulum

    Flying Inverted Pendulum

    Students: Christoph Hintz, Shakeeb Ahmad, Joseph Kloeppel

    A universal inverted pendulum has been implemented by using a quadrotor with an underactuated load attached. The system can function as a flying linear and rotary inverted pendulum without any physical modification. Such an agile system has applications ranging from off-centered suspended load transportation to package delivery to flying cars to control education. The robust hybrid control scheme proposed, is able to overcome the challenges that come with using a quadrotor as the actuator for the pendulum. A complete simulation of the system in a ROS/Gazebo environment has been implemented and discussed. The simulations gave the team a lot of insight into the implementation, especially since both systems use ROS structure. This has the advantage that the control nodes developed for the simulation can be used in a similar fashion on the real quadcopter. The real system is currently under development and shows promising results. The AscTec Hummingbird that is used has been equipped with an Odroid XU-4 onboard microprocessor, which runs ROS environment on a Ubuntu 14.04 operating system. An 8mm carbon fiber tube is used as a pendulum and a ceramic ball bearing is used as a pivot for it. To attach the pendulum to the quadrotor carbon fiber tube in combination with specially designed 3D printed parts are used.

    Papers:
    [1] Robust Hybrid Control for Swinging-up and Balancing an Inverted Pendulum Attached to a UAV, CCTA, 2017

  • Mobile Motion Capture Testbed

    Mobile Motion Capture Testbed

    Students: Shakeeb Ahmad, Joseph Kloeppel, Jasmin Regalado 

    Funding: MAST-CTA

    Introduction

    miniROaCH is a small-scale, affordable, and portable multi-vehicle robotic platform, capable of running Linux, and hence Robot Operating System (ROS) on-board. The purpose of this research is to simulate and implement a small-scale swarm testbed based on these miniROaCHes. Developing a swarm testbed can be difficult in terms of robot localization. The proposed testbed aims to tackle this difficult issue by utilizing an aerial vehicle used to monitor and publish the locations and orientations of the swarm’s ground agents. The testbed consists of one quadrotor and multiple crawler robots. The team plans to build a mobile motion capture system so that experiments can be done both indoors and outdoors. The system is able to precisely estimate the poses of multiple ground robots with respect to the absolute world frame. A quadrotor UAV with a downward-facing camera is used to detect the relative poses of the ground robots in its field of view. The poses of each ground robot are ultimately transformed into the world frame through proper transformation calculations. The pose of the quadrotor is measured with respect to the world frame for the transformation.

    Approach

    The research is divided mainly into two parts:

    1. Simulation: Gazebo is used to create an environment with a hovering quadrotor, miniROaCH ground robot models, and a camera mounted beneath the quadrotor. AprilTags are used on the ground robots to detect their unique positions relative to the quadrotor frame which is ultimately transformed into the world frame.
    2. Practical implementation of the system: A tripod mount with a camera facing downwards is used to measure the relative poses of the miniROaCHes on the tabletop. An AprilTag package for ROS is used to decode the AprilTags posted on the ground robots. Once the transform tree in ROS is available, a simple swarm control algorithm is demonstrated with the system.

    Tools

    1. ROS (packages : rotors_simulator, apriltags_ros etc)
    2. Rviz and Gazebo
    3. Raspberry Pi Zero
    4. 5 Kamigami Dash Robots (MARHES custom build)
    5. Tripod and camera mount
    6. USB Camera