Skip to content

Uses a downward facing camera to see a landing target, and guides the drone to land there. This node will consume and leverage topics with the Xeni drone nodes.

License

Notifications You must be signed in to change notification settings

slaghuis/lander

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

29 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Vision-based-lander

Subscribing to sensor_msgs/msg/image from a downward facing camera, this package will publish cmd_vel messages to move a drone to an Aruco target and effect landing through a action client. A fancy way of describing vision based precision landing.

Through active control over the whole landing process this code should be able to land a drone on its assigned marker in windy conditions.

The packages comprises two nodes:

Tracker Node

The tracker node subscribes to a presumably downward facing camera and detects an Aruco marker. Using OpenCV the marker is translated to a coordinate pose. These coordinates are published in the landing_target frame. Further a base_camera->landing_target transform is published to maintain the coordinates of the marker relative to the drone. To make this work, a base_link->base_camera static transform must be published.

ros2 run tf2_ros static_transform_publisher 0.05 0.05 -0.03 0 0 -2.36 base_link base_camera

Lander Node

Reads the last transform passed from base_camera to landing_target. If the transform is recent enough, it is assumed that the traget is in sight. Sends out cmd_vel commands to the Drone Node to bring about movement. This is a closed loop control system.

The code is structures as a finite state machine. On accepting a Action Server request to land at a given coordinate, the state machine is moved from PENDING to SEEKING.

Seeking State

Flies toward the given coordinates (in the map frame) at a parameterised approach alltitude, hoping to sight the landing target on arrival. If the target is sighted, the state is changed to APPROACHING. This type of flight will fly straight into a wall. No collision avoidance is done.

Approaching State

Reads the last transform passed between base_link and landing_target the lander aims to minimise the positional error over the target, whilst maintaining the approach altitude. If stability is achieved, the state is changed to DESCENDING. Flight is all in the base_link frame using a FLU orientation.

Descending State

Whilst maintaining position over the target, the altitude is reduced to a within a minimum of the landing target. On acheiving this minimum flight altitude, the state is changed to LANDING. Flight is all in the base_link frame using a FLU orientation.

Landing State

Landing is tricky. Ground effect causes random perturbations in vehicle velocity, and we can't correct for that very well because we're so close to the ground we don't have much room to maneuver or margin for error. We just call the land service advertised by the Drone Node. This will land and disarm the drone. The state is returned to PENDING.

Depends

This node needs the Lander Interfaces package. Flight control is either being done by Drone MAVSDK or Drone RTPS, but then any controller that uses velocity messages could be used (with some re-mapping possibly). Some camera node is needed to publish the sensor_msgs/msg/image message to detect Aruco Markers on. A simple option is Camera Lite. Camera lite describes the installation of OpenVC that is needed to detect the Aruco markers.

Warning

This code is being tested. It has not flown. (December 2021)

About

Uses a downward facing camera to see a landing target, and guides the drone to land there. This node will consume and leverage topics with the Xeni drone nodes.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published