BallLocalisation
Description
This module takes in a list of vision balls, uses the ball measurement closest to our current estimate and applies an Unscented Kalman Filter to estimate the balls position and velocity in world space.
Usage
Include this module to allow the robot to estimate the balls position and velocity.
Consumes
message::vision::Ballsuses the ball position estimate from visionmessage::input::Sensorsuses sensors to compute transform from camera {c} to torso space {t}message::support::FieldDescriptionuses field description to obtain height of ball off the ground
Emits
message::localisation::Ballcontains filtered ball position measurement
FieldLocalisationNLopt
Description
A localisation method for estimating the where the field is in world space, which relies on field line points and field line intersections using non-linear optimisation.
Optimisation Setup
The optimisation framework integrates several cost components and constraints to compute the optimal field localisation state:
Cost Components
Field Line Alignment Cost ():
- Measures the alignment of predicted field lines with observed ones.
- Calculated as the squared Euclidean distance between field line points and the nearest point on any observed line, scaled by a predefined weight and divided by the number of field lines:
- Points off the field are given a constant weight set in configuration.
Field Line Intersection Cost ():
- Assesses the accuracy of predicted field line intersections against observed intersections.
- Computed similarly through the squared distances between predicted and observed intersections:
- State Change Cost ():
- Penalises large deviations from the initial state estimate to ensure temporal consistency.
- Expressed as:
Constraints
The optimisation is subject to the following constraints:
State Bounds:
- Limits the allowable state changes between optimisation steps to ensure the solution does not jump an unrealisic amount between updates
Here, represents the maximum allowable change in each state dimension (x, y, and ).
Minimum Field Line Points:
- The algorithm requires a minimum number of field line points to run the optimisation to ensure sufficient data for accurate estimation:
- Robot Stability:
- Optimisation will not proceed if the robot is in an unstable state (e.g., falling):
Optimisation Algorithm
- The overall cost function optimised is:
Where:
- represents the state vector.
- , , and are weights controlling the relative importance of each cost component.
Optimisation is carried out using NLopt's COBYLA (Constrained Optimisation BY Linear Approximations) algorithm, respecting the constraints and bounds set on the changes allowed in the state to ensure plausible and robust field localisation.
Cost Threshold and Update Acceptance
The module uses a cost threshold (cost_threshold) to determine whether to accept optimisation results:
- Accepting Updates: Optimisation results are only applied if their cost is below
cost_threshold. This prevents poor localisations from corrupting the state estimate. - Rejecting Updates: When the cost exceeds the threshold, the optimisation result is rejected and the previous filtered state is maintained. A warning is logged and a counter is incremented.
- Triggering Resets: If the cost exceeds the threshold for
max_over_costconsecutive updates (andreset_delayhas elapsed), an uncertainty reset is triggered.
This mechanism provides robustness against temporary vision anomalies or ambiguous field features while maintaining accurate localisation when observations are reliable.
Reset
A soft reset ResetFieldLocalisation is used when the robot is starting on the side of the field (or in a custom position specified in the configuration).
A more extreme reset UncertaintyResetFieldLocalisation is used when the cost of the current field position is high during play and either a local search or field-wide search must be conducted to regain the position. The local search is a grid search using configurable parameters and uses the lowest cost position if it is under the configurable cost threshold. If this fails to find an appropriate position, a global grid search is conducted. This grid search occurs over the half of the field that the robot was last in, so as to ignore mirror states.
Usage
Include this module to allow the robot to estimate where the field is in world space.
Consumes
message::vision::FieldLinesfield line points are used in the field localisation cost functionmessage::vision::FieldLineIntersectionsfield line intersections are used in the field localisation cost functionmessage::vision::Goalsgoal positions are used in the field localisation cost functionmessage::support::FieldDescriptionto determine the field map and bounds of the field
Emits
message::localisation::Fieldcontains the estimated (x, y, theta) statemessage::localisation::ResetFieldLocalisationsignalling a side-of-field localisation resetmessage::localisation::UncertaintyResetFieldLocalisationsignalling a local or field-wide reset, which is a computationally intensive actionmessage::localisation::FinishResetsignalling that a local or field-wide reset has completed
Dependencies
Eigenutility::math::stats::MultivariateNormalUtility for sampling from a multivariate normal distribution
Mocap
Description
The Mocap module processes motion capture data to provide ground truth robot pose information. It receives motion capture data containing rigid body positions and orientations, filters for the specific robot rigid body, and converts this data into a robot pose ground truth message.
Usage
The module requires configuration through Mocap.yaml with the following parameters:
robot_rigid_body_id: The ID of the rigid body that represents the robot in the motion capture system
Enable ground truth localisation in SensorFilter and FieldLocalisationNLopt module to use this data directly for pose estimation (odometry/localisation)
Consumes
message::input::MotionCapture: Motion capture data containing positions and orientations of tracked rigid bodiesextension::Configuration: Configuration data from Mocap.yaml
Emits
message::localisation::RobotPoseGroundTruth: The ground truth pose of the robot, containing:- Transformation matrix (Hft)
RobotLocalisation
Description
Estimates the position and velocity of other robots on the field.
The module works by tracking multiple robots using a UKF filter for each. Vision measurements are associated with each tracked robot using global nearest neighbor and an acceptance radius. Messages over the network from other robots are used to determine whether a tracked robot is a teammate or opponent and to update the filter.
Tracked robots are discarded if they are not seen for a consecutive number of times when they should be visible.
Usage
Incluide this role to track other robots on the field.
Consumes
message::vision::Robotsuses the robot position estimates from visionmessage::input::RoboCupuses teammate position from their WiFi messagemessage::vision::GreenHorizonuses the GreenHorizon to manage tracked robotsmessage::localisation::Fielduses the field transformation matrix Hfw to get the location of the tracked robots in field spacemessage::support::FieldDescriptionuses the field dimensions to determine whether the robot is outside the field (plus a given distance outside the field)message::input::GameStateto get the robot's team colour for visualisation in NUsight
Emits
message::localisation::Robotscontains filtered robot positions and velocity estimates