Input

How inputs are handled in the NUbots codebase.

Input to the system includes cameras, Game Controller and NatNet.

Cameras

The cameras have the following parameters that are used by the object detection algorithm.

ParameterTypeDescription
serial_numberstringThe serial number of the camera. This is used to identify the camera and distinguish it from other cameras in the robot.
lens.projectionstringThe lens projection type. Can be rectilinear, equidistant or equisold.
lens.focal_lengthfloatThe normalised focal length. It is defined as focal length in pixels divided by image width. The focal length is the angle of view and magnification.
lens.center2-dimensional vectorThe normalised image centre offset. Represents the pixels from the centre of the image to the optical axis, divided by the image width
k2-dimensional vectorThe polynomial distortion coefficients for the length
fovfloat (radians)Field of view. The angular diameter that the lens covers (the area that light hits on the sensor).
Hpc4x4 matrixHomogeneous transform from the rigid platform this camera is attached to (pitch servo) to the camera's virtual focal point.

These parameters are set for each camera as configuration values in the Camera module, in each robot's respective folder. The values for the left camera on the robot will be stored in Left.yaml. The values for the right camera on the robot will be stored in Right.yaml.

The parameters are used in the Camera module to find and set up the cameras. The Camera module emits Image messages.

The projection tool, based on panotools' fisheye projection calculations, maps a portion of the surface of a sphere to a flat image. The type of projection is specified by the above parameter lens.projection.

Copyright © 2021 NUbots - CC-BY-4.0
Deploys by Netlify