NUpbr

A physically-based rendering pipeline for generating realistic semi-synthetic data.
Matthew Amos GitHub avatar
Updated 3 Sept 2022
NUpbr currently only supports Blender 2.79. Ensure this version is installed.

Motivation

As the power of state-of-the-art deep learning methods increases, so does the requirement for large, fully annotated datasets. Particularly in terms of object recognition and classification computer vision tasks, annotation of this data requires considerable time and effort when performed manually by humans.

The NUpbr project aims to reduce this annotation time by allowing for synthetic generation and automatic annotation of image data via the Blender 3D rendering engine. To ensure the data is comparable to that of realistically obtained data, physically-Based Rendering (PBR) materials are used. Additionally, Image-Based Lighting (IBL) is used with High-Dynamic Range (HDR) environment maps to provide semi-synthetic image support.

The tool is capable of outputting raw RGB images taken from random points of view within the scene, as well as generating the corresponding segmentation image, depth map, and various metadata for the current scene.

How it Works

NUpbr constructs a soccer field scene with user-specified dimensions. By default, this includes two goals, a soccer ball and multiple NUgus robots.

Environment

The environment map defines how the IBL will light the scene, as well as the background for the RGB image. NUpbr will recursively perform regular expression searches to find three files per environment map:

ItemRegex SearchExampleDescription
Raw HDR map.*raw.*\.hdr001_raw.hdrHDR RGB pixel image filename
Segmentation mask.*mask.*\.png001_mask.pngSemantic segmentation mask of the HDR image
Metadata file.*json001.jsonMetadata file for the HDR image

The raw HDR map will contain the RGB image as typically perceived. The HDR format is ideal for this image, as it allows for changing of exposure within Blender.

The segmentation mask will define the desired class of each pixel within the HDR map. Colour is used to distinguish between the classes, as shown in the examples within this page.

The metadata file contains positional and rotational camera parameters for when the image was taken, as well as configuration for which objects to render, including if balls, goals or the synthetic field are draw during rendering. Additionally, manual limits for where the ball can be placed in the scene can be specified in metres.

The configurable properties for the environment can be found in scene_config.py within resources["environment"].

Balls

The ball is either constructed using a preconstructed model (.fbx) with corresponding colour and/or normal maps, or via a UV colour map for the ball. Again, a recursive regex search is applied to populate the ball resources:

UV Map

ItemRegex SearchExampleDescription
UV Image.*colou?rs?.*\.jpg001_colour.jpgRGB image of the UV map for the ball

FBX File

ItemRegex SearchExampleDescription
Colour map.*colou?rs?.*\.jpg001_colour.pngRGB pixel image filename for the colour map which can be applied to the FBX
Normal map.*norm(?:al)?s?.*\.png001_norm.pngNormal map of the FBX

The normal and colour maps are then applied to the ball using a material controlled by a PBR node.

The configurable properties for the ball can be found in scene_config.py within resources["ball"].

Field

When the user specifies that the field is to be synthetically generated, a hair particle system is used to create grass on the field.

Colouring the Field

A UV map of the field lines is generated based on the specified field dimensions in scene_config.py within cfg["field"], which is then mixed with the grass colour to create field lines within the synthetic grass. A PBR node is used for the grass, using the grass and field line colouring for the base colour of the field.

Segmenting the Field

The segmentation of the field is performed using scene composites, where the UV map projection on the field (in the camera viewpoint) is recoloured based on the desired segmentation mask colour of the field lines. This scene composite is then applied over the segmentation mask from the field, to ensure that the field lines appear over the top of the field.

The configurable properties for the field can be found in scene_config.py within resources["field"].

Goals

The goals are constructed based on the desired goal dimensions in scene_config.py within cfg["goal"]. These goals can either be generated as rectangular or rounded, as defined in scene_config.py, in cfg["goal"]["shape"].

Using a PBR node, the goals are given a colour, metallicity and roughness as specified in blend_config.py within goal.

The segmentation colouring for the goals are only applied to the front quad (goal posts and crossbar).

The configurable properties for the goals can be found in scene_config.py within resources["goal"].

Robots

Robots within NUpbr are still a work in progress. Currently only a specific implementation of the NUgus robot is supported, and is likely subject to change in the future.

Configuration

Configuration for the NUgus robot can be found in scene_config.py within resources["robot"]. This configuration is composed of four parameters:

ParameterExampleDescription
mesh_path"./robot/NUgus.fbx"Path to the file which holds the NUgus FBX file
texture_path"./robot/textures/"Path to the directory which holds the NUgus robot's materials and texture maps
kinematics_path"./robot/NUgus.json"Path to the file which holds kinematics limits and part names to assemble robot
kinematics_variance0.5Floating point value for the maximum variation from the neutral pose of each robot part
texture_path"./robot/textures/"Path to colour and normal maps for each robot part

Additionally, the number of robots generated within the scene can be configured with scene_config.py with num_robots.

Kinematics

The kinematics of the robot are read from the kinematics_path, and the individual FBX files for each part are assembled as per the read kinematics configuration. Robot parts will randomly be placed according to kinematics_variance, where kinematics_variance = 0.0 will place robots in their neutral pose, and kinematics_variance = 1.0 will allow for random placement up to the kinematic limits of the robot.

The configurable properties for robots can be found in scene_config.py within resources["robot"].

Segmentation

The segmentation colouring for the field is specified through scene_config.py within resources[object]["mask"]. By default the scene segmentation is configured as:

ClassDescriptionColour
unclassifiedEnvironment or unclassified areas
ballsBalls
fieldGrass and/or field area
field linesField lines
robotGeneral robots

However, these colours can be configured as per user requirements in scene_config.py, under resources[<class>]["mask"].

Examples

RGB, segmentation and depth images for a generated ball on a synthetic field
RGB, segmentation and depth images for a generated ball on a synthetic field
RGB, segmentation and depth images with a generated robot
RGB, segmentation and depth images with a generated robot

Usage

The NUpbr repository can be found within the NUbots organisation. To run the data generation script, ensure Blender 2.79 is installed. Ensure that the configuration properties are established as desired, as found in pbr/config/output_config.py This includes:

Config ParameterTypeExample ValueDescription
num_imagesint10Number of generated image sets (where a raw image, segmentation mask and depth image will be generated for each camera)
output_stereoboolTrueDetermines if stereo images will be generated, separated by 0.5 * camera["stereo_camera_distance"] (configured in pbr/scene_config.py)
output_depthboolTrueDetermines if depth images will be generated.
output_basestr"./run_{}"Defines the output directory format string to allow for new output directory creation on regeneration.
filename_lenint10Length of the output file, e.g. for filename_len = 10, the first raw image generated will be saved as 0000000000.jpg.
image_dirnamestr"raw"Directory name in which raw images are saved. e.g. ./run_0/raw.
mask_dirnamestr"seg"Directory name in which segmentation are saved. e.g. ./run_0/seg.
depth_dirnamestr"depth"Directory name in which depth images are saved. e.g. ./run_0/depth.
meta_dirnamestr"meta"Directory name in which the metadata for each image set is saved. e.g. ./run_0/meta.
max_depthfloat20Maximum depth for the normalised depth map (in metres).

Once the parameters are set, NUpbr can be run using

# Show Blender GUI
blender --python pbr.py
# Headless
blender --python pbr.py -b

Status

Ongoing Work

  • Rigging of robot models to generate realistic poses
  • Replacement of hair particle system with grass cards to reduce performance overhead

Future Work

  • Porting to Blender 2.8
  • Porting field UV map generation to OpenCV
Tools
NUbook
Tools
NUgan
NUbots acknowledges the traditional custodians of the lands within our footprint areas: Awabakal, Darkinjung, Biripai, Worimi, Wonnarua, and Eora Nations. We acknowledge that our laboratory is situated on unceded Pambalong land. We pay respect to the wisdom of our Elders past and present.
Copyright © 2024 NUbots - CC-BY-4.0
Deploys by Netlify