We have now released v0.3.0! Please use the latest version for the best experience.

orbit.sensors.patterns#

Sub-module for ray-casting patterns used by the ray-caster.

Classes

PatternBaseCfg

Base configuration for a pattern.

GridPatternCfg

Configuration for the grid pattern for ray-casting.

PinholeCameraPatternCfg

Configuration for a pinhole camera depth image pattern for ray-casting.

BpearlPatternCfg

Configuration for the Bpearl pattern for ray-casting.

Pattern Base#

class omni.isaac.orbit.sensors.patterns.PatternBaseCfg#

Base configuration for a pattern.

Attributes:

func

Function to generate the pattern.

func: Callable[[PatternBaseCfg, str], tuple[torch.Tensor, torch.Tensor]]#

Function to generate the pattern.

The function should take in the configuration and the device name as arguments. It should return the pattern’s starting positions and directions as a tuple of torch.Tensor.

Grid Pattern#

omni.isaac.orbit.sensors.patterns.grid_pattern(cfg: patterns_cfg.GridPatternCfg, device: str) tuple[torch.Tensor, torch.Tensor]#

A regular grid pattern for ray casting.

The grid pattern is made from rays that are parallel to each other. They span a 2D grid in the sensor’s local coordinates from (-length/2, -width/2) to (length/2, width/2), which is defined by the size = (length, width) and resolution parameters in the config.

Parameters:
  • cfg – The configuration instance for the pattern.

  • device – The device to create the pattern on.

Returns:

The starting positions and directions of the rays.

Raises:
  • ValueError – If the ordering is not “xy” or “yx”.

  • ValueError – If the resolution is less than or equal to 0.

class omni.isaac.orbit.sensors.patterns.GridPatternCfg#

Configuration for the grid pattern for ray-casting.

Defines a 2D grid of rays in the coordinates of the sensor.

Attention

The points are ordered based on the ordering attribute.

Attributes:

resolution

Grid resolution (in meters).

size

Grid size (length, width) (in meters).

direction

Ray direction.

ordering

Specifies the ordering of points in the generated grid.

resolution: float#

Grid resolution (in meters).

size: tuple[float, float]#

Grid size (length, width) (in meters).

direction: tuple[float, float, float]#

Ray direction. Defaults to (0.0, 0.0, -1.0).

ordering: Literal['xy', 'yx']#

Specifies the ordering of points in the generated grid. Defaults to "xy".

Consider a grid pattern with points at \((x, y)\) where \(x\) and \(y\) are the grid indices. The ordering of the points can be specified as “xy” or “yx”. This determines the outer and inner loop order when iterating over the grid points.

  • If “xy” is selected, the points are ordered with outer loop over “x” and inner loop over “y”.

  • If “yx” is selected, the points are ordered with outer loop over “y” and inner loop over “x”.

For example, the grid pattern points with \(X = (0, 1, 2)\) and \(Y = (3, 4)\):

  • “xy” ordering: \([(0, 3), (0, 4), (1, 3), (1, 4), (2, 3), (2, 4)]\)

  • “yx” ordering: \([(0, 3), (1, 3), (2, 3), (1, 4), (2, 4), (2, 4)]\)

Pinhole Camera Pattern#

omni.isaac.orbit.sensors.patterns.pinhole_camera_pattern(cfg: patterns_cfg.PinholeCameraPatternCfg, intrinsic_matrices: torch.Tensor, device: str) tuple[torch.Tensor, torch.Tensor]#

The image pattern for ray casting.

Caution

This function does not follow the standard pattern interface. It requires the intrinsic matrices of the cameras to be passed in. This is because we want to be able to randomize the intrinsic matrices of the cameras, which is not possible with the standard pattern interface.

Parameters:
  • cfg – The configuration instance for the pattern.

  • intrinsic_matrices – The intrinsic matrices of the cameras. Shape is (N, 3, 3).

  • device – The device to create the pattern on.

Returns:

The starting positions and directions of the rays. The shape of the tensors are (N, H * W, 3) and (N, H * W, 3) respectively.

class omni.isaac.orbit.sensors.patterns.PinholeCameraPatternCfg#

Configuration for a pinhole camera depth image pattern for ray-casting.

Attributes:

focal_length

Perspective focal length (in cm).

horizontal_aperture

Horizontal aperture (in mm).

horizontal_aperture_offset

Offsets Resolution/Film gate horizontally.

vertical_aperture_offset

Offsets Resolution/Film gate vertically.

width

Width of the image (in pixels).

height

Height of the image (in pixels).

focal_length: float#

Perspective focal length (in cm). Defaults to 24.0cm.

Longer lens lengths narrower FOV, shorter lens lengths wider FOV.

horizontal_aperture: float#

Horizontal aperture (in mm). Defaults to 20.955mm.

Emulates sensor/film width on a camera.

Note

The default value is the horizontal aperture of a 35 mm spherical projector.

horizontal_aperture_offset: float#

Offsets Resolution/Film gate horizontally. Defaults to 0.0.

vertical_aperture_offset: float#

Offsets Resolution/Film gate vertically. Defaults to 0.0.

width: int#

Width of the image (in pixels).

height: int#

Height of the image (in pixels).

RS-Bpearl Pattern#

omni.isaac.orbit.sensors.patterns.bpearl_pattern(cfg: patterns_cfg.BpearlPatternCfg, device: str) tuple[torch.Tensor, torch.Tensor]#

The RS-Bpearl pattern for ray casting.

The Robosense RS-Bpearl is a short-range LiDAR that has a 360 degrees x 90 degrees super wide field of view. It is designed for near-field blind-spots detection.

Parameters:
  • cfg – The configuration instance for the pattern.

  • device – The device to create the pattern on.

Returns:

The starting positions and directions of the rays.

class omni.isaac.orbit.sensors.patterns.BpearlPatternCfg#

Configuration for the Bpearl pattern for ray-casting.

Attributes:

horizontal_fov

Horizontal field of view (in degrees).

horizontal_res

Horizontal resolution (in degrees).

vertical_ray_angles

Vertical ray angles (in degrees).

horizontal_fov: float#

Horizontal field of view (in degrees). Defaults to 360.0.

horizontal_res: float#

Horizontal resolution (in degrees). Defaults to 10.0.

vertical_ray_angles: Sequence[float]#

Vertical ray angles (in degrees). Defaults to a list of 32 angles.

Note

We manually set the vertical ray angles to match the Bpearl sensor. The ray-angles are not evenly spaced.