Skip to main content

Chapter 7: Unity for Robotics

While Gazebo excels at physics-accurate simulation, Unity brings photorealistic rendering, advanced ML integration, and cross-platform deployment to robotics. In this chapter, we explore Unity Robotics Hub and how it complements traditional simulators.


Learning Objectivesโ€‹

By the end of this chapter, you will be able to:

  • Explain when to use Unity vs Gazebo for robotics simulation
  • Set up Unity with ROS 2 integration
  • Create photorealistic environments for robot testing
  • Generate synthetic training data for perception models
  • Implement domain randomization for robust ML models
  • Build human-robot interaction scenarios

Why Unity for Robotics?โ€‹

The Perception Gapโ€‹

Traditional robotics simulators like Gazebo focus on physics accuracy but often produce visually simplistic environments:

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚ The Simulation Reality Gap โ”‚
โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
โ”‚ โ”‚
โ”‚ Gazebo Simulation Real World โ”‚
โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚
โ”‚ โ”‚ Simple colors โ”‚ โ”‚ Complex texturesโ”‚ โ”‚
โ”‚ โ”‚ Basic shapes โ”‚ โ†’ โ”‚ Lighting vary โ”‚ โ”‚
โ”‚ โ”‚ Flat lighting โ”‚ GAP โ”‚ Occlusions โ”‚ โ”‚
โ”‚ โ”‚ No reflections โ”‚ โ”‚ Reflections โ”‚ โ”‚
โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚
โ”‚ โ”‚
โ”‚ ML models trained on simple visuals fail in real world! โ”‚
โ”‚ โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

Unity's Strengths for Roboticsโ€‹

CapabilityBenefit for Robotics
Photorealistic RenderingTrain perception models that transfer to real world
Universal Render PipelineReal-time ray tracing, PBR materials
Domain RandomizationAutomatic variation of textures, lighting, objects
Synthetic Data GenerationLabeled datasets (bounding boxes, segmentation)
Cross-PlatformWindows, Linux, embedded devices
Asset StoreThousands of 3D models, environments
C# ScriptingRapid prototyping, custom behaviors

When to Use Each Simulatorโ€‹

Use CaseRecommended Simulator
Physics-critical (contact, dynamics)Gazebo
Perception/ML trainingUnity
Photorealistic visualizationUnity
ROS 2 ecosystem toolsGazebo
Human-robot interactionUnity
Large-scale fleet simulationBoth (hybrid)
Real-time control testingGazebo
Synthetic dataset generationUnity

Unity Robotics Hub Overviewโ€‹

What is Unity Robotics Hub?โ€‹

Unity Robotics Hub is a collection of tools and packages that enable robotics development in Unity:

  • ROS-TCP-Connector: Bidirectional ROS 2 communication
  • URDF Importer: Import robot models from URDF
  • Perception Package: Synthetic data generation with labels
  • ML-Agents: Reinforcement learning integration
  • Articulation Bodies: Physics for articulated robots

Architectureโ€‹

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚ Unity Robotics Architecture โ”‚
โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
โ”‚ โ”‚
โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚
โ”‚ โ”‚ Unity Editor โ”‚ โ”‚
โ”‚ โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚ โ”‚
โ”‚ โ”‚ โ”‚ Scene โ”‚ โ”‚ Physics โ”‚ โ”‚ Rendering โ”‚ โ”‚ โ”‚
โ”‚ โ”‚ โ”‚ Editor โ”‚ โ”‚ Engine โ”‚ โ”‚ Pipeline โ”‚ โ”‚ โ”‚
โ”‚ โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”˜ โ””โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”˜ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚ โ”‚
โ”‚ โ”‚ โ”‚ โ”‚ โ”‚ โ”‚ โ”‚
โ”‚ โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ–ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ–ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ–ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚ โ”‚
โ”‚ โ”‚ โ”‚ Unity Runtime โ”‚ โ”‚ โ”‚
โ”‚ โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚ โ”‚
โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚
โ”‚ โ”‚ โ”‚
โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ–ผโ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚
โ”‚ โ”‚ ROS-TCP โ”‚ โ”‚
โ”‚ โ”‚ Connector โ”‚ โ”‚
โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚
โ”‚ โ”‚ โ”‚
โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ–ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚
โ”‚ โ”‚ ROS 2 โ”‚ โ”‚
โ”‚ โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚ โ”‚
โ”‚ โ”‚ โ”‚ Nodes โ”‚ โ”‚ Topics โ”‚ โ”‚ Services โ”‚ โ”‚ โ”‚
โ”‚ โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚ โ”‚
โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚
โ”‚ โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

Setting Up Unity for Roboticsโ€‹

Prerequisitesโ€‹

  1. Unity Hub: Download from unity.com
  2. Unity Editor: Version 2021.3 LTS or later (2022.3 recommended)
  3. ROS 2: Humble or later installed on your system

Step 1: Create a New Unity Projectโ€‹

# Open Unity Hub and create a new project
# Template: 3D (URP) - Universal Render Pipeline
# Project Name: RoboticsSimulation

Step 2: Install Robotics Packagesโ€‹

In Unity, open Window โ†’ Package Manager, then:

  1. Click + โ†’ Add package from git URL
  2. Add these packages one by one:
https://github.com/Unity-Technologies/ROS-TCP-Connector.git?path=/com.unity.robotics.ros-tcp-connector
https://github.com/Unity-Technologies/URDF-Importer.git?path=/com.unity.robotics.urdf-importer
https://github.com/Unity-Technologies/com.unity.perception.git

Step 3: Configure ROS-TCP Connectionโ€‹

Create a connection settings asset:

  1. Assets โ†’ Create โ†’ Robotics โ†’ ROS Connection Prefab
  2. Configure:
    • ROS IP Address: 127.0.0.1 (or your ROS machine IP)
    • ROS Port: 10000
    • Protocol: ROS2

Step 4: Start the ROS-TCP Endpointโ€‹

On your ROS 2 machine:

# Install the ROS-TCP-Endpoint package
cd ~/ros2_ws/src
git clone https://github.com/Unity-Technologies/ROS-TCP-Endpoint.git -b main-ros2

# Build
cd ~/ros2_ws
colcon build --packages-select ros_tcp_endpoint

# Source and run
source install/setup.bash
ros2 run ros_tcp_endpoint default_server_endpoint --ros-args -p ROS_IP:=0.0.0.0

Importing Robots with URDFโ€‹

URDF Importer Workflowโ€‹

Unity's URDF Importer converts your ROS robot descriptions into Unity GameObjects:

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚ URDF Import Pipeline โ”‚
โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
โ”‚ โ”‚
โ”‚ URDF/Xacro File โ”‚
โ”‚ โ”‚ โ”‚
โ”‚ โ–ผ โ”‚
โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚
โ”‚ โ”‚ Parse Links โ”‚โ”€โ”€โ–ถ Unity GameObjects with Transforms โ”‚
โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚
โ”‚ โ”‚ โ”‚
โ”‚ โ–ผ โ”‚
โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚
โ”‚ โ”‚Parse Joints โ”‚โ”€โ”€โ–ถ Articulation Bodies (physics joints) โ”‚
โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚
โ”‚ โ”‚ โ”‚
โ”‚ โ–ผ โ”‚
โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚
โ”‚ โ”‚Parse Meshes โ”‚โ”€โ”€โ–ถ MeshFilter + MeshRenderer components โ”‚
โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚
โ”‚ โ”‚ โ”‚
โ”‚ โ–ผ โ”‚
โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚
โ”‚ โ”‚ Colliders โ”‚โ”€โ”€โ–ถ Unity Colliders (Box, Mesh, etc.) โ”‚
โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚
โ”‚ โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

Importing Your Robotโ€‹

  1. Copy your URDF file and meshes to Assets/Robots/
  2. In Unity: Assets โ†’ Import Robot from URDF
  3. Select your .urdf file
  4. Configure import settings:
// Import settings (in the import dialog)
public class URDFImportSettings
{
public bool UseUrdfInertiaData = true;
public bool UseGravity = true;
public float GlobalScale = 1.0f;
public ImportPipelineType Pipeline = ImportPipelineType.ArticulationBody;
}

Articulation Bodies for Robot Physicsโ€‹

Unity's Articulation Bodies provide stable physics for articulated robots:

using UnityEngine;

public class RobotController : MonoBehaviour
{
private ArticulationBody[] joints;

void Start()
{
// Get all articulation bodies in the robot
joints = GetComponentsInChildren<ArticulationBody>();
}

public void SetJointTarget(int jointIndex, float targetPosition)
{
if (jointIndex < joints.Length)
{
var drive = joints[jointIndex].xDrive;
drive.target = targetPosition * Mathf.Rad2Deg;
joints[jointIndex].xDrive = drive;
}
}

public float GetJointPosition(int jointIndex)
{
if (jointIndex < joints.Length)
{
return joints[jointIndex].jointPosition[0];
}
return 0f;
}
}

ROS 2 Communication in Unityโ€‹

Publishing Topicsโ€‹

Send data from Unity to ROS 2:

using UnityEngine;
using Unity.Robotics.ROSTCPConnector;
using RosMessageTypes.Geometry;

public class VelocityPublisher : MonoBehaviour
{
private ROSConnection ros;
private string topicName = "/cmd_vel";
public float publishFrequency = 10f;

void Start()
{
ros = ROSConnection.GetOrCreateInstance();
ros.RegisterPublisher<TwistMsg>(topicName);
InvokeRepeating("PublishVelocity", 1f, 1f / publishFrequency);
}

void PublishVelocity()
{
TwistMsg msg = new TwistMsg
{
linear = new Vector3Msg { x = 0.5, y = 0, z = 0 },
angular = new Vector3Msg { x = 0, y = 0, z = 0.1 }
};
ros.Publish(topicName, msg);
}
}

Subscribing to Topicsโ€‹

Receive data from ROS 2 in Unity:

using UnityEngine;
using Unity.Robotics.ROSTCPConnector;
using RosMessageTypes.Sensor;

public class LaserScanSubscriber : MonoBehaviour
{
private ROSConnection ros;
private string topicName = "/scan";

void Start()
{
ros = ROSConnection.GetOrCreateInstance();
ros.Subscribe<LaserScanMsg>(topicName, OnLaserScanReceived);
}

void OnLaserScanReceived(LaserScanMsg msg)
{
// Process laser scan data
float[] ranges = msg.ranges;
float angleMin = msg.angle_min;
float angleIncrement = msg.angle_increment;

// Visualize or use the data
Debug.Log($"Received {ranges.Length} laser points");
}
}

Calling ROS Servicesโ€‹

using UnityEngine;
using Unity.Robotics.ROSTCPConnector;
using RosMessageTypes.Std;

public class ServiceCaller : MonoBehaviour
{
private ROSConnection ros;

void Start()
{
ros = ROSConnection.GetOrCreateInstance();
ros.RegisterRosService<SetBoolRequest, SetBoolResponse>("/enable_motor");
}

public void EnableMotor(bool enable)
{
SetBoolRequest request = new SetBoolRequest { data = enable };
ros.SendServiceMessage<SetBoolResponse>(
"/enable_motor",
request,
OnServiceResponse
);
}

void OnServiceResponse(SetBoolResponse response)
{
Debug.Log($"Motor enabled: {response.success}, Message: {response.message}");
}
}

Synthetic Data Generationโ€‹

Why Synthetic Data?โ€‹

Training perception models requires massive labeled datasets. Real-world data collection is:

  • Expensive: Hours of human labeling
  • Limited: Hard to capture edge cases
  • Biased: May not cover all scenarios

Synthetic data solves these problems with:

  • Automatic labeling: Perfect ground truth
  • Unlimited scale: Generate millions of images
  • Controlled variation: Test specific scenarios

Unity Perception Packageโ€‹

The Perception package provides tools for generating labeled training data:

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚ Synthetic Data Generation Pipeline โ”‚
โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
โ”‚ โ”‚
โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚
โ”‚ โ”‚ Scene โ”‚ โ”‚ Camera โ”‚ โ”‚ Labelers โ”‚ โ”‚
โ”‚ โ”‚ Setup โ”‚โ”€โ”€โ–ถโ”‚ Capture โ”‚โ”€โ”€โ–ถโ”‚ (Annotation) โ”‚ โ”‚
โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚
โ”‚ โ”‚ โ”‚
โ”‚ โ–ผ โ”‚
โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚
โ”‚ โ”‚ Output Dataset โ”‚ โ”‚
โ”‚ โ”‚ - RGB Images โ”‚ โ”‚
โ”‚ โ”‚ - Bounding Box โ”‚ โ”‚
โ”‚ โ”‚ - Segmentation โ”‚ โ”‚
โ”‚ โ”‚ - Depth Maps โ”‚ โ”‚
โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚
โ”‚ โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

Setting Up Perceptionโ€‹

  1. Add Perception Camera:
// Attach to your camera
using UnityEngine.Perception.GroundTruth;

public class PerceptionSetup : MonoBehaviour
{
void Start()
{
var perceptionCamera = gameObject.AddComponent<PerceptionCamera>();

// Add labelers
perceptionCamera.AddLabeler(new BoundingBox2DLabeler());
perceptionCamera.AddLabeler(new SemanticSegmentationLabeler());
perceptionCamera.AddLabeler(new InstanceSegmentationLabeler());
}
}
  1. Label Objects:
using UnityEngine.Perception.GroundTruth;

// Add to objects you want to detect
public class ObjectLabeler : MonoBehaviour
{
void Start()
{
var labeling = gameObject.AddComponent<Labeling>();
labeling.labels.Add("robot");
labeling.labels.Add("humanoid");
}
}
  1. Configure Output:
// Perception settings (via UI or code)
{
"outputPath": "PerceptionOutput",
"captureFormat": "PNG",
"capturesPerIteration": 1,
"framesPerCapture": 1
}

Output Formatโ€‹

Unity Perception generates COCO-compatible annotations:

{
"captures": [
{
"id": "frame_001",
"filename": "rgb/frame_001.png",
"annotations": [
{
"label_id": 1,
"label_name": "robot",
"instance_id": 42,
"bounding_box": {
"x": 120,
"y": 80,
"width": 200,
"height": 350
}
}
]
}
]
}

Domain Randomizationโ€‹

What is Domain Randomization?โ€‹

Domain Randomization varies simulation parameters to help ML models generalize to real-world conditions:

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚ Domain Randomization Strategy โ”‚
โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
โ”‚ โ”‚
โ”‚ Randomize during training: โ”‚
โ”‚ โ”‚
โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚
โ”‚ โ”‚ Lighting โ”‚ โ”‚ Textures โ”‚ โ”‚ Object Positions โ”‚ โ”‚
โ”‚ โ”‚ - Intensityโ”‚ โ”‚ - Colors โ”‚ โ”‚ - Random spawns โ”‚ โ”‚
โ”‚ โ”‚ - Color โ”‚ โ”‚ - Patterns โ”‚ โ”‚ - Orientations โ”‚ โ”‚
โ”‚ โ”‚ - Directionโ”‚ โ”‚ - Materialsโ”‚ โ”‚ - Scale variations โ”‚ โ”‚
โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚
โ”‚ โ”‚
โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚
โ”‚ โ”‚ Camera โ”‚ โ”‚ Noise โ”‚ โ”‚ Distractors โ”‚ โ”‚
โ”‚ โ”‚ - Position โ”‚ โ”‚ - Gaussian โ”‚ โ”‚ - Background โ”‚ โ”‚
โ”‚ โ”‚ - FOV โ”‚ โ”‚ - Blur โ”‚ โ”‚ - Foreground โ”‚ โ”‚
โ”‚ โ”‚ - Exposure โ”‚ โ”‚ - Occlusionโ”‚ โ”‚ - Clutter โ”‚ โ”‚
โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚
โ”‚ โ”‚
โ”‚ Result: Model learns to handle real-world variations! โ”‚
โ”‚ โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

Implementing Randomizersโ€‹

using UnityEngine;
using UnityEngine.Perception.Randomization.Randomizers;
using UnityEngine.Perception.Randomization.Parameters;

[AddRandomizerMenu("Custom/Lighting Randomizer")]
public class LightingRandomizer : Randomizer
{
public FloatParameter lightIntensity = new FloatParameter { value = new UniformSampler(0.5f, 2.0f) };
public ColorHsvaParameter lightColor = new ColorHsvaParameter();

private Light sceneLight;

protected override void OnIterationStart()
{
if (sceneLight == null)
sceneLight = FindObjectOfType<Light>();

sceneLight.intensity = lightIntensity.Sample();
sceneLight.color = lightColor.Sample();
}
}

[AddRandomizerMenu("Custom/Object Position Randomizer")]
public class ObjectPositionRandomizer : Randomizer
{
public FloatParameter xPosition = new FloatParameter { value = new UniformSampler(-5f, 5f) };
public FloatParameter zPosition = new FloatParameter { value = new UniformSampler(-5f, 5f) };
public FloatParameter yRotation = new FloatParameter { value = new UniformSampler(0f, 360f) };

public GameObject targetObject;

protected override void OnIterationStart()
{
if (targetObject != null)
{
targetObject.transform.position = new Vector3(
xPosition.Sample(),
targetObject.transform.position.y,
zPosition.Sample()
);
targetObject.transform.rotation = Quaternion.Euler(0, yRotation.Sample(), 0);
}
}
}

Texture Randomizationโ€‹

using UnityEngine;
using UnityEngine.Perception.Randomization.Randomizers;
using UnityEngine.Perception.Randomization.Parameters;

[AddRandomizerMenu("Custom/Texture Randomizer")]
public class TextureRandomizer : Randomizer
{
public Texture2D[] texturePool;
public CategoricalParameter<Texture2D> textureParameter;

private Renderer[] targetRenderers;

protected override void OnScenarioStart()
{
textureParameter = new CategoricalParameter<Texture2D>();
foreach (var tex in texturePool)
textureParameter.AddOption(tex);

targetRenderers = FindObjectsOfType<Renderer>();
}

protected override void OnIterationStart()
{
foreach (var renderer in targetRenderers)
{
if (renderer.CompareTag("Randomizable"))
{
renderer.material.mainTexture = textureParameter.Sample();
}
}
}
}

Creating Photorealistic Environmentsโ€‹

Universal Render Pipeline (URP) Setupโ€‹

For robotics applications requiring visual realism:

  1. Enable URP features:

    • Screen Space Ambient Occlusion (SSAO)
    • Screen Space Reflections (SSR)
    • Post-processing (Bloom, Color Grading)
  2. Configure lighting:

using UnityEngine;
using UnityEngine.Rendering.Universal;

public class RealisticLightingSetup : MonoBehaviour
{
public Light sunLight;
public ReflectionProbe environmentProbe;

void Start()
{
// Configure sun
sunLight.type = LightType.Directional;
sunLight.shadows = LightShadows.Soft;
sunLight.shadowResolution = UnityEngine.Rendering.LightShadowResolution.VeryHigh;
sunLight.color = new Color(1f, 0.95f, 0.9f); // Warm sunlight
sunLight.intensity = 1.5f;

// Configure environment reflections
environmentProbe.mode = UnityEngine.Rendering.ReflectionProbeMode.Realtime;
environmentProbe.refreshMode = UnityEngine.Rendering.ReflectionProbeRefreshMode.EveryFrame;
}
}

PBR Materials for Robotsโ€‹

Create physically accurate materials:

public class MetalMaterialSetup : MonoBehaviour
{
void Start()
{
var renderer = GetComponent<Renderer>();
var material = new Material(Shader.Find("Universal Render Pipeline/Lit"));

// Brushed metal appearance
material.SetFloat("_Metallic", 0.9f);
material.SetFloat("_Smoothness", 0.7f);
material.SetColor("_BaseColor", new Color(0.8f, 0.8f, 0.85f));

renderer.material = material;
}
}

Human-Robot Interaction Scenariosโ€‹

Why Unity for HRI?โ€‹

Unity excels at human-robot interaction research:

  • Character animation: Realistic human movements
  • Facial expressions: Emotional responses
  • Voice integration: Speech synthesis and recognition
  • Social scenarios: Crowd simulation

Setting Up Human Charactersโ€‹

using UnityEngine;

public class HumanCharacterController : MonoBehaviour
{
private Animator animator;
public Transform robotTarget;
public float interactionDistance = 2f;

void Start()
{
animator = GetComponent<Animator>();
}

void Update()
{
float distance = Vector3.Distance(transform.position, robotTarget.position);

if (distance < interactionDistance)
{
// Face the robot
Vector3 direction = robotTarget.position - transform.position;
direction.y = 0;
transform.rotation = Quaternion.LookRotation(direction);

// Trigger interaction animation
animator.SetBool("IsInteracting", true);
}
else
{
animator.SetBool("IsInteracting", false);
}
}
}

Gesture Recognition Integrationโ€‹

using UnityEngine;
using Unity.Robotics.ROSTCPConnector;
using RosMessageTypes.Std;

public class GesturePublisher : MonoBehaviour
{
private ROSConnection ros;
private Animator humanAnimator;

void Start()
{
ros = ROSConnection.GetOrCreateInstance();
ros.RegisterPublisher<StringMsg>("/detected_gesture");
humanAnimator = GetComponent<Animator>();
}

public void OnGestureDetected(string gestureName)
{
// Publish gesture to ROS 2
StringMsg msg = new StringMsg { data = gestureName };
ros.Publish("/detected_gesture", msg);

// Trigger corresponding animation
humanAnimator.SetTrigger(gestureName);
}
}

Practical Exercise: Complete Unity-ROS 2 Pipelineโ€‹

Goalโ€‹

Create a Unity simulation that:

  1. Imports a robot from URDF
  2. Generates synthetic training data
  3. Communicates with ROS 2

Step-by-Step Implementationโ€‹

1. Project Structure:

Assets/
โ”œโ”€โ”€ Robots/
โ”‚ โ””โ”€โ”€ my_robot.urdf
โ”œโ”€โ”€ Scripts/
โ”‚ โ”œโ”€โ”€ RobotController.cs
โ”‚ โ”œโ”€โ”€ DataGenerator.cs
โ”‚ โ””โ”€โ”€ ROSBridge.cs
โ”œโ”€โ”€ Scenes/
โ”‚ โ””โ”€โ”€ RoboticsSimulation.unity
โ””โ”€โ”€ Randomizers/
โ””โ”€โ”€ CustomRandomizers.cs

2. Main Controller Script:

using UnityEngine;
using Unity.Robotics.ROSTCPConnector;
using RosMessageTypes.Sensor;
using RosMessageTypes.Geometry;

public class RobotSimulationController : MonoBehaviour
{
private ROSConnection ros;
private Camera robotCamera;
public ArticulationBody robotBase;

// ROS topics
private string imageTopic = "/camera/image_raw";
private string cmdVelTopic = "/cmd_vel";
private string jointStateTopic = "/joint_states";

void Start()
{
ros = ROSConnection.GetOrCreateInstance();

// Register publishers
ros.RegisterPublisher<ImageMsg>(imageTopic);
ros.RegisterPublisher<JointStateMsg>(jointStateTopic);

// Subscribe to velocity commands
ros.Subscribe<TwistMsg>(cmdVelTopic, OnCmdVelReceived);

// Start publishing
InvokeRepeating("PublishSensorData", 0.1f, 0.033f); // 30Hz
}

void OnCmdVelReceived(TwistMsg msg)
{
// Apply velocity to robot
Vector3 linearVel = new Vector3(
(float)msg.linear.x,
(float)msg.linear.y,
(float)msg.linear.z
);

Vector3 angularVel = new Vector3(
(float)msg.angular.x,
(float)msg.angular.y,
(float)msg.angular.z
);

robotBase.velocity = linearVel;
robotBase.angularVelocity = angularVel;
}

void PublishSensorData()
{
// Publish camera image
// Publish joint states
}
}

3. Launch ROS 2 Side:

# Terminal 1: ROS-TCP Endpoint
ros2 run ros_tcp_endpoint default_server_endpoint

# Terminal 2: Verify connection
ros2 topic list

# Terminal 3: Send commands
ros2 topic pub /cmd_vel geometry_msgs/msg/Twist \
"{linear: {x: 0.5}, angular: {z: 0.1}}"

Hybrid Simulation: Unity + Gazeboโ€‹

When to Use Bothโ€‹

For complex projects, combine simulators:

ComponentSimulatorReason
Physics simulationGazeboMore accurate dynamics
Visual renderingUnityPhotorealistic output
ML training dataUnityDomain randomization
Control testingGazeboROS 2 native
DemonstrationUnityBetter visuals

Architecture for Hybrid Simulationโ€‹

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚ Hybrid Simulation Architecture โ”‚
โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
โ”‚ โ”‚
โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚
โ”‚ โ”‚ Gazebo โ”‚โ—€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ–ถโ”‚ Unity โ”‚ โ”‚
โ”‚ โ”‚ (Physics) โ”‚ Sync โ”‚ (Rendering) โ”‚ โ”‚
โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚
โ”‚ โ”‚ โ”‚ โ”‚
โ”‚ โ–ผ โ–ผ โ”‚
โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚
โ”‚ โ”‚ Joint States โ”‚ โ”‚ Camera Images โ”‚ โ”‚
โ”‚ โ”‚ Sensor Data โ”‚ โ”‚ Training Data โ”‚ โ”‚
โ”‚ โ”‚ Physics State โ”‚ โ”‚ Visualization โ”‚ โ”‚
โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚
โ”‚ โ”‚ โ”‚ โ”‚
โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚
โ”‚ โ–ผ โ”‚
โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚
โ”‚ โ”‚ ROS 2 โ”‚ โ”‚
โ”‚ โ”‚ Middleware โ”‚ โ”‚
โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚
โ”‚ โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

Summaryโ€‹

In this chapter, you learned:

  • Unity's role in robotics: Photorealistic simulation and synthetic data
  • ROS 2 integration: Bidirectional communication via ROS-TCP-Connector
  • URDF import: Bringing ROS robots into Unity with articulated physics
  • Synthetic data generation: Using the Perception package for ML training
  • Domain randomization: Creating robust models that transfer to reality
  • Human-robot interaction: Building scenarios with human characters
  • Hybrid approaches: Combining Unity and Gazebo strengths

Unity fills a critical gap in the robotics simulation ecosystem by providing the visual fidelity needed for modern perception systems.


Further Readingโ€‹


Next Week Previewโ€‹

In Chapter 8, we enter Module 3: The AI-Robot Brain with NVIDIA Isaac Sim:

  • GPU-accelerated physics simulation
  • Integration with NVIDIA's AI stack
  • Advanced perception with Isaac ROS
  • Synthetic data at scale with Omniverse Replicator