Chapter 7: Unity for Robotics
While Gazebo excels at physics-accurate simulation, Unity brings photorealistic rendering, advanced ML integration, and cross-platform deployment to robotics. In this chapter, we explore Unity Robotics Hub and how it complements traditional simulators.
Learning Objectivesโ
By the end of this chapter, you will be able to:
- Explain when to use Unity vs Gazebo for robotics simulation
- Set up Unity with ROS 2 integration
- Create photorealistic environments for robot testing
- Generate synthetic training data for perception models
- Implement domain randomization for robust ML models
- Build human-robot interaction scenarios
Why Unity for Robotics?โ
The Perception Gapโ
Traditional robotics simulators like Gazebo focus on physics accuracy but often produce visually simplistic environments:
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ The Simulation Reality Gap โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โ
โ Gazebo Simulation Real World โ
โ โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ โ
โ โ Simple colors โ โ Complex texturesโ โ
โ โ Basic shapes โ โ โ Lighting vary โ โ
โ โ Flat lighting โ GAP โ Occlusions โ โ
โ โ No reflections โ โ Reflections โ โ
โ โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ โ
โ โ
โ ML models trained on simple visuals fail in real world! โ
โ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Unity's Strengths for Roboticsโ
| Capability | Benefit for Robotics |
|---|---|
| Photorealistic Rendering | Train perception models that transfer to real world |
| Universal Render Pipeline | Real-time ray tracing, PBR materials |
| Domain Randomization | Automatic variation of textures, lighting, objects |
| Synthetic Data Generation | Labeled datasets (bounding boxes, segmentation) |
| Cross-Platform | Windows, Linux, embedded devices |
| Asset Store | Thousands of 3D models, environments |
| C# Scripting | Rapid prototyping, custom behaviors |
When to Use Each Simulatorโ
| Use Case | Recommended Simulator |
|---|---|
| Physics-critical (contact, dynamics) | Gazebo |
| Perception/ML training | Unity |
| Photorealistic visualization | Unity |
| ROS 2 ecosystem tools | Gazebo |
| Human-robot interaction | Unity |
| Large-scale fleet simulation | Both (hybrid) |
| Real-time control testing | Gazebo |
| Synthetic dataset generation | Unity |
Unity Robotics Hub Overviewโ
What is Unity Robotics Hub?โ
Unity Robotics Hub is a collection of tools and packages that enable robotics development in Unity:
- ROS-TCP-Connector: Bidirectional ROS 2 communication
- URDF Importer: Import robot models from URDF
- Perception Package: Synthetic data generation with labels
- ML-Agents: Reinforcement learning integration
- Articulation Bodies: Physics for articulated robots
Architectureโ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Unity Robotics Architecture โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ Unity Editor โ โ
โ โ โโโโโโโโโโโโโ โโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโ โ โ
โ โ โ Scene โ โ Physics โ โ Rendering โ โ โ
โ โ โ Editor โ โ Engine โ โ Pipeline โ โ โ
โ โ โโโโโโโฌโโโโโโ โโโโโโโฌโโโโโโ โโโโโโโโโฌโโโโโโโโ โ โ
โ โ โ โ โ โ โ
โ โ โโโโโโโผโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโ โโผโโโโโโโโ โ โ
โ โ โ Unity Runtime โ โ โ
โ โ โโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโ โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ โ
โ โโโโโโโโผโโโโโโโ โ
โ โ ROS-TCP โ โ
โ โ Connector โ โ
โ โโโโโโโโฌโโโโโโโ โ
โ โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ ROS 2 โ โ
โ โ โโโโโโโโโโโโ โโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโ โ โ
โ โ โ Nodes โ โ Topics โ โ Services โ โ โ
โ โ โโโโโโโโโโโโ โโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโ โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Setting Up Unity for Roboticsโ
Prerequisitesโ
- Unity Hub: Download from unity.com
- Unity Editor: Version 2021.3 LTS or later (2022.3 recommended)
- ROS 2: Humble or later installed on your system
Step 1: Create a New Unity Projectโ
# Open Unity Hub and create a new project
# Template: 3D (URP) - Universal Render Pipeline
# Project Name: RoboticsSimulation
Step 2: Install Robotics Packagesโ
In Unity, open Window โ Package Manager, then:
- Click + โ Add package from git URL
- Add these packages one by one:
https://github.com/Unity-Technologies/ROS-TCP-Connector.git?path=/com.unity.robotics.ros-tcp-connector
https://github.com/Unity-Technologies/URDF-Importer.git?path=/com.unity.robotics.urdf-importer
https://github.com/Unity-Technologies/com.unity.perception.git
Step 3: Configure ROS-TCP Connectionโ
Create a connection settings asset:
- Assets โ Create โ Robotics โ ROS Connection Prefab
- Configure:
- ROS IP Address:
127.0.0.1(or your ROS machine IP) - ROS Port:
10000 - Protocol: ROS2
- ROS IP Address:
Step 4: Start the ROS-TCP Endpointโ
On your ROS 2 machine:
# Install the ROS-TCP-Endpoint package
cd ~/ros2_ws/src
git clone https://github.com/Unity-Technologies/ROS-TCP-Endpoint.git -b main-ros2
# Build
cd ~/ros2_ws
colcon build --packages-select ros_tcp_endpoint
# Source and run
source install/setup.bash
ros2 run ros_tcp_endpoint default_server_endpoint --ros-args -p ROS_IP:=0.0.0.0
Importing Robots with URDFโ
URDF Importer Workflowโ
Unity's URDF Importer converts your ROS robot descriptions into Unity GameObjects:
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ URDF Import Pipeline โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โ
โ URDF/Xacro File โ
โ โ โ
โ โผ โ
โ โโโโโโโโโโโโโโโ โ
โ โ Parse Links โโโโถ Unity GameObjects with Transforms โ
โ โโโโโโโโโโโโโโโ โ
โ โ โ
โ โผ โ
โ โโโโโโโโโโโโโโโ โ
โ โParse Joints โโโโถ Articulation Bodies (physics joints) โ
โ โโโโโโโโโโโโโโโ โ
โ โ โ
โ โผ โ
โ โโโโโโโโโโโโโโโ โ
โ โParse Meshes โโโโถ MeshFilter + MeshRenderer components โ
โ โโโโโโโโโโโโโโโ โ
โ โ โ
โ โผ โ
โ โโโโโโโโโโโโโโโ โ
โ โ Colliders โโโโถ Unity Colliders (Box, Mesh, etc.) โ
โ โโโโโโโโโโโโโโโ โ
โ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Importing Your Robotโ
- Copy your URDF file and meshes to
Assets/Robots/ - In Unity: Assets โ Import Robot from URDF
- Select your
.urdffile - Configure import settings:
// Import settings (in the import dialog)
public class URDFImportSettings
{
public bool UseUrdfInertiaData = true;
public bool UseGravity = true;
public float GlobalScale = 1.0f;
public ImportPipelineType Pipeline = ImportPipelineType.ArticulationBody;
}
Articulation Bodies for Robot Physicsโ
Unity's Articulation Bodies provide stable physics for articulated robots:
using UnityEngine;
public class RobotController : MonoBehaviour
{
private ArticulationBody[] joints;
void Start()
{
// Get all articulation bodies in the robot
joints = GetComponentsInChildren<ArticulationBody>();
}
public void SetJointTarget(int jointIndex, float targetPosition)
{
if (jointIndex < joints.Length)
{
var drive = joints[jointIndex].xDrive;
drive.target = targetPosition * Mathf.Rad2Deg;
joints[jointIndex].xDrive = drive;
}
}
public float GetJointPosition(int jointIndex)
{
if (jointIndex < joints.Length)
{
return joints[jointIndex].jointPosition[0];
}
return 0f;
}
}
ROS 2 Communication in Unityโ
Publishing Topicsโ
Send data from Unity to ROS 2:
using UnityEngine;
using Unity.Robotics.ROSTCPConnector;
using RosMessageTypes.Geometry;
public class VelocityPublisher : MonoBehaviour
{
private ROSConnection ros;
private string topicName = "/cmd_vel";
public float publishFrequency = 10f;
void Start()
{
ros = ROSConnection.GetOrCreateInstance();
ros.RegisterPublisher<TwistMsg>(topicName);
InvokeRepeating("PublishVelocity", 1f, 1f / publishFrequency);
}
void PublishVelocity()
{
TwistMsg msg = new TwistMsg
{
linear = new Vector3Msg { x = 0.5, y = 0, z = 0 },
angular = new Vector3Msg { x = 0, y = 0, z = 0.1 }
};
ros.Publish(topicName, msg);
}
}
Subscribing to Topicsโ
Receive data from ROS 2 in Unity:
using UnityEngine;
using Unity.Robotics.ROSTCPConnector;
using RosMessageTypes.Sensor;
public class LaserScanSubscriber : MonoBehaviour
{
private ROSConnection ros;
private string topicName = "/scan";
void Start()
{
ros = ROSConnection.GetOrCreateInstance();
ros.Subscribe<LaserScanMsg>(topicName, OnLaserScanReceived);
}
void OnLaserScanReceived(LaserScanMsg msg)
{
// Process laser scan data
float[] ranges = msg.ranges;
float angleMin = msg.angle_min;
float angleIncrement = msg.angle_increment;
// Visualize or use the data
Debug.Log($"Received {ranges.Length} laser points");
}
}
Calling ROS Servicesโ
using UnityEngine;
using Unity.Robotics.ROSTCPConnector;
using RosMessageTypes.Std;
public class ServiceCaller : MonoBehaviour
{
private ROSConnection ros;
void Start()
{
ros = ROSConnection.GetOrCreateInstance();
ros.RegisterRosService<SetBoolRequest, SetBoolResponse>("/enable_motor");
}
public void EnableMotor(bool enable)
{
SetBoolRequest request = new SetBoolRequest { data = enable };
ros.SendServiceMessage<SetBoolResponse>(
"/enable_motor",
request,
OnServiceResponse
);
}
void OnServiceResponse(SetBoolResponse response)
{
Debug.Log($"Motor enabled: {response.success}, Message: {response.message}");
}
}
Synthetic Data Generationโ
Why Synthetic Data?โ
Training perception models requires massive labeled datasets. Real-world data collection is:
- Expensive: Hours of human labeling
- Limited: Hard to capture edge cases
- Biased: May not cover all scenarios
Synthetic data solves these problems with:
- Automatic labeling: Perfect ground truth
- Unlimited scale: Generate millions of images
- Controlled variation: Test specific scenarios
Unity Perception Packageโ
The Perception package provides tools for generating labeled training data:
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Synthetic Data Generation Pipeline โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โ
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ โ
โ โ Scene โ โ Camera โ โ Labelers โ โ
โ โ Setup โโโโถโ Capture โโโโถโ (Annotation) โ โ
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ โ
โ โ โ
โ โผ โ
โ โโโโโโโโโโโโโโโโโโโ โ
โ โ Output Dataset โ โ
โ โ - RGB Images โ โ
โ โ - Bounding Box โ โ
โ โ - Segmentation โ โ
โ โ - Depth Maps โ โ
โ โโโโโโโโโโโโโโโโโโโ โ
โ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Setting Up Perceptionโ
- Add Perception Camera:
// Attach to your camera
using UnityEngine.Perception.GroundTruth;
public class PerceptionSetup : MonoBehaviour
{
void Start()
{
var perceptionCamera = gameObject.AddComponent<PerceptionCamera>();
// Add labelers
perceptionCamera.AddLabeler(new BoundingBox2DLabeler());
perceptionCamera.AddLabeler(new SemanticSegmentationLabeler());
perceptionCamera.AddLabeler(new InstanceSegmentationLabeler());
}
}
- Label Objects:
using UnityEngine.Perception.GroundTruth;
// Add to objects you want to detect
public class ObjectLabeler : MonoBehaviour
{
void Start()
{
var labeling = gameObject.AddComponent<Labeling>();
labeling.labels.Add("robot");
labeling.labels.Add("humanoid");
}
}
- Configure Output:
// Perception settings (via UI or code)
{
"outputPath": "PerceptionOutput",
"captureFormat": "PNG",
"capturesPerIteration": 1,
"framesPerCapture": 1
}
Output Formatโ
Unity Perception generates COCO-compatible annotations:
{
"captures": [
{
"id": "frame_001",
"filename": "rgb/frame_001.png",
"annotations": [
{
"label_id": 1,
"label_name": "robot",
"instance_id": 42,
"bounding_box": {
"x": 120,
"y": 80,
"width": 200,
"height": 350
}
}
]
}
]
}
Domain Randomizationโ
What is Domain Randomization?โ
Domain Randomization varies simulation parameters to help ML models generalize to real-world conditions:
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Domain Randomization Strategy โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โ
โ Randomize during training: โ
โ โ
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ Lighting โ โ Textures โ โ Object Positions โ โ
โ โ - Intensityโ โ - Colors โ โ - Random spawns โ โ
โ โ - Color โ โ - Patterns โ โ - Orientations โ โ
โ โ - Directionโ โ - Materialsโ โ - Scale variations โ โ
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ Camera โ โ Noise โ โ Distractors โ โ
โ โ - Position โ โ - Gaussian โ โ - Background โ โ
โ โ - FOV โ โ - Blur โ โ - Foreground โ โ
โ โ - Exposure โ โ - Occlusionโ โ - Clutter โ โ
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ
โ Result: Model learns to handle real-world variations! โ
โ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Implementing Randomizersโ
using UnityEngine;
using UnityEngine.Perception.Randomization.Randomizers;
using UnityEngine.Perception.Randomization.Parameters;
[AddRandomizerMenu("Custom/Lighting Randomizer")]
public class LightingRandomizer : Randomizer
{
public FloatParameter lightIntensity = new FloatParameter { value = new UniformSampler(0.5f, 2.0f) };
public ColorHsvaParameter lightColor = new ColorHsvaParameter();
private Light sceneLight;
protected override void OnIterationStart()
{
if (sceneLight == null)
sceneLight = FindObjectOfType<Light>();
sceneLight.intensity = lightIntensity.Sample();
sceneLight.color = lightColor.Sample();
}
}
[AddRandomizerMenu("Custom/Object Position Randomizer")]
public class ObjectPositionRandomizer : Randomizer
{
public FloatParameter xPosition = new FloatParameter { value = new UniformSampler(-5f, 5f) };
public FloatParameter zPosition = new FloatParameter { value = new UniformSampler(-5f, 5f) };
public FloatParameter yRotation = new FloatParameter { value = new UniformSampler(0f, 360f) };
public GameObject targetObject;
protected override void OnIterationStart()
{
if (targetObject != null)
{
targetObject.transform.position = new Vector3(
xPosition.Sample(),
targetObject.transform.position.y,
zPosition.Sample()
);
targetObject.transform.rotation = Quaternion.Euler(0, yRotation.Sample(), 0);
}
}
}
Texture Randomizationโ
using UnityEngine;
using UnityEngine.Perception.Randomization.Randomizers;
using UnityEngine.Perception.Randomization.Parameters;
[AddRandomizerMenu("Custom/Texture Randomizer")]
public class TextureRandomizer : Randomizer
{
public Texture2D[] texturePool;
public CategoricalParameter<Texture2D> textureParameter;
private Renderer[] targetRenderers;
protected override void OnScenarioStart()
{
textureParameter = new CategoricalParameter<Texture2D>();
foreach (var tex in texturePool)
textureParameter.AddOption(tex);
targetRenderers = FindObjectsOfType<Renderer>();
}
protected override void OnIterationStart()
{
foreach (var renderer in targetRenderers)
{
if (renderer.CompareTag("Randomizable"))
{
renderer.material.mainTexture = textureParameter.Sample();
}
}
}
}
Creating Photorealistic Environmentsโ
Universal Render Pipeline (URP) Setupโ
For robotics applications requiring visual realism:
-
Enable URP features:
- Screen Space Ambient Occlusion (SSAO)
- Screen Space Reflections (SSR)
- Post-processing (Bloom, Color Grading)
-
Configure lighting:
using UnityEngine;
using UnityEngine.Rendering.Universal;
public class RealisticLightingSetup : MonoBehaviour
{
public Light sunLight;
public ReflectionProbe environmentProbe;
void Start()
{
// Configure sun
sunLight.type = LightType.Directional;
sunLight.shadows = LightShadows.Soft;
sunLight.shadowResolution = UnityEngine.Rendering.LightShadowResolution.VeryHigh;
sunLight.color = new Color(1f, 0.95f, 0.9f); // Warm sunlight
sunLight.intensity = 1.5f;
// Configure environment reflections
environmentProbe.mode = UnityEngine.Rendering.ReflectionProbeMode.Realtime;
environmentProbe.refreshMode = UnityEngine.Rendering.ReflectionProbeRefreshMode.EveryFrame;
}
}
PBR Materials for Robotsโ
Create physically accurate materials:
public class MetalMaterialSetup : MonoBehaviour
{
void Start()
{
var renderer = GetComponent<Renderer>();
var material = new Material(Shader.Find("Universal Render Pipeline/Lit"));
// Brushed metal appearance
material.SetFloat("_Metallic", 0.9f);
material.SetFloat("_Smoothness", 0.7f);
material.SetColor("_BaseColor", new Color(0.8f, 0.8f, 0.85f));
renderer.material = material;
}
}
Human-Robot Interaction Scenariosโ
Why Unity for HRI?โ
Unity excels at human-robot interaction research:
- Character animation: Realistic human movements
- Facial expressions: Emotional responses
- Voice integration: Speech synthesis and recognition
- Social scenarios: Crowd simulation
Setting Up Human Charactersโ
using UnityEngine;
public class HumanCharacterController : MonoBehaviour
{
private Animator animator;
public Transform robotTarget;
public float interactionDistance = 2f;
void Start()
{
animator = GetComponent<Animator>();
}
void Update()
{
float distance = Vector3.Distance(transform.position, robotTarget.position);
if (distance < interactionDistance)
{
// Face the robot
Vector3 direction = robotTarget.position - transform.position;
direction.y = 0;
transform.rotation = Quaternion.LookRotation(direction);
// Trigger interaction animation
animator.SetBool("IsInteracting", true);
}
else
{
animator.SetBool("IsInteracting", false);
}
}
}
Gesture Recognition Integrationโ
using UnityEngine;
using Unity.Robotics.ROSTCPConnector;
using RosMessageTypes.Std;
public class GesturePublisher : MonoBehaviour
{
private ROSConnection ros;
private Animator humanAnimator;
void Start()
{
ros = ROSConnection.GetOrCreateInstance();
ros.RegisterPublisher<StringMsg>("/detected_gesture");
humanAnimator = GetComponent<Animator>();
}
public void OnGestureDetected(string gestureName)
{
// Publish gesture to ROS 2
StringMsg msg = new StringMsg { data = gestureName };
ros.Publish("/detected_gesture", msg);
// Trigger corresponding animation
humanAnimator.SetTrigger(gestureName);
}
}
Practical Exercise: Complete Unity-ROS 2 Pipelineโ
Goalโ
Create a Unity simulation that:
- Imports a robot from URDF
- Generates synthetic training data
- Communicates with ROS 2
Step-by-Step Implementationโ
1. Project Structure:
Assets/
โโโ Robots/
โ โโโ my_robot.urdf
โโโ Scripts/
โ โโโ RobotController.cs
โ โโโ DataGenerator.cs
โ โโโ ROSBridge.cs
โโโ Scenes/
โ โโโ RoboticsSimulation.unity
โโโ Randomizers/
โโโ CustomRandomizers.cs
2. Main Controller Script:
using UnityEngine;
using Unity.Robotics.ROSTCPConnector;
using RosMessageTypes.Sensor;
using RosMessageTypes.Geometry;
public class RobotSimulationController : MonoBehaviour
{
private ROSConnection ros;
private Camera robotCamera;
public ArticulationBody robotBase;
// ROS topics
private string imageTopic = "/camera/image_raw";
private string cmdVelTopic = "/cmd_vel";
private string jointStateTopic = "/joint_states";
void Start()
{
ros = ROSConnection.GetOrCreateInstance();
// Register publishers
ros.RegisterPublisher<ImageMsg>(imageTopic);
ros.RegisterPublisher<JointStateMsg>(jointStateTopic);
// Subscribe to velocity commands
ros.Subscribe<TwistMsg>(cmdVelTopic, OnCmdVelReceived);
// Start publishing
InvokeRepeating("PublishSensorData", 0.1f, 0.033f); // 30Hz
}
void OnCmdVelReceived(TwistMsg msg)
{
// Apply velocity to robot
Vector3 linearVel = new Vector3(
(float)msg.linear.x,
(float)msg.linear.y,
(float)msg.linear.z
);
Vector3 angularVel = new Vector3(
(float)msg.angular.x,
(float)msg.angular.y,
(float)msg.angular.z
);
robotBase.velocity = linearVel;
robotBase.angularVelocity = angularVel;
}
void PublishSensorData()
{
// Publish camera image
// Publish joint states
}
}
3. Launch ROS 2 Side:
# Terminal 1: ROS-TCP Endpoint
ros2 run ros_tcp_endpoint default_server_endpoint
# Terminal 2: Verify connection
ros2 topic list
# Terminal 3: Send commands
ros2 topic pub /cmd_vel geometry_msgs/msg/Twist \
"{linear: {x: 0.5}, angular: {z: 0.1}}"
Hybrid Simulation: Unity + Gazeboโ
When to Use Bothโ
For complex projects, combine simulators:
| Component | Simulator | Reason |
|---|---|---|
| Physics simulation | Gazebo | More accurate dynamics |
| Visual rendering | Unity | Photorealistic output |
| ML training data | Unity | Domain randomization |
| Control testing | Gazebo | ROS 2 native |
| Demonstration | Unity | Better visuals |
Architecture for Hybrid Simulationโ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Hybrid Simulation Architecture โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โ
โ โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ โ
โ โ Gazebo โโโโโโโโโโโถโ Unity โ โ
โ โ (Physics) โ Sync โ (Rendering) โ โ
โ โโโโโโโโโโฌโโโโโโโโโ โโโโโโโโโโฌโโโโโโโโโ โ
โ โ โ โ
โ โผ โผ โ
โ โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ โ
โ โ Joint States โ โ Camera Images โ โ
โ โ Sensor Data โ โ Training Data โ โ
โ โ Physics State โ โ Visualization โ โ
โ โโโโโโโโโโฌโโโโโโโโโ โโโโโโโโโโฌโโโโโโโโโ โ
โ โ โ โ
โ โโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโ โ
โ โผ โ
โ โโโโโโโโโโโโโโโโโ โ
โ โ ROS 2 โ โ
โ โ Middleware โ โ
โ โโโโโโโโโโโโโโโโโ โ
โ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Summaryโ
In this chapter, you learned:
- Unity's role in robotics: Photorealistic simulation and synthetic data
- ROS 2 integration: Bidirectional communication via ROS-TCP-Connector
- URDF import: Bringing ROS robots into Unity with articulated physics
- Synthetic data generation: Using the Perception package for ML training
- Domain randomization: Creating robust models that transfer to reality
- Human-robot interaction: Building scenarios with human characters
- Hybrid approaches: Combining Unity and Gazebo strengths
Unity fills a critical gap in the robotics simulation ecosystem by providing the visual fidelity needed for modern perception systems.
Further Readingโ
- Unity Robotics Hub - Official Unity robotics resources
- Unity Perception Package - Synthetic data generation
- ROS-TCP-Connector - ROS 2 integration
- Unity ML-Agents - Reinforcement learning
- Domain Randomization Paper - Original research on sim-to-real transfer
Next Week Previewโ
In Chapter 8, we enter Module 3: The AI-Robot Brain with NVIDIA Isaac Sim:
- GPU-accelerated physics simulation
- Integration with NVIDIA's AI stack
- Advanced perception with Isaac ROS
- Synthetic data at scale with Omniverse Replicator