Module 1: Gazebo Fundamentals & Physics Simulation
Module 1 introduces Gazebo as your primary open‑source physics simulator and shows how to represent robots, environments, and sensors using SDF. You will learn how rigid body dynamics, collisions, and constraints are modeled, and how to connect Gazebo to your ROS 2 system so that your capstone humanoid can move in a simulated world before it ever moves in reality.
1.1 Why Simulation? The Robotics Dilemma
Cost, Risk, and Iteration
Humanoid robots are expensive (from tens to hundreds of thousands of dollars) and fragile. A single unstable controller or bad trajectory can produce:
- Broken hardware (joints, gearboxes, sensors)
- Unsafe behaviors near humans
- Days or weeks of downtime while repairs are made
Simulation solves this by providing a cheap, safe, and fast environment for experimentation:
- Run thousands of trials without wearing out hardware
- Test edge cases (slips, pushes, near-collisions) that would be unacceptable on a real robot
- Reproduce scenarios precisely for debugging and ablation studies
What is a Digital Twin?
A digital twin is a virtual replica of your robot and environment with:
- Geometry & mass: URDF/SDF models, meshes, inertia tensors
- Dynamics: Joints, limits, friction, damping, gravity
- Sensors: Cameras, LiDAR, IMUs, force/torque sensors
- Environment: Floors, obstacles, lighting, materials
The goal is not perfect one‑to‑one fidelity, but a simulation that is close enough that behavior and algorithms transfer with minimal surprises once you move to real hardware.
Kinematics vs Physics Simulation
-
Kinematics-only simulation:
- Computes joint positions and end‑effector poses given joint commands
- Ignores contact forces, friction, inertia, and impacts
- Good for early motion planning and collision‑free geometry checks
-
Full physics simulation:
- Models rigid body dynamics, contacts, friction, and joint torques
- Captures falls, slips, and unstable gaits
- Essential for biped balance, manipulation forces, and realistic sensor data
Gazebo supports both, but for humanoids you will rely primarily on full physics simulations.
1.2 Gazebo Architecture and Setup
Gazebo Server–Client Model
Gazebo follows a server–client architecture:
-
Server (
gz server)- Runs the physics engine
- Advances simulation time, handles collisions, integrates equations of motion
- Loads worlds, models, and plugins
-
Client (
gz gui)- Renders 3D visualization
- Provides GUI tools (play/pause, reset, model insertion, plotting)
- Connects to the server over a network transport
ROS 2 nodes typically interact with the server via the gazebo_ros bridge and plugins that publish/subscribe to ROS topics.
Installing Gazebo with ROS 2 (Ubuntu 22.04)
For ROS 2 Humble/Iron and Gazebo Garden/Fortress:
- Install ROS 2 and Gazebo following the official docs
- Add
gazebo_ros_pkgsfor ROS 2 integration:
sudo apt update
sudo apt install ros-humble-gazebo-ros-pkgs
You will maintain a ROS 2 workspace that contains:
- Gazebo worlds (
.world/.sdf) - Robot description packages (URDF/SDF)
- Gazebo plugins (C++ or Python via wrappers)
- Launch files that start Gazebo + ROS 2 nodes together
Worlds, Models, and Plugins
Gazebo organizes content into:
- Worlds: Top-level SDF files describing the environment, physics settings, and which models are present
- Models: Reusable definitions of robots, objects, and structures
- Plugins: Compiled code that extends Gazebo (controllers, custom sensors, logging)
Typical directory layout:
my_sim_ws/
models/
humanoid/
model.sdf
meshes/
worlds/
lab_world.sdf
src/
humanoid_plugins/
1.3 SDF: Simulation Description Format
SDF vs URDF
-
URDF:
- Originally created for ROS
- Focused on kinematics, geometry, and basic inertial properties
- Widely used for robot description and RViz visualization
-
SDF:
- Created for Gazebo as a simulation‑first format
- Supports worlds, lights, sensors, materials, and advanced physics settings
- Better suited for full simulation, including multiple robots and environments
For this chapter:
- You will often author the robot in URDF (Chapter 2)
- Then convert or wrap it into SDF for Gazebo
Core SDF Structure
At a high level, an SDF file is:
world→ physics settings, global properties, top‑level modelsmodel→ links, joints, pluginslink→ inertial, collision, visual, sensorsjoint→ type, limits, dynamics, parent/child links
You will:
- Define at least one world with ground plane, gravity, and lighting
- Include your humanoid model as a nested
modelelement - Attach sensors to appropriate links (heads, torsos, hands)
1.4 Physics Simulation: Rigid Body Dynamics
Forces, Torques, and Constraints
Gazebo uses physics engines such as ODE, Bullet, or DART to:
- Integrate Newton–Euler equations for each link
- Enforce joint constraints and limits
- Compute contact forces when surfaces collide
Key concepts:
- Rigid body: A link with mass and inertia tensor
- Joint: Constrains relative motion between links (revolute, prismatic, fixed, etc.)
- Contact: Surface interactions with friction and restitution
For humanoids, contact‑rich behaviors (feet on ground, hand–object interaction) are especially sensitive to friction coefficients and time‑step settings.
Time Stepping and Stability
Simulation advances in discrete time steps:
- Time step (
dt): e.g., 0.001 s (1 kHz) - Solver iterations: Number of constraint solver iterations per step
Trade‑offs:
- Larger
dt→ faster simulation but risk of missed collisions, penetration, or numerical instability - Smaller
dt→ more accurate and stable but slower
You will generally start with:
dtin the 0.0005–0.001 s range- Adequate solver iterations for stable biped contacts
Then tune upward or downward based on performance and stability.
1.5 Sensor Simulation in Gazebo
Built‑In Sensors
Gazebo provides built‑in sensor types:
- Camera: RGB images
- Depth camera: Depth images and point clouds
- LiDAR: 2D or 3D scanning rangefinders
- IMU: Accelerometer + gyroscope + (optional) magnetometer
- GPS: Position and velocity in WGS84 frames
Each sensor:
- Attaches to a
linkin your robot model - Has update rate, field of view, resolution, and noise parameters
- Can be connected to ROS 2 topics using
gazebo_rosplugins
Noise and Realism
Real sensors are noisy, biased, and delayed. If your simulation assumes:
- Perfect depth maps
- Noise‑free IMU data
- Instantaneous camera frames
…your perception and control pipelines will overfit to unrealistic conditions and fail on hardware.
You will:
- Add Gaussian noise to ranges, pixels, and accelerations
- Model approximate bias and drift for IMUs
- Introduce realistic latency (tens of milliseconds)
So that your controllers and perception algorithms are robust to real‑world imperfections.
1.6 ROS 2 Integration: gazebo_ros Bridge
Connecting Gazebo and ROS 2
The gazebo_ros bridge exposes Gazebo’s internal entities as ROS 2 interfaces:
- Simulated sensors → ROS 2 topics (
sensor_msgs/Image,sensor_msgs/Imu, etc.) - Model states → topics/services for pose and velocity
- Joint states and commands → controllers that speak ROS messages
You will:
- Use
gazebo_rosplugins in SDF to:- Publish sensor data to ROS 2
- Subscribe to motor commands from ROS 2
- Write launch files that:
- Start Gazebo with your world
- Launch perception nodes, controllers, and RViz
Time Synchronization
Simulation typically runs with sim time (/clock topic), not wall clock:
- ROS 2 nodes must set
use_sim_timetotrue - All timestamps (sensor messages, TF, logs) must use sim time
This is essential for reproducibility:
- You can pause, rewind, and replay simulations via rosbag
- Perception and control algorithms see a consistent timeline
1.7 Hands-On Lab: Build Your First Gazebo World
Lab Goals
In this lab, you will:
- Create a basic Gazebo world with:
- Ground plane, gravity, and physics engine settings
- Simple environment elements (walls, boxes)
- Import a simplified humanoid URDF from Chapter 2
- Attach core sensors:
- RGB camera
- Depth camera or LiDAR
- IMU
- Use
gazebo_rosto:- Publish sensor data to ROS 2 topics
- Receive joint commands from a ROS 2 node
- Implement a minimal joint controller that moves the humanoid in simulation
Success Criteria
- The humanoid spawns at the correct location and does not fall through the floor
- Publishing to a motor command topic causes visible joint motion
- Simulated sensors publish reasonable data:
- Camera images view the environment
- Depth/LiDAR sees obstacles at correct ranges
- IMU reflects gravity and motion
- The simulation remains stable (no exploding joints or wild oscillations)
By finishing Module 1, you will have a working Gazebo‑based digital twin skeleton: a physics‑enabled humanoid with ROS 2‑connected sensors and actuators, ready for higher‑level perception and planning in later modules.