Workflow:CARLA simulator Carla Simulation Setup and First Steps
| Knowledge Sources | |
|---|---|
| Domains | Autonomous_Driving, Simulation |
| Last Updated | 2026-02-15 12:00 GMT |
Overview
End-to-end process for connecting to a CARLA simulator instance, loading a map, and populating the scene with vehicles and sensors for autonomous driving simulation.
Description
This workflow covers the fundamental steps for setting up a CARLA simulation session from a Python client. It begins with establishing a client-server connection, proceeds through map selection and loading, and culminates in spawning an ego vehicle with attached sensors. The process introduces the core CARLA concepts: Client, World, Blueprint Library, Actors, and Transforms. Upon completion, the user has a running simulation with an ego vehicle that can be controlled manually or via autopilot, and sensors actively producing data.
Usage
Execute this workflow when starting a new CARLA simulation session for the first time, or whenever you need to set up a fresh simulation environment for data collection, algorithm testing, or scenario development. The prerequisite is a running CARLA server (either from a packaged release or built from source).
Execution Steps
Step 1: Connect to the CARLA Server
Establish a TCP connection from a Python client to a running CARLA server instance. The client is initialized with a hostname and port (default localhost:2000) and a connection timeout. Once connected, retrieve the World object which provides access to all simulation state including actors, map, weather, and settings.
Key considerations:
- The server must be running before connecting
- Default port is 2000; the streaming port is automatically assigned as port+1
- Set a reasonable client timeout (e.g., 10 seconds) to detect connection failures early
Step 2: Configure World Settings
Adjust the simulation settings to match the use case. The primary decision is whether to run in synchronous or asynchronous mode. Synchronous mode ensures deterministic simulation by requiring the client to explicitly tick the server. Set the fixed delta seconds to control simulation timestep granularity.
Key considerations:
- Synchronous mode is required for reproducible results and proper sensor synchronization
- Typical fixed_delta_seconds values range from 0.05 (20 Hz) to 0.1 (10 Hz)
- Always restore original settings before disconnecting to avoid leaving the server in sync mode
Step 3: Load a Map
Select and load a simulation map. CARLA provides multiple pre-built maps (Town01 through Town12, plus specialty maps). Each map has different road layouts, intersections, and environmental characteristics suited for different testing scenarios. Loading a map resets the world state.
Key considerations:
- Use get_available_maps() to list all installed maps
- Map loading destroys all existing actors and resets settings
- Map layers (buildings, vegetation, etc.) can be individually loaded or unloaded to reduce rendering cost
Step 4: Browse and Select Blueprints
Access the Blueprint Library to discover available actor types. Blueprints define the properties of actors (vehicles, walkers, sensors, props). Filter the library by pattern to find specific actor types, then configure blueprint attributes such as color or sensor resolution.
Key considerations:
- Filter patterns use wildcards: 'vehicle.*' for all vehicles, 'sensor.camera.*' for cameras
- Blueprints have configurable attributes (e.g., color, number_of_wheels, role_name)
- Setting role_name to 'hero' marks the ego vehicle for special treatment by Traffic Manager
Step 5: Spawn the Ego Vehicle
Retrieve the map spawn points and select one for the ego vehicle. Spawn the vehicle using the chosen blueprint at the selected transform. Use try_spawn_actor for fault-tolerant spawning that returns None instead of raising on collision.
Key considerations:
- Spawn points are pre-defined safe locations on the road network
- try_spawn_actor is preferred over spawn_actor to handle occupied spawn points gracefully
- The spectator camera can be moved to the ego vehicle location for visual monitoring
Step 6: Attach Sensors to the Ego Vehicle
Create sensor blueprints (cameras, LiDAR, GNSS, IMU), configure their attributes (resolution, field of view, tick rate), and spawn them attached to the ego vehicle. Register listen callbacks to process incoming sensor data.
Key considerations:
- Sensors are spawned with a relative transform to the parent vehicle
- Attachment types include Rigid (locked to vehicle), SpringArm (follows with damping), and SpringArmGhost (passes through geometry)
- Each sensor needs a listen callback registered to receive data
- sensor_tick attribute controls the update frequency (0.0 means every simulation tick)
Step 7: Enable Autopilot or Apply Manual Control
Either enable the Traffic Manager autopilot for autonomous driving or apply manual vehicle controls (throttle, steer, brake). When using autopilot, the Traffic Manager handles all driving decisions. For manual control, construct a VehicleControl object with the desired values each frame.
Key considerations:
- set_autopilot(True) registers the vehicle with the default Traffic Manager
- Manual control requires applying a VehicleControl every simulation tick
- In synchronous mode, call world.tick() to advance the simulation after applying controls