Architecture Refresh
Higher Level Architecture
Entire Architecture Graph v0.2
Entire Architecture Graph v0.1
You can also get the vectorized version from the diagrams.net link:
Rationale
For the Bus ODD(Operational Design Domain) requirements to be realized, we will refresh the architecture of the Autoware. With Core and Universe repositories coming, there is the need for a higher level module definitions.
The expected inputs and outputs of these high level modules should be defined thoroughly to enable development around a unified core.
This should also make it easier for new developers to understand the code-base and make it simpler for them to contribute.
Modules (v0.1)
I've created the draft architecture to start from. I've also added basic expectations from the modules.
We should also go over the existing messages finalize the interface between these modules. We have a nice message structure defined in Autoware.Auto but we should go over these to enable connections to potential new modules. And deprecate the messages/fields that aren't/won't be used.
Drivers
Drivers are the gateway between the raw sensory information and the ROS2 messaging system. They take inputs in various formats such as Serial, TCP/UDP, CANBUS and output ROS2 messages.
Driver Outputs
Perception
Perception module is expected to perform 3 main tasks:
- Providing downsampled, combined point cloud to the localization stack for scan matching purposes
- Providing 3D obstacle and object detections/predictions to the planning stack
- Maintaining a Traffic Light Recognition server that calculates what the given traffic light color is.
Perception Inputs
name |
format |
Camera Images |
Defined in Drivers |
Lidar Point Cloud |
Defined in Drivers |
Radar Point Cloud |
Defined in Drivers |
Vehicle Kinematic State |
Defined in Localization |
Traffic Light Query |
Defined in Planning |
Perception Outputs
Localization
Localization Inputs
name |
format |
Camera Images |
Defined in Drivers |
Lidar Point Cloud |
Defined in Drivers |
Radar Point Cloud |
Defined in Drivers |
GNSS Message |
Defined in Drivers |
INS Message |
Defined in Drivers |
Vehicle Odometry |
Defined in Vehicle Interface |
Localization Outputs
Planning
Planning Inputs
name |
format |
3D Object Predictions |
Defined in Perception |
Traffic Light Response |
Defined in Perception |
Vehicle Kinematic State |
Defined in Localization |
Lanelet2 Map |
Defined in Map Server |
Engagement Response |
Defined in User Interface |
Planning Outputs
Control
This is the simplest module in terms of connectivity. Control is easy right?
Control Inputs
name |
format |
Vehicle Kinematic State |
Defined in Localization |
Trajectory |
Defined in Planning |
Control Outputs
Vehicle Interface
Vehicle Interface also talks to the vehicle in a specific protocol defined by the vehicle. And acts as the gateway between the ROS2 messaging system and the vehicle.
Vehicle Interface Inputs
name |
format |
Vehicle Control Command |
Defined in Control |
Vehicle Signal Commands |
Defined in Planning |
Vehicle Interface Outputs
User Interface
User Interface Inputs
name |
format |
Engagement Request |
Defined in Planning |
Lanelet2 Map |
Defined in Map Server |
User Interface Outputs
name |
format |
Engagement Response |
A new message that defines the response by the user that approves or rejects the request for a certain routine |
Goal Pose |
sensor_msgs/msg/NavSatFix.msg |
Map Server
Map Server reads the Lanelet2 map from the disk and publishes it for other nodes to use for navigation.
It also takes in the current position info of the vehicle, reads the feature map around the vehicle from the disk and publishes the for localization module to use.
User Interface Outputs
References
The AVP Demo Architecture: https://autowarefoundation.gitlab.io/autoware.auto/AutowareAuto/avpdemo.html