Currently autoware does work this way, but really, a generic gps_localizer should produce a Position or Pose (depending on the gps hardware) in the “earth” frame using ECEF coordinates. Then the above mentioned EKF node can take this Pose and update the map->odom->base_link tfs accordingly.
This would also allow more flexibility when using autoware anywhere in the world (currently, I’ve heard stories of people having to fake their GPS data and pretend they are in Japan to use autoware without modification). All the user has to do is provide the earth -> map TF and then you can run autoware anywhere you want. Last week I wrote a quick gps-only localization node (no EKF) to see if I could localize without ndt_matching at all (in case you are in a featureless environment). Autoware is currently operating just fine (still more testing needed though) using the methodology I described thanks to the de-coupling that TF trees provide (if everything operates in the map frame, it doesn’t matter where the map frame is). Hoping to bring some of this work into autoware, but we need to define this EKF node first.
I’m not following you on that, wouldn’t TR (turn-rate) from CATR be the same as angular velocity about z (or yaw)?. In the link you provided, is that just an example of a CATR motion model, or the one actually in use for tracking purposes? The lack of orientation (yaw) surprises me, wouldn’t we want to filter the direction of detected/tracked objects? Or is the orientation just passed through from object detection? I’ve never worked closely on object tracking code, so I may be over-estimating what is possible, I know detected object noise makes it hard to get good state estimations of objects so people tend to opt for simpler EKFs.
This is the first I’ve heard of the CATR acronym, and I’m definitely not a localization expert (it’s been years since I’ve needed to look closely at an EKF), but it sounds similar to what we’ve used in the past which was basically a really simple bicycle motion model with constant acceleration (this was for ego state estimation, not sure what we used for detected object tracking).