Hi all,
I have a mobile robot with `navigation_stack` up and running. The robot is able to navigate properly from point A to point B. I am using encoders to get the odometry data and IMU, encoders and Lidar ([AMCL](http://wiki.ros.org/amcl)) using [robot_localization](http://wiki.ros.org/robot_localization) to localize the robot and [move_base](http://wiki.ros.org/move_base) for planning. Now, I have a Ultrawide band sensor which gives the pose of the robot in the global frame. I want to compare the pose estimated by UWB sensor and the actual pose of the robot (output of `robot_localization`)(both gives pose_estimate in the map frame). Both poses are of the form of `nav_msgs/Odometry`. I know I can subscribe to both topics and compare the pose values from both of them and see how different they are. I was just wondering if there is a standard way or a better way to compare two pose estimates of the same type coming from different sources. Any suggestions regarding it will be appreciated.
Thanks in advance.
Naman Kumar
↧