Quantcast
Channel: ROS Answers: Open Source Q&A Forum - RSS feed
Viewing all 50 articles
Browse latest View live

Gazebo and PR2 simulation setup publishing nan values on base_odometry/odom

0
0
Some background: Essentially I have set up pr2 in simulation in an empty world on gazebo. I would like to set the initial pose of the robot, so I have done the following: model_state = gazebo.msg.ModelState() model_state.model_name = 'pr2' model_state.pose.position.x = map2lab[0] model_state.pose.position.y = map2lab[1] rospy.wait_for_service('/gazebo/set_model_state') try: sms = rospy.ServiceProxy('/gazebo/set_model_state', SetModelState) print 'Model State: ' + str(model_state) sms(model_state) except rospy.ServiceException, e: print "Service call failed: %s"%e At this point, I'm not sure if it has broken yet, but I then would also like to move the robot around, so I set up the following: self.pub = rospy.Publisher('/base_controller/command', Twist) rospy.init_node('locomotionCommand') Which later I use to publish something like the following (in which v and w are always real numbers): twist = Twist() twist.linear.x = v twist.linear.y = 0 twist.linear.z = 0 twist.angular.x = 0 twist.angular.y = 0 twist.angular.z = w self.pub.publish(twist) So sometime in here, the pr2 jumps and disappears and I get the following error repeated at some high frequency: [ERROR] [1343312019.823059305, 40.582000000]: TF_NAN_INPUT: Ignoring transform for child_frame_id "/wheelodom" from authority "default_authority" because of a nan value in the transform (nan nan nan) (-nan -nan -nan -nan) I'm fairly certain it is only on /base_odometry/odom, here's a sample: header: seq: 5320 stamp: secs: 58 nsecs: 971000000 frame_id: odom child_frame_id: '' pose: pose: position: x: nan y: nan z: 0.0 orientation: x: nan y: nan z: nan w: nan covariance: [4e-06, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4e-06, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.7976931348623157e+308, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.7976931348623157e+308, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.7976931348623157e+308, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.00028900000000000003] twist: twist: linear: x: 2.2798345089 y: 2.02649641037 z: 0.0 angular: x: 0.0 y: 0.0 z: -0.20633020997 covariance: [4e-06, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4e-06, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.7976931348623157e+308, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.7976931348623157e+308, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.7976931348623157e+308, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.00028900000000000003] I'm at a loss to figure out what is going on here.

How to make my own base controller publish /odom?

0
0
I have no robot, so I must build one piece by piece. I use differential drivers ,each with a encoder.And each encoder can output two channels signal :A,B. I can read channel A and B to aquire the speed and direction of a motor through Arduino device. But how can I integrate the signal from the two encoders with the topic of /odom with the lowest level device Arduino? If this is done by Arduino ,I doubt the ability of an 8-bit processor. so is there any relevent reference about it? i have check the detail of msg /odom: rosmsg show nav_msgs/Odometry std_msgs/Header header uint32 seq time stamp string frame_id string child_frame_id geometry_msgs/PoseWithCovariance pose geometry_msgs/Pose pose geometry_msgs/Point position float64 x float64 y float64 z geometry_msgs/Quaternion orientation float64 x float64 y float64 z float64 w float64[36] covariance geometry_msgs/TwistWithCovariance twist geometry_msgs/Twist twist geometry_msgs/Vector3 linear float64 x float64 y float64 z geometry_msgs/Vector3 angular float64 x float64 y float64 z float64[36] covariance thank you very much!

PR2, Odometry, Gazebo

0
0
I'm using the pr2 robot in gazebo. I've been seeing the odometer publishers *(/base_odometry/odom, /base_odometry/odometer, /base_odometry/state and /robot_pose_ekf/odom_combined)* and what I want to ask is how to correct the position error? I mean, like when you have a kalman filter with a GPS, with the data of the GPS you correct the position, is there any topic or service who do this work with the data of the robot's position from gazebo? In short, how do I update the odometer (estimated position) values?

error in TurtleBot2.......

0
0
Hi,guys. I have big trouble.....in TurtleBot2. When I launch rviz, TurtleBot2 rotations itself........but in fact it is not rotation...... This is video: [Turtlebot2 error](http://youtu.be/bYaklkvUeXo) I use groovy in ros and Ubuntu 12.04LTS 64-bit.

Read odometry or gyro sensor's data of turtlebot

0
0
Hi all, At the moment, I am looking for the way to determine the velocity of turtlebot. As I research I can use the navigation pack to do this, but maybe now in ros groovy ( my current distro) it is not supported. I see the ecl_navigation pack, so is it as same as the navigation pack, if not is there any way for me to determine the velocity of robot with using ROS Thanks in advance

slam_gmapping odometry tf already published

0
0
Hey, I'm running my robot with a laser scanner and wheel odometry and the navigation stack. I already built a map from logged data and now I want to build it online. So I start my launch file for my robot (ibr_node) and the laser scanner. My ibr_node sends a transform from the map frame to the odom_base_link_frame. Then I start a launch file for slam_gmapping: odom_frame: odom_base_link base_frame: base_link With this transform tree: map -> odom_base_link -> base_link -> laser The transform from odom_base_link to base link and the transform from base_link to laser are static. roswtf shows this Error: ERROR TF multiple authority contention: * node [/slam_gmapping] publishing transform [odom_base_link] with parent [map] already published by node [/ibr_node] * node [/ibr_node] publishing transform [odom_base_link] with parent [map] already published by node [/slam_gmapping] How can I solve this? When I just dont send the odometry transform in my ibr_node the map frame is missing in my transform tree. Thanks in advance!! :)

Need to publish X, Y position to ROS while slamming with gmapping

0
0
Shall I say, when laser sanner is working very well and gmapping is translating message to /map topic very well, there is no need to publish X, Y position and heading angle to ROS via /odom topic?

How scan angle is measured and published in LaserScan msg?

0
0
Range of the distance of the obstacles are measured from output of laser sensor but how actually the scan angles are measured. do we use any separate sensor to get the value?

robot starts drifiting/rotating in RVIZ, issues with odometry data or amcl

0
0
Hi all, I have a mobile robot and I would like it to navigate around the building and I already have a map of the building. I am using wheel encoders to generate [odometry message](http://wiki.ros.org/navigation/Tutorials/RobotSetup/Odom), [robot_pose_ekf](http://wiki.ros.org/robot_pose_ekf) to fuse data from IMU and wheel_encoders, [amcl](http://wiki.ros.org/amcl) for localization and [move_base](http://wiki.ros.org/move_base) for planning. I am using hokuyo laser for navigation. The transformations and tf tree seems to be correct. Once I start the robot (initial pose of the robot is specified using the parameters of amcl), everything looks good, I have attached the RVIZ image below: ![image description](/upfiles/14304085791422608.png) Now, when I rotate the robot at its initial position manually just to see whether the odometry data and localization is working properly, say I rotate the robot by 90 degrees in clockwise direction and then again 90 degrees in anticlockwise direction manually to bring it to the original position, it seems to be working at first but after some time, the robot in RVIZ starts drifting although the actual robot is stationary. It will keep on drifting and will stop at some random and wrong position. I have attached images below (the actual position of the robot is same as the above image): ![image description](/upfiles/1430408748850875.png) ![image description](/upfiles/14304092189909822.png) Also, when I give a goal to the robot, it is able to go towards the goal position but once it is close to the goal position, it again starts behaving in a weird manner (starts rotating in a random way which messes up the localization of the robot). I am not able to figure out whether the problem lies in the odometry data or amcl localization. If anyone has any idea about how to solve such an issue, any help will be greatly appreciated. Let me know if you need more information from my side. Thanks in advance. Naman Kumar

doubt regarding working of base_local_planner

0
0
Hi all, I have a doubt regarding working of [base_local_planner](http://wiki.ros.org/base_local_planner). As the base_local_planner requires an odometry information (`nav_msgs/Odometry`) to get the current speed of the robot, I would like to know how does the base_local_planner uses the odometry message to effectively generate the command velocities (`/cmd_vel`), I mean what kind of controller does it use (is it PID or something) or how does it work out? I am more interested in knowing how the odometry message is interpreted to better estimate the command velocities. Can anyone throw some light on it or point me in the right direction. Thanks in advance. Naman Kumar

How is the base_local_planner using the odom message

0
0
Hi all, I am just getting familiar with [move_base](http://wiki.ros.org/move_base), specifically with the working of the [base_local_planner](http://wiki.ros.org/base_local_planner). I went through the code and the documentation on the ros wiki but I have a small doubt regarding **how is the odom (`nav_msgs/Odometry`) message used by the base_local_planner to give a better estimate of the command velocity (`/cmd_vel`) ?** Is there any kind of a controller (maybe only P Control) which is present which reads the current robot velocity and accordingly generates the command velocity which is sent to the mobile base? Sorry If I am being really naive, but it would be great if someone can clear my doubt or point me in the right direction? Thanks in advance. Naman Kumar

Robot navigation using CAD dawing maps

0
0
I want to provide a map drawn in CAD software for robot navigation instead of laser scans, is there any package or stack to take the CAD drawn maps for navigation. How to accomplish this task?

adding own sensor to robot_pose_ekf

0
0
Hi all, I am using [robot_pose_ekf](http://wiki.ros.org/robot_pose_ekf) to fuse data from rotary encoders and IMU and it seems to be working pretty good. Then, I am adding [another sensor](http://www.decawave.com/products/evk1000-evaluation-kit) to improve indoor localization. I am getting the pose(x and y) and the yaw of the robot using two such sensors on the robot in the `base_footprint` frame and I use that information to generate a `nav_msgs/Odometry` message which is published on the `/vo` topic which is the third topic to which `robot_pose_ekf` subscribes. Once I do that, the performance of `robot_pose_ekf` is bad and the pose of the robot keeps on changing and jumping in RVIZ although the physical robot is not moving. I am following [this](http://answers.ros.org/question/39790/robot_pose_ekf-and-gps/) and [this](http://answers.ros.org/question/11682/robot_pose_ekf-with-an-external-sensor/) post. From different posts (including [this](http://answers.ros.org/question/173713/gps-in-ekf/)), I have realized people are having difficulties in integrating GPS with `robot_pose_ekf` and the sensor which I am using is similar to GPS in the sense that it gives absolute values unlike rotary encoders and IMU. But I am just generating a `nav_msgs/odometry` message and publishing it on `/vo` topic for `robot_pose_ekf` and I have specified correct covariance values ( I hope): 0.001 for x,y,yaw and 99999 for roll, pitch and z and the performance of `robot_pose_ekf` is really bad. Also from different posts, I have noticed that [robot_localization](http://wiki.ros.org/robot_localization) package might be a better way to go. But it might take some time for me to set it up and therefore I was hoping if I could set my [third sensor](http://www.decawave.com/products/evk1000-evaluation-kit) and run it using `robot_pose_ekf` before eventually moving to `robot_localization`. Does anyone have any idea how can I efficiently integrate my [third sensor](http://www.decawave.com/products/evk1000-evaluation-kit) which publishes `nav_msgs/Odometry` message on `/vo topic` with `robot_pose_ekf`? Any help will be appreciated. Let me know if you need more information from my side. Thanks in advance. Naman Kumar

Using wheel encoders and IMU with robot_localization

0
0
Hi all, I am using wheel_encoders which is giving odometry information of the type `nav_msgs/Odometry` and IMU which gives `sensor_msgs/IMU` (IMU is giving orientation, angular velocity and linear acceleration). I am using them with [robot_localization](http://wiki.ros.org/robot_localization). It seems to be working but when I save a bag file and then run it and visualize the odom (from the `wheel_encoders`) and the odometry output of `robot_localization`, they seem off under the settings shown in the launch file below. Here is the link to the [video](https://youtu.be/4k5htqfe3i0) (Red arrow represents odometry from `wheel_encoders` and Blue arrow represents output odometry from `robot_localization`). **ekf_robot_localization.launch** [false,false, false, false, false, false, true, true, false, false, false, true, false, false, false][false, false, false, false, false, false, false, false, false, false, false, true, false, false, false][0.05, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.05, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.06, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.03, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.03, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.06, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.025, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.025, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.04, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.02, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.015][1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e+9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e+9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e+9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e+9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e+9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e+9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e+9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e+9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e+9] In the above case, `wheel_encoders` odometry and output filtered odometry of `robot_localization` is off meaning they diverge. Is this behavior normal? Also, If I use x,y,yaw from `wheel_encoders` odometry (`differential = false`) and vyaw from IMU (differential = false), odometry from `wheel_encoders` overlaps perfectly with the output of `robot_localization`. Also, If I just use `wheel_encoders` odometry in `robot_localization` (no IMU), its odometry overlaps perfectly with the odometry output of `robot_localization` (obviously). I am just trying to understand why is this happening and what is causing it and what are best settings for the above case? Can someone pitch in with their ideas? Thanks in advance. Naman Kumar

best way to compare two pose estimates from different sources

0
0
Hi all, I have a mobile robot with `navigation_stack` up and running. The robot is able to navigate properly from point A to point B. I am using encoders to get the odometry data and IMU, encoders and Lidar ([AMCL](http://wiki.ros.org/amcl)) using [robot_localization](http://wiki.ros.org/robot_localization) to localize the robot and [move_base](http://wiki.ros.org/move_base) for planning. Now, I have a Ultrawide band sensor which gives the pose of the robot in the global frame. I want to compare the pose estimated by UWB sensor and the actual pose of the robot (output of `robot_localization`)(both gives pose_estimate in the map frame). Both poses are of the form of `nav_msgs/Odometry`. I know I can subscribe to both topics and compare the pose values from both of them and see how different they are. I was just wondering if there is a standard way or a better way to compare two pose estimates of the same type coming from different sources. Any suggestions regarding it will be appreciated. Thanks in advance. Naman Kumar

How does the Planer move plugin work?

0
0
I am trying to get the build a omni-directional robot in gazebo. I got it working model by applying forces to each wheel following [this](http://www.romin.upm.es/wiki/index.php?title=ROS%3ATutorials/Build_Your_Own_URDF_Robot) tutorial. But the problem is that I also need to do odometry, which I cannot do controlling the model this way. I want to do path planning and navigation. I cam across [this](http://answers.ros.org/question/212889/gazebo-planar-move-plugin-for-omni-directional-wheel/?answer=220467#post-id-220467) but I do not understand how I use it. Do I still control each wheel individual by setting commands to each wheel use the \cmd_vel topic? Or Do I ignore the individual wheel speeds and just use the /cmd_vel for the velocity of the robot? I have been stuck on this for a while

Tf origin is away from the base_link

0
0
I have built my differential drive mobile robot in solidworks and converted that to URDF file using soliworks2urdf convertor. I successfully launched and robot and simulated with tele-operation node. Since i am intended to use navigation stack in i viewed the transform of the robot in rviz which resulted as below. ![image description](/upfiles/14522727952715033.png) As you can see the base plate is the one which supports the wheels and castors but the tf of base plate is shown away from the actual link and even odom is away from the model. Where have i gone wrong and how to fix this. Refer the URDF of model below. Wheel_LWheel_RBase_plate0.2350.12trueCastor_F, Castor_R

Robot model orientation is improper with respect to map frame.

0
0
Robot model is not properly oriented (upside down) in rviz when "map" is selected as fixed frame. When the fixed frame is odom the robot model was perfectly aligned and oriented in rviz. How to bring the robot to same position as it was in odom frame. Refer the result below ![image description](/upfiles/1454152820846060.png) ![image description](/upfiles/1454225177294279.png)

What is the odometry frame in relation to?

0
0
I'm trying to implement an odometry publisher for a small robot following the [tutorial](http://wiki.ros.org/navigation/Tutorials/RobotSetup/Odom), but I'm confused by the terminology. An odometry message "stores an estimate of the position and velocity of a robot in free space". It stores the estimate in free space...relative to what? If the robot doesn't know where it is, what significance does the pose variable mean? Is it simply `(0,0,0)`? When the robot turns, how do you update the twist without knowing your initial orientation?

Dynamic Footprint in Navigation

0
0
Dear all, i want to ask if someone has an idea of how to make my footprint dynamic depending on the speed. So for example, if the robot is traveling with high speed, the footprint size should increase and vice versa. In my configuration files for navigation i am using DWA (same as the turtlebot config files), but it would be really nice to control the footprint dynamically. Any help? Thanks
Viewing all 50 articles
Browse latest View live




Latest Images