Quantcast
Channel: ROS Answers: Open Source Q&A Forum - RSS feed

Robot navigation using CAD dawing maps

$
0
0
I want to provide a map drawn in CAD software for robot navigation instead of laser scans, is there any package or stack to take the CAD drawn maps for navigation. How to accomplish this task?

Calculating path using linear and angular velocity published by odometry

$
0
0
I have the message type `nav_msgs/Odometry` constantly publishing linear and angular velocity. I am calculating linear distance by taking Velocity at X and Y, multiply them by `1/rate(Hz)` and summing them together. Is this calculation right? Also, I am a bit skeptical about calculating the turn using angular velocity. Any guidance here is appreciated. I am visualizing the path in RViz. It would be great if you could point me to relevant source code that I could look into.

odometry message from Arduino

$
0
0
Hello, I am trying to create and publish odometry data for a differential drive robot from an Arduino Due board with the data coming from wheel encoders and a gyro sensor attached to the board. my intention is to create and publish the Odom messages from the Arduino Due itself, since it has a fairly powerful Microcontroller. (Is it a good practice?) I have a few question though to clarify & make sure about a few things for myself. I check out the [publishing odometry tutorial](http://wiki.ros.org/navigation/Tutorials/RobotSetup/Odom) and i notice that there is a `odom.twist.twist.linear.y = vy` field, do I have to hard code this to `0.0` for a differential drive robot? Is it alright if I just convert the gyro’s readings (which is in Degree Per Second) to Radian/sec and pass it straight as `vth` to `odom.twist.twist.angular.z = vth`? In the Arduino it seems like that i can not use the odom_quat as in the tutorial to convert my heading to quats. Instead should I add `odom_trans.transform.rotation.z` and `odom_trans.transform.rotation.w` to the message manually and calculate the value? At what rate is the best practice to calculate and publish odometry for a Turtlebot class diff drive robot? if you have any heads up or pointers regarding to what i'm trying to do I would appreciate it. Thank you.

Issue while creating a Map using gmapping?

$
0
0
Hello I am in confusion, what really the issue is in? Please any one can solved it, we urgently needed to solve it. In my mobile base Robot , we are now at the stage of creating a map using gmapping, but while creating the map I am facing one issue, The issue is When I move my robot using teleop_twist_keyboard the cmd_vel is generated properly and the robot moves smoothly , but when I use RVIZ tool for visualizing it , in that tool for some second the visualization of robot i.e. its movement on the tool is correct but in between it suddenly changes its route or its position automatically even if my robot has no movement. I have done the analysis of my scan data it is good and also the analysis for my odometry data. When I select the Fixed frame as "odom" it does show this issue.I am facing the issue only when I am trying to create a map i.e when I select my Fixed frame as "map". Please can anyone suggest why this would be happening, where might be the problem.

TF of URDF is not showing from robot state publisher

$
0
0
I am using raspberry pi and trying to publish TF of URDF file. I created a URDF of a differential drive robot and wrote the launch file including joint state publisher & robot state publisher to publish the TF. When i run the launch file and tried to read the TF using tf_monitor, there is no TF value shown, below is my result. When i run rostopic echo tf_static the topic is publishing. ![image description](/upfiles/1477251686645989.png) The strange part is when i tried this in my local PC (with same URDF & launch) the result was perfect, TF of all the links published. Is there any limitation in raspberry pi to publish robot state of URDF? How to resolve this?

Calibrating Pioneer P3DX robot for ROSARIA

$
0
0
Hi All, I am now setting up a new Pioneer 3DX with kinect and URG laser sensor. I have so far managed to put all the sensors and connect them via rosaria. I would now like to do odometry calibration of the mobile base. Does anyone know how to do it. There are no specific tutorials available on the ros wiki page. Also what are the optimum values of (TicksMM, DriftFactor, RevCount) for P3DX mobile robot. At present when I run the rosaria node I get the settings like this: TicksMM - 128 DriftFactor - 0 RevCount - 16570 I can change these values using dynamic_reconfigure. I need the optimum values for this particular model. And how do I permanently change these parameters everytime I launch the rosaria node. Can someone explain me the process. Basically I believe the robots odometry is not good when its turning, how do I correct that. Thanks

STM32 Microcontroller for communication with ROS

$
0
0
I currently intend to deploy an STM32 (Stm32f103c8t6) as the microcontroller to publish the Odom message and message from other sensors also. My questions are as follows: 1. Is there any support available for the STM32 for communicating with ROS(client, serial etc.)? I know of the rosserial but it's pretty confusing to me on the tutorials page. 2. Has anyone implemented the STM32 in their bots? Would it be better to port to an arduino? If not, is the process of writing an altogether new node for the stm32 a tiresome task? Thanks in advance.

Adding odometry to my Custom Robot

$
0
0
So I have the follow code that allows my Robot to move via Teleop. I now need to Publish odometry values. Does anyone have any good ideas on how to do it? import roslib import rospy from BrickPi import * from std_msgs.msg import String from std_msgs.msg import UInt16 from geometry_msgs.msg import Twist #from nav_msgs.msg import * #Model dependend settings PI=3.141 ROBOT_WIDTH=0.16 WHEEL_DIAMETER=0.055 WHEEL_RADIUS=WHEEL_DIAMETER/2 WHEEL_PERIMETER=2*PI*WHEEL_RADIUS #RPM dependant from voltage without load #9.0V = 120 RPM #7.5V = 80 RPM # with load 60-80 rpm is a good average MAX_RPM=80.0 RPS=MAX_RPM/60.0 MPS=RPS*WHEEL_PERIMETER PWRDIV=1000*RPS rospy.loginfo("PWRDIV:"+str(PWRDIV)) RF_WHEEL=PORT_C LF_WHEEL=PORT_D BrickPiSetup() BrickPi.MotorEnable[RF_WHEEL] = 1 BrickPi.MotorEnable[LF_WHEEL] = 1 BrickPi.SensorType[PORT_1] = TYPE_SENSOR_ULTRASONIC_CONT BrickPiSetupSensors() def cmd_vel_callback(cmd_vel): left_speed_out = cmd_vel.linear.x - cmd_vel.angular.z*ROBOT_WIDTH/2 right_speed_out = cmd_vel.linear.x + cmd_vel.angular.z*ROBOT_WIDTH/2 v = cmd_vel.linear.x # speed m/s theta = cmd_vel.angular.z # angle rad/s rospy.loginfo("VEL_CMD_CB: v:" + str(v) + ", theta:" + str(theta)) motor_control(left_speed_out, right_speed_out) def vel_cmd_listener(): rospy.Subscriber("cmd_vel", Twist, cmd_vel_callback) def motor_control(left_speed_out,right_speed_out): rospy.loginfo("LSPEED:"+ str(left_speed_out) + " RSPEED:" + str(right_speed_out)) rospy.loginfo("LSPEED:"+ str(left_speed_out) + " RSPEED:" + str(right_speed_out)) BrickPi.MotorSpeed[RF_WHEEL] = int(-right_speed_out*PWRDIV) rospy.loginfo("RF:"+str(BrickPi.MotorSpeed[RF_WHEEL])) BrickPi.MotorSpeed[LF_WHEEL] = int(-left_speed_out*PWRDIV) rospy.loginfo("LF:"+str(BrickPi.MotorSpeed[LF_WHEEL])) BrickPiUpdateValues() time.sleep(.01) scan_publisher() def scan_publisher(): result = BrickPiUpdateValues() if not result : range = BrickPi.Sensor[PORT_1] us = rospy.Publisher('scan', UInt16) us.publish(UInt16(range)) rospy.loginfo ("SCAN:" + str(range)) if __name__ == '__main__': try: rospy.init_node('RobotBaseController') vel_cmd_listener() time.sleep(.01) rospy.spin() except rospy.ROSInterruptException: pass

Use navigation to correct robot's path

$
0
0
Till now I was using Gazebo to simulation. Now I am trying to implement that on a real robot. I have been successful in doing so but there is one problem in following the path. When the path is straight, the robot **fails to follow a perfectly straight path (like it followed in Gazebo)**. This maybe due to **limitations of odometer** based speed control of motors. So will the navigation stack correct the bot path by giving an alternate **path or speed as a feedback** if it finds out (*by the encoder values that is computed and published as odom*) that the robot is not following the correct path that it previously published? So, does the navigation stack does that or is there any way to do it?

Laser scan data not aligned with map data and real world (Navigation Stack)

$
0
0
![image description](/upfiles/1486285790714428.jpg) Good Morning, I am facing a problem recently with navigation. Once navigating, the laser scan inflation is not aligned with the walls included in the map. It happens at every corner in the map, it starts detecting the corner from the laser scanner before it actually reach it in reality which fail all the navigation. The strange thing is that i have been navigating with this robot for around 6 months and every thing was working really fine, but this problem suddenly appeared recently without changing the navigation software. Any suggestion what could the problem be? is it odometry or laser scanner? If someone had such a problem before and fixed it, it would be great to help. Thanks a lot,

find 3D location of the sensor when a depth image arrives

$
0
0
Hey buddies! I went through the forum and couldn't find a satisfying answer. Sorry, if I reopened an already-answered question. Basically, I need to write a ROS node that will print the 3D location of the sensor and the timestamp in a text format when it detects a depth image. Here is what I think I should do (tho I don't know how to): 1. Build a robot with odometer and mount a web camera on it. 2. Create a node that will publish odometer's data. ->odom_node 3. Create a node that will detect 3D images. -> cam_node 4. I somehow will make the odom_node to listen to the cam_node. I have thousands of questions to ask but that would make it too broad, I don't really know where to start. Could someone direct me to some good tutorials and projects where I can learn and finish the project on my own? I appreciate all the answers. Thank you in advance.

Odometer_Arduino

$
0
0
Hi, OKAY, I hate to admit it but I am a total NOOB. But I really work hard to learn this stuff. So, yeah... I have made an odometer using Arduino and a couple of IR LEDs. Below is the sketch: int counter; int currReading; int prevReading = 0; int threashold = 600; void setup() { Serial.begin(9600); } void loop() { currReading = analogRead (A0); delay(1); if(currReading > threashold && prevReading I don't have any special hardware that supports current libraries. I am making my own stuff. I am literally lost, so any help is pretty much appreciated. Millions of thanks in advance.

Stereo camera, preprocessing, transforms and frames = pointcloud?

$
0
0
Hello, I am trying to avoid the simple questions, so trust me; I've really tried to figure out this for myself. But I find this to be am beyond my understanding. **Situation** - I am acquiring BGR-preprocessed stereo view, but this has raised some issues for me. While the camera software is aimed towards Igloo (and Jade's?) "uvc-camera". For concretizing purposes, let's assume the simplest version of "differential wheeled robot"; with a base footprint, base link etc - as frequently addressed in tutorials. My goal is to achieve perception of the environment utilizing as much as possible from generic packages,., The camera is published under the /stereo/ namespace 1) How do you describe a stereo camera through xacro/urdf? - Is the camera "one"; with e.g. "optical frame" as origin? Or should the left and right camera be described in a separate manner? - Which frames/transforms am I advised to implement? - Should I utilize a dedicated driver? 2) I am finding a recurring error-messages relating to the packages expecting a colored image. At the same time I am facing all these other challenges, making it impossible for me to deduce whether this is a cause or a consequence.' - I have implemented stereo_image_proc ; but without these issues addressed I find it natural that I don't get any disparity or pointcloud data. 3) If colored images are required; could it be an idea to process the BGR-frames captured; use OpenCV to create a colored image intensifying parameters of interest, and then feeding these to the stereo-vision algorithms at a lower framefrate? (I have managed to utilize the BGR images for cv-processing by using a "cv_bridge"-node). 4) Related to the abovementioned; how do I successfully propagate odometry / pose in a simple conceptual system, where an ekf-node provdes filtered sensordata? I'm assuming I should remap the topic somewhere, but I'm struggling to catch the essential factor here. 5) In general as with the camera; I'm struggling with the countless possibilities relating to frames, tf's and dedicated drivers. Should I use a kobuki or a differential driver? What are the main pro's and con's? (It should be mentioned that I won't receive my wheel encoders before the end of next week, so for now I will have to estimate odometry. ) I hope I'm not completely off track here, thanks in advance. Please notify me if I should change and/or elaborate on any of these issues.

Runtime Odometry Correction

$
0
0
Hello We are stuck at a point, unable to solved it. Please suggest, how can I do it !. We am working on small Module, firstly we drive the robot at curtain distance e.g say 7 m and store the drive trajectory data Odometry + Laserscan as it's reference points. Using FeedBack Controller and stored data we make the robot travel the same taught path . Now I am stuck at , run time we are getting drift in current odometry due to which we are unable to tackle the exact taught path. Does we have any technique to solve run-time drifts in robot odometry . I their a way to solve the drift in odometry at run-time appearance. Please lets us know if any one have idea regarding it.

Pose Covariance for swiftnav_ros rtk message

$
0
0
Hello all, I am using Piski Multi RTK gps system for centimeter accurate pose information. Below is a snapshot of a Nav_msgs::Odometry message, as published by the swiftnav_ros package, after taking my little robot outside and driving it around, with the gps in RTK mode. header: seq: 183 stamp: secs: 1507219077 nsecs: 699150124 frame_id: gps child_frame_id: '' pose: pose: position: x: 2.89 y: 4.694 z: -0.749 orientation: x: 0.0 y: 0.0 z: 0.0 w: 0.0 covariance: [324000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 324000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1296000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0] twist: twist: linear: x: -1.111 y: 0.852 z: 0.016 angular: x: 0.0 y: 0.0 z: 0.0 covariance: [0.0022500000000000003, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0022500000000000003, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0046500000000000005, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0] --- header: seq: 184 stamp: secs: 1507219077 nsecs: 799420749 frame_id: gps child_frame_id: '' pose: pose: position: x: 2.779 y: 4.766 z: -0.75 orientation: x: 0.0 y: 0.0 z: 0.0 w: 0.0 covariance: [324000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 324000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1296000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0] twist: twist: linear: x: -1.099 y: 0.715 z: -0.026 angular: x: 0.0 y: 0.0 z: 0.0 covariance: [0.0022500000000000003, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0022500000000000003, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0046500000000000005, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0] Please note that the covariances for x, y, and z pose are 529000000.0, 529000000.0, and 2601000000.0, respectively. These numbers seem obsenely large, or is this normal and correct? The numbers for the twist message seem reasonable.. Is it possible that i wasn't in RTK mode? I think it is possible i was in float RTK and not fixed RTK. Thank you, Jad

Sabertooth odometry feed back with kangaroo x2

$
0
0
Apologies , this is not a ROS related question . I wanted to know if Kangaroo X2 can give us feedback for odometry . I know kangaroo can do PID control using the encoders(as much as I understand it's like with Kangaroo +sabertooth , if we ask the motor to turn 1 unit ,it will turn 1 unit ) My question is do we feed a real-time feedback from either the sabertooth or Kangaroo ? Or is Kangaroo enough for odometry ? (Ie fake published odometry.)

Ground truth to show robot movement in Rviz

$
0
0
Hello all, I'm trying to figure out what is need and the information path so that information the robot shown in Rviz moves relative to a fixed frame. (as opposed to the robot standing still and the world moving around it). The practical case is that I am using px4's sitl tools to simulate an UAV. It opens fine if I use the UAV's base_link as a fixed frame. When the UAV moves, it only moves in Gazebo, in Rviz is the world around it that moves. It looks like I am missing the transformation between world and base_link. But I don't understand who to do it. I read about fixed frames in the tutorials and I saw a lot of code using the librotors_gazebo_odometry_plugin I also saw that the /uav1/mavros/local_position/odom topic is publishing the NaviationMessage Odometry with the values I want. But since there is no world frame anywhere (I'm assuming it is a special frame loaded with world) I don't understand what stuff to route to where. I am launching joint_state_publisher, robot_state_publisher and rviz all from the same robot_description. [Urdf file](/upfiles/15156968383508633.gif) Here is my tf tree (had to cut it in half and remove arms and most rotors otherwise was too big) ![image description](/upfiles/1515696631975623.png)![image description](/upfiles/1515696644213806.png)

Help needed with TF issues .

$
0
0
Hey , i am working on a 4 wheel drive differential robot , ***I am using Rosserial to talk to the arduino (cmd_vel) and arduino publishes back (tf , in which the odom is there).*** Timed out waiting for transform from base_link to map to become available before running costmap, tf error: Could not find a connection between 'map' and 'base_link' because they are not part of the same tree.Tf has two or more unconnected trees.. canTransform returned after 0.101146 timeout was 0.1. ![image description](/upfiles/1519791791661797.png) As you can see from the image , the error is `No transform from camera to odom` . I have done all simulations and it works properly ,but when i am using the Real robot , this error pops up . What could be the error , the possibilities that i could think of are the below 1. The frames not being synced , will i have to subscribe to a /clock topic on the arduino to synchronize the /tf topic . 2. /odom as a dedicated topic doesnt exist . Also can someone explain me the differences between 1. /odometry/filtered ![image description](/upfiles/15197927334620956.png) 2. /velocity_controller/odom ![image description](/upfiles/15197928196303803.png) 3. /tf topic ![image description](/upfiles/1519792766314356.png) ![image description](/upfiles/15197988812798659.png) Additional information , the Arduino code which communicates to kangaroo x2. enter code here #include #include #include #include #include #include #include #include #include #include #include #include #define TX_PIN 13 //s1 #define RX_PIN 12 //s2 // Arduino TX (pin 2) goes to Kangaroo S1 // Arduino RX (pin 3) goes to Kangaroo S2 // Independent mode channels on Kangaroo are, by default, '1' and '2'. SoftwareSerial SerialPort(RX_PIN, TX_PIN); KangarooSerial K(SerialPort); ////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// //SETTINGS////////////////////////////////////////////// #define Mixed_mode true bool position_control = true; ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// #ifdef Mixed_mode KangarooChannel Drive(K, 'D'); KangarooChannel Turn(K, 'T'); #endif #ifndef Mixed_mode KangarooChannel KL(K, '1'); KangarooChannel KR(K, '2'); #endif ros::NodeHandle nh; long WHEEL_DIST = 2 ; /////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// std_msgs::Bool position_control_msg; std_msgs::Bool mixed_mode_msg; //////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// double x = 1.0; double y = 0.0; double theta = 1.57; void cmd_velCallback( const geometry_msgs::Twist& CVel){ //geometry_msgs::Twist twist = twist_msg; long vel_x = CVel.linear.x*200; long vel_th = -CVel.angular.z*200; x += cos(theta)*CVel.linear.x*0.1; y += sin(theta)*CVel.linear.x*0.1; theta += CVel.angular.z*0.1; long speed_right = (vel_th*WHEEL_DIST)/2 + vel_x; long speed_left = (vel_x*2) - speed_right; long right_vel = 0.0; long left_vel = 0.0; //right_vel = vel_th * WheelSeparation / 2.0; //left_vel = (-1) * right_vel; //left_vel = vel_x - vel_th * WheelSeparation / 2.0; //right_vel = vel_x + vel_th * WheelSeparation / 2.0; // turning //IF MIXED MODE CONFIG #ifdef Mixed_mode if(int(vel_x) == 0 && int(vel_th)==0){ if (position_control){ Turn.pi(0); Drive.pi(0);} else{ Turn.s(0); Drive.s(0); } } // forward / backward else if(abs(vel_x) > abs(vel_th) ){ if (position_control){ Drive.pi(vel_x); Turn.pi(vel_th); } else{ Drive.s(vel_x); Turn.s(vel_th); } } // moving doing arcs else{ if (position_control){ Drive.pi(vel_x); Turn.pi(vel_th);} else{ Turn.s(vel_th); Drive.s(vel_x); } } #endif ////////////////////////////////////////////////////////////////////////////////////////////////////// //IF NOT MIXED MODE CONFIG #ifndef Mixed_mode if(position_control){ KL.pi(speed_left); KR.pi(speed_right); } else{ KL.s(speed_left); KR.s(speed_right); } #endif } //////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// void reset_Kangaroo( std_msgs::Empty& toggle_msg){ init_kangaroo(); } void mixed_mode_func(std_msgs::Bool& mixed_mode_topic ){ mixed_mode_msg = mixed_mode_topic; } void position_control_func(std_msgs::Bool& position_control_topic ){ position_control_msg = position_control_topic; } // ROS SUBSCRIBERS ros::Subscriber<:twist> Sub("cmd_vel", &cmd_velCallback ); ros::Subscriber<:empty> reset_kangaroo("reset_kangaroo", reset_Kangaroo ); //THese two topics are to turn on and off the position_control OR mixed_mode ros::Subscriber<:bool> Position_control("position_control",position_control_func ); ros::Subscriber<:bool> mixed_mode("mixed_mode",mixed_mode_func ); //////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// void init_kangaroo(){ #ifdef Mixed_mode Drive.start(); Turn.start(); Drive.home(); Turn.home(); SerialPort.println("D,start"); SerialPort.println("T,start"); SerialPort.println("D, units 100 thou = 245 lines"); SerialPort.println("T, units 100 thou = 245 lines"); #endif #ifndef Mixed_mode KL.start(); KR.start(); SerialPort.println("1,start"); SerialPort.println("2,start"); SerialPort.println("1, units 100 thou = 245 lines"); SerialPort.println("2, units 100 thou = 245 lines"); #endif } ////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// geometry_msgs::TransformStamped t; tf::TransformBroadcaster broadcaster; char base_link[] = "/base_link"; char odom[] = "/odom"; void odom_publisher(){ // drive in a circle //double dx = 0.2; //double dtheta = 0.18; //x += cos(theta)*dx*0.1; //y += sin(theta)*dx*0.1; //theta += dtheta*0.1; if(theta > 3.14) theta=-3.14; //tf odom->base_link t.header.frame_id = odom; t.child_frame_id = base_link; t.transform.translation.x = x; t.transform.translation.y = y; t.transform.rotation = tf::createQuaternionFromYaw(theta); t.header.stamp = nh.now(); broadcaster.sendTransform(t); } ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// void setup(){ SerialPort.begin(9600); init_kangaroo(); nh.initNode(); broadcaster.init(nh); nh.subscribe(Sub); nh.subscribe(reset_kangaroo); nh.subscribe(Position_control); nh.subscribe(mixed_mode); } void loop() { odom_publisher(); nh.spinOnce(); delay(1); }

what is the best way to connect ackermann steering with rviz?

$
0
0
Hello, I am building my own car-like robot. For the driving motor controller, I am using the SDC2130 from Roboteq. I have found this package on github called [roboteq_diff_driver](https://github.com/ecostech/roboteq_diff_driver) which is able to control two motors in a differential drive setup (subscribes to cmd_Vel, publishes to odom and broadcasts odom tf). I am trying to expand on this to include ackermann steering. **My setup:** *Driving*: modify the roboteq_diff_driver package to only listen to the linear velocity so the robot is only either going forward or backward *Steering*: I am using an arduino to control some [linear actuators](https://s3.amazonaws.com/actuonix/Actuonix+P16+Datasheet.pdf) to steer the wheels. I can subscribe to cmd_vel (more specficially, the angular velocity part) with the arduino and then control the linear actuators to meet the desired angular velocity based on ackermann geometry. So although arduino is steering the wheels correctly based on cmd_vel, I dont know how to connect this with rviz since all rviz sees is the odom tf published from the package, which in this case, is only going forward and backward. This is problematic because I am not able to use the navigation stack properly. **My question:** How do I feed steering information back in to rviz (Base_link frame) so that it considers steering? Would using an IMU solve this problem? I was thinking, am I able to use an IMU to obtain the yaw data and then use this data instead of calculating the yaw based on differential drive kinematics as demonstrated by the package (see code below) and send this information to base_link? Can someone let me know if I am on the right track or if there are any alternative way to combine steering with rviz? Below is the part of the package that handles odom publishing: void MainNode::odom_publish() { // determine delta time in seconds uint32_t nowtime = millis(); float dt = (float)DELTAT(nowtime,odom_last_time) / 1000.0; odom_last_time = nowtime; #ifdef _ODOM_DEBUG /* ROS_DEBUG("right: "); ROS_DEBUG(odom_encoder_right); ROS_DEBUG(" left: "); ROS_DEBUG(odom_encoder_left); ROS_DEBUG(" dt: "); ROS_DEBUG(dt); ROS_DEBUG(""); */ #endif // determine deltas of distance and angle float linear = ((float)odom_encoder_right / (float)encoder_cpr * wheel_circumference + (float)odom_encoder_left / (float)encoder_cpr * wheel_circumference) / 2.0; // float angular = ((float)odom_encoder_right / (float)encoder_cpr * wheel_circumference - (float)odom_encoder_left / (float)encoder_cpr * wheel_circumference) / track_width * -1.0; float angular = ((float)odom_encoder_right / (float)encoder_cpr * wheel_circumference - (float)odom_encoder_left / (float)encoder_cpr * wheel_circumference) / track_width; #ifdef _ODOM_DEBUG /* ROS_DEBUG("linear: "); ROS_DEBUG(linear); ROS_DEBUG(" angular: "); ROS_DEBUG(angular); ROS_DEBUG(""); */ #endif // Update odometry odom_x += linear * cos(odom_yaw); // m odom_y += linear * sin(odom_yaw); // m odom_yaw = NORMALIZE(odom_yaw + angular); // rad #ifdef _ODOM_DEBUG //ROS_DEBUG_STREAM( "odom x: "

Integrating Autonomous Navigation with SLAM

$
0
0
I'm a beginner in ROS and I'm trying to develop a robot that can autonomously navigate with the help of 3D Lidar data and IMU data. Using individual motor speeds as feedback for the navigation isn't an option because the robot is going to drive on sand and it is going to slip. So using IMU data is the only sure way to say it has reached its goal pose. Here is my scenario: I have an arduino to transform cmd_vel (subscribed) data into PWM values for the motors. I have a 3D lidar with an attached IMU to help me with mapping. I'm using an open source package like LOAM/Google cartographer for the mapping part. Now my question is, how do I integrate all this to use the data from the map to produce odom values that can be converted into cmd_vel data and published to my arduino for motor control. (I'm not sure I have the scenario right, but you get the idea) From all the reading I did online, people give vague pointers about Hector_Slam, gmapping, hector_exploration_planner, rosserial, etc. I haven't found even a single source that walks me through the steps, like, 1. Which topic generates what kind of data, which node should publish/subscribe to which topic. 2. How I can overcome not having motor encoder values by using IMU instead, etc. Any guidance or pointers to clear tutorials, open source github packages, videos, etc will be greatly appreciated. Thanks in advance. :)

Controller for robots

$
0
0
Hi, i found two ways to setup a controller to drive around with my mobile platform robot and am a bit confused. First I found the way to create a new xacro file and will it with a gazebo-tag and options for the controller:100.0/joint_base_wheel_front_leftjoint_base_wheel_front_rightjoint_base_wheel_rear_leftjoint_base_wheel_rear_right${2*0.77}${2*wheel_radius}heros_base_link130cmd_velodomodom1 Whats this file libgazebo_ros_skid_steer_drive.so? The second way is to load a controller in my gazebo launch file like that: Then I had this .yaml file for all the settings: robot_joint_publisher: type: "joint_state_controller/JointStateController" publish_rate: 50 robot_velocity_controller: type: "diff_drive_controller/DiffDriveController" left_wheel: ['joint_base_wheel_front_left', 'joint_base_wheel_rear_left'] right_wheel: ['joint_base_wheel_front_right', 'joint_base_wheel_front_right'] publish_rate: 100 pose_covariance_diagonal: [0.001, 0.001, 0.001, 0.001, 0.001, 0.03] twist_covariance_diagonal: [0.001, 0.001, 0.001, 0.001, 0.001, 0.03] cmd_vel_timeout: 0.25 # Base frame_id base_frame_id: robot_base_link # Odometry fused with IMU is published by robot_localization, so # no need to publish a TF based on encoders alone. enable_odom_tf: false # Wheel separation and radius multipliers wheel_separation_multiplier: 0.580 wheel_radius_multiplier : 0.32 # Velocity and acceleration limits linear: x: has_velocity_limits : true max_velocity : 0.9 # m/s has_acceleration_limits: true max_acceleration : 3.0 # m/s^2 angular: z: has_velocity_limits : true max_velocity : 2.0 # rad/s has_acceleration_limits: true max_acceleration : 6.0 # rad/s^2 In option 1 I added the topic to publish the odometry and so on. In the second way i didnt. Where does ROS know how to publish those things? To use all of this I have to add a transmission tag to my wheels like that. Is that right? transmission_interface/SimpleTransmissionhardware_interface/VelocityJointInterface100 Are this two described ways the same thing or does it make any difference to do it like one of this ways? Are these ways common to setup the control setting for a robot or did i do incorrect things? Thanks for your help Max

how I convert string to float?

$
0
0
I'm using arduino to send values from 0.00 to 5.00 and I'm trying to read them but I get this error raise ROSSerializationException(str(e)) rospy.exceptions.ROSSerializationException: : 'required argument is not a float' when writing 'x: "0.92" this is the code, how can I convert the value to float? def __init__(self): rospy.init_node('my_tf_broadcaster') self.ser = serial.Serial('/dev/ttyACM0', 115200) self.get_odometry() def get_odometry(self): self.pub_tf = rospy.Publisher("/tf", tf.msg.tfMessage) change = 0.0 while not rospy.is_shutdown(): # Run this loop at about 10Hz rospy.sleep(0.1) message = self.ser.readline() t = geometry_msgs.msg.TransformStamped() t.header.stamp = rospy.Time.now() t.header.frame_id = "odom" t.child_frame_id = "base_link" t.transform.translation.x = message.rstrip() #trying to read this value t.transform.translation.y = 1.0 t.transform.translation.z = 0.0 t.transform.rotation.x = 0.0 t.transform.rotation.y = 0.0 t.transform.rotation.z = 0.0 t.transform.rotation.w = 1.0 tfm = tf.msg.tfMessage([t]) self.pub_tf.publish(tfm)

Can I use motors without encoders (open loop) for base of robot?

$
0
0
I have read that i need to publish odometry data so that the robot knows its current position. I have encoders built into the motor so I know that the steps i am sending will be accurate. Is there a way I can publish the goal state as the current state and will that work?

Robot Localization with Odometry and UM7

$
0
0
Hello all, I am trying to get localization to work for a custom car like robot. I am able to publish odom from wheel encoders and imu/data from my UM7 imu sensor. I would like to fuse together these two data using the [robot localization](http://docs.ros.org/indigo/api/robot_localization/html/index.html) package and then show it with base_link in rviz. Without proper localization of base_link, I will not be able to run the navigation stack properly. In my current setup, rviz is able to show forward and backward movement with base_link; however, when I manually rotate the um7 imu sensor, it does not reflect within rviz. I can see that the odometry/filtered topic is receiving input from the um7 but how do i visualize this in rviz? My end goal is to place this um7 on my custom car like robot. I want rviz to display the base_link mimicking its real life counterpart in terms of position/orientation and velocities (moving forward/backward, turing left/right...etc). It is to my understanding that this is what the robot localization package is for, please correct my understanding if i am making a mistake. Here are two videos describing my problem: [1st video](https://www.youtube.com/watch?v=Sa5ZqVbpd6o&feature=youtu.be) shows the screen recording from my computer. You can see that moving forward and backward is okay, but I do not know how to turn the base_link. [2nd video](https://www.youtube.com/watch?v=40nO4phqiqI&feature=youtu.be) shows what I am trying to do Below is the configuration file for the robot_localization package: frequency: 50 sensor_timeout: 0.1 two_d_mode: true transform_time_offset: 0.0 transform_timeout: 0.0 print_diagnostics: true debug: false debug_out_file: /path/to/debug/file.txt publish_tf: true publish_acceleration: false odom_frame: odom # Defaults to "odom" if unspecified base_link_frame: base_link # Defaults to "base_link" if unspecified world_frame: odom # Defaults to the value of odom_frame if unspecified odom0: /odom odom0_config: [false, false, false, false, false, true, true, true, false, false, false, true, false, false, false] odom0_queue_size: 2 odom0_nodelay: false odom0_differential: false odom0_relative: false odom0_pose_rejection_threshold: 5 odom0_twist_rejection_threshold: 1 imu0: /imu/data imu0_config: [false, false, false, false, false, true, false, false, false, false, false, true, false, false, false] imu0_nodelay: false imu0_differential: false imu0_relative: true imu0_queue_size: 5 imu0_pose_rejection_threshold: 0.8 # Note the difference in parameter names imu0_twist_rejection_threshold: 0.8 # imu0_linear_acceleration_rejection_threshold: 0.8 # imu0_remove_gravitational_acceleration: true use_control: true stamped_control: false control_timeout: 0.2 control_config: [true, false, false, false, false, true] acceleration_limits: [1.3, 0.0, 0.0, 0.0, 0.0, 3.4] deceleration_limits: [1.3, 0.0, 0.0, 0.0, 0.0, 4.5] acceleration_gains: [0.8, 0.0, 0.0, 0.0, 0.0, 0.9] deceleration_gains: [1.0, 0.0, 0.0, 0.0, 0.0, 1.0] process_noise_covariance: [0.05, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.05, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.06, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.03, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.03, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.06, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.025, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.025, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.04, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.02, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.015] initial_estimate_covariance: [1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9]

robot doesn t start moving

$
0
0
I have setup navigation stack on my robot with all the nodes up and running (move_base, amcl , costmap2d etc.). I am also able to move my robot using teleoperation (keyboard control) and see its motion in Rviz as well. Now, I have created a map using gmapping and loaded it with amcl and when I give 2D NavGoal command from Rviz, the robot does not move. It shows the planned path in Rviz but does not move. In the terminal, it prints 'Got new plan' many times and at the end prints 'Rotate Recovery behavior started'. And after that, nothing happens. Any ideas why the robot is not moving. Thanks

How to fuse encoder ticks + IMU for odometry?

$
0
0
Hi, I have an IMU which shows the orientation perfectly and also installed some optical encoders. When I move the robot straight, the encoders will show the movement more or less correctly. However, when I turn and then move, IMU first displays orientation then when I move after the turn suddenly my robot gets drifted back and then move. I guess that this is because encoders do their job separately but how can I fuse them together? This is my encoder tick callback (one of them, second will be similar): void Odometry::leftencoderCb(const std_msgs::Int64::ConstPtr& left_ticks) { double enc = left_ticks->data; if((enc encoder_high_wrap)) { lmult = lmult + 1; } if((enc > encoder_high_wrap) && (prev_lencoder orientation.x; orient_t = data->orientation.y; orient_z = data->orientation.z; orient_w = data->orientation.w; } And finally, here is my update code which updates odometry periodically: void Odometry::update() { ros::Time now = ros::Time::now(); double elapsed; double d_left, d_right, d, th,x,y; elapsed = now.toSec() - then.toSec(); if(enc_left == 0) { d_left = 0; d_right = 0; } else { d_left = (left - enc_left) / ( ticks_meter); d_right = (right - enc_right) / ( ticks_meter); } enc_left = left; enc_right = right; d = (d_left + d_right ) / 2.0; th = ( d_right - d_left ) / base_width; dx = d / elapsed; dr = th / elapsed; if (d != 0) { ROS_INFO("Distance has been changed!"); x = cos(th) * d; y = -sin(th) * d; x_final += (cos(theta_final) * x - sin(theta_final) * y); y_final += (sin(theta_final) * x + cos(theta_final) * y); } theta_final += th; geometry_msgs::Quaternion odom_quat ; odom_quat.x = this->orient_x; odom_quat.y = this->orient_y; odom_quat.z = this->orient_z; odom_quat.w = this->orient_w; geometry_msgs::TransformStamped odom_trans; odom_trans.header.stamp = now; odom_trans.header.frame_id = "odom"; odom_trans.child_frame_id = "base_footprint"; odom_trans.transform.translation.x = x_final; odom_trans.transform.translation.y = y_final; odom_trans.transform.translation.z = 0.0; odom_trans.transform.rotation = odom_quat; odom_broadcaster.sendTransform(odom_trans); // Odometry values: nav_msgs::Odometry odom; odom.header.stamp = now; odom.header.frame_id = "odom"; odom.pose.pose.position.x = x_final; odom.pose.pose.position.y = y_final; odom.pose.pose.position.z = 0.0; odom.pose.pose.orientation = odom_quat; odom.child_frame_id = "base_footprint"; odom.twist.twist.linear.x = dx; odom.twist.twist.linear.y = 0.0; //dy; odom.twist.twist.angular.z = 0.0; //dr; odom_pub.publish(odom); then = now; ros::spinOnce(); else ROS_INFO_STREAM("Not in a loop"); } Can someone suggest me how can I write odometry code more correctly?

Odometry for a wheelchair

$
0
0
Hi there, I'm planning to use the navigation stack package. I currently have the lidar attached to my base, a electric wheelchair. However, odometry seems to be required for the navigation package. I have searched around but seems like normal wheel odometry might not fit my case... Is there other good and **cheap** option (type and model) for that ? Or is there package can substitute the use of a physical odometry (would be great if there is a tutorial)Thanks!

How to combine LIDAR and IMU with Hector mapping.

$
0
0
Hi, I am a beginner to ROS trying to fuse the IMU for odometry along with a hokuyo LIDAR and hector mapping to create an accurate map. Currently the map works in rviz but the frames do not move continuously, Sorry for the links but I am unable to add the images manually. I am unsure where to begin and have just confused myself over the past week. I am using the xsens IMU package here [here](https://github.com/ethz-asl/ethzasl_xsens_driver), I'm trying to follow this tutorial for setting up hector mapping [here](http://wiki.ros.org/hector_slam/Tutorials/SettingUpForYourRobot) This is a graph of my [frames and nodes here](https://imgur.com/a/FOjJz8c) and these are the current nodes. These are my launch files for the [mapping](https://pastebin.com/dM5yfdze) [IMU](https://pastebin.com/CpQpW2L5) and the [Hector attitude to tf package](https://pastebin.com/PA8DM9m3) I am not quite sure what I am doing even after reading the docs, and I would really appreciate any pointers

fake encoder ticks from laser scanner?

$
0
0
Hi, this is my first post here I have a 4WD differential robot with encoders, but after testing i think the encoders dont work (except 1) So i thought go away with encoders altogether and use a 2D ydlidar, With hector slam rf2o_laser_odometry i can publish odom required for interfacing with Navigation Stack I also have a Kinect and will buy IMU My problem is how to implement Closed loop PID Control since i don't have encoders to measure velocity, can i infer velocity from the laser scan and use that for PID Control Also can i use this inferred velocity to infer fake encoder ticks for interfacing with Ros Control? I have a Jetson TX2 so the computing performance is good. Thanks a lot!!

Improve map created by Gmapping

$
0
0
Hi All, I implemented Gmapping algorithm for a featureless tunnel/pipe environment. Since LiDar/camera are not that effective in these environments, stable odometry is the prime data that can be used in such situations. I read there are many parameters in gmapping.launch.xml file that can be tuned to get the desired results. I even tried to tune the number of particles and map_update_interval and it really changed a lot. But there are many other parameters like minimumScore, linearUpdate, srr, srt...etc which might affect the map as well. But I'm not sure how to tune these for my application since there can be many possibilities. Can someone help me in this regard?




Latest Images