Quantcast
Channel: ROS Answers: Open Source Q&A Forum - RSS feed
Viewing all 50 articles
Browse latest View live

Calculating path using linear and angular velocity published by odometry

$
0
0
I have the message type `nav_msgs/Odometry` constantly publishing linear and angular velocity. I am calculating linear distance by taking Velocity at X and Y, multiply them by `1/rate(Hz)` and summing them together. Is this calculation right? Also, I am a bit skeptical about calculating the turn using angular velocity. Any guidance here is appreciated. I am visualizing the path in RViz. It would be great if you could point me to relevant source code that I could look into.

odometry message from Arduino

$
0
0
Hello, I am trying to create and publish odometry data for a differential drive robot from an Arduino Due board with the data coming from wheel encoders and a gyro sensor attached to the board. my intention is to create and publish the Odom messages from the Arduino Due itself, since it has a fairly powerful Microcontroller. (Is it a good practice?) I have a few question though to clarify & make sure about a few things for myself. I check out the [publishing odometry tutorial](http://wiki.ros.org/navigation/Tutorials/RobotSetup/Odom) and i notice that there is a `odom.twist.twist.linear.y = vy` field, do I have to hard code this to `0.0` for a differential drive robot? Is it alright if I just convert the gyro’s readings (which is in Degree Per Second) to Radian/sec and pass it straight as `vth` to `odom.twist.twist.angular.z = vth`? In the Arduino it seems like that i can not use the odom_quat as in the tutorial to convert my heading to quats. Instead should I add `odom_trans.transform.rotation.z` and `odom_trans.transform.rotation.w` to the message manually and calculate the value? At what rate is the best practice to calculate and publish odometry for a Turtlebot class diff drive robot? if you have any heads up or pointers regarding to what i'm trying to do I would appreciate it. Thank you.

Issue while creating a Map using gmapping?

$
0
0
Hello I am in confusion, what really the issue is in? Please any one can solved it, we urgently needed to solve it. In my mobile base Robot , we are now at the stage of creating a map using gmapping, but while creating the map I am facing one issue, The issue is When I move my robot using teleop_twist_keyboard the cmd_vel is generated properly and the robot moves smoothly , but when I use RVIZ tool for visualizing it , in that tool for some second the visualization of robot i.e. its movement on the tool is correct but in between it suddenly changes its route or its position automatically even if my robot has no movement. I have done the analysis of my scan data it is good and also the analysis for my odometry data. When I select the Fixed frame as "odom" it does show this issue.I am facing the issue only when I am trying to create a map i.e when I select my Fixed frame as "map". Please can anyone suggest why this would be happening, where might be the problem.

TF of URDF is not showing from robot state publisher

$
0
0
I am using raspberry pi and trying to publish TF of URDF file. I created a URDF of a differential drive robot and wrote the launch file including joint state publisher & robot state publisher to publish the TF. When i run the launch file and tried to read the TF using tf_monitor, there is no TF value shown, below is my result. When i run rostopic echo tf_static the topic is publishing. ![image description](/upfiles/1477251686645989.png) The strange part is when i tried this in my local PC (with same URDF & launch) the result was perfect, TF of all the links published. Is there any limitation in raspberry pi to publish robot state of URDF? How to resolve this?

Calibrating Pioneer P3DX robot for ROSARIA

$
0
0
Hi All, I am now setting up a new Pioneer 3DX with kinect and URG laser sensor. I have so far managed to put all the sensors and connect them via rosaria. I would now like to do odometry calibration of the mobile base. Does anyone know how to do it. There are no specific tutorials available on the ros wiki page. Also what are the optimum values of (TicksMM, DriftFactor, RevCount) for P3DX mobile robot. At present when I run the rosaria node I get the settings like this: TicksMM - 128 DriftFactor - 0 RevCount - 16570 I can change these values using dynamic_reconfigure. I need the optimum values for this particular model. And how do I permanently change these parameters everytime I launch the rosaria node. Can someone explain me the process. Basically I believe the robots odometry is not good when its turning, how do I correct that. Thanks

STM32 Microcontroller for communication with ROS

$
0
0
I currently intend to deploy an STM32 (Stm32f103c8t6) as the microcontroller to publish the Odom message and message from other sensors also. My questions are as follows: 1. Is there any support available for the STM32 for communicating with ROS(client, serial etc.)? I know of the rosserial but it's pretty confusing to me on the tutorials page. 2. Has anyone implemented the STM32 in their bots? Would it be better to port to an arduino? If not, is the process of writing an altogether new node for the stm32 a tiresome task? Thanks in advance.

Adding odometry to my Custom Robot

$
0
0
So I have the follow code that allows my Robot to move via Teleop. I now need to Publish odometry values. Does anyone have any good ideas on how to do it? import roslib import rospy from BrickPi import * from std_msgs.msg import String from std_msgs.msg import UInt16 from geometry_msgs.msg import Twist #from nav_msgs.msg import * #Model dependend settings PI=3.141 ROBOT_WIDTH=0.16 WHEEL_DIAMETER=0.055 WHEEL_RADIUS=WHEEL_DIAMETER/2 WHEEL_PERIMETER=2*PI*WHEEL_RADIUS #RPM dependant from voltage without load #9.0V = 120 RPM #7.5V = 80 RPM # with load 60-80 rpm is a good average MAX_RPM=80.0 RPS=MAX_RPM/60.0 MPS=RPS*WHEEL_PERIMETER PWRDIV=1000*RPS rospy.loginfo("PWRDIV:"+str(PWRDIV)) RF_WHEEL=PORT_C LF_WHEEL=PORT_D BrickPiSetup() BrickPi.MotorEnable[RF_WHEEL] = 1 BrickPi.MotorEnable[LF_WHEEL] = 1 BrickPi.SensorType[PORT_1] = TYPE_SENSOR_ULTRASONIC_CONT BrickPiSetupSensors() def cmd_vel_callback(cmd_vel): left_speed_out = cmd_vel.linear.x - cmd_vel.angular.z*ROBOT_WIDTH/2 right_speed_out = cmd_vel.linear.x + cmd_vel.angular.z*ROBOT_WIDTH/2 v = cmd_vel.linear.x # speed m/s theta = cmd_vel.angular.z # angle rad/s rospy.loginfo("VEL_CMD_CB: v:" + str(v) + ", theta:" + str(theta)) motor_control(left_speed_out, right_speed_out) def vel_cmd_listener(): rospy.Subscriber("cmd_vel", Twist, cmd_vel_callback) def motor_control(left_speed_out,right_speed_out): rospy.loginfo("LSPEED:"+ str(left_speed_out) + " RSPEED:" + str(right_speed_out)) rospy.loginfo("LSPEED:"+ str(left_speed_out) + " RSPEED:" + str(right_speed_out)) BrickPi.MotorSpeed[RF_WHEEL] = int(-right_speed_out*PWRDIV) rospy.loginfo("RF:"+str(BrickPi.MotorSpeed[RF_WHEEL])) BrickPi.MotorSpeed[LF_WHEEL] = int(-left_speed_out*PWRDIV) rospy.loginfo("LF:"+str(BrickPi.MotorSpeed[LF_WHEEL])) BrickPiUpdateValues() time.sleep(.01) scan_publisher() def scan_publisher(): result = BrickPiUpdateValues() if not result : range = BrickPi.Sensor[PORT_1] us = rospy.Publisher('scan', UInt16) us.publish(UInt16(range)) rospy.loginfo ("SCAN:" + str(range)) if __name__ == '__main__': try: rospy.init_node('RobotBaseController') vel_cmd_listener() time.sleep(.01) rospy.spin() except rospy.ROSInterruptException: pass

Use navigation to correct robot's path

$
0
0
Till now I was using Gazebo to simulation. Now I am trying to implement that on a real robot. I have been successful in doing so but there is one problem in following the path. When the path is straight, the robot **fails to follow a perfectly straight path (like it followed in Gazebo)**. This maybe due to **limitations of odometer** based speed control of motors. So will the navigation stack correct the bot path by giving an alternate **path or speed as a feedback** if it finds out (*by the encoder values that is computed and published as odom*) that the robot is not following the correct path that it previously published? So, does the navigation stack does that or is there any way to do it?

Laser scan data not aligned with map data and real world (Navigation Stack)

$
0
0
![image description](/upfiles/1486285790714428.jpg) Good Morning, I am facing a problem recently with navigation. Once navigating, the laser scan inflation is not aligned with the walls included in the map. It happens at every corner in the map, it starts detecting the corner from the laser scanner before it actually reach it in reality which fail all the navigation. The strange thing is that i have been navigating with this robot for around 6 months and every thing was working really fine, but this problem suddenly appeared recently without changing the navigation software. Any suggestion what could the problem be? is it odometry or laser scanner? If someone had such a problem before and fixed it, it would be great to help. Thanks a lot,

find 3D location of the sensor when a depth image arrives

$
0
0
Hey buddies! I went through the forum and couldn't find a satisfying answer. Sorry, if I reopened an already-answered question. Basically, I need to write a ROS node that will print the 3D location of the sensor and the timestamp in a text format when it detects a depth image. Here is what I think I should do (tho I don't know how to): 1. Build a robot with odometer and mount a web camera on it. 2. Create a node that will publish odometer's data. ->odom_node 3. Create a node that will detect 3D images. -> cam_node 4. I somehow will make the odom_node to listen to the cam_node. I have thousands of questions to ask but that would make it too broad, I don't really know where to start. Could someone direct me to some good tutorials and projects where I can learn and finish the project on my own? I appreciate all the answers. Thank you in advance.

Odometer_Arduino

$
0
0
Hi, OKAY, I hate to admit it but I am a total NOOB. But I really work hard to learn this stuff. So, yeah... I have made an odometer using Arduino and a couple of IR LEDs. Below is the sketch: int counter; int currReading; int prevReading = 0; int threashold = 600; void setup() { Serial.begin(9600); } void loop() { currReading = analogRead (A0); delay(1); if(currReading > threashold && prevReading I don't have any special hardware that supports current libraries. I am making my own stuff. I am literally lost, so any help is pretty much appreciated. Millions of thanks in advance.

Stereo camera, preprocessing, transforms and frames = pointcloud?

$
0
0
Hello, I am trying to avoid the simple questions, so trust me; I've really tried to figure out this for myself. But I find this to be am beyond my understanding. **Situation** - I am acquiring BGR-preprocessed stereo view, but this has raised some issues for me. While the camera software is aimed towards Igloo (and Jade's?) "uvc-camera". For concretizing purposes, let's assume the simplest version of "differential wheeled robot"; with a base footprint, base link etc - as frequently addressed in tutorials. My goal is to achieve perception of the environment utilizing as much as possible from generic packages,., The camera is published under the /stereo/ namespace 1) How do you describe a stereo camera through xacro/urdf? - Is the camera "one"; with e.g. "optical frame" as origin? Or should the left and right camera be described in a separate manner? - Which frames/transforms am I advised to implement? - Should I utilize a dedicated driver? 2) I am finding a recurring error-messages relating to the packages expecting a colored image. At the same time I am facing all these other challenges, making it impossible for me to deduce whether this is a cause or a consequence.' - I have implemented stereo_image_proc ; but without these issues addressed I find it natural that I don't get any disparity or pointcloud data. 3) If colored images are required; could it be an idea to process the BGR-frames captured; use OpenCV to create a colored image intensifying parameters of interest, and then feeding these to the stereo-vision algorithms at a lower framefrate? (I have managed to utilize the BGR images for cv-processing by using a "cv_bridge"-node). 4) Related to the abovementioned; how do I successfully propagate odometry / pose in a simple conceptual system, where an ekf-node provdes filtered sensordata? I'm assuming I should remap the topic somewhere, but I'm struggling to catch the essential factor here. 5) In general as with the camera; I'm struggling with the countless possibilities relating to frames, tf's and dedicated drivers. Should I use a kobuki or a differential driver? What are the main pro's and con's? (It should be mentioned that I won't receive my wheel encoders before the end of next week, so for now I will have to estimate odometry. ) I hope I'm not completely off track here, thanks in advance. Please notify me if I should change and/or elaborate on any of these issues.

Runtime Odometry Correction

$
0
0
Hello We are stuck at a point, unable to solved it. Please suggest, how can I do it !. We am working on small Module, firstly we drive the robot at curtain distance e.g say 7 m and store the drive trajectory data Odometry + Laserscan as it's reference points. Using FeedBack Controller and stored data we make the robot travel the same taught path . Now I am stuck at , run time we are getting drift in current odometry due to which we are unable to tackle the exact taught path. Does we have any technique to solve run-time drifts in robot odometry . I their a way to solve the drift in odometry at run-time appearance. Please lets us know if any one have idea regarding it.

Pose Covariance for swiftnav_ros rtk message

$
0
0
Hello all, I am using Piski Multi RTK gps system for centimeter accurate pose information. Below is a snapshot of a Nav_msgs::Odometry message, as published by the swiftnav_ros package, after taking my little robot outside and driving it around, with the gps in RTK mode. header: seq: 183 stamp: secs: 1507219077 nsecs: 699150124 frame_id: gps child_frame_id: '' pose: pose: position: x: 2.89 y: 4.694 z: -0.749 orientation: x: 0.0 y: 0.0 z: 0.0 w: 0.0 covariance: [324000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 324000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1296000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0] twist: twist: linear: x: -1.111 y: 0.852 z: 0.016 angular: x: 0.0 y: 0.0 z: 0.0 covariance: [0.0022500000000000003, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0022500000000000003, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0046500000000000005, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0] --- header: seq: 184 stamp: secs: 1507219077 nsecs: 799420749 frame_id: gps child_frame_id: '' pose: pose: position: x: 2.779 y: 4.766 z: -0.75 orientation: x: 0.0 y: 0.0 z: 0.0 w: 0.0 covariance: [324000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 324000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1296000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0] twist: twist: linear: x: -1.099 y: 0.715 z: -0.026 angular: x: 0.0 y: 0.0 z: 0.0 covariance: [0.0022500000000000003, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0022500000000000003, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0046500000000000005, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0] Please note that the covariances for x, y, and z pose are 529000000.0, 529000000.0, and 2601000000.0, respectively. These numbers seem obsenely large, or is this normal and correct? The numbers for the twist message seem reasonable.. Is it possible that i wasn't in RTK mode? I think it is possible i was in float RTK and not fixed RTK. Thank you, Jad

Sabertooth odometry feed back with kangaroo x2

$
0
0
Apologies , this is not a ROS related question . I wanted to know if Kangaroo X2 can give us feedback for odometry . I know kangaroo can do PID control using the encoders(as much as I understand it's like with Kangaroo +sabertooth , if we ask the motor to turn 1 unit ,it will turn 1 unit ) My question is do we feed a real-time feedback from either the sabertooth or Kangaroo ? Or is Kangaroo enough for odometry ? (Ie fake published odometry.)

how to get wheel encoder count from turtlebot's /odom topic

$
0
0
turtlebot's odom topic gives: header: seq: 406289 stamp: secs: 4392 nsecs: 160000000 frame_id: odom child_frame_id: base_footprint pose: pose: position: x: 1.56701645246e-05 y: -9.82132735628e-06 z: 0.0 orientation: x: 0.0 y: 0.0 z: -0.548275342929 w: 0.836297882537 covariance: [0.1, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.05] twist: twist: linear: x: -2.67171244095e-06 y: 0.0 z: 0.0 angular: x: 0.0 y: 0.0 z: -0.000185729678152 covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0] I'd like to find the wheel encoder count from this data. How does turtlebot use wheel encoder values to compute positions? Where can I find the code that does this?

Ground truth to show robot movement in Rviz

$
0
0
Hello all, I'm trying to figure out what is need and the information path so that information the robot shown in Rviz moves relative to a fixed frame. (as opposed to the robot standing still and the world moving around it). The practical case is that I am using px4's sitl tools to simulate an UAV. It opens fine if I use the UAV's base_link as a fixed frame. When the UAV moves, it only moves in Gazebo, in Rviz is the world around it that moves. It looks like I am missing the transformation between world and base_link. But I don't understand who to do it. I read about fixed frames in the tutorials and I saw a lot of code using the librotors_gazebo_odometry_plugin I also saw that the /uav1/mavros/local_position/odom topic is publishing the NaviationMessage Odometry with the values I want. But since there is no world frame anywhere (I'm assuming it is a special frame loaded with world) I don't understand what stuff to route to where. I am launching joint_state_publisher, robot_state_publisher and rviz all from the same robot_description. [Urdf file](/upfiles/15156968383508633.gif) Here is my tf tree (had to cut it in half and remove arms and most rotors otherwise was too big) ![image description](/upfiles/1515696631975623.png)![image description](/upfiles/1515696644213806.png)

Help needed with TF issues .

$
0
0
Hey , i am working on a 4 wheel drive differential robot , ***I am using Rosserial to talk to the arduino (cmd_vel) and arduino publishes back (tf , in which the odom is there).*** Timed out waiting for transform from base_link to map to become available before running costmap, tf error: Could not find a connection between 'map' and 'base_link' because they are not part of the same tree.Tf has two or more unconnected trees.. canTransform returned after 0.101146 timeout was 0.1. ![image description](/upfiles/1519791791661797.png) As you can see from the image , the error is `No transform from camera to odom` . I have done all simulations and it works properly ,but when i am using the Real robot , this error pops up . What could be the error , the possibilities that i could think of are the below 1. The frames not being synced , will i have to subscribe to a /clock topic on the arduino to synchronize the /tf topic . 2. /odom as a dedicated topic doesnt exist . Also can someone explain me the differences between 1. /odometry/filtered ![image description](/upfiles/15197927334620956.png) 2. /velocity_controller/odom ![image description](/upfiles/15197928196303803.png) 3. /tf topic ![image description](/upfiles/1519792766314356.png) ![image description](/upfiles/15197988812798659.png) Additional information , the Arduino code which communicates to kangaroo x2. enter code here #include #include #include #include #include #include #include #include #include #include #include #include #define TX_PIN 13 //s1 #define RX_PIN 12 //s2 // Arduino TX (pin 2) goes to Kangaroo S1 // Arduino RX (pin 3) goes to Kangaroo S2 // Independent mode channels on Kangaroo are, by default, '1' and '2'. SoftwareSerial SerialPort(RX_PIN, TX_PIN); KangarooSerial K(SerialPort); ////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// //SETTINGS////////////////////////////////////////////// #define Mixed_mode true bool position_control = true; ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// #ifdef Mixed_mode KangarooChannel Drive(K, 'D'); KangarooChannel Turn(K, 'T'); #endif #ifndef Mixed_mode KangarooChannel KL(K, '1'); KangarooChannel KR(K, '2'); #endif ros::NodeHandle nh; long WHEEL_DIST = 2 ; /////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// std_msgs::Bool position_control_msg; std_msgs::Bool mixed_mode_msg; //////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// double x = 1.0; double y = 0.0; double theta = 1.57; void cmd_velCallback( const geometry_msgs::Twist& CVel){ //geometry_msgs::Twist twist = twist_msg; long vel_x = CVel.linear.x*200; long vel_th = -CVel.angular.z*200; x += cos(theta)*CVel.linear.x*0.1; y += sin(theta)*CVel.linear.x*0.1; theta += CVel.angular.z*0.1; long speed_right = (vel_th*WHEEL_DIST)/2 + vel_x; long speed_left = (vel_x*2) - speed_right; long right_vel = 0.0; long left_vel = 0.0; //right_vel = vel_th * WheelSeparation / 2.0; //left_vel = (-1) * right_vel; //left_vel = vel_x - vel_th * WheelSeparation / 2.0; //right_vel = vel_x + vel_th * WheelSeparation / 2.0; // turning //IF MIXED MODE CONFIG #ifdef Mixed_mode if(int(vel_x) == 0 && int(vel_th)==0){ if (position_control){ Turn.pi(0); Drive.pi(0);} else{ Turn.s(0); Drive.s(0); } } // forward / backward else if(abs(vel_x) > abs(vel_th) ){ if (position_control){ Drive.pi(vel_x); Turn.pi(vel_th); } else{ Drive.s(vel_x); Turn.s(vel_th); } } // moving doing arcs else{ if (position_control){ Drive.pi(vel_x); Turn.pi(vel_th);} else{ Turn.s(vel_th); Drive.s(vel_x); } } #endif ////////////////////////////////////////////////////////////////////////////////////////////////////// //IF NOT MIXED MODE CONFIG #ifndef Mixed_mode if(position_control){ KL.pi(speed_left); KR.pi(speed_right); } else{ KL.s(speed_left); KR.s(speed_right); } #endif } //////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// void reset_Kangaroo( std_msgs::Empty& toggle_msg){ init_kangaroo(); } void mixed_mode_func(std_msgs::Bool& mixed_mode_topic ){ mixed_mode_msg = mixed_mode_topic; } void position_control_func(std_msgs::Bool& position_control_topic ){ position_control_msg = position_control_topic; } // ROS SUBSCRIBERS ros::Subscriber<:twist> Sub("cmd_vel", &cmd_velCallback ); ros::Subscriber<:empty> reset_kangaroo("reset_kangaroo", reset_Kangaroo ); //THese two topics are to turn on and off the position_control OR mixed_mode ros::Subscriber<:bool> Position_control("position_control",position_control_func ); ros::Subscriber<:bool> mixed_mode("mixed_mode",mixed_mode_func ); //////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// void init_kangaroo(){ #ifdef Mixed_mode Drive.start(); Turn.start(); Drive.home(); Turn.home(); SerialPort.println("D,start"); SerialPort.println("T,start"); SerialPort.println("D, units 100 thou = 245 lines"); SerialPort.println("T, units 100 thou = 245 lines"); #endif #ifndef Mixed_mode KL.start(); KR.start(); SerialPort.println("1,start"); SerialPort.println("2,start"); SerialPort.println("1, units 100 thou = 245 lines"); SerialPort.println("2, units 100 thou = 245 lines"); #endif } ////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// geometry_msgs::TransformStamped t; tf::TransformBroadcaster broadcaster; char base_link[] = "/base_link"; char odom[] = "/odom"; void odom_publisher(){ // drive in a circle //double dx = 0.2; //double dtheta = 0.18; //x += cos(theta)*dx*0.1; //y += sin(theta)*dx*0.1; //theta += dtheta*0.1; if(theta > 3.14) theta=-3.14; //tf odom->base_link t.header.frame_id = odom; t.child_frame_id = base_link; t.transform.translation.x = x; t.transform.translation.y = y; t.transform.rotation = tf::createQuaternionFromYaw(theta); t.header.stamp = nh.now(); broadcaster.sendTransform(t); } ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// void setup(){ SerialPort.begin(9600); init_kangaroo(); nh.initNode(); broadcaster.init(nh); nh.subscribe(Sub); nh.subscribe(reset_kangaroo); nh.subscribe(Position_control); nh.subscribe(mixed_mode); } void loop() { odom_publisher(); nh.spinOnce(); delay(1); }

what is the best way to connect ackermann steering with rviz?

$
0
0
Hello, I am building my own car-like robot. For the driving motor controller, I am using the SDC2130 from Roboteq. I have found this package on github called [roboteq_diff_driver](https://github.com/ecostech/roboteq_diff_driver) which is able to control two motors in a differential drive setup (subscribes to cmd_Vel, publishes to odom and broadcasts odom tf). I am trying to expand on this to include ackermann steering. **My setup:** *Driving*: modify the roboteq_diff_driver package to only listen to the linear velocity so the robot is only either going forward or backward *Steering*: I am using an arduino to control some [linear actuators](https://s3.amazonaws.com/actuonix/Actuonix+P16+Datasheet.pdf) to steer the wheels. I can subscribe to cmd_vel (more specficially, the angular velocity part) with the arduino and then control the linear actuators to meet the desired angular velocity based on ackermann geometry. So although arduino is steering the wheels correctly based on cmd_vel, I dont know how to connect this with rviz since all rviz sees is the odom tf published from the package, which in this case, is only going forward and backward. This is problematic because I am not able to use the navigation stack properly. **My question:** How do I feed steering information back in to rviz (Base_link frame) so that it considers steering? Would using an IMU solve this problem? I was thinking, am I able to use an IMU to obtain the yaw data and then use this data instead of calculating the yaw based on differential drive kinematics as demonstrated by the package (see code below) and send this information to base_link? Can someone let me know if I am on the right track or if there are any alternative way to combine steering with rviz? Below is the part of the package that handles odom publishing: void MainNode::odom_publish() { // determine delta time in seconds uint32_t nowtime = millis(); float dt = (float)DELTAT(nowtime,odom_last_time) / 1000.0; odom_last_time = nowtime; #ifdef _ODOM_DEBUG /* ROS_DEBUG("right: "); ROS_DEBUG(odom_encoder_right); ROS_DEBUG(" left: "); ROS_DEBUG(odom_encoder_left); ROS_DEBUG(" dt: "); ROS_DEBUG(dt); ROS_DEBUG(""); */ #endif // determine deltas of distance and angle float linear = ((float)odom_encoder_right / (float)encoder_cpr * wheel_circumference + (float)odom_encoder_left / (float)encoder_cpr * wheel_circumference) / 2.0; // float angular = ((float)odom_encoder_right / (float)encoder_cpr * wheel_circumference - (float)odom_encoder_left / (float)encoder_cpr * wheel_circumference) / track_width * -1.0; float angular = ((float)odom_encoder_right / (float)encoder_cpr * wheel_circumference - (float)odom_encoder_left / (float)encoder_cpr * wheel_circumference) / track_width; #ifdef _ODOM_DEBUG /* ROS_DEBUG("linear: "); ROS_DEBUG(linear); ROS_DEBUG(" angular: "); ROS_DEBUG(angular); ROS_DEBUG(""); */ #endif // Update odometry odom_x += linear * cos(odom_yaw); // m odom_y += linear * sin(odom_yaw); // m odom_yaw = NORMALIZE(odom_yaw + angular); // rad #ifdef _ODOM_DEBUG //ROS_DEBUG_STREAM( "odom x: "

Integrating Autonomous Navigation with SLAM

$
0
0
I'm a beginner in ROS and I'm trying to develop a robot that can autonomously navigate with the help of 3D Lidar data and IMU data. Using individual motor speeds as feedback for the navigation isn't an option because the robot is going to drive on sand and it is going to slip. So using IMU data is the only sure way to say it has reached its goal pose. Here is my scenario: I have an arduino to transform cmd_vel (subscribed) data into PWM values for the motors. I have a 3D lidar with an attached IMU to help me with mapping. I'm using an open source package like LOAM/Google cartographer for the mapping part. Now my question is, how do I integrate all this to use the data from the map to produce odom values that can be converted into cmd_vel data and published to my arduino for motor control. (I'm not sure I have the scenario right, but you get the idea) From all the reading I did online, people give vague pointers about Hector_Slam, gmapping, hector_exploration_planner, rosserial, etc. I haven't found even a single source that walks me through the steps, like, 1. Which topic generates what kind of data, which node should publish/subscribe to which topic. 2. How I can overcome not having motor encoder values by using IMU instead, etc. Any guidance or pointers to clear tutorials, open source github packages, videos, etc will be greatly appreciated. Thanks in advance. :)
Viewing all 50 articles
Browse latest View live




Latest Images