Quantcast
Channel: ROS Answers: Open Source Q&A Forum - RSS feed
Viewing all 74 articles
Browse latest View live

Odom Axes not in line with base_link

$
0
0
Hi, I am using Powerbot to be able to build a map using gmapping algorithm. To setup my robot, I am using the ROSARIA package to be able to have control on the motors, get pose estimates from odometry etc. This is an ROS wrapper for the ARIA library provided by ActivMedia mobilerobots. I am aware that I need some tf configuration in order to align the odometry frame with the base_link and to align the laser frame with base_link frame. I have followed the tutorials and I have understood the concept. I came across [this tutorial](http://www.ist.tugraz.at/robotics/bin/view/Main/Poineer_mapping) and followed it to be able to create my transforms while using ROSARIA. However, in doing so, I have noticed that the odometry axes is not aligned with the base_link axes. The laser axes are aligned perfectly as can be seen in [this screenshot](/upfiles/14238183324878828.png). The odometry axes as the ones far off from the other two. I am aware that ROSARIA publishes its own tf as can be seen from [rosgraph.png](/upfiles/14238185341661154.png). The current transform tree according to my code is [this](/upfiles/14238187038679438.png). The code that I am using to build the transformations is the following: #include #include #include void poseCallback(const nav_msgs::Odometry::ConstPtr& odomsg) { //TF odom=> base_link static tf::TransformBroadcaster odom_broadcaster; odom_broadcaster.sendTransform( tf::StampedTransform( tf::Transform(tf::Quaternion(odomsg->pose.pose.orientation.x, odomsg->pose.pose.orientation.y, odomsg->pose.pose.orientation.z, odomsg->pose.pose.orientation.w), tf::Vector3(odomsg->pose.pose.position.x/1000.0, odomsg->pose.pose.position.y/1000.0, odomsg->pose.pose.position.z/1000.0)), odomsg->header.stamp, "/odom", "/base_link")); ROS_DEBUG("odometry frame sent"); ROS_INFO("odometry frame sent: y=[%f]", odomsg->pose.pose.position.y/1000.0); } int main(int argc, char** argv){ ros::init(argc, argv, "pioneer_tf"); ros::NodeHandle n; ros::Rate r(100); tf::TransformBroadcaster broadcaster; //subscribe to pose info ros::Subscriber pose_sub = n.subscribe("RosAria/pose", 100, poseCallback); while(n.ok()){ //base_link => laser broadcaster.sendTransform( tf::StampedTransform( tf::Transform(tf::Quaternion(0, 0, 0), tf::Vector3(/*0.034, 0.0, 0.250*/0.385, 0, 0.17)), ros::Time::now(), "/base_link", "/laser")); ros::spinOnce(); r.sleep(); } } Of course, the maps created using this setup are a mess. This happens after I move the robot with the joystick for a bit. Initially, on starting up ROSARIA and the transform node the axes are aligned. It is only after the robot moves that they lose their alignment. Can someone help me understand what is wrong with my transform tree and how can I fix this? Thanks! EDIT [This](/upfiles/1423820617525082.png) is a typical example of what happens to my robot's odometry when I drive it around a little bit. Basically, it moved forwards, then turned around a desk and then moved forwards some more. I do not expect such bad odometry.

transform base_link to base_laser,map,odom

$
0
0
I know the meaning of transform base_link to base_laser. But what is the meaning of transform base_link to map? What is the meaning of transform base_link to odom? Because I need the meaning to set the correct value in navigation program. Thank you~

Robot model in rviz is not following base link

$
0
0
Hi, I am using Gazebo with a simulated model with GPS, IMU and Odometry from the diff drive including the navsat and the ukf node, the estimation is not correct yet, but I wonder about something else: My robot model does not move in rviz. In rviz I have set up the fixed frame to map. The tfs are generated and I see them in Rviz, the tf links are moving around and the wheel tfs also rotate when doing so, but the rendered robot model keeps staying at the initial position. It is like that now, but before I could uncheck and check the checkbox of the robot state and the robot moved to the base_link to stick in that position till I do that unchecking and checking again. But that also does not work anymore. Somehow it is just at the right place when setting fixed frame to base link, so the robot model is always following the fixed frame. But shouldn't it follow the "Robot Root Link"? (set to base_link) Has anybody an idea what I am doing wrong? Thanks in advance, for any help.

Base_link frame is not connected to odom frame

$
0
0
hello, I'm using Ubuntu 14.04 and ROS Indigo I used the following block of code to publish the transformation of base_link frame and odom frame: geometry_msgs::TransformStamped odom_trans; odom_trans.header.stamp = current_time; odom_trans.header.frame_id = "odom"; odom_trans.child_frame_id = "base_link"; odom_trans.transform.translation.x = x; odom_trans.transform.translation.y = y; odom_trans.transform.translation.z = 0.0; odom_trans.transform.rotation = odom_quat; //send the transform odom_broadcaster.sendTransform(odom_trans); Also, I used this launch file to publish the other frames: When the robot start moving I run rviz and it shows the odom frame moving, but the base_link and camera_link are fixed in their initial position. I think I missed something but I don't know what is it. UPDATE: 1) current_time is set inside the loop `while(nodehandle.ok())` and is set to `current_time = ros::Time::now();` 2) `odom_broadcaster` is created outside the loop and at the beginning of the main as `tf::TransformBroadcaster odom_broadcaster;` 3) This is running live 4) When I run `rosrun tf tf_echo odom base_link` it shows values like: At time 1428335473.731 - Translation: [-0.338, 1.148, 0.000] - Rotation: in Quaternion [0.000, 0.000, 0.295, 0.955] in RPY [0.000, -0.000, 0.600] At time 1428335474.787 - Translation: [-1.613, 3.059, 0.000] - Rotation: in Quaternion [0.000, 0.000, 0.223, 0.975] in RPY [0.000, -0.000, 0.449] 5) This is my tf tree: ![tf tree](/upfiles/1428336712182709.jpg) Also I have this transformation in the launch file: ``

urdf in rviz is too deep

$
0
0
Hi, I setup my robot system and its working good. I am using hector mapping with urdf and a laserscanner and I am getting the following result: ![image description](/upfiles/1430385843616846.png) I dont know why exactly the robot is kind of swimming in the world, here is my tf tree: ![image description](/upfiles/14303859142285894.png) Another Question is now, how can I make the Robot move autonomously and how can I integrate a path planner? What is a common used module to make the robot move and avoid obstacles and crashes? thanks in advance, felix

robot_base_frame parameter in move_base/amcl

$
0
0
Hi all, I have a mobile robot with IMU, wheel encoders and hokuyo laser. I am following [this](http://answers.ros.org/question/11682/robot_pose_ekf-with-an-external-sensor/) to set up the transformations and all the frames. I am using robot_pose_ekf to fuse data from IMU and wheel encoders and get the `odom_combined -> base_footprint` transformation. As suggested in the above link, in wheel_odometry, `odom_combined` is the header frame id and `base_footprint` is the child_frame_id and I have a fixed transformation from `base_footprint->base_link` . Now my question is what should be the robot_base_frame parameter in `global_costmap_params.yaml` and `local_costmap_params.yaml` in move_base? Should it be base_link(this is the default) or base_footprint and why? Also, what should be the `base_frame_id` parameter in AMCL, base_link or base_footprint? Thanks in advance. Naman Kumar

Calibrating Laser Scanner Transform

$
0
0
Hey, I work with a robot that is equipped with a 2D Laser Scanner. And actually its pretty hard to measure the distance from the center of the rorbot to the center of the Laser Scanner accurately by hand. So I think my static transform between the frame for the base link and the frame for the Laser Scanner is not precise enough so that there are small errors when the robots rotates. Is there any possibility to determine the exact position of the Laser Scanner using Laser Scan Data? Thanks in advance :)

hector slam problem

$
0
0
I want to create a map only with my Laserscanner SICK TIM551. So I started my laserscanner and hector slam with roslaunch hector_mapping sick_mapping.launch I noticed this warning lookupTransform base_link to laser timed out. Could not transform laser scan into base_frame. My launch file here is a pic of my frames: http://www.pic-upload.de/view-27390402/Screenshot-from-2015-06-17-04--45--12.png.html Can anyone help me??

Need help with hector slam tf

$
0
0
Hi all. I have successful set up my Hokuyo UST-20LX to do laser scanning. I'm currently trying to use the hector_slam package now. However, I have problems regarding the tf. Using rosrun tf tf_monitor, Frames: Frame: /base_laser published by /robot_tf_publisher Average Delay: 0.000538933 Max Delay: 0.00112979 Frame: /base_link published by /odom_to_base_link Average Delay: -0.0995477 Max Delay: 0 Frame: /laser published by /US6 Average Delay: -0.0995496 Max Delay: 0 All Broadcasters: Node: /US6 9.99601 Hz, Average Delay: -0.0995496 Max Delay: 0 Node: /odom_to_base_link 9.99599 Hz, Average Delay: -0.0995477 Max Delay: 0 Node: /robot_tf_publisher 100.099 Hz, Average Delay: 0.000538933 Max Delay: 0.00112979 I tried to run hector_geotiff node, but got a warning, that is: [ WARN] [1435203912.226730957]: No transform between frames /map and /base_link available after 20.001726 seconds of waiting. This warning only prints once. That means I havent connect up /map and /base_link. I'm not sure how to do that. I tried looking for answers but to no avail. Also, I ran mapping_default.launch from hector_mapping package and got this: [ INFO] [1435206678.813741923]: lookupTransform base_footprint to /laser timed out. Could not transform laser scan into base_frame. My task is to map out area using hector slam, like this one here: https://www.youtube.com/watch?v=sieBqVxTz2c Does anyone knows what tf should I connect to achieve the final result? Thanks a million!

specifying base_link frame for the Transformation tree

$
0
0
Hi all, The base_link frame of my robot is 16 cm above the ground (or base_footprint frame) and laser frame is 8 cm above the base_link frame and I have specified all the transformations with respect to base_link frame. The origin is set to the map frame. Here is my TF tree: ![image description](/upfiles/14352528307805993.png) Now my doubt is when I open RViz and see the robot_footprint with all the frames, base_link frame is on the ground and base_footprint is 16 cm below it, see the image: ![image description](/upfiles/14352536494293107.png) This causes laser to be 8 cm above the ground instead of 24 cm (16 cm + 8 cm) in RViz. Shouldn't the base_footprint be on the ground and base_link frame 16 cm above it? Can anyone tell me what is wrong here and how can I correct it so that the base_footprint is on the ground and base_link frame is 16 cm above it or am I misinterpreting something? **Update:** The odom_combined frame is also below the ground by 16 cm and is aligned with the base_footprint frame to start with: ![image description](/upfiles/14353401487052631.png) **Update 2:** The output of `rosrun tf tf_echo map odom_combined` is : At time 1435579139.606 - Translation: [2.414, 1.156, -0.160] - Rotation: in Quaternion [-0.003, 0.015, 0.680, 0.733] in RPY [0.016, 0.025, 1.496] At time 1435579140.606 - Translation: [2.414, 1.156, -0.160] - Rotation: in Quaternion [-0.003, 0.015, 0.680, 0.733] in RPY [0.016, 0.025, 1.496] Thanks in advance. Naman Kumar

offset between base_link and base_footprint

$
0
0
hello friends. I am using laser_scan_matcher and robot_pose_ekf. everything is good. But there is a offset always between base_link and base_footprint in rviz display. you think is that normal or is there a problem ?? this my rviz diplay: http://i.imgur.com/SoIrtIu.png this is my tf trees: http://i.imgur.com/Pknej0f.png

doubt regarding cmd_vel published by move_base

$
0
0
Hi all, **(See Update 2)*** I am using [move_base](http://wiki.ros.org/move_base?distro=jade) and everything seems to be working fine except sometimes, `move_base` publishes erratic velocities on `cmd_vel` topic. For example: if `max_vel_x: 0.4 (or 0.3)` is specified in `base_local_planner_params.yaml`, `move_base` publishes correct velocities. But if `max_vel_x: 0.5`, then velocities published by `move_base` keeps on switching between 0.5 and 0.3 for a straight line motion of the robot which results in a jerky motion of the robot. Similarly, if `max_vel_x: 0.6`, same thing happens and velocities switches between 0.6 and 0.35. Also, there are lot of jerks in angular velocities also. The `base_local_planner` is not able to send correct and smooth velocities. I have tried changing the `base_local_planner_params.yaml` and it didn't help. It causes lot of jerks in the robot. Does anyone have any idea why is this happening? Can it be a problem with the transformation or the way `robot_footprint` is specified? I am using rotary encoders to get the odometry message and `odom->base_link` tf. I am using [amcl](http://wiki.ros.org/amcl?distro=indigo) for localization and [move_base](http://wiki.ros.org/move_base?distro=indigo) for planning with default local planner and `sbpl_lattice_planner` as global planner. **base_local_planner_params.yaml** TrajectoryPlannerROS: #Set the acceleration limits of the robot acc_lim_th: 0.7 acc_lim_x: 0.5 acc_lim_y: 0 #Set the velocity limits of the robot max_vel_x: 0.40 min_vel_x: 0.10 max_vel_theta: 1.0 min_vel_theta: -1.0 max_rotational_vel: 1.0 min_in_place_rotational_vel: 0.8 min_in_place_vel_theta: 0.8 #The velocity the robot will command when trying to escape from a stuck situation escape_vel: -0.2 #For this example, we'll use a holonomic robot holonomic_robot: false #Set the tolerance on achieving a goal xy_goal_tolerance: 0.20 yaw_goal_tolerance: 0.15 latch_xy_goal_tolerance: false #We'll configure how long and with what granularity we'll forward simulate trajectories sim_time: 1.5 sim_granularity: 0.025 angular_sim_granularity: 0.025 vx_samples: 3 vtheta_samples: 20 controller_frequency: 10.0 #Parameters for scoring trajectories goal_distance_bias: 0.8 path_distance_bias: 1.0 occdist_scale: 0.01 heading_lookahead: 0.325 #We'll use the Dynamic Window Approach to control instead of Trajectory Rollout for this example dwa: false #How far the robot must travel before oscillation flags are reset oscillation_reset_dist: 0.05 #Eat up the plan as the robot moves along it prune_plan: false # Global Frame id global_frame_id: odom_combined The **footprint of the robot** is : #The footprint of the robot and associated padding footprint: [[-0.08, -0.32], [-0.08, 0.32], [0.74, 0.32], [0.74, -0.32]] footprint_padding: 0.01 **Update 1:** So, it turns out the problem is with the `robot_footprint`. If I change the `robot_footprint` so that `base_link` is at the center of the robot by making `footprint: [[-0.41, -0.32], [-0.41, 0.32], [0.41, 0.32], [0.41, -0.32]]` and change the transformations accordingly, it seems to be working now. But in the original `robot_footprint`, `base_link` was at the center of rotation of the robot whereas in this case, `base_link` is at the geometric center of the robot (not center of rotation) but this works better. I am slightly confused now. Does anyone have any idea why is this happening? **Update 2:** As suggested by @nyquist09, increasing `vx_samples` helped. Also, changing `sim_time: 1.0` from `sim_time:1.5` helped a lot and improved the results. But I am not sure why reducing `sim_time` helped that much? Does anyone know why is that? Thanks in advance. Naman Kumar

Waiting on transform from base_link to map to become available before running costmap

$
0
0
Hi, I doing a project of using kinect and ros for obstacle avoidance. I am new to ros, so glad if anyone could explain to me in details. Here is my launch file to bring up all the sensors, as well as running the navigation stack with amcl. I following this repo. https://github.com/danimtb/pioneer3at_ETSIDI As you can see from my TF tree, the depthimage_to_laser node and map_server is not running. I not sure why because in my launch file, i did launch my depthimage_to_laserscan. I not sure what other launch files i didn't run ? I also get this error message. Waiting on transform from base_link to map to become available before running costmap, tf error:

Connect two Youbots with a wire in gazebo

$
0
0
I'm simulating a wire using multiple joints linked together. The two youbots have its own separate urdf file. SO I was able to link 10 joints to the front of one Youbot, but I cannot link the last joint to the rear of the other Youbot. How do i make the last joint the parent link of the base of the other Youbot? Any ideas?

Tf origin is away from the base_link

$
0
0
I have built my differential drive mobile robot in solidworks and converted that to URDF file using soliworks2urdf convertor. I successfully launched and robot and simulated with tele-operation node. Since i am intended to use navigation stack in i viewed the transform of the robot in rviz which resulted as below. ![image description](/upfiles/14522727952715033.png) As you can see the base plate is the one which supports the wheels and castors but the tf of base plate is shown away from the actual link and even odom is away from the model. Where have i gone wrong and how to fix this. Refer the URDF of model below. Wheel_LWheel_RBase_plate0.2350.12trueCastor_F, Castor_R

Error in amcl node while configuring navigation stack simulation

$
0
0
I have configured the navigation stack to a mobile robot model. When i tried to launch the below file got a following error. I am sure the error is with amcl node because i checked other nodes, launch file works fine without the amcl node. Error: > [ERROR] [1454003234.548580927,> 485.467000000]: Couldn't transform from hokuyo_link to base_link, even> though the message notifier is in use I have verified the rqt_graph and here is it for ref ![image description](/upfiles/14540035042708535.png)

Moving laser frame causing problem

$
0
0
I am using lidar lite v2 with gmapping. The lidar is rotated using a motor. To incorporate moving lidar, I have applied transform from lidar frame i.e base_laser to base_link is using a variable angle which is sent by the lidar motor. It works perfectly fine as I can see the lidar frame constantly rotating in Rviz. But, due to this, the transform from map to odom or base_link keeps changing constantly which causes the map frame to move constantly in Rviz while all other frames are stationary (except base_laser). When I use a non_varying/constant transform from base_laser to base_link (as if the lidar was stationary), then it works fine and the map frame does not float in Rviz (or in other words, transform from map to odom/base_link does'not change). Any ideas on how to resolve this? Thanks.

more than one odom->base_link transform

$
0
0
Hi, I have 2 nodes from third parts, both of them publish odom->base_link transform. I want to combine these 2 transforms and publish my own odom->base_link transform, since our navigation node needs this transform. If I can't modify the source code of all the nodes except my own fusion node, how can I make the whole system work on ROS? I mean combining the 2 input odometry and publish my own odom->base_link transform for navigation node? I notice there was a discussion about similar topics. But it didn't provide an answer. http://answers.ros.org/question/198041/more-than-one-odom-base_link-transform/ Thanks a lot. Yanming

Confused about coordinate frames. Can someone please explain?

$
0
0
I've read: - http://wiki.ros.org/navigation/Tutorials/RobotSetup/Odom - http://www.ros.org/reps/rep-0105.html - http://ros.org/wiki/tf - http://wiki.ros.org/tf/Design However I'm still quite confused about the different coordinate frames map, odom, and base_link. Below I'll explain what my current understanding of these frames is, along with some questions. **map** - I think this one makes the most sense conceptually. - I consider this the "ground truth" which is the real world. - REP-0105 calls this a "world fixed frame", which makes sense, because it's tied to the real world. **odom** - REP-0105 calls this a "world fixed frame". What? How is this fixed? Isn't this supposed to represent the movement of a mobile robot? - This frame is computed from an odemetry source like wheels or an IMU. Makes sense. **base_link** - Why is base_link a separate frame? Shouldn't odom and base_link be the same thing? Doesn't odom represent the robot base and its movement? I feel like I partially understand frames and tf. For example if you have sonar sensors mounted on a robot, you may want to know what the sensor value is from the base_link frame instead of the sonar sensor frame. I get that. But the odom and base_link frames confuse me. Can someone please explain what the difference is?

error while building 2d map

$
0
0
hi i am trying to build 2d map i go just part of the map and git in the axis error that say in rviz: Transform For frame [base_link]: No transform to fixed frame [map]. TF error: [Could not find a connection between 'map' and 'base_link' because they are not part of the same tree.Tf has two or more unconnected trees.] ithe odometery data are publishing and the kinect work fine i don't know where the problem is i will include all the information that can help: i am using indigo first i use this launch file and and it must run good base controller is running without any problem and publish the odom data the tf tree and rviz [Screenshot from 2016-06-18 19:51:20.png](/upfiles/14662687908962767.png) [Screenshot from 2016-06-18 19:52:30.png](/upfiles/1466268808500082.png) i hope this can help. note: you can find base_controller in this package https://github.com/sungjik/my_personal_robotic_companion I use it as a reference with some difference but the tf publish is the same . I am launch robot config and slam.launch from this package I hope this information will help.
Viewing all 74 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>