User Tools

Site Tools


Writing /var/lib/dokuwiki/data/meta/tutorials/object_manipulation_robot_simulator.meta failed
tutorials:object_manipulation_robot_simulator

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

tutorials:object_manipulation_robot_simulator [2019/11/14 00:59] – [Starting the vfclik system for the real robot (ROBIO 2019 experiments). Only runs on the right arm/hand] admintutorials:object_manipulation_robot_simulator [2022/09/20 00:08] (current) – external edit 127.0.0.1
Line 415: Line 415:
   * Press the green button with "+" several times until FRI switches to joint impedance control. During this the arm moves to the starting position and then to the last commanded position. Beware!!!!   * Press the green button with "+" several times until FRI switches to joint impedance control. During this the arm moves to the starting position and then to the last commanded position. Beware!!!!
  
-==== Starting the vfclik system for the real robot (ROBIO 2019 experiments). Only runs on the right arm/hand ====+==== Starting the vfclik system for the real robot (AAMAS 2019 experiments). Only runs on the right arm/hand ====
  
  
Line 494: Line 494:
   torque_sim -b /arcosbot-real -r -a robot_descriptions/arcosbot/kinematics/lwr/ -c robot_descriptions/arcosbot/kinematics/sahand/calibration_data/finger_calibration_data.py -f robot_descriptions/arcosbot/kinematics/sahand/hands_kin.py   torque_sim -b /arcosbot-real -r -a robot_descriptions/arcosbot/kinematics/lwr/ -c robot_descriptions/arcosbot/kinematics/sahand/calibration_data/finger_calibration_data.py -f robot_descriptions/arcosbot/kinematics/sahand/hands_kin.py
  
-  * Adjust the desired fingers to calibrate the the finger variable in hand_calibration.py file +  * Adjust the desired fingers to calibrate in the finger variable in hand_calibration.py file 
-  * Prepare a small bottle with a previously measured weight.+  * Prepare a small "weight object" for calibration purposes with a previously measured weight. We recommend using a small bottle filled with water. Remember that the fingers can't lift something too heavy. Use 200grams.
   * Run the hand_calibration.py script:   * Run the hand_calibration.py script:
  
Line 501: Line 501:
   ./hand_calibration.py -b /arcosbot-real -c ~/local/src/robot_descriptions/arcosbot/kinematics/sahand/calibration_data/finger_calibration_data.py -f ~/local/src/robot_descriptions/arcosbot/kinematics/sahand/hands_kin.py -a ~/local/src/robot_descriptions/arcosbot/kinematics/lwr/   ./hand_calibration.py -b /arcosbot-real -c ~/local/src/robot_descriptions/arcosbot/kinematics/sahand/calibration_data/finger_calibration_data.py -f ~/local/src/robot_descriptions/arcosbot/kinematics/sahand/hands_kin.py -a ~/local/src/robot_descriptions/arcosbot/kinematics/lwr/
  
-  * Follow the instructions of the script. Be sure to disconnect or "lift" the finger connected weight when the script is zeroing sensors torques. Be sure to connect, to the corresponding finger, the bottle when indicated.+  * Follow the instructions of the script. Be sure to disconnect or "lift" the finger connected weight when the script is zeroing sensor'torques. Be sure to connect, to the corresponding finger, the bottle when indicated.
   * When the script is "looping", ctrl-c it !!! (Stop it)   * When the script is "looping", ctrl-c it !!! (Stop it)
   * Annotate the torque and angle calibration values.   * Annotate the torque and angle calibration values.
Line 509: Line 509:
   * Run again sahand_yarp_sim and torque_sim. This will apply the new calibration values to the estimated forces.   * Run again sahand_yarp_sim and torque_sim. This will apply the new calibration values to the estimated forces.
   * Remember to flex the finger joints to a prudent value to avoid a singularity and get better force estimations.   * Remember to flex the finger joints to a prudent value to avoid a singularity and get better force estimations.
-  * Remember to offset the finger torques every time you reorient the hand. The finger impedance control doesn't have gravity weight compensation, therefore, the finger weight affects the measurement greatly.+  * Remember to zero the finger torques every time you reorient the hand. The finger impedance control doesn't have gravity weight compensation, therefore, the finger weight affects the measurement greatly.
   * You can run hand_calibration.py again until the fingers are flexed and the torques are zeroed to test the new calibration values.   * You can run hand_calibration.py again until the fingers are flexed and the torques are zeroed to test the new calibration values.
  
  
 +=== Camera calibration ===
  
 +The usual steps for setting up a system with a camera with marker detection consists:
 +
 +  * Install ROS modules (camera module, rectification module, marker detection module)
 +  * Calibrate the camera intrinsics 
 +  * Measure the arm_base-to-camera_base transform. Since this is done totally manually, this is usually pretty wrong.
 +  * Run a arm_base-to-camera_base calibration to obtain the corrected arm_base-to-camera_base transform.
 +  * Set a launch file for running the camera module, together with a rectification module, together with an static transform representing the arm_base-to-camera_base transform.
 +  * Use a ros to yarp bridge to send the marker poses to yarp.
 +
 +
 +== Install ROS1 modules ==
 +
 +  * We use a realsense sensor. Our sensor has a full-HD RGB camera which we are using.
 +  * We detected a problem with the brightness adjustment of the camera: the brightness oscillates erratically. We deal with this problem by stopping the ROS realsense module, running the realsense-viewer, deactivating all the auto-exposure and auto-brightness features, stopping the realsense-viewer and then running again the ROS realsense module. 
 +  * We have a launch file for our realsense camera in the following repository:
 +
 +  git clone git@git.arcoslab.org:humanoid-software/oms-cylinder.git
 +
 +  * You can run this configuration with:
 +
 +  roslaunch oms_launcher rs_camera.launch
 +
 +  * Check this launch file for relevant camera settings
 +  * We use ROS1 with a module "ar_pose" to detect object's position and orientation. Follow typical ROS1 procedures to install this module (works in melodic and Ubuntu Bionic)
 +
 +== Calibrate camera intrinsics ==
 +
 +  * The purpose of this step is to rectify the image coming from the camera. Usually there is some degree of image distortion with any lenses. This is possible to counteract with image_proc ROS module.
 +  * You can use the camera calibration toolset from ROS1:
 +
 +http://wiki.ros.org/camera_calibration
 +
 +  * After finding a way to deactivate the realsense module intrinsic internal rectification we ran a full camera calibration and found out that the included realsense calibration is quite good. This step is not necessary with this camera.
 +  * If you have a camera that needs this calibration, you will need to configure the camera intrinsics in the camera module somehow (some modules can't do this, beware!). Then you will need to connect the camera output to image_proc. This will create an /image_rect topic with the rectified image.
 +
 +== Run a arm_base-to-camera_base calibration ==
 +
 +  * For this type of calibration we used the ROS1 module: robot_cal_tools
 +
 +  https://github.com/Jmeyer1292/robot_cal_tools
 +
 +  * This module is robot independent and is not too complicated to use
 +  * In our oms-cylinder ROS module take a look at the static_calibration.launch and its internal configuration files
 +  * These are the main steps to follow to execute a camera_base calibration:
 +    * Mount rigidly a the calibration panel to the arms end-effector
 +    * Mount rigidly the camera to the robot (this can't change after calibration!)
 +    * Write and run a script that moves the end-effector in the picture space. Be sure to navigate to very disperse 6D areas of the picture space. Try strong rotations but still camera visible. Try this rotations if far away x,y,z points. Also try the same in some center areas of the picture. It is possible to calibration the camera_base with positions. This script has to move the end-effector to these positions and then capture an image in each of them. Be sure to stop the arm completely, if not, the image can be blurred and the calibration table can be incorrectly detected. You need to store the corresponding image file and a yaml file that points to the corresponding image. This yaml file includes the end-effector position.
 +    * With the images and end-effector positions now you can run the robot_cal_tools calibrator. If it converges it will give an estimated camera_base transform. You will need this transform for the next main step.
 +
 +== Set a launch file for running the camera module, together with a rectification module, together with an static transform representing the arm_base-to-camera_base transform. ==
 +
 +  * In oms-cylinder repository use the following two launch files to incorporate the camera_to_arm_base transform found in the previous step:
 +
 +  oms_launcher_marker90.launch
 +  ar_oms_cylinder_marker90.launch
 +
 +  * You can use this launch file to run the system once the calibration is satisfactory
 +
 +
 +== Ros to yarp bridge ==
 +
 +  * The ROS oms-cylinder module has a ros_to_yarp script to send the markers poses to yarp. Find it in:
 +
 +  cd ar_pose_marker_yarp_bridge/scripts
 +  ./ros_to_yarp_box_pose.py -w ~/local/src/robot_descriptions/arcosbot/perception/webcam_calibration_values.py -o ~/local/src/cmoc/objects/sliding/objects/object_params.py
 +
 +  * The webcam_calibration_values.py file is for now ignored. You will need to put a camera_pose homomatrix array (identity) though. This file was used in case you didn't use a static_transform in the camera launch modules for the camera_base transform.
 +  * The object_params.py file contains self.markers_transformations dictionary. There you have to configure an static transformation for you object of interest (a box has the marker attached on one side, you can add a static transform to set the markers yarp data to that position).
 +  * This yarp module will publish marker data in the /arcosbot-real/marker/object_pos:o port
 +  * You can use the -f option to do a "finger_testing" run. This will disable the objects transforms and it will give a pure marker position. This will be useful for the the next tutorial part.
 +
 +=== Finger tip pushing calibration ===
 +
 +Once you followed the [[tutorials:object_manipulation_robot_simulator#hand_torque_sensor_calibration|#hand_torque_sensor_calibration]], you will have one or more fingers for exerting forces against objects. Now we will assume that you will use one finger to push an object. Once you selected the particular finger you will need to find a particular hand orientation to push the object to avoid table crashes or other parts of the hand to crash against the object itself. A trick to get this easier is to glue 
 +a marker to the fingertip of the selected pushing finger. Then orient the hand (using run_right.sh "/arcosbot-real") in the desired pushing orientation. Command the fingers to a finger pushing configuration. Glue the marker to the finger tip. Adjust the marker such that it is vertical in orientation (to match the markers that are glued to other objects). Get the current marker pose (To_m), get the current finger global pose (To_f), calculate the fingertip-to-marker homo transform (Tt_m=((To_t)^-1)*To_m). Use this transform in the robot_descriptions/arcosbot/perception/webcam_calibration_values.py file with the rel_pose_marker_finger variable.
 +
 +Step-by-step instructions:
 +
 +  * Run the ros_to_yarp marker bridge without any object transform:
 +
 +  cd ar_pose_marker_yarp_bridge/scripts
 +  ./ros_to_yarp_box_pose.py -w ~/local/src/robot_descriptions/arcosbot/perception/webcam_calibration_values.py -o ~/local/src/cmoc/objects/sliding/objects/object_params.py -f
 +
 +  * This will give the pure marker pose with no extra object transforms.
 +  * Edit the webcam_calibration_values.py file. Set rel_pose_marker_finger transform to an identity homomatrix.
 +  * Set the finger_pushing_pos joint angles in the exploration.py file to select your desired finger pushing configuration.
 +  * Glue a marker to an object of interest.
 +  * Position the object on a reachable pose on top of a table.
 +  * Run exploration.py 
 +
 +  cd local/src/cmoc/objects/sliding/scripts/
 +  ./exploration.py -n /arcosbot-real -c ~/local/src/robot_descriptions/arcosbot/kinematics/sahand/calibration_data/finger_calibration_data.py -o ~/local/src/cmoc/objects/sliding/objects/object_params.py -f ~/local/src/robot_descriptions/arcosbot/kinematics/sahand/hands_kin.py -w ~/local/src/robot_descriptions/arcosbot/perception/webcam_calibration_values.py
 +
 +  * Follow the program instructions until this appears:
 +
 +<code>
 +Getting initial object position
 +Box global pose [[ 0.99903308  0.00634678 -0.04350436  0.63601302]
 + [-0.00841706  0.99883239 -0.04757109  0.25734383]
 + [ 0.04315164  0.04789128  0.99792002  0.85763204]
 + [ 0.          0.          0.          1.        ]]
 +If right, press "y"
 +</code>
 +
 +  * If the position looks file press y and enter. The robot should move the finger behind the object with the fingertip aligned to the marker of the object (more or less).
 +  * Remove the object. (hide this marker)
 +  * Glue a marker to the finger tip.
 +  * Ctrl-C (cancel) exploration.py
 +  * Using run_right.sh /arcosbot-real rotate the end-effector until the pushing orientation is found. Adjust the glued marker such that the orientation of the marker is the some as the orientation that the object previously had.
 +  * Get the finger_tip pose and the marker pose:
 +    * In one console run (To_m):
 +
 +  yarp read ... /arcosbot-real/marker/object_pos:o
 +
 +    * In another console run (To_f):
 +
 +  yarp read ... /arcosbot-real/lwr/right/vectorField/pose
 +
 +  * Anotate both homomatrix. Calculate Tf_m =((To_f)^-1)*To_m
 +  * Use Tf_m in webcam_calibration_values.py file with the rel_pose_marker_finger variable.
 +  * Run exploration.py again. Check that the pushing orientation is the desired one.
 +
 +=== Exploring for the object parameters ===
 +
 +./exploration.py -n /arcosbot-real -c ~/local/src/robot_descriptions/arcosbot/kinematics/sahand/calibration_data/finger_calibration_data.py -o ~/local/src/cmoc/objects/sliding/objects/object_params.py -f ~/local/src/robot_descriptions/arcosbot/kinematics/sahand/hands_kin.py -w ~/local/src/robot_descriptions/arcosbot/perception/webcam_calibration_values.py
 ==== Running in simulation the same conf as in real life ==== ==== Running in simulation the same conf as in real life ====
  
tutorials/object_manipulation_robot_simulator.1573693160.txt.gz · Last modified: 2022/09/20 00:08 (external edit)