User Tools

Site Tools


Writing /var/lib/dokuwiki/data/meta/tutorials/object_manipulation_robot_simulator.meta failed
tutorials:object_manipulation_robot_simulator

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

tutorials:object_manipulation_robot_simulator [2019/11/14 02:23] – [Starting the vfclik system for the real robot (AAMAS 2019 experiments). Only runs on the right arm/hand] admintutorials:object_manipulation_robot_simulator [2022/09/20 00:08] (current) – external edit 127.0.0.1
Line 588: Line 588:
 === Finger tip pushing calibration === === Finger tip pushing calibration ===
  
-Once you followed the [[tutorials:object_manipulation_robot_simulator#hand_torque_sensor_calibration|#hand_torque_sensor_calibration]]+Once you followed the [[tutorials:object_manipulation_robot_simulator#hand_torque_sensor_calibration|#hand_torque_sensor_calibration]], you will have one or more fingers for exerting forces against objects. Now we will assume that you will use one finger to push an object. Once you selected the particular finger you will need to find a particular hand orientation to push the object to avoid table crashes or other parts of the hand to crash against the object itself. A trick to get this easier is to glue  
 +a marker to the fingertip of the selected pushing finger. Then orient the hand (using run_right.sh "/arcosbot-real") in the desired pushing orientation. Command the fingers to a finger pushing configuration. Glue the marker to the finger tip. Adjust the marker such that it is vertical in orientation (to match the markers that are glued to other objects). Get the current marker pose (To_m), get the current finger global pose (To_f), calculate the fingertip-to-marker homo transform (Tt_m=((To_t)^-1)*To_m). Use this transform in the robot_descriptions/arcosbot/perception/webcam_calibration_values.py file with the rel_pose_marker_finger variable.
  
 +Step-by-step instructions:
 +
 +  * Run the ros_to_yarp marker bridge without any object transform:
 +
 +  cd ar_pose_marker_yarp_bridge/scripts
 +  ./ros_to_yarp_box_pose.py -w ~/local/src/robot_descriptions/arcosbot/perception/webcam_calibration_values.py -o ~/local/src/cmoc/objects/sliding/objects/object_params.py -f
 +
 +  * This will give the pure marker pose with no extra object transforms.
 +  * Edit the webcam_calibration_values.py file. Set rel_pose_marker_finger transform to an identity homomatrix.
 +  * Set the finger_pushing_pos joint angles in the exploration.py file to select your desired finger pushing configuration.
 +  * Glue a marker to an object of interest.
 +  * Position the object on a reachable pose on top of a table.
 +  * Run exploration.py 
 +
 +  cd local/src/cmoc/objects/sliding/scripts/
 +  ./exploration.py -n /arcosbot-real -c ~/local/src/robot_descriptions/arcosbot/kinematics/sahand/calibration_data/finger_calibration_data.py -o ~/local/src/cmoc/objects/sliding/objects/object_params.py -f ~/local/src/robot_descriptions/arcosbot/kinematics/sahand/hands_kin.py -w ~/local/src/robot_descriptions/arcosbot/perception/webcam_calibration_values.py
 +
 +  * Follow the program instructions until this appears:
 +
 +<code>
 +Getting initial object position
 +Box global pose [[ 0.99903308  0.00634678 -0.04350436  0.63601302]
 + [-0.00841706  0.99883239 -0.04757109  0.25734383]
 + [ 0.04315164  0.04789128  0.99792002  0.85763204]
 + [ 0.          0.          0.          1.        ]]
 +If right, press "y"
 +</code>
 +
 +  * If the position looks file press y and enter. The robot should move the finger behind the object with the fingertip aligned to the marker of the object (more or less).
 +  * Remove the object. (hide this marker)
 +  * Glue a marker to the finger tip.
 +  * Ctrl-C (cancel) exploration.py
 +  * Using run_right.sh /arcosbot-real rotate the end-effector until the pushing orientation is found. Adjust the glued marker such that the orientation of the marker is the some as the orientation that the object previously had.
 +  * Get the finger_tip pose and the marker pose:
 +    * In one console run (To_m):
 +
 +  yarp read ... /arcosbot-real/marker/object_pos:o
 +
 +    * In another console run (To_f):
 +
 +  yarp read ... /arcosbot-real/lwr/right/vectorField/pose
 +
 +  * Anotate both homomatrix. Calculate Tf_m =((To_f)^-1)*To_m
 +  * Use Tf_m in webcam_calibration_values.py file with the rel_pose_marker_finger variable.
 +  * Run exploration.py again. Check that the pushing orientation is the desired one.
 +
 +=== Exploring for the object parameters ===
 +
 +./exploration.py -n /arcosbot-real -c ~/local/src/robot_descriptions/arcosbot/kinematics/sahand/calibration_data/finger_calibration_data.py -o ~/local/src/cmoc/objects/sliding/objects/object_params.py -f ~/local/src/robot_descriptions/arcosbot/kinematics/sahand/hands_kin.py -w ~/local/src/robot_descriptions/arcosbot/perception/webcam_calibration_values.py
 ==== Running in simulation the same conf as in real life ==== ==== Running in simulation the same conf as in real life ====
  
tutorials/object_manipulation_robot_simulator.1573698191.txt.gz · Last modified: 2022/09/20 00:08 (external edit)