User Tools

Site Tools


Writing /var/lib/dokuwiki/data/meta/tutorials/camera_arm_calibration.meta failed
tutorials:camera_arm_calibration

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
tutorials:camera_arm_calibration [2019/11/08 15:39] – [Camera Arm Calibration Process] dgarciatutorials:camera_arm_calibration [2022/09/20 00:08] (current) – external edit 127.0.0.1
Line 2: Line 2:
  
 **WARNING!** This tutorial assumes that you have an advanced knowledge of: git, linux, ROS1, YARP, Python, KUKA arm, FRI, Robot Kinematics **WARNING!** This tutorial assumes that you have an advanced knowledge of: git, linux, ROS1, YARP, Python, KUKA arm, FRI, Robot Kinematics
 +
 +This tutorial was created by [[people:dgarcia|Daniel García Vaglio]]. If you have any questions about this tutorial or about the calibration process ask him.
  
 ==== Set Up process ==== ==== Set Up process ====
Line 8: Line 10:
   * Clone repository oms-cylinder and follow the ''README'' for installation instructions   * Clone repository oms-cylinder and follow the ''README'' for installation instructions
   * Install the Humanoid Robot Simulator   * Install the Humanoid Robot Simulator
-  * Clone the keyboard_cart_control repository in ''~/local/src''+  * Clone the ''keyboard-cart-control'' repository in ''~/local/src''
   * Install and setup FRI   * Install and setup FRI
   * Attach the calibration board to the robot.   * Attach the calibration board to the robot.
Line 38: Line 40:
   * Press ''u'' to see how many poses have been stored. We want around 2500 poses.   * Press ''u'' to see how many poses have been stored. We want around 2500 poses.
   * When you are done, press ''o'' to save the poses to a ''csv'' file.   * When you are done, press ''o'' to save the poses to a ''csv'' file.
 +  * Close ''keyboard_cart_control.py''
 +  * Move the generated file to the following directory:
 +
 +  mv <poses-filename>.csv  <catkin_workspace>/src/oms-cylinder/camera2png/data
  
 ==== Taking Photos ==== ==== Taking Photos ====
  
 +  * Unplug and re-plug the realsense from the computer.
 +  * open ''realsense-viewer''
 +  * **Disable** the following options:
 +    - Enable auto Exposure
 +    - Enable auto white Balance
 +    - Auto Exposure Priority
 +    - Set White Balance to: 3760
 +  * Close ''realsense-viewer''
 +  * Remember to source tour workspace before executing the following commands.
 +  * Now we need to launch the camera node:
  
 +  cd <catkin_workspace>/src/oms-cylinder/oms_launcher/launch/
 +  roslaunch rs_camera.launch
 +
 +  * In a separate terminal execute the script that will create the photos for calibration
 +
 +  cd <catkin_workspace>/src/oms-cylinder/camera2png/data
 +  mkdri outpout
 +  cd <catkin_workspace>/src/oms-cylinder/camera2png/scripts/
 +  python camera2png.py ../data/example.py ../data/<poses-file-name>.csv ../data/output
 +  
 +=== Trouble shooting ===
 +
 +Sometimes the KUKA robot stops working (we are not sure why). When that happens, you will have to:
 +
 +  * Kill the ''camera2png'' script
 +  * Re-enable the arm
 +  * check in ''<catkin_workspace>/src/oms-cylinder/camera2png/data'' which was the last generated photo. Lets say it was ''N''.
 +  * Restart the ''camera2png'' script like this:
 +
 +  python camera2png.py ../data/example.py ../data/<poses-file-name>.csv ../data/output --start <N+1>
 ==== Executing the calibration script ==== ==== Executing the calibration script ====
  
 +  * First we have to measure the transformations between the kinematic arm base to the camera and from the wrist to the calibration board. This will be used as the initialization guess for the calibration program to use. The guess has to be given in translation ''vector(x, y, z)'' and orientation ''quaternion(x, y, z, w)''. This configuration has to be stored in YAML format in the following file: ''<catkin_ws>/src/oms-cylinder/oms_launcher/cal/configs/static_camera_guesses.yml''. Here you can find an example (Do **NOT** use this as your guess).
 +
 +<code yaml>
 +base_to_camera_guess:
 +  x: 1.04
 +  y: 1.1
 +  z: 0.0
 +  qx: 0.3
 +  qy: 0.4
 +  qz: 0.5
 +  qw: 0.01
 +  
 +wrist_to_target_guess:
 +  x: 0.109
 +  y: 0.2
 +  z: -0.05
 +  qx: -0.-4
 +  qy: 0.5
 +  qz: 0.5
 +  qw: -0.4
 +</code>
 +
 +  * You also need to get the camera intrinsics and store them in another YAML. Here you can find and example:
 +
 +<code yaml>
 +intrinsics:
 +  fx: 1387.6160888671875
 +  fy: 1387.5479736328125
 +  cx: 943.9945678710938
 +  cy: 561.1880493164062
 +</code>
 +
 +  * Then you need to provide a configuration file for the calibration board, like so:
 +
 +<code yaml>
 +target_definition:
 +  rows: 10
 +  cols: 10
 +  spacing: 0.01861
 +</code>
 +
 +  * Now we have to create a separate directory for images and poses.
 +
 +<code bash>
 +mv <catkin_workspace>/src/oms-cylinder/camera2png/data <catkin_ws>/src/oms-cylinder/oms_launcher/cal/configs
 +cd <catkin_ws>/src/oms-cylinder/oms_launcher/cal/configs/data
 +mkdir images
 +mkdir poses
 +mv *.yml poses
 +mv *.png images
 +</code>
 +
 +  * Now we have to crate a YAML listing all the files that have to be taken into consideration by the calibration script, in the ''configs/data'' directory.
 +
 +<code yaml>
 +# TODO: Fix this exxample
 +- pose: "000000.yml",
 +  image: "000000.png"
 +- pose: "000001.yml",
 +  image: "000001.png"
 +  .
 +  .
 +  .
 +</code>
 +
 +  * Launch the static calibration script. 
 +<code bash> 
 +roslaunch oms-cylinder/oms_launcher/launch/static_calibration.launch
 +</code>
 +
 +  * This will start the calibration script. This will open a window that will show you the photos that were taken, you should see some color dots. If your data set is small, you will have to press ''ENTER'' for each image. If your data set is big, only one ''ENTER'' is required (we have no idea what is the threshold, but is seems to be around 2000 poses).
 +  * You must wait when you press ''ENTER'' and the program doesn't respond, there are some images that take longer to analyze than others. When you reach the last image, the //iamge-window-interface// will gt stuck, Do Not press ''ENTER''. Wait until the algorithm converges.
 +  * Once the algorithm converges you will see a lot of results like the one shown below. The calibration result is the first one.
 +
 +<code>
 +Did converge?: 1
 +Initial cost?: 47853.2 (pixels per dot)
 +Final cost?: 0.655122 (pixels per dot)
 +
 +BASE TO CAMERA:
 + -0.208397  -0.664739   0.717421 -0.0773392
 + -0.967382  0.0320803  -0.251281   0.220998
 +  0.144021  -0.746387  -0.649742    1.43606
 +                  0          0          1
 +
 +--- URDF Format Base to Camera ---
 +xyz="-0.0773392 0.220998 1.43606"
 +rpy="0.854511(48.9599 deg) -2.99707(-171.719 deg) 1.35862(77.8429 deg)"
 +qxyzw="-0.593562 0.687426 -0.362827 0.208531"
 +
 +BASE TO CAMERA:
 + -0.0184899    0.999697  -0.0162778  -0.0817589
 +-0.00618554   0.0161659     0.99985    0.139854
 +    0.99981   0.0185878  0.00588475 -0.00619646
 +          0                               1
 +
 +--- URDF Format Base to Camera ---
 +xyz="-0.0817589 0.139854 -0.00619646"
 +rpy="-1.87741(-107.567 deg) -1.59029(-91.1172 deg) 0.322833(18.497 deg)"
 +qxyzw="-0.48976 -0.507142 -0.502048 0.500889"
 +</code>
 ==== Verify that the calibration is correct ==== ==== Verify that the calibration is correct ====
  
tutorials/camera_arm_calibration.1573227549.txt.gz · Last modified: 2022/09/20 00:08 (external edit)