Writing /var/lib/dokuwiki/data/meta/tutorials/camera_arm_calibration.meta failed
tutorials:camera_arm_calibration
This is an old revision of the document!
Table of Contents
Camera Arm Calibration Process
WARNING! This tutorial assumes that you have an advanced knowledge of: git, linux, ROS1, YARP, Python, KUKA arm, FRI, Robot Kinematics
This tutorial was created by Daniel García Vaglio. If you have any questions about this tutorial or about the calibration process ask him.
Set Up process
- You need to install ROS1 and YARP.
- Clone repository oms-cylinder and follow the
README
for installation instructions - Install the Humanoid Robot Simulator
- Clone the
keyboard-cart-control
repository in~/local/src
- Install and setup FRI
- Attach the calibration board to the robot.
How calibration works
TODO: Add an explanation of the process and why we do things the way we do them
Generating Calibration poses
Set up the system
- Turn on the KUKA arm and set it up with FRI for being used from a Linux Computer.
- Execute a YARP server
- Continue until the
bridge
andvfclik
are running and talking to the robot using FRI. - Go to
~/local/src/keyboard_cart_control
, and execute the following command:
python keyboard_cart_control.py -n /arcosbot-real
- Execute the realsense viewer with the following command:
realsense-viewer
Generate the poses
- Move the arm so that the calibration board is inside the field of view of the camera.
- Press
i
to store this first pose. - press
y
to activate automatic pose storage. - Move the arm in such a way that the calibration board is always visible to the camera. Be careful not to reach singularities.
- Press
u
to see how many poses have been stored. We want around 2500 poses. - When you are done, press
o
to save the poses to acsv
file. - Close
keyboard_cart_control.py
- Move the generated file to the following directory:
mv <poses-filename>.csv <catkin_workspace>/src/oms-cylinder/camera2png/data
Taking Photos
- Unplug and re-plug the realsense from the computer.
- open
realsense-viewer
- Disable the following options:
- Enable auto Exposure
- Enable auto white Balance
- Auto Exposure Priority
- Close
realsense-viewer
- Remember to source tour workspace before executing the following commands.
- Now we need to launch the camera node:
cd <catkin_workspace>/src/oms-cylinder/oms_launcher/launch/ roslaunch rs_camera.launch
- In a separate terminal execute the script that will create the photos for calibration
cd <catkin_workspace>/src/oms-cylinder/camera2png/data mkdri outpout cd <catkin_workspace>/src/oms-cylinder/camera2png/scripts/ python camera2png.py ../data/example.py ../data/<poses-file-name>.csv ../data/output
Trouble shooting
Sometimes the KUKA robot stops working (we are not sure why). When that happens, you will have to:
- Kill the
camera2png
script - Re-enable the arm
- check in
<catkin_workspace>/src/oms-cylinder/camera2png/data
which was the last generated photo. Lets say it wasN
. - Restart the
camera2png
script like this:
python camera2png.py ../data/example.py ../data/<poses-file-name>.csv ../data/output --start <N+1>
Executing the calibration script
- First we have to measure the transformations between the kinematic arm base to the camera and from the wrist to the calibration board. This will be used as the initialization guess for the calibration program to use. The guess has to be given in translation
vector(x, y, z)
and orientationquaternion(x, y, z, w)
. This configuration has to be stored in YAML format in the following file:<catkin_ws>/src/oms-cylinder/oms_launcher/cal/configs/static_camera_guesses.yml
. Here you can find an example (Do NOT use this as your guess).
base_to_camera_guess: x: 1.04 y: 1.1 z: 0.0 qx: 0.3 qy: 0.4 qz: 0.5 qw: 0.01 wrist_to_target_guess: x: 0.109 y: 0.2 z: -0.05 qx: -0.-4 qy: 0.5 qz: 0.5 qw: -0.4
- You also need to get the camera intrinsics and store them in another YAML. Here you can find and example:
intrinsics: fx: 1387.6160888671875 fy: 1387.5479736328125 cx: 943.9945678710938 cy: 561.1880493164062
- Then you need to provide a configuration file for the calibration board, like so:
target_definition: rows: 10 cols: 10 spacing: 0.01861
- Now we have to create a separate directory for images and poses.
mv <catkin_workspace>/src/oms-cylinder/camera2png/data <catkin_ws>/src/oms-cylinder/oms_launcher/cal/configs cd <catkin_ws>/src/oms-cylinder/oms_launcher/cal/configs/data mkdir images mkdir poses mv *.yml poses mv *.png images
- Now we have to crate a YAML listing all the files that have to be taken into consideration by the calibration script, in the
configs/data
directory.
# TODO: Fix this exxample - pose: "000000.yml", image: "000000.png" - pose: "000001.yml", image: "000001.png" . . .
- Launch the static calibration script.
roslaunch oms-cylinder/oms_launcher/launch/static_calibration.launch
- This will start the calibration script. This will open a window that will show you the photos that were taken, you should see some color dots. If your data set is small, you will have to press
ENTER
for each image. If your data set is big, only oneENTER
is required (we have no idea what is the threshold, but is seems to be around 2000 poses). - You must wait when you press
ENTER
and the program doesn't respond, there are some images that take longer to analyze than others. When you reach the last image, the iamge-window-interface will gt stuck, Do Not pressENTER
. Wait until the algorithm converges. - Once the algorithm converges you will see a lot of results like the one shown below. The calibration result is the first one.
Did converge?: 1 Initial cost?: 47853.2 (pixels per dot) Final cost?: 0.655122 (pixels per dot) BASE TO CAMERA: -0.208397 -0.664739 0.717421 -0.0773392 -0.967382 0.0320803 -0.251281 0.220998 0.144021 -0.746387 -0.649742 1.43606 0 0 0 1 --- URDF Format Base to Camera --- xyz="-0.0773392 0.220998 1.43606" rpy="0.854511(48.9599 deg) -2.99707(-171.719 deg) 1.35862(77.8429 deg)" qxyzw="-0.593562 0.687426 -0.362827 0.208531" BASE TO CAMERA: -0.0184899 0.999697 -0.0162778 -0.0817589 -0.00618554 0.0161659 0.99985 0.139854 0.99981 0.0185878 0.00588475 -0.00619646 0 0 0 1 --- URDF Format Base to Camera --- xyz="-0.0817589 0.139854 -0.00619646" rpy="-1.87741(-107.567 deg) -1.59029(-91.1172 deg) 0.322833(18.497 deg)" qxyzw="-0.48976 -0.507142 -0.502048 0.500889"
Verify that the calibration is correct
tutorials/camera_arm_calibration.1573408251.txt.gz · Last modified: 2022/09/20 00:08 (external edit)