Writing /var/lib/dokuwiki/data/meta/tutorials/camera_arm_calibration.meta failed
tutorials:camera_arm_calibration
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
tutorials:camera_arm_calibration [2019/11/08 15:39] – [Camera Arm Calibration Process] dgarcia | tutorials:camera_arm_calibration [2022/09/20 00:08] (current) – external edit 127.0.0.1 | ||
---|---|---|---|
Line 2: | Line 2: | ||
**WARNING!** This tutorial assumes that you have an advanced knowledge of: git, linux, ROS1, YARP, Python, KUKA arm, FRI, Robot Kinematics | **WARNING!** This tutorial assumes that you have an advanced knowledge of: git, linux, ROS1, YARP, Python, KUKA arm, FRI, Robot Kinematics | ||
+ | |||
+ | This tutorial was created by [[people: | ||
==== Set Up process ==== | ==== Set Up process ==== | ||
Line 8: | Line 10: | ||
* Clone repository oms-cylinder and follow the '' | * Clone repository oms-cylinder and follow the '' | ||
* Install the Humanoid Robot Simulator | * Install the Humanoid Robot Simulator | ||
- | * Clone the keyboard_cart_control | + | * Clone the '' |
* Install and setup FRI | * Install and setup FRI | ||
* Attach the calibration board to the robot. | * Attach the calibration board to the robot. | ||
Line 38: | Line 40: | ||
* Press '' | * Press '' | ||
* When you are done, press '' | * When you are done, press '' | ||
+ | * Close '' | ||
+ | * Move the generated file to the following directory: | ||
+ | |||
+ | mv < | ||
==== Taking Photos ==== | ==== Taking Photos ==== | ||
+ | * Unplug and re-plug the realsense from the computer. | ||
+ | * open '' | ||
+ | * **Disable** the following options: | ||
+ | - Enable auto Exposure | ||
+ | - Enable auto white Balance | ||
+ | - Auto Exposure Priority | ||
+ | - Set White Balance to: 3760 | ||
+ | * Close '' | ||
+ | * Remember to source tour workspace before executing the following commands. | ||
+ | * Now we need to launch the camera node: | ||
+ | cd < | ||
+ | roslaunch rs_camera.launch | ||
+ | |||
+ | * In a separate terminal execute the script that will create the photos for calibration | ||
+ | |||
+ | cd < | ||
+ | mkdri outpout | ||
+ | cd < | ||
+ | python camera2png.py ../ | ||
+ | | ||
+ | === Trouble shooting === | ||
+ | |||
+ | Sometimes the KUKA robot stops working (we are not sure why). When that happens, you will have to: | ||
+ | |||
+ | * Kill the '' | ||
+ | * Re-enable the arm | ||
+ | * check in ''< | ||
+ | * Restart the '' | ||
+ | |||
+ | python camera2png.py ../ | ||
==== Executing the calibration script ==== | ==== Executing the calibration script ==== | ||
+ | * First we have to measure the transformations between the kinematic arm base to the camera and from the wrist to the calibration board. This will be used as the initialization guess for the calibration program to use. The guess has to be given in translation '' | ||
+ | |||
+ | <code yaml> | ||
+ | base_to_camera_guess: | ||
+ | x: 1.04 | ||
+ | y: 1.1 | ||
+ | z: 0.0 | ||
+ | qx: 0.3 | ||
+ | qy: 0.4 | ||
+ | qz: 0.5 | ||
+ | qw: 0.01 | ||
+ | | ||
+ | wrist_to_target_guess: | ||
+ | x: 0.109 | ||
+ | y: 0.2 | ||
+ | z: -0.05 | ||
+ | qx: -0.-4 | ||
+ | qy: 0.5 | ||
+ | qz: 0.5 | ||
+ | qw: -0.4 | ||
+ | </ | ||
+ | |||
+ | * You also need to get the camera intrinsics and store them in another YAML. Here you can find and example: | ||
+ | |||
+ | <code yaml> | ||
+ | intrinsics: | ||
+ | fx: 1387.6160888671875 | ||
+ | fy: 1387.5479736328125 | ||
+ | cx: 943.9945678710938 | ||
+ | cy: 561.1880493164062 | ||
+ | </ | ||
+ | |||
+ | * Then you need to provide a configuration file for the calibration board, like so: | ||
+ | |||
+ | <code yaml> | ||
+ | target_definition: | ||
+ | rows: 10 | ||
+ | cols: 10 | ||
+ | spacing: 0.01861 | ||
+ | </ | ||
+ | |||
+ | * Now we have to create a separate directory for images and poses. | ||
+ | |||
+ | <code bash> | ||
+ | mv < | ||
+ | cd < | ||
+ | mkdir images | ||
+ | mkdir poses | ||
+ | mv *.yml poses | ||
+ | mv *.png images | ||
+ | </ | ||
+ | |||
+ | * Now we have to crate a YAML listing all the files that have to be taken into consideration by the calibration script, in the '' | ||
+ | |||
+ | <code yaml> | ||
+ | # TODO: Fix this exxample | ||
+ | - pose: " | ||
+ | image: " | ||
+ | - pose: " | ||
+ | image: " | ||
+ | . | ||
+ | . | ||
+ | . | ||
+ | </ | ||
+ | |||
+ | * Launch the static calibration script. | ||
+ | <code bash> | ||
+ | roslaunch oms-cylinder/ | ||
+ | </ | ||
+ | |||
+ | * This will start the calibration script. This will open a window that will show you the photos that were taken, you should see some color dots. If your data set is small, you will have to press '' | ||
+ | * You must wait when you press '' | ||
+ | * Once the algorithm converges you will see a lot of results like the one shown below. The calibration result is the first one. | ||
+ | |||
+ | < | ||
+ | Did converge?: 1 | ||
+ | Initial cost?: 47853.2 (pixels per dot) | ||
+ | Final cost?: 0.655122 (pixels per dot) | ||
+ | |||
+ | BASE TO CAMERA: | ||
+ | | ||
+ | | ||
+ | 0.144021 | ||
+ | | ||
+ | |||
+ | --- URDF Format Base to Camera --- | ||
+ | xyz=" | ||
+ | rpy=" | ||
+ | qxyzw=" | ||
+ | |||
+ | BASE TO CAMERA: | ||
+ | | ||
+ | -0.00618554 | ||
+ | 0.99981 | ||
+ | 0 | ||
+ | |||
+ | --- URDF Format Base to Camera --- | ||
+ | xyz=" | ||
+ | rpy=" | ||
+ | qxyzw=" | ||
+ | </ | ||
==== Verify that the calibration is correct ==== | ==== Verify that the calibration is correct ==== | ||
tutorials/camera_arm_calibration.1573227549.txt.gz · Last modified: 2022/09/20 00:08 (external edit)