Table of Contents

RosKnect: ROS and Kinect on Raspberry Pi

Authors

Javier Acosta

Josselin Porras

Description

This project seeks to implement a Kinect for a computer network using free software, this will require two modules programmed in Python, one that allows YARP use the Kinect drivers and another that allows communication between YARP and ROS.

The hardware used for this purpose includes: a Kinect, a Raspberry Pi ® and a computer. The Raspberry Pi ® will feature a suitable operating system, will include YARP installation and necessary modules to achieve Kinect integration, to it, the computer will work ideally with adequate of Debian version, besides ROS, YARP and rviz.

The following is a brief description of the different tools:

Objectives

General Objective

Specific Objectives

Getting Started

For this section we follow this tutorials:

http://wiki.icub.org/yarpdoc/yarp_with_ros.html

http://wiki.icub.org/wiki/YARP_ROS_Interoperation

Working on the PC

To start we need properly installation of ROS (on the computer, consult http://wiki.ros.org/groovy/Installation) and YARP (on the RaspberryPi)

Turn on the “CREATE_OPTIONAL_CARRIERS” option and “CREATE_IDLS” option

Make sure YARP is compiled with the following carriers enabled:

* ENABLE_yarpcar_tcpros_carrier (the usual protocol for topics)

* ENABLE_yarpcar_rossrv_carrier (tcpros with minor tweaks for services)

* ENABLE_yarpcar_xmlrpc_carrier (used by ros nameserver and node interfaces)

* ENABLE_yarpidl_rosmsg (a program to convert .msg/.srv files into YARP-usable form

YARP does not link to any parts of the ROS codebase, so don't worry about how to tell YARP where to find the ROS libraries, it doesn't need them.

First step

Both ROS and YARP use specialized name server, but is possible share this name server.

  $ export ROS_MASTER_URI=http://localhost:11311
  $ roscore
  $ yarp detect --ros --write

$ yarp detect –ros –write

Trying ROS_MASTER_URI=http://127.0.0.1:11311/

Reachable. Writing.

Configuration stored. Testing.

Looking for name server on 127.0.0.1, port number 11311

If there is a long delay, try:

yarp conf --clean

ROS Name server /ros is available at ip 127.0.0.1 port 11311

  $ yarp check

Now we can establish YARP-ROS interoperation

In general, connections to topics can be made and broken using “yarp connect” and “yarp disconnect”

Sending Strings

YARP ports can publish or subscribe to ROS topics. For example, suppose a ROS node called “/talker” is publishing strings to a ROS topic called “/chatter”. To listen in using “yarp read”, we could do:

  $ rosrun roscpp_tutorials listener
  $ yarp write /chatter@/yarp_writer
Sending/receiving simple messages from code

For simple cases, we can just use YARP Bottles whose content matches ROS types.

Is necessary create a CMakeLists.txt file to compile the code and link with YARP.

For example, this code:

 #include <stdio.h>
 #include <stdlib.h>
 #include <yarp/os/all.h>
 using namespace yarp::os;
 int main(int argc, char *argv[]) {
 if (argc!=3) return 1; // expect two integer arguments
 Network yarp;
 RpcClient client;
 if (!client.open("/add_two_ints@/yarp_add_int_client")) return 1;
 Bottle msg, reply;
 msg.addInt(atoi(argv[1]));
 msg.addInt(atoi(argv[2]));
 if (!client.write(msg,reply)) return 1;
 printf("Got %d\n", reply.get(0).asInt());
 return 0;
 }
 cmake_minimum_required(VERSION 2.8.7)
 find_package(YARP REQUIRED)
 include_directories(${YARP_INCLUDE_DIRS})
 add_executable(add_int_client_v1 add_int_client_v1.cpp)
 target_link_libraries(add_int_client_v1 ${YARP_LIBRARIES})
  $ rosrun rospy_tutorials add_two_ints_server
  $ ./add_int_client_v1 4 6
  yarp: Port /yarp_add_int_client active at tcp://192.168.1.2:35731
  yarp: Port /add_two_ints+1@/yarp_add_int_client active at tcp://192.168.1.2:35004
  Got 10
  

Working on the Raspberry Pi

To start we need properly installation of Yarp (on the Raspberry Pi, consult https://arcoslab.eie.ucr.ac.cr/dokuwiki/doku.php?id=installing_yarp_in_debian)

Installing OpenNI2
$ sudo apt-get update
$ sudo apt-get install g++ python libusb-1.0-0-dev freeglut3-dev doxygen graphviz openjdk-6-jdk libudev-dev 
$ cd
$ git clone git://github.com/OpenNI/OpenNI2.git 
$ cd OpenNI2

To build openni2 requires some changes in some files.

$ nano OpenNI2/ThirdParty/PSCommon/BuildSystem/Platform.Arm

In this file modify the line:

CFLAGS += -march=armv7-a -mtune=cortex-a8 -mfpu=neon -mfloat-abi=softfp #-mcpu=cortex-a8 >> CFLAGS += -mtune=arm1176jzf-s -mfpu=vfp -mfloat-abi=hard # 

And comment, this other line:

DEFINES += XN_NEON >> # DEFINES += XN_NEON 

Now proceed to build OpenNI2 for Arm

$ cd OpenNI2/Packaging
$ python ReleaseVersion.py Arm 
$ cd Final/ 
$ tar -jxvf OpenNI-Linux-Arm-2.2.tar.bz2 
$ cd OpenNI-Linux-Arm-2.2 
$ sudo sh install.sh

To test the correct installation, you can run some examples.

$ cd ~/OpenNI2/Samples/SimpleRead
$ make
$ cd ~/OpenNI2/Bin/Arm-Release
$ ./SimpleRead
Installing OpenCV

The Raspberry Pi don't support OpenGL, to view images of Xtion need to install OpenCV or OpenGL ES.

$ sudo apt-get install libcv-dev libcv2.3 libopencv-highgui-dev libopencv-highgui2.3 libopencv-contrib-dev

Now we can run a “simpleviewer”.

$ cd ~/OpenNI2/Samples/SimpleViewer
$ rm main.cpp
$ touch main.cpp
$ emacs main.cpp

Copy the following code in main.cpp

#include <OpenNI.h>
#include <opencv2/opencv.hpp>
#include <vector>
 
int main()
{
  		try {
		openni::OpenNI::initialize();
      		openni::Device device;
      		int ret = device.open(openni::ANY_DEVICE);
      		if ( ret != openni::STATUS_OK ) {
         			return 0;
      		}
 
		openni::VideoStream Stream;
		Stream.create(device, openni::SENSOR_COLOR); // if you want you can change 
								//SENSOR _COLOR >> SENSOR_IR or SENSOR_DEPTH
		Stream.start();
 
		cvNamedWindow("Image", CV_WINDOW_AUTOSIZE);
		cvResizeWindow("Image", 800, 600);
		std::vector<openni::VideoStream*> streams;
		streams.push_back( &Stream );
		cv::Mat Image;
 
		while (1) {
			int changedIndex;
			openni::OpenNI::waitForAnyStream( &streams[0], streams.size(), &changedIndex );
				if ( changedIndex == 0 ) {
					openni::VideoFrameRef colorFrame;
					Stream.readFrame( &colorFrame );
				if ( colorFrame.isValid() ) {
					Image = cv::Mat(Stream.getVideoMode().getResolutionY(),
								Stream.getVideoMode().getResolutionX(),
								CV_8UC3, (char*)colorFrame.getData() ); 
		// If use SENSOR_DEPTH or SENSOR_IR, change CV_8UC3 >> CV_16U, and uncomment the next line.
								//Image.convertTo( Image, CV_8U );
					cv::imshow( "Image", Image );
				}
			}
 
		int key = cv::waitKey( 10 );
		if ( key == 'q' ) {
			break;
		}
	}
}
catch ( std::exception& ) {
}
 
return 0;
}

For compile this program, must edit the corresponding Makefile and follow the instructions above to compile and run SimpleRead.

Results

It is generally able to obtain communication between YARP and ROS, but this interoperation only manage string data flow was unable transmit streaming video. However we could try looking for some alternative with “yarprosbrige” a new tool on develop for YARP-ROS interoperation, we try used this tool but need some packages that caused trouble (needs permission of the author, some configuration is not clear), but maybe asking to developer it could be solved. Try this: http://wiki.icub.org/wiki/UPMC_iCub_project/YARP_ROS_bridge

Conclusions

References

[1] http://wiki.icub.org/yarpdoc/an

[2] http://wiki.icub.org/yarpdoc/yarp_with_ros.html

[3] http://openkinect.org/wiki/Main_Page

[4] http://wiki.ros.org/

[5] http://wiki.ros.org/kinect

[6] http://wiki.ros.org/ROSberryPi/Setting %20up %20ROS %20on %20RaspberryPi

[7] http://wiki.ros.org/rviz

[8] http://www.python.org/

[9] http://www.raspberrypi.org/

[10] http://www.raspberrypi.org/wp-content/uploads/2012/04/quick-start-guide-v2_1.pdf