User Tools

Site Tools


hardware:xsens

NOTE: Site under construction

Required Software

  • rviz
  • knowrob
  • knowrob_addons
  • openni2
  • Xsens MVN studio

Required Hardware

  • Xsens laptop
  • Xsens suite
  • kinect

Preparation

  • recharge batteries and replacement batteries
  • make sure to set up streaming for only one client

Running the software

  • Start RVIZ
$ rosrun rviz rviz
  • For Calibration, define TF between `map` and `mocap`
$ rosrun comp_mocap tf_dynamic_transform.py
  • Receive mocap data on port 9763
  • Modify the IP in the script to reflect the IP of your computer
$ rosrun comp_mocap xsens_tf_broadcaster.py
  • Publish marker messages for human skelleton with TF root set to `mocap`
$ rosrun comp_mocap mocap_marker.py --skeleton xsens --root-frame mocap
  • Start Xsens MVN studio on the Xsens laptop
  1. Make sure to attach the USB stick with the license
  2. Go to Options/Preferences/Network streamer
  3. Add your computers IP to the destination addresses
  • Setup a kinect for capturing RGB images
  1. Attach kinect to tripod and to your computer via USB
  2. Start Publishing
$ roslaunch openni2_launch openni2.launch
  • Calibrate kinect camera
  1. Go to camera/driver
  2. Check depth_registration if you need depth info
$ rosrun rqt_reconfigure rqt_reconfigure
  • Spawn semantic map in rviz via knowrob
  1. Start knowrob_vis
$ roslaunch knowrob_vis knowrob_vis.launch
  1. in firefox, go to http://127.0.0.1:1111/ and spawn the map
$ owl_parse('package://iai_semantic_maps/owl/room.owl').
$ register_ros_package(knowrob_objects).
$ owl_individual_of(A, knowrob:'SemanticEnvironmentMap'), !, add_object_with_children(A).
  1. Leave the knowrob server

Dressing the suite

TODO

Calibration

TODO

Recording

rosbag record –duration=3 –output-name=$1 /camera/rgb/camera_info /camera/rgb/image_raw /camera/depth/camera_info /camera/depth/image /camera/depth_registered/image_raw /tf

hardware/xsens.txt · Last modified: 2016/05/19 09:19 (external edit)