ROS Resources: Documentation | Support | Discussion Forum | Service Status | Q&A answers.ros.org

Energy-Aware Planetary Navigation Dataset with ROS Bag parsing tools

Myself and other lab members have collected a dataset focused on energy-aware navigation & multi-view visual sensing in a simulated planetary environment at the Canadian Space Agency. We used a tele-operated ClearPath Husky equipped with a custom sensor suite. The dataset includes driving power consumption, solar irradiance, omnidirectional stereo-visual color imagery, front-facing high-resolution single-channel imagery, inertial measurements, wheel odometry, regular GPS and post-processed geo-referenced full-pose estimations (from combined GPS, vision and inertial data). Finally, geo-referenced maps of the test environment (colour imagery, elevation, slope and aspect) are also included.
The data is available in both rosbag format and human-readable format. We wrote a data parsing script using the Python rosbag API and prepared simple launch scripts to easily play and visualize the data. Lastly, the georefenced maps were also integrated with our visualization tools using the popular GridMap ROS package.
Since we mainly used open-source ROS packages to collect all the dataset (standard ClearPath Husky packages, OCCAM Vision Group packages, PointGrey packages and more), those were not included in the released code.
Links:
Repository of the dataset
Web page of the dataset
Official Dataset Publication

4 Likes