Revising the gtest tutorial

I decided to bite into our gtest tutorial at wiki.ros.org/gtest. This looks like a larger job, as the tutorial is much less developed than the other top-level testing tutorial, which I already revised. I expect this will take several weeks - I will have to find time to try out things, and fix things that are outdated. Times is scarce, thus the slow speed.

I will start by moving it (and redirecting) to Quality/Tutorials/* and work in there. Of course, I will try to always keep a relatively usable version when leaving it off for a few days.

I do have some questions for more knowledgeable folks down here:

  • What is wiki.ros.org/gtest/Troubleshooting and wiki.ros.org/gtest/Reviews? Both seem to be (almost) empty pages that wiki knows about and asks whether they need to be moved as well. They look as some auto generated stuff, and I am wondering whether, if I move them, they will be recreated again. Or are they some legacy stuff that should simply be deleted? Delete? Move along?

  • The rosbuild part of the tutorial concerns me as well. I am too young a ROS user to be able to update. And I don’t care so much right now for learning rosbuild. What should we do about it? Can someone help by adjusting this in about a month? Or should we drop maintaining it?

3 Likes

That’s great. While you’re looking at it though, try to make the tutorial about the ROS specific aspects and link out to other upstream gtest tutorials for the core gtest concepts instead of reproducing them.

The Troubleshooting and Review pages are not completely auto generated, but are auto linked to the header if they are present. They have templates for them if you create the page. The TroubleshootingTemplate just helps start a list with reasonable formatting. The Reviews page is left over from the older QAProcess used at Willow Garage. It hasn’t been used really since then. Much of that process was used on the old wiki and the Review pages were found to be slowing down the wiki and often would overshadow formal documentation in search results. Much of that content predates the use of github and other platforms with pull requests and review processes available. It’s likely that all of the QAProcess should be cleaned up. As an example the gtest Review page was made to propose creating a ROS package for gtest, but clearly we’ve chosen to use the upstream version now so that proposal from 2009 is not particularly useful now and could be cleared out.

Since the Troubleshooting page had zero content I just deleted it.

We should not delete the rosbuild content, but I wouldn’t suggest spending much time to update it to new practices as we are not planning to spend time updating the rosbuild toolchain.

I’m exploring how to use ROS for doing physical integration testing, a mix of automated and manual testing on a real robot. Getting to know gtest/rostest from the documentation has not always been a smooth experience. Please let me know if you need any help in creating the tutorials, as I would gladly contribute from what I have learned.

@floris Sligthly of topic but probably helpful:

As long as performance of the test infrastructure is not critical for you (in most cases not the case) and the effect of Pythons GIL on multi-threading is no issue for you (usually not the case as well) or could be workaround with multiprocessing (unfortunately limited by the machines # of “real” processors) I would recommend to use pytest for “physical integration testing” instead. As it is a Python only framework it integrates way better with other tools/infrastructures.

I consider gtest and rostest suitable for “ROS node/nodelet unit testing” only so far. In case several ROS nodes/nodelets are executed (in production) on a single machine they can be suitable for “ROS node/nodelet integration testing” as well. As soon as ROS nodes are executed (in production) on “physically separated” machines w.r.t. to application context gtest and rostest is a rather bad fit (because “too far” from production).

However I am open for productive disagreement/feedback :slight_smile:

Thanks for your advice. I agree with you that when possible, writing these integration tests in a language such as Python will be much quicker than using C++/gtest.

In the context of our (research) project, we want to create a (ideally language agnostic) framework for specifying tests with pre-conditions, invariants and post-conditions. These conditions would be verified using both intrinsic data from the robot and data from an external sensing system such as a laser scanner, rgbd-camera or mocap. It’s likely that neither pytest or gtest will be of much use, as we are interested in contributing towards better tools within ROS for constructing these kind of tests.

That sounds cool. I wish you good project outcomes.