ROS Quality Assurance Working Group meeting minutes - Feb. 2018 Meeting

ROS Quality Assurance Working Group meeting minutes - 9th of Feb. 2018 Meeting
Time: 5 p.m. UTC

Participants:

  1. Adam Alami
  2. Aaditya Saraiya
  3. Andrzej Wasowski
  4. David Bensoussan
  5. Dejan Pangercic
  6. Dirk Thomas
  7. Gijs vd. Hoorn
  8. Jihoon Lee
  9. Jose Luis Rivero
  10. Kunal Tyagi
  11. Matt Droter
  12. Matt Robinson
  13. Shaun Edwards
  14. Shawn Schaerer
  15. Thomas Denewiler
  16. Victor Lopez

Agenda:

  1. The result of the votes on ROSIN quality assurance initiatives
  2. “Make ROS packages’ quality visible”

Notes:

  1. The result of the vote:

    1. The Quality Assurance group members were asked to vote on the priorities of the ROSIN initiatives. The result of the vote was presented and discussed.

    2. The results of the vote: The ROSIN initiatives were ranked as follows:

      1. Make ROS packages’ quality visible
      2. Appoint ownership for ROSIN QA initiatives
      3. Energize the code review process
      4. Implement a code scanning method and tool
      5. Maintenance issues
      6. Continuous Integration
      7. Quality hub website
      8. Formalize the code ownership process
      9. Onboarding process for core and non-core community members
      10. Model in-the-loop testing
      11. Implement a continuous improvement process
      12. Automated unit test generation
      13. Quality discourse
      14. QA promotion events
      15. Model Driven Development (MDD)
      16. #ROSQA
    3. Discussion:

      1. Actionable initiatives were voted with higher priorities. The message seems to be “Let’s get broken things fixed first.” Some group members were disappointed that MDD received a lower score. The direction given seems to be to “make what we have now visible” then improve the process.
      2. These initiatives need to be defined.
      3. Making packages’ quality visible is something that the community has been requesting for a while.
      4. The community has a split opinion regarding MDD. Some are passionate about it, while others have little experience with it.
      5. The priority may change once the quality of packages become visible.
      6. Going forward, we will be discussing these initiatives one at a time:
        1. Each group meeting will be dedicated to a particular initiative in order of priority. The community perspective through the working group will be discussed. Then the implementation of the initiative (How and Who) will follow.
  2. Quality Stamp: Making ROS package quality visible

    1. Discussion:
      1. current CI badge was discussed. Currently, there are three badges displayed, depending on the state (the
        way the package is registered with ROS Build) of the package. The Build Farm currently generates test statistics files that the wiki uses to generate the badges (example).
      2. Other perspectives:
        1. The ability to display the test failures and display the details of the failed tests. If there are failures, then what type of failure is it? If the test succeeds, then what was the test?
        2. There is a plan to display the results of a code scanner.
        3. For someone unfamiliar with the package (other than the author), how can she or he assess the magnitude of the failure? It may be obvious to the maintainer, but to an external user, it may not make sense.
        4. Display test coverage not only for CI service but for other types of tests (i.e., Unit Test, Integration Testing, MIL, HIL, etc.). This will show how much coverage the code has.
        5. Display a link to a static analysis tool run. This will provide visibility into the internal code quality. HAROS was discussed as a candidate for static analysis.
        6. Generate statistics about the package, for example, the number of issues logged against the packages, how long the issue has been open, etc. These metrics are available in GitHub.
        7. What kind of metrics do we have now? What other metrics can we display? Which of these metrics can we automate and display (make visible)? Generate a list of available metrics and conduct an analysis of the possibility of using each metric as a quality indicator.
        8. The “quality stamp” puts the user in a position to make a judgement call based on what has been made available.
        9. The ability to record a “soft review” provides community members with the ability to review and comment on packages’ quality. Sometimes metrics may not be the best way to present the quality of a particular package.
          User feedback along the lines of Amazon’s user feedback can be helpful.
        10. Add information about memory leaks since packages sometimes have memory leak issues. Users interested in robustness will look at metrics such as memory leaks and segfault.
        11. Different users have different needs. New users of ROS may be more interested in documentation, while an industry user may be more interested in the robustness of a package.
        12. The documentation coverage. Currently, the wiki badge states, “Documented,” but it does not say exactly how much of the code is documented. It would be helpful to display the percentage of the code being documented. This would help new users. There are tools available to analyze the level of documentation compared to the number of functions available in the code.
        13. The decision is to document the current available metrics and possible future metrics and to discuss this in the next group meeting. As a group, we will discuss which ones are valuable. We will also prioritize the implementation of these metrics in the quality stamp.
2 Likes