ROS Quality Assurance Working Group meeting minutes - March 2018 Meeting

  1. Two instances have taken place this month. The first was the 7th and the second was the 14th

  2. A list of quality metrics was presented for discussion. The list was compiled based on @tyagikunal post. The objective is to define a set of metrics to use in order to “make ROS packages quality visible”.

  3. These quality metrics have been discussed:

    1. Build
    2. Unit Tests
    3. Unit Test Crash
    4. Unit Test Coverage [%]
    5. Code style violations
    6. Logic errors and warnings
    7. Cyclomatic complexity
    8. McCabe complexity
    9. Afferent coupling
    10. Efferent coupling
    11. Clang AddressSanitizer and LeakSanitizer
    12. Fuzzy testing by “chaos node”
    13. Comment to code ratio
    14. Status
    15. README
    16. Wiki
    17. Getting Started
    18. Other resources
    19. Number of closed issues
    20. Time to close issue
    21. Activity on issues
    22. Other status (eg: wont-fix, etc.)
    23. Number of open issues
    24. User rating (Star rating and feedback)
    25. Maintainability Index
    26. Depth of Inheritance
    27. Class Coupling
    28. Lines of Code
    29. Cyclic includes
  4. A decision has been made to put the list into a google document (The link to the spreadsheet) and share it with the group to allow people to contribute to the list development.

  5. We invite people to take initiative and contribute to the development of spreadsheet. Your contributions will make it happen!

Hi,

I would like to contribute to this working group in the future. Kudos for the initiative!

My 2 cents is that it would be useful to prioritize our quality metrics. A list of 29 elements might be intimidating for people which are doing the transition from “it just works” to professionally written code.

I guess we can approach it in the 80/20 rules, ie. 20% of this recommendation can already solve 80% of typical issues.

Clang sanytizers and Unit Test for example are extremely powerful best practices, IMHO.

Davide

1 Like

I agree, it is an ambitious list. We should prioritize and have an iterative approach to the implementation. We should also think of the audience of these metrics, i.e. How they may interpret these metrics?