-
Two instances have taken place this month. The first was the 7th and the second was the 14th
-
A list of quality metrics was presented for discussion. The list was compiled based on @tyagikunal post. The objective is to define a set of metrics to use in order to “make ROS packages quality visible”.
-
These quality metrics have been discussed:
- Build
- Unit Tests
- Unit Test Crash
- Unit Test Coverage [%]
- Code style violations
- Logic errors and warnings
- Cyclomatic complexity
- McCabe complexity
- Afferent coupling
- Efferent coupling
- Clang AddressSanitizer and LeakSanitizer
- Fuzzy testing by “chaos node”
- Comment to code ratio
- Status
- README
- Wiki
- Getting Started
- Other resources
- Number of closed issues
- Time to close issue
- Activity on issues
- Other status (eg: wont-fix, etc.)
- Number of open issues
- User rating (Star rating and feedback)
- Maintainability Index
- Depth of Inheritance
- Class Coupling
- Lines of Code
- Cyclic includes
-
A decision has been made to put the list into a google document (The link to the spreadsheet) and share it with the group to allow people to contribute to the list development.
-
We invite people to take initiative and contribute to the development of spreadsheet. Your contributions will make it happen!
Hi,
I would like to contribute to this working group in the future. Kudos for the initiative!
My 2 cents is that it would be useful to prioritize our quality metrics. A list of 29 elements might be intimidating for people which are doing the transition from “it just works” to professionally written code.
I guess we can approach it in the 80/20 rules, ie. 20% of this recommendation can already solve 80% of typical issues.
Clang sanytizers and Unit Test for example are extremely powerful best practices, IMHO.
Davide
1 Like
I agree, it is an ambitious list. We should prioritize and have an iterative approach to the implementation. We should also think of the audience of these metrics, i.e. How they may interpret these metrics?