I would like to contribute my 2cents on the ROS2 build tools and systems. Regardless of the ample and elaborate documentation on the build tools I, like many others fail to grasp the “why” of all this build tool/system (re)engineering. For example, although a different project under the umbrella of ROS2, if one wants to build micro-ros (ROS2 for embedded devices), the recommended and documented approach is to use ros2 build tools (colcon and co.) which I find to be a ridiculous overkill.
It really does not make sense that a researcher or a robotics engineer or even a student should waste time to learn colcon while learning CMake is way more robust and transferable to other projects. Shoving a custom “general build tool” to everybody’s throat who want to use ROS2 will seriously impact the ROS2 adoption rate.
It is really annoying that one has to go through numerous bash scripts and colcon related files to build a library primarily written in C/C++. The argument for colcon as a general build tool is that it “enables” a developer to build heterogeneous packages written in different source languages and are set up to use different build tools. Well how many ROS2 developers work on projects that require the use of heterogeneous build systems and languages? If I am not mistaken most devs in robotics focus on C/C++ and I wonder why this large segment of devs are forced to use a custom “global build tool” that encapsulates the CMake build system? I just don’t get it.
What is the alternative (eh, standard) way you suggest for building a workspace consisting of multiple packages? Using plain CMake, I think you’d end up with something similarly ridiculous as the Gazebo Classic 11 dependencies install: Gazebo : Tutorial : Dependencies from source . I.e. 5 commands to build a single package. Without the build system knowing dependencies between packages, so if you make changes in multiple packages, it is you who has to know in which order you should rebuild them. Colcon/catkin do this for you. It just seems to me that CMake is good for building isolated packages, but not whole bunches of them. But ROS is based on the concept of using hundreds of packages. And even the most DIY hobbyists usually have lower tens of packages in their workspace (e.g. robot drivers, a few packages without binary release, a few packages they needed to modify).
Similarly for the usage of package.xml and rosdep. There is again an example of “the world without package.xml” for managing dependencies in Gazebo:
Such command looks dangerous, works on exactly one version of Ubuntu and only on Ubuntu. Compare to rosdep install --from-paths src. This command can distinguish your OS and adapt the actual binary packages that are installed.
So, in general, as long as you develop a single package, you’re free to use standard tooling and it will work for you. But as soon as you add more packages, the need of something better will naturally arise. Managing build in custom bash scripts is even worse than managing build using non-standard tools.
PS: I’m not sure about the state of alternative build systems. I remember bazel was almost unusable for multi-package workspaces some time ago. Not sure about Meson.
PPS: I never understood why Gazebo did not adopt the ROS way of managing dependencies. All source install tutorials look ridiculous compared to ROS. At least colcon became the standard for building the workspace, but without rosdep, it’s only half useful.
I understand your point of view: starting using ROS2 requires a lot of learning and the build system may feel like “one more thing”, that makes this even more challenging.
But I think you don’t fully understand the problem you are talking about, if I am honest with you.
The original rosbuild/catkin build system solved a problem that the entire C++ community had (have?): dependency management.
Nowadays, that community is finally addressing this with conan, vcpkg and some other interesting approaches (I am keeping an eye on https://prefix.dev/).
ROS, in that sense, was ahead of its time.
A fair question is: why can’t we use those C++ package managers, instead?
I tried and let me tell you that it is not fun, nor you can solve the same class of problems with them, that colcon solves.
When you stop using colcon (I had to, in a project), is when you start appreciating how much simpler it is compared with the “alternatives”.
I’ve done quite a lot with MicroROS recently and I also find the build system cumbersome. I agree it does the package and dependency management well. When the project is targeting microros firmware one ends up having to create separate projects with shared interface packages, as I can’t see how to get the colcon environment to support multiple architecture builds in one place. This means I am doing manual dependency management between the two or more projects.
Ideally the build process should be able to build multiple architecture targets within the same build environment. It should also be easy to install on a Mac, but that a different thread;)
Developing for the RP2040 I also then have to build firmware with CMake system outside of colcon control. That isn’t true for all microcontrollers. Certainly, for the RP2040 it becomes a more involved build process rather than the “cmake …; make install” process I am used to for a microcontroller project as I first need to use colcon to build the library.
I am unaware of what multi architecture support work has been looked into for the build process. Is there anything written up on this?
This is not true. The ROS ecosystem still penalizes you if you slightly step away from the “recommended” way. Agreeing with the original poster, my original intention was to stay away from the ROS tooling as much as possible but ROS does many things in such archaic ways that only a custom tool could decipher their own customization. So ROS forces the user towards its own true path.
Cmake can easily build several targets in multiple subdirectories but here the concept of local workspace concept is the root problem. Why do you put all third-party packages and try to build them together with your own packages at the first place? Instead ROS should recommend building third party packages apart from your packages like any other software project does.
Dependency sorting is the only thing colcon does but developers already know how to build their own packages without needing colcon. If you separate third-party builds, no need for colcon.
So a modern approach to ROS tooling must start from ditching local workspaces. Probably this will happen together with containerization.
But this is the problem original catkin_make had. You do not want all your CMake packages building inside a single CMake context. There is very high risk of naming conflicts between targets (I faced this in real world).
Nobody said that (I hope). I don’t even think this is suggested somewhere. The usual practice is to have one workspace with dependencies (or even more, chained), and on top of that, another workspace with your own code. But in our case, I can’t even distinguish “our” and “3rd party”. As a robot lab maintainer, our custom robot drivers are “our” code to me. But the same package is “3rd party” for a student working with the robot who just wants to use the robot as a black box.
I strongly disagree. Or, maybe, “true developers ™”/maintainers know. But we have many students who “barely know how to run the build” and do not understand the system as a whole. Such developers are very happy for being provided something that takes care of things correctly.
We can argue about other technical stuff here but …
They must learn to be software engineers before being roboticists, whether they like it or not. They can be happy if you give them sugar but it’s bad for their future careers. I have to say that ROS recommends bad software engineering so the majority of roboticists here think it is inherent. No, you must ask good design from those who are in charge. Here I do.
I would blame cmake for 90% of the frustration I have with the build environment. Ament mostly gets into the crossfire because it’s layered on top of cmake. Best practices for cmake (“modern!”) seem to be a moving target and many libraries I wanted to use have CMakeLists.txt files that are broken to varying degree. Days can fly by trying to fix this brokenness. To me this does not look like a problem with the ROS build tools (catkin/ament).
Having that said, I have nagging doubts about using the all-powerful ament cmake (“auto”) macros because when stuff breaks I have to dig through two layers of macros (ament and cmake). Plus I dread the thought that ament will one day be replaced with another build environment (remember catkin?) and I’ll have to rewrite my build files again. Although I hate cmake, at least there is the feeling it will not be replaced in the short run, if only because it has such a large user base.
I haven’t used ament much myself, but I feel the same when doing anything related to Gazebo and the ign-cmake project. There is so much automagic hidden behind ign-cmake macros that it takes hours to figure out where or why is something done or is not. I agree the build systems should do the bare minimum above plain CMake or provide wrappers around standard CMake calls whose functions are guessable from their names. Seeing gz_create_packages() as the only “effective” call in a top-level CMake file is really frustrating.
What about mechanical or electrical engineers, who are focused on non-software applications and need the software to ‘‘just work’’ so they can focus on their 2-thirds of the field? Or the applications, process, and systems engineers, who need to be able to treat each robot or package as a black-box module?
The reason I joked about ‘‘this conversation again’’ is because I’ve seen versions of this thread a handful of times before, and we didn’t come up with a concrete solution that does it better than what we currently have. We can debate pain points all day, but I’ve yet to see a concrete technical contribution that resolves them.
Fair point, but I bet that if it were the other way around, folks would be saying “it’s really frustrating to have to wade through so many macros when all I want to do is create my gz packages”.
That’s fair, and I don’t disagree. But I would highlight from my previous comment:
We can debate pain points all day, but I’ve yet to see a concrete technical contribution that resolves them.
My point here is that every time this thread comes up there’s a lot of back-and-forth about “I find this frustrating” but never “here’s a better solution”. So… when you say “something better”, what does that actually mean in terms of specific, actionable contributions? Until people come up with a concrete answer to that, this conversation will go the same way as all the others: people deciding what we have is good enough and calling it a day.
I agree that these could be better. gz-cmake shares some of the same ideas as catkin to automate as much as possible, but it makes it fairly challenging for anybody who wants to do more than what the macros provide.
I will co-sign this. It’s a good goal to keep things as close to the “native” CMake (or equivalent buildsystem for your language of choice) and not use magic as much as possible.
In this, I think that ament is a step in that direction to some degree (versus catkin and gz macros), but still maintains some of the “magic” that non-buildsystem-experts may rely on.
Some of us are constantly experimenting with the outside world. A few have managed to actually make an escape (looking at you bazel and nix people). You might find the tools we have here bad… but just go try and replace them at the scale of software we have and you’ll come to secretly love ament or maybe you’ll finally succeed and open source your work in a way the rest of us can use.
I hope for pixi to save us all… that and cargo. Fight me.