#2 The state of C++ package management: The underdogs(?)
Welcome to the second part of the series about dependency and package management in C++ projects. This time I’m gonna focus on somewhat less popular solutions than the main three.
Overview
Let’s evaluate if any of these are worth the hassle.
Hunter
Hunter caters towards CMake. The
way it works is really similar to meson’s wraps (with the only difference that
it’s not built into CMake). Usage is fairly simple. You have to export
HUNTER_ROOT
to a location of choice. Hunter will use it to store its build
directories and obtained source code. There’s nothing that has to be installed.
It’s kind of “self bootstrapping”. You have to download HunterGate
within your CMake project:
|
|
After which, it has to be included in your main CMakeLists.txt
. Additionally, you’ll need to call
|
|
All of that has to be done prior to project()
function call and that’s it -
it’s now fully operational. You can now add packages by simply calling
hunter_add_package
. Here’s an example of an entire CMakeLists.txt
:
|
|
Hunter has it’s own repository of recipes and (as far as I can tell) this is the only source of packages it can provide.
Cool, but which version of ZLIB
is this actually installing? This is by default
specified in HunterGate
configuration files. For HunterGate 0.25.6
this
would be defined here.
To override that you need to add your own cmake/etc/Hunter/config.cmake
file
containing the versions you want and add LOCAL
to HunterGate()
call. The
versions you can use are defined within HunterGate
repo… and this is the
biggest problem with Hunter
.
Most of the hunter.cmake
files for projects it offers are simply obsolete.
I’ve tried installing inja and the only
available version was 0.1.1
(the up to date version is 3.4 :D). The
selection of packages is minimal as well. On top of that, the documentation
simply sucks immensely. It doesn’t even specify which versions are provided for any given
package - you have to dig around in
HunterGate to
figure that out.
Testing
Clutching at straws I just experimented installing zlib
to see if that would
work and it did (although the version wasn’t up to date either).
In overall, I’m not impressed and Hunter
most likely would be my last choice
if everything else failed - and even then I’d probably prefer to handle the
problem myself rather than resorting to it.
Summary
Feature | Support | My verdict |
---|---|---|
Declarative dependencies | Supported. Versions selectable in cmake/Hunter/config.cmake , packages needed declared using hunter_add_package . |
✔️ |
Build reproducibility | No lock file. Additionally, if you change HunterGate version it’s very likely that default package versions will change as well. | ❌ |
Inter-dependency mgmt | Supported. Defined by HunterGate |
✔️ |
Handling non-native packages | Provides a repository of packages. Other than that, no support. | ❌ |
Project build systems supported | CMake only. | ❌ |
Dependencies build systems supported | CMake mainly. Documentation mentions autotools as well but haven’t tested myself. | ✔️ |
Caching | Caches locally within its own prefix. | ✔️ |
Build tools | No support. | ❌ |
Other remarks | Majority of packages it provides are outdated. | ❌ |
cpm
cpm is not something I originally planned to focus on. I’ve stumbled upon
it accidentally. It seems to be based on a similar principle as hunter
.
Integration with the project is very simple and requires downloading the
get_cpm.cmake
bootstrapping script.
|
|
I was quite positively surprised with the project’s README file after quickly acquainting myself with it. The documentation is short, concise and describes in details the most important aspects of the tool.
Testing
Having cpm bootstrapped, the dependencies can be added to the project similarly as in hunter
’s case:
|
|
The main difference though is the fact that cpm
can pull any arbitrary
version directly from github, gitlab, bitbucked or local sources and doesn’t
have to rely on its own database. This is a massive improvement over
hunter
!
cpm
supports lock files as well. These can be generated using a special target:
|
|
I’ve tried it with my go-to test project using inja
and everything worked as
expected. I’m very much positively surprised as it seems to be a quick to use,
working and frictionless solution for CMake projects.
Summary
Feature | Support | My verdict |
---|---|---|
Declarative dependencies | Supported. Required dependencies declared directly in CMakeLists.txt |
✔️ |
Build reproducibility | Supported using a lock file. | ✔️ |
Inter-dependency mgmt | Supported. | ✔️ |
Handling non-native packages | It can rely on pre-built packages, system packages or local checkouts. | ✔️ |
Project build systems supported | CMake only. | ❌ |
Dependencies build systems supported | It builds CMake projects only. It can download non-cmake projects but you’ll have to provide CMake instructions to build them yourself. | ❌ |
Caching | Uses project’s build for dependency caching. Additional global cache can be configured with CPM_SOURCE_CACHE and CPM_USE_NAMED_CACHE_DIRECTORIES |
✔️ |
Build tools | No support. | ❌ |
Other remarks | Seems like a well executed solution for CMake projects. Something that hunter aspires to but fails to achieve. |
✔️ |
cget
cget is a simple package manager aiming to ease obtaining dependencies for
CMake projects. It’s written in python. It maintains its tree (containing
both the downloaded source code and build directories) under
$CGET_PREFIX
. The integration with CMake is seamless. You can use the
toolchain file directly like so:
cmake -DCMAKE_TOOLCHAIN_FILE=$CGET_PREFIX/cget/cget.cmake ...
or allow cget
to take over and configure CMake for you:
cget build -B bld
Prior to that, you install the required dependencies manually using:
cget install <dependency>
Or using the provided requirements.txt file:
cget install -f requirements.txt
It can grab packages from what it defines as a Package source. You can specify an URL to a tarball or a github repo name (using / scheme). There’s support for local filesystem as well so, it’s possible to operate offline or within a closed network.
The format of the requirements.txt
file is a bit awkward as it looks like a
set of CLI parameters e.g.:
foo/mypackage -DSOMEDEFINE=ON -X meson -H md5:<md5hash> ... and so on
Additionally, if the package you want to install doesn’t define its
dependencies in its own requirements.txt
you can create a
recipe which defines
the source, configuration and the dependencies in a two files package.txt
and
requirements.txt
. The first file should contain a single package source
entry for the dependency it defines and the latter its dependencies.
Testing
I’ve tried cget
with a couple of CMake and meson dependencies i.e.:
cget install -v -DBUILD_BENCHMARK=OFF pantor/inja --cmake header
Correctly installed the package and passed the configuration to it. Similarly, no problems installing a meson project from github:
cget install ebassi/graphene --cmake meson
Other sources worked fine as well:
|
|
I was successful as well installing all of the above via requirements.txt
file.
During testing, I’ve noticed that it does not rebuild packages it already has in its repos so, that’s good.
However, there are some downsides as well. requirements.txt
doesn’t seem to
scale well when the configuration is complex. Just imagine that you need to
specify 10 or more defines in a single line. Ugh.
It doesn’t handle more exotic cases out of the box either. For example,
I’ve tried obtaining and building llvm
with cget
and initially I couldn’t force it to build it. The main problem being
the fact, that the project uses CMakeLists.txt
under llvm/
path and not in
its root directory. There seems to be the -X
option but it looks for
CMakeLists.txt
relative to your project root, not the dependency’s root so,
it’s meant to be used by the recipes.
Eventually, I’ve managed to work around that by creating a custom recipe that
has a trivial build.cmake
delegating to the project’s actual CMake file:
|
|
I’ve done that by adding a new recipe directory for llvm:
mkdir -p $CGET_PREFIX/etc/cget/recipes/llvm
My package.txt
contained:
|
|
This does solved the problem but it is a bit too involving to be convenient in my opinion and as I predicted, the accumulation of all the arguments in one line looks terrible.
There’s no support for any type of lock file neither which basically means, no reproducible builds.
No support for supplementary build tools either.
Summary
Feature | Support | My verdict |
---|---|---|
Declarative dependencies | Supported via requirements.txt and recipes. |
✔️ |
Build reproducibility | Direct dependencies can be listed with exact versions. Indirect dependencies are not locked via a lock file. | ❌ |
Inter-dependency mgmt | Supported via requirements.txt file within dependencies repo (or by manually created recipes). |
✔️ |
Handling non-native packages | Provides a rich repository of pre-made recipes. Custom recipes stored within project’s repo can be created to support dependencies missing in cget-recipes . |
✔️ |
Project build systems supported | CMake only. | ❌ |
Dependencies build systems supported | CMake, meson and 3rd party by providing a custom build.cmake in custom recipe. | ✔️ |
Caching | Caches locally within its own prefix, two different projects using two different CGET_PREFIXES won’t benefit from any caching. | ❌ |
Build tools | No support. | ❌ |
Other remarks | Wasn’t able to build the a dependency if CMakeLists.txt was in a subdirectory. No way to configure the location of CMakeLists.txt relative to dependency’s root. Required custom recipe creation | ❌ |
buckaroo
Buck, Buck2 and Buckaroo are all
facebook/meta products. buck and buck2 are build systems and buckaroo is a
supporting package manager. buck2 supersedes
buck. The development of buckaroo
seems to be on
pause so, the future of it is uncertain.
The idea is that it treats git repos as dependencies.
However, there’s a catch. The repo has to contain buckaroo
files to be
available for consumption via buckaroo
. There are buckaroo repo forks (which
are called ports). The official repo contains ~350 ports which is not
a lot when compared to e.g. vcpkg. Ingestion of such repos is very simple:
buckaroo add github.com/buckaroo-pm/boost-thread
Testing
This is the first build system and dependency manager which I simply failed to force to work! After a while I just gave up as I considered the whole exercise futile and a massive waste of time.
buck
is outdated and replaced with buck2
so, I didn’t even bother with the
former one. buck2
is very similar to bazel. I’ve managed to build some test
projects with buck2
successfully but when tried to use buckaroo
on top of
that… it all fell apart.
buckaroo
seems to still expect buck
instead of buck2
and buck2
is
starting to complain about the presence of directories that buckaroo
creates
within the project. Without trying to understand cryptic error messages about
cell
names etc I just gave up on the whole thing.
Summary
Feature | Support | My verdict |
---|---|---|
Declarative dependencies | Yes, buckaroo.toml contains all dependencies used in the project. |
✔️ |
Build reproducibility | Theoretically yes, there’s support for lock files. | ✔️ |
Inter-dependency mgmt | Supported. | ✔️ |
Handling non-native packages | Not supported - you have to port dependency’s build system to buck and create a buckaroo port for it. |
❌ |
Project build systems supported | Bazel, buck, buck2…? | ❌ |
Dependencies build systems supported | Bazel, buck, buck2… ? | ❌ |
Caching | Couldn’t test. | ❌ |
Build tools | No support. | ❌ |
Other remarks | Last commit was 3 years ago. buck is replaced with buck2 . The port registry is rather modest. The documentation is rather sparse. I wouldn’t rely on this solution at all and prefer any other build system. |
❌ |
build2
build2 is a completely new build system with capability for package management built in. It comes from Code Synthesis.
It is very different to what I’m used to. It comes with its own language and whole set of ideas of how to manage and build projects. You really need to commit some time and go through the documentation to get a general grasp of how to use it. I’m gonna try to provide a condensed introduction here, just for reference.
build2
is comprised of a set of utilities. Mainly bdep
, bpkg
and b
.
bdep
is used to manage the project and its build configurations, bpkg
is a
package manager and b
is the build tool itself.
As opposed to meson or CMake, it’s not a “meta” build system. Meaning, it
doesn’t generate Makefiles or Ninja files, which are then used to build the
project. It takes care about the build process as well. Additionally, it
integrates a simple testing framework as well. Tests have to be described
within testscript
. Tests can be invoked using bdep test
.
Creation of an example project containing an executable is very simple:
|
|
The command initiates a new git repository containing a minimal executable project skeleton:
|
|
To build it, a configuration has to be created first
bdep init --config-create ../hello-gcc @gcc cc config.cxx=g++
This creates a configuration called “@gcc” in “../hello-gcc” directory. This
configuration will use g++
to build the project. There can be as many configurations as you require. They can be listed:
|
|
First configuration is marked as default. It’s the fallback in case non other
configuration name is provided. When building the project from the source tree
(by just invoking b
) the default configuration will be used. Let’s try to build the example then.
|
|
The build rules are defined in buildfile
’s present in the project. It seems
to follow the notion of one target per directory. If I’d want to add one more
executable, let’s call it now
which will print the current time of day, I’d
need to create a new directory called now
and populate it with source code
and a buildfile
. Here’s an example:
|
|
Here’s my buildfile
:
|
|
It’s a bit strange initially but after a while it becomes obvious. exe
is
the target type. now
is the target name and, just like in Makefiles, anything
on the right hand side after the colon are dependencies of the target. Again,
cxx
is the dependency type, and withing the { }
, there’s dependency
list.
After rebuilding, my new extra executable is available for use:
|
|
Testing
That’s cool but how do I add extra dependencies? First let’s have a look on
repositories.manifest
. This is a list of external sources that build2
will
scan for dependencies. You can put there an URL to any build2
git repository.
Additionally, build2
provides cppget this is an index
of build2 project wrappers for 3rd party projects. As an example, let’s
integrate fmtlib into a build2
project.
Searching on cppget for fmt
returns
fmt. fmt
is available in
https://pkg.cppget.org/1/stable repository.
Let’s add that to repositories.manifest
:
|
|
The trust field must be populated with repository’s certificate fingerprint. This is available in repo’s details page.
Cool. Next step is to add the dependency to project’s manifest:
|
|
After that’s done a call to bdep sync
will pull the dependency:
|
|
Now it can be used in the project. Just add it in the buildfile:
|
|
And that’s it. Just run b
and the project is built!
That’s great but what about non build2
projects? Well, there’s a problem with that. If the project is not a build2 project and there’s no wrapper for it on cppget.org then
you’re out of luck. The official recommendation in the documentation is that you should arrange to install it yourself and depend on the binaries:
The standard way to consume such unpackaged libraries is to install them (not necessarily into a system-default location like /usr/local) so that we have a single directory with their headers and a single directory with their libraries. We can then configure our builds to use these directories when searching for imported libraries.
I’m afraid that this is simply not good enough and as a result, my verdict is that build2
only supports native packages.
All dependencies and their artefacts are downloaded and stored within the build configuration directory, there’s also a support for lockfiles.
Another good thing is that it support build tools (which it calls build time dependencies).
Summary
Feature | Support | My verdict |
---|---|---|
Declarative dependencies | Declared in manifest file |
✔️ |
Build reproducibility | Support for lockfile |
✔️ |
Inter-dependency mgmt | Supported. build2 dependencies define their own dependencies in their own manifest files | ✔️ |
Handling non-native packages | Not supported. The advice is to pre-prepare binary versions of non build2 dependencies. | ❌ |
Project build systems supported | build2 only. | ❌ |
Dependencies build systems supported | build2 only. Provides cppget.org which is a collection of build2 compatible dependencies that can be integrated off the shelf. | ❌ |
Caching | Only caches locally within “configuration” directory. | ❌ |
Build tools | Supported. Dependencies in manifest, prefixed with ‘*’ are treated as build time dependencies. | ✔️ |
Other remarks | Quirky and a bit different than the rest of available ecosystem. Fun solution to play-around, suitable for experimental projects but due to lack of traction not good enough to be a replacement for CMake, meson or bazel. Additionally, the documentation is painfully long and unnecessarily verbatim which makes it difficult to use as pure reference document. | ❌ |
Conclusion
That’s all in this part, which was dominated mainly by CMake specific
solutions, some better than others with my personal favourites being cpm
and
cget
(in that order). I really like cpm
for the low barrier of entry and
overall ease of use. I will probably give it a go for some simple CMake
projects in the future. There’s still a couple of more to discuss so please,
bear with me and let’s continue in part 3.