Contents

#1 The state of C++ package management: The big three

In this post I’m gonna do a short overview of package management and vendoring solutions for C++. This is a controversial topic. Still, there’s no official standardised package manager however, there’s a plethora of solutions (some more mature than others) which I think solve the problem well (at least as far as I’m concerned). I planned to publish the entire overview in a single post however, there’s a lot of material to cover and eventually I had to split it up into multiple parts.

In this part, I’m gonna cover my testing criteria and the most popular solutions out there. Let’s start!

What’s available?

Surprisingly, the choice is quite big. Below are the ones I’ve tried myself:

I’m aware of nuget but I’m not a VSCode user hence not gonna cover that at all.

Most of these package managers work by introducing a dedicated storage for obtained source code and its associated build directory. The package is downloaded and built prior to executing the build system they integrate with.

What will I test?

There are some aspects that I find universally important when it comes to package manager. Particularly:

Declarative dependencies

This one is very easy. It’s a bit like python pip’s requirements.txt file. The package manager has to provide a way for the project to define a collection of required dependencies that the package manager will obtain on behalf of the project.

Build reproducibility

All required packages and their dependencies have to be explicitly listed with their exact versions and origins so, the project is guaranteed to utilise exactly the same code regardless of when and where it is built. This is usually done in a form of a lock file, like npm’s package-json.lock or cargo’s Cargo.lock.

To be very strict and fully fall under the definition of a reproducible build, as defined here the build system should guarantee a defined and reproducible build environment which includes the full toolchain as well. For the sake of my own classification, I’m gonna disregard that and classify support for reproducible build environment separately.

Inter-dependency management

The dependencies are very rarely a stand-alone projects. They have their own dependencies as well. This is fundamental feature that every package manager should be able to handle.

Handling non-native packages

Majority of the package managers depend on their own repositories of “recipes”, “modules” or “packages”. The parlance differs from package manager to package manager but the meaning stays the same. Most of them have some sort of internal index containing a set of files describing how to obtain a given library and build it (along with some other metadata).

But what about packages that are not in the index? There should be a way to support these as well and that’s what I want to find out.

Build system support/integration

This applies to both the build system used by the project employing the package manager and the build systems of the packages managed by the package manager.

Ideally, there should be no restrictions on both sides. Package manager should be able to easily:

  • manage dependencies using any build system,
  • integrate with project’s build system (whatever it is).

Caching

It’s not perfect having to download and rebuild everything everytime when working with multiple projects using the same dependency so, since the package manager is a solution independent of the build system, it would be great if it could cache the source code and maybe even build artefacts.

Support for build tools

Sometimes a project depends on a specific version of CMake or other supporting tools or libraries required to start and successfully complete the build. The code in the project doesn’t require any of these dependencies directly but the project’s build environment does. It would be nice if the package manager support installation of such dependencies as well.

Overview

conan

I’d describe conan as, de facto, an industry standard. It is multi-platform and can cater to both meson (via pkg-config) and CMake. It can be initially a bit overwhelming but after a while it’s just becomes incredibly easy to use it. I’ve already committed a post to conan where I describe in details how to start using it. Therefore I’m gonna mention the basics here only very superficially.

First of all, you need to create a profile:

conan profile detect

A profile is just a configuration for the installed compiler and related toolchain. Any setting in the profile can be overridden during build anyway so, it’s not a big deal.

Next, create conanfile.txt or conanfile.py and list all required dependencies. If you don’t know how to create any of these files, just go to conan center index, look up the package you want - I’m gonna choose gtest and literally in the middle of the screen you’ve got an example conanfile.txt.

Once you have conanfile.txt, prior to initialising your build directory with your build system of choice, just run

conan install -of bld --build=missing .

The above will pull all of your dependencies and install them in a $USER wide conan directory, the build directory is just gonna be populated with required pkg-config files and cmake files needed to discover the dependencies by your toolchain. After that, you can just build as normal. When using CMake, you have to provide a path to the toolchain file:

cmake \
    -Bbld/build/Release/generators \
    -DCMAKE_BUILD_TYPE=Release \
    -DCMAKE_TOOLCHAIN_FILE=conan_toolchain.cmake \
    -GNinja \
    -S .

Simple as that!

Testing

pantor/inja is gonna be my go to testing package. It’s really a random choice. I recently worked with it so, without any special reason, I’m just gonna use it. I’m gonna recreate the same project using CMake, Meson and Bazel toolchains. The test project can be found here. Each toolchain is tested on a dedicated branch.

CMake

Here’s my conanfile.txt:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
[requires]
inja/3.4.0

[tool_requirements]
cmake/3.23.5

[test_requires]
gtest/1.13.0

[generators]
CMakeDeps
CMakeToolchain

[layout]
cmake_layout

This worked flawlessly. After installing the dependencies and activating the build environment:

1
2
3
4
5
$ source bld/build/Release/generators/conanbuild.sh
544:heimdall conan_test 0 $ cmake --version
cmake version 3.23.5

CMake suite maintained and supported by Kitware (kitware.com/cmake).

I can see that the version of CMake I’ve requested has been installed.

Configuring, building and testing the project can be done just as I already described:

1
2
3
4
5
6
7
8
9
cmake \
        -B bld/build/Release/generators/ \
        -DCMAKE_BUILD_TYPE=Release \
        -DCMAKE_TOOLCHAIN_FILE=conan_toolchain.cmake \
        -S.

cmake --build bld/build/Release/generators/

ctest --test-dir bld/build/Release/generators/tests/
Meson

The conanfile.txt has to be adjusted slightly to support meson:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
diff --git a/conanfile.txt b/conanfile.txt
index b840105..d46dc37 100644
--- a/conanfile.txt
+++ b/conanfile.txt
@@ -3,6 +3,7 @@ inja/3.4.0

 [tool_requires]
 cmake/3.23.5
+meson/1.4.1

 [test_requires]
 gtest/1.13.0
@@ -10,6 +11,8 @@ gtest/1.13.0
 [generators]
 CMakeDeps
 CMakeToolchain
+PkgConfigDeps
+MesonToolchain

 [layout]
 cmake_layout

I’ve added meson as a build dependency and I’m generating meson toolchain specific files now along with pkg-config files for dependency consumption. The project can be bootstrapped the following way:

1
meson setup --native-file bld/build/Release/generators/conan_meson_native.ini bld

… and built like so:

1
2
3
meson compile -C bld

meson test -C bld

Everything works out of the box, no problems at all.

Bazel 6.0.0

Bazel support is still experimental and there’s a bit of confusion related to transition to Bazel modules. I’ve tried both with bazel 6.0.0 installed using bazelisk and newest bazel 7.2.1 - obtained using bazelisk as well. Switching versions with bazelisk is easy. You can defined the required version using $USE_BASE_VERSION or in .bazeliskrc.

As instructed by conan documentation, I’ve added

1
2
load("@//conan:dependencies.bzl", "load_conan_dependencies")
load_conan_dependencies()

to my WORKSPACE. My BUILD.bazel looked the following way:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
cc_library(
    name = "foo",
    hdrs = [ "foo.hpp" ],
    srcs = [ "foo.cpp" ],
    deps = [
        "@inja//:inja",
    ],
    visibility = [
        "//tests:__pkg__",
    ],
)

cc_binary(
    name = "hello",
    srcs = [ "hello.cpp", "foo.hpp" ],
    deps = [
        "//:foo"
    ],
)

I’ve installed conan files in my main repo directory using:

conan install --build=missing .

The bazel build can be started now:

bazelisk --bazelrc=./conan/conan_bzl.rc build --config=conan-config //...

Tests can be triggered in a similar fashion (bazel is able to find’em itself):

bazelisk --bazelrc=./conan/conan_bzl.rc test --config=conan-config //...

Works great!

Bazel >= 7.1

The support for Bazel modules has been just added very recently. I’ve tried it out and it works as well (at least in the simple case that I’ve tested). It doesn’t differ dramatically from the traditional Bazel WORKSPACE dependency management. I’ve removed the WORKSPACE file and created MODULE.bazel with the content as adviced in conan’s documentation:

1
2
3
4
load_conan_dependencies = use_extension("//conan:conan_deps_module_extension.bzl", "conan_extension")
use_repo(load_conan_dependencies, "nlohmann_json")
use_repo(load_conan_dependencies, "inja")
use_repo(load_conan_dependencies, "gtest")

That’s really it. Once that is done, everything can be rebuilt just like before with the same command. I’ve found no problems, as previously, everything worked.

Summary

Feature Support My verdict
Declarative dependencies Supported via conanfile.txt or conanfile.py ✔️
Build reproducibility Guaranteed thanks to support for lockfiles and build time tools. ✔️
Inter-dependency mgmt Supported. Each recipe contains all dependency’s dependencies. Additionally, any recipe submitted to conan center is tested. ✔️
Handling non-native packages Dependencies are consumed from conan center index. It’s possible to setup your own repository but it’s not straight-forward.
Project build systems supported Supports CMake, Meson, Bazel and autotools out of the box. ✔️
Dependencies build systems supported The dependency describes how it can be built in its conan recipe. Support for CMake, Meson, Bazel and Autotools toolchains is built in. ✔️
Caching Cache is shared and configured user wide. ✔️
Build tools Supported. ✔️
Other remarks If there’s gonna be any official package manager adopted by c++ committee it should be conan. ✔️

vcpkg

vcpkg is Microsoft’s child… great. It’s very similar to conan in the way it works. There’s a registry of packages. These are installed in $VCPKG_ROOT (you have to set it up to your liking). vcpkg provides toolchain files that cmake or meson can consume. Similarly as with conan, you can add your own private registries in the project’s manifest file vcpkg-configuration.json. The usage details are pretty well described in the documentation so, I’m not gonna go into the specifics.

It caters mostly towards CMake projects. Meson is not supported and you have to create the vcpkg.txt native file yourself. The integration happens through pkg-config. There’s a popular repository showing how to do that. Nothing can be auto-generated (as far as I know) so, unfortunately, you’ll have to make your hands dirty with this one.

As far as dependencies go, it can build meson projects. I’ve seen in the documentation that there’s support for qmake and gn as well but I’m not sure how well both if these work.

Testing

I’m gonna try to do a very basic test with a similar project as used for conan testing. I’ve added VCPKG_ROOT to my .bashrc as advised by the docs. As far as I can tell, there’s no support for build time dependencies so, unfortunately, required version of CMake must be provided by the build environment rather than in declarative way, using the package manager.

Starting a new project and installing project dependencies is easy enough:

vcpkg new --application

vcpkg add port inja
vcpkg add port gtest

The above creates a manifest and adds the port to it as dependency. Nothing is yet built nor downloaded. The packages are build during when initiating the CMake build directory:

cmake \
    -Bbld \
    -DCMAKE_TOOLCHAIN_FILE=$VCPKG_ROOT/scripts/buildsystems/vcpkg.cmake \
    -GNinja \
    -S.

From this point onward, everything happens the same old way.

cmake --build bld

Builds and links the project correctly, no problems at all.

Summary

Feature Support My verdict
Declarative dependencies Supported via project’s manifest file. ✔️
Build reproducibility No support for lock files. Dependencies versions and origins are not pinned by default in the manifest.
Inter-dependency mgmt Supported. Each port contains all its dependencies. ✔️
Handling non-native packages Supports installing packages directly from git repos. Allows for custom registries. ✔️
Project build systems supported Intended mainly for CMake. Custom integration possible but requires some work.
Dependencies build systems supported CMake, Meson, QMake, gn ✔️
Caching Download and build cache is shared across all projects using the same VCPKG_ROOT ✔️
Build tools Not supported.
Other remarks Solid choice with rich package repository. My personal preference is still conan due to support for lockfiles and more flexibility regarding project’s build system. ✔️

spack

spack brands itself as a package manager for super computers. It claims as well that it’s language independent and can support Python, C/C++ or Fortran. There seems to be no installation process. The documentation suggests to clone the spack repo and activate its environment in your shell config. In my case, it’s bash:

git clone --depth=1 -c feature.manyFiles=true https://github.com/spack/spack.git
. $HOME/spack/share/spack/setup-env.sh

All packages available with spack can be listed with

spack list

The repository has more than 8000 entries which seems very impressive.

Spack works by installing everything into $MODULEPATH - this is configured automatically when you source the spack’s environment script. Installation is very easy e.g.:

spack install fmt
spack install zlib@1.3.1

Additionally, you can specify the compiler to use when building the package:

spack install libjpeg@9f %gcc

Spack works a bit differently than all other discussed tools so far. The project doesn’t contain any dependencies file. You need to create an environment for the project:

spack env create myproject
spack env activate myproject

Once the environment active, the visibility of the required dependencies can be added to it:

spack add fmt
spack add libjpeg

The above commands only modify environment so appropriate build system can discover the mentioned dependencies.

Testing

My test project relies on pantor/inja. Unfortunately, this is not available by default in spacks repository. Not to worry though. It’s a good opportunity to check how easy it is to create a package and spack does support that. It’s as easy as:

1
spack create --name inja https://github.com/pantor/inja/archive/refs/tags/v3.4.0.tar.gz

It creates a complete boilerplate file with all necessary details to build the project. The only customisations I did was to add the dependencies:

depends_on("nlohmann-json")
depends_on("cmake@3.18:", type="build")

Additionally, I’ve added some project configuration flags to avoid building tests and benchmarks:

def cmake_args(self):
    # FIXME: Add arguments other than
    # FIXME: CMAKE_INSTALL_PREFIX and CMAKE_BUILD_TYPE
    # FIXME: If not needed delete this function
    args = ["-DBUILD_BENCHMARK=OFF", "-DBUILD_TESTING=OFF"]
    return args

That’s it really. After that I just issued:

spack install inja

… and spack managed to successfully build it for me. Package details can be inspected with:

spack info inja

This will show link-time, run-time and build-time dependencies.

Cool. First I’m gonna try a CMake project. I just created a test environment and added my dependency to it:

spack env create testenv
spack env activate testenv
spack add inja

After that, CMake was successfully able to detect and use my dependency just like normal:

1
2
3
4
5
6
7
find_package(fmt REQUIRED)
find_package(inja REQUIRED)

add_executable(spack_test
    hello.cpp)

target_link_libraries(spack_test PRIVATE fmt::fmt pantor::inja)

So far, spack passed the test flawlessly. What about meson project? To test that I decided to use one of my old projects. Additionally, the project uses meson subprojects so, it will be interesting to see if spack will be able to handle that.

spack create --force -t meson https://gitlab.com/hesperos/argparser

That worked out of the box as well. The only thing I had to add was:

1
2
3
4
def meson_args(self):
    # FIXME: If not needed delete this function
    args = ["-Dexamples=false", "-Dwrap_mode=forcefallback"]
    return args

Spack can create an independent environments as well. Those are environments with custom directory (most often it’s the same directory as the project itself):

spack env create -d .
spack -e . add fmt
...

The advantage is that environment details are present in the project. It’s very similar to vcpkg manifest or conan’s conanfiles.

Feature Support My verdict
Declarative dependencies Yes, project environment contains all required packages and their versions. ✔️
Build reproducibility Yes. Spack creates spack.lock within the environment. ✔️
Inter-dependency mgmt Supported. ✔️
Handling non-native packages It’s very good in importing packages thanks to template files for all supporting systems and good tooling. ✔️
Project build systems supported Creates environments. If your build system can find the dependency via environment - it’s supported. ✔️
Dependencies build systems supported CMake/Meson tested. Documentation mentions many more. ✔️
Caching Maintains a global package cache. ✔️
Build tools Supported. ✔️
Other remarks I’m really positively surprised. I only scratched the surface with my tests. Can’t wait to experiment more. ✔️

Conclusion

The discussed solutions are all solid choices which are well worth recommending for more complex projects. So far, in my personal ranking I’d place conan as my most preferable go to solution and spack on second place as the simplicity of importing packages really impressed me. With that being said, I’m not even half way through the list and there’s still a lot to cover in the next instalments of this series.