I find that the real problem is no one wants to properly learn how their build system works. I don't care if it's make, cmake or bazel -- whatever it is, you need to _learn_ it. I've worked with folks that have 20 years of experience, fantastic C/C++ developers, that look at a makefile and say "Ugh what is this complicated mess, can you do it in cmake or bazel or something" and expect a silver-bullet where-in the makefile build will somehow transform itself into a self-describing intuitive build system by virtue of some sort of hype-osmosis.
This is so true, it happened to me more than once.
A couple of projects ago, we had a complicated build process (7-8 manual build steps that depend on files generated from each other before) for an embedded system. I wrote a little makefile deleting all those 7-8 shell scripts and i was asked to re do it in cmake.. I was like wtf.. each clearly defined step in makefile would turn into multiple unreadable function calls in cmake.. why would anyone want to do that..
Not that Makefiles are perfect, but sometimes, the right tool for the job isn't the shiniest. Make does a job of being "good enough" for a lot of little tasks.
If this sounds like you, do yourself a favor and go through these slides (or watch the talk they came from):
https://github.com/boostcon/cppnow_presentations_2017/blob/m...
It really clarified things for me, but also avoids going too much into detail. You will definitely need more info as you go along, but you can look those things up in the docs. This presentation does a good job at showing the core essentials that you can build your knowledge on later.
The problem in my experience is you invest time learning build system A, then a year later build system B comes out, and not only do you need to relearn a bunch of stuff again, often build system B does some of the stuff of system A but not all of it, plus it does new stuff that you've never encountered before. Then this cycle repeats, endlessly, and every new team you join has adopted the newest build system.
Granted some ecosystems are worse than others here. in the JavaScript world it went something like: make/grunt/bower, gulp/webpack, esbuild, parcel, vite, rollup, and on and on it goes.
Even in the conservative Java ecosystem we've been through maven, ant, groovy/gradle...
Most of these tools offer incremental improvements for a huge learning cost. It's a nightmare.
I end up "Randomly" stabbing at things until it works just well enough to get that particular thing done then dropping it all because it was such a painful experience.
Compared to something like cargo which works really well, C++ and it's build tools just feel flaky.
It may be that I'm just missing a mental model to get to grips with it, but no other major programming language is like that from my experience.
Looking for something that is still alive in 2023
Looking for something that is still alive in 2023
Theon Greyjoy in "Game of Thrones" exhibited this condition.
It is a thing to overcome.
What on earth.
#pragma comment(lib, "xxx.lib")
You can specify it in a header file for the library. That way if you include the header file, the library mentioned will automatically get linked as long as it is somewhere in the library search path.I have found myself wishing that GCC would also get something like this.
function I2C_GetNumChannels(out numChannels: Longword): FT_Result; stdcall; external 'libmpsse.dll';
and that was it; but to do this in MSVC you needed not only the .h header and the .dll itself, you also needed that stupid .lib file that had AFAYCT had literally nothing inside it except symbol entries that said "no, load it dynamically from this .dll on startup, please". So it was a rather common source of amusement for Delphi programmers that paradoxically, it was harder to link a program written in C against a DLL written in C than it was to link a program written in Delphi against a DLL written in C. cc main.c -o blaI suspected that someone might have done it before, but didn’t know of any implementation. I’ll take a closer look at Visual C++ (used it in the last millennium for work) before deciding how mine should work.
FD: CMake developer
Of course, the other alternative is to simply #include _every_ file in your project into a single source file, then compile that. It’ll probably be faster than anything else you do, and eliminates several other foot–guns as well. And it means that your build script can just be a shell script with a single line that runs the compiler on that one file.
But these days I greatly prefer Rust, where building is always just “cargo build”. Doesn’t get much easier than that.
> Of course, the other alternative is to simply #include _every_ file in your project into a single source file, then compile that.
Yeah, no... recompiling the entire project whenever any file is touched is way too slow for any non-trivial project.
I'm not sure why C and C++ have such a bad story here. Some combination of greater intrinsic complexity (separate headers, underspecified source-object relationships, architecture dependency, zillions of build flags, etc), a longer history of idiosyncratic libraries which people still need to use, the oppressive presence of distro package managers, and C programmers just being gluttons for punishment, probably.
Not always the case; I have a project with
default.o: default.yaml
$(LD) -r -b binary -o default.o default.yaml
and a default.h containing extern const char _binary_default_yaml_start[];
extern const char _binary_default_yaml_end[];
#define PARAM_YAML _binary_default_yaml_start
#define PARAM_YAML_LEN (_binary_default_yaml_end - _binary_default_yaml_start)
this used in the main code as fwrite(PARAM_YAML, 1, PARAM_YAML_LEN, stdout);
printing the contents of the yaml file to stdout.Your use case would be served by C23's #embed [1]. The same thing has been proposed for C++ but repeatedly kicked down the road because the standardisation committee wanted to make it more general even though no one had any demand for that so they didn't know what it would look like. (C++ standardisation in a nutshell.)
> “If you want something like this to work, you have to commit to a certain amount of consistency in your code base. You might have to throw out a few really nasty hacks that you've done in the past. It's entirely likely that most people are fully unwilling or unable to do this, and so they will continue to suffer. That's on them.”
That goes for almost everything, in developing ship code.
Today, I am in the initial stages of rewriting an app with a codebase that has “accreted” over two years.
It’s kind of a mess (my mess, to be clear).
I’ll be adding a great deal of rigor to the new code.
I think it will come out great, but I have my work cut out for me.
Conan has a learning curve, but it’s totally worth it. Anyone making their own build system should get some experience with a state of the art package manager before writing a single line of code, because chances are that it already solves whatever problem is motivating you.
Conan obviously has promise, I haven't spent much time with it, most of my experience with C++ package managers is with nuget and vcpkg. However, my attitude toward package managers is changing.
I increasingly like _not_ using package managers because it makes me (and my company) way way way less likely to bloat our software with unnecessary third party dependencies.
I wrote this in another thread: I never believed you should write something yourself if you can find a package for it. My boss told me I should write it all myself, I could probably write it to be faster. I encountered a case where I needed to compare version numbers in python. For the heck of it I wrote the simplest, quickest, most naive solution I could come up with and then timed it against the most recommended version comparison package in python. I blew it away by 20x throughput.
I don't believe in package managers anymore. Obviously I'll keep using pip and sqlalchemy in Python, but I'll happily spend the 20-30 minutes it takes adding something like nlohmann-json or md4c to my project over worrying about maintaining a package manager for c++ these days. Precisely because it makes me think twice about adding another dependency.
And yaml parsing is probably on the simpler side of things. We need to run torch models, we do need libtorch. We are not rewriting libtorch, that would be silly.
vcpkg/NuGET + (any build system) = problem solved
nix flake new --template "github:nixvital/flake-templates#cpp-starter-kit" my-project
will create a skeleton for my new C++ projects.