The compensation packages we are able to offer to new hires means we're generally hiring from the middle of the talent pool, not the upper tier.
The complexity of c++ has long since outpaced the pace of fluency of the hiring pool. In my experience, the average c++ professional (that applies for our open job ads) knows c++14, and might not balk at an occasional c++17 feature.
It doesn't matter if the compilers support modules or not, because in practice, I won't be able to use modules in the workplace for ages.
--
Standard disclaimer - I'm not able to predict the crush of changes coming as generative AI for software development proliferates.
When a new feature comes out, it's best to let it settle in a bit, maybe experiment using it on smaller side projects, but avoid diving into using it too early before it's really well understood.
For example I now cringe every time I see the curly bracket initialization of the form T{...} and how that was advocated as the one true way to initialize everything in C++, only for everyone to realize a year later that it has its own footguns (like with initialize lists) and with C++20 fixing almost all of the original problems that led to T{...}, the best practice nowadays is to go back to just using plain old T(...) and there's little to no reason to use T{...} anymore.
There was also Herb Sutter's Always Use Auto, which then was revised to Almost Always Use Auto, and now I think most developers take the sensible approach of use auto for redundant names like iterators, or unnamable types like lambda expressions, and avoid using it everywhere so as to turn your codebase into an opaque soup of inscrutable keywords.
You mention the average C++ programmer won't know the latest features, but if you did find an enthusiast who knew the latest features, you and the team probably wouldn't allow the use of those new features.
I can't imagine a more soul draining job than maintaining a corporate C++ codebase. Talk about doing the bare minimum.
To me, that's anything web-related in Java/C#/Go/JS.
Especially if most of the developers learnt Microsoft Visual C++ and believe that is proper C++!
This is C++es way of finally getting rid of them, akin to Swift or Rust.
I feel those statements are related.
The feature has so many irregularities that could only come out of a standards process, there are too many compiler bugs (just try using header units), the different implementations are too fragmented (I’m only using clang, which makes this easier on me), and there is a lack of things like module maps that would dramatically improve usability.
I work extensively in the embedded space and unfortunately C and C++ are still pretty much the only viable languages. I can not wait until the day rust or some other language finally supplants them.
I think if you want to work on systems software you should enjoy working with legacy cruft, otherwise you're just going to be miserable. I mean it's everywhere from the language, to the POSIX APIs, to device drivers, to hardware quirks... C++ is only a quarter of the problem.
For compiled garbage-collected applications (web/cli): Go.
For high-level applications (web/cli/etl/desktop): Java, C#.
Also here is good writeup: https://hackernoon.com/the-real-c-killers-not-you-rust
discussed here two times:
If you had to pick only one language to use, for everything, you'd pick C++. It can do it all, from bit fields to polymorphic classes to closures; it's safer and saner than C (you haven't read the standards if you think otherwise); it's got a level of support and maturity (and probably lifespan) than any other comparable language.
However, a key opportunity is missed in that neither the icon nor the site links in the footer linked to a short definition of the language before modules (the lack), the impact of modules on the design of the language at present (the real) and its place in the future of programming languages (the imaginary and the symbolic).
This is the source of all the evil. Even a hello world program involves reading through 100s of kilobytes, often megabytes, of headers that have to be parsed again and again for every source file but which can produce totally different outcomes in each case depending on compilers and the OS and the definitions on the commandline and whatever's defined in the source code itself and how the filesystem is laid out.
You can forget managing the dependencies on large projects this way - they are overwhelming. Every build system tends to be leaky and imperfect to not get drowned in dependencies and the fanciest systems all tend to have big holes here or there or they have to use huge "catchall" dependencies to try to be correct at the cost of efficiency.
I hoped modules would remove this problem but so far I'm not sure. I'd love to get the opinion of someone who has used them. My read-ups about it didn't seem that hopeful - I got the impression of them being a bit like pre-compiled headers.
[1]: https://clang.llvm.org/docs/StandardCPlusPlusModules.html#he...
I think this line on its own sums it up.
This website scrapes vcpkg's registry[1], which contains many C libraries which are unlikely to ever receive C++20 module updates. Many are primarily binary executable packages, like lunarg-vulkantools. It is quite unfair to judge C++ module support by this. There are even bugs in the table: the issue tracking Vulkan-Hpp module links to https://github.com/KhronosGroup/Vulkan-Hpp/issues/121, but it was actually implemented in https://github.com/KhronosGroup/Vulkan-Hpp/issues/1580 (full disclosure: I implemented it).
Boost maintainers have picked up on this[2], which is big.
The big 3 compilers have had a myriad of bugs, ICEs, and redefinition errors, despite what is claimed on cppreference[3]. VS 2022 17.10 will only just fix some of these, and G++'s module support isn't even released yet. Clang 18 has seemingly full(er) support for C++20 modules, but clangd is broken, and it seems mixing standard library headers and `import std` might still break, as will header units (`import <header>`).
CMake released C++20 modules support with 3.28, and will release `import std` support with 3.30.
This is painful but IMO worth the paper cuts that the bleeding-edge adopters will experience in the next year or so as modules are implemented.
I fully believe that a good one-third to half of build time and power consumption in the past 40+ years of compiling C and C++ code (considerably more so in the case of template-heavy C++ header-only libraries and projects) has gone to parsing and re-parsing headers and the resultant output.
Headers are a distinctly 1970s approach to compartmentalisation. Other languages have sorted dependency and library/import resolution years ago; there's no reason the C and C++ world has to be stuck with essentially copy-pasting code over and over. The embarrassingly parallel building that results from headers is fake; it takes more time and more energy than strictly necessary.
[1]: https://vcpkg.link/browse/all
And prob much more
Here is my, uh, “favorite” https://arewereorganizedyet.com/
I'd recommend becoming an expert in Python modules. How they're packaged, how they're referenced and installed by pip, etc. Then learn how headers and translation units work in C++. How templates operate is an important concept to understand. Jumping right to C++ modules without a deeper understanding of the C++ compiler or without a reference point for other languages' module concepts will only lead to confusion.
If you're totally new to C++, I'd actually recommend reading "The C++ Programming Language" cover to cover just to "know what you don't know" and then roll up your sleeves and get some experience with a hobby project.
Also, it might not be a popular opinion, but I think Bjarne's books are just fine.
A Tour of C++ (3rd edition) [2]
Principles and Practice Using C++ (3rd Edition) was just published in april 2023 [3]
[1] https://github.com/isocpp/CppCoreGuidelines/blob/master/CppC... [2] https://www.stroustrup.com/tour3.html [3] https://www.stroustrup.com/programming.html
The modern parts of C++ are alright (inelegant and cumbersome, but alright), but because the language has grown over the years and best practices changed, it's difficult to see, which parts are worth learning as a newcomer.
That's why I like the books, mentioned above. They give concrete Do's and Don'ts, explain some important concepts like move semantics and why and how best practices changed.
Keep in mind, that these books require some preliminary knowledge, but you should be fine, if you learn the basic stuff from an online tutorial before going through the books.
(edit: Conan seem to address C++20 modules, seems to seek compatibility, but as a non CPP developer, not sure I read it right https://blog.conan.io/2023/10/17/modules-the-packaging-story...)
I did not expect it to be the expected year of completion.
In the future, things should not get standardized without an actual working implementation of the feature that people can actually use. Even better would be to have multiple similar non-standard features that each compiler can implement, and then the standardization can serve as way for them to converge.
It's even worse that for embedded stuff it takes even longer for these modern compilers to be included in the toolchains etc. For example at work we are looking forward to being able to use GCC 13 later this year so we can use some features that were lacking in the GCC 11.3 we are using currently.
I don't think we can expect modules to be widely adopted before like 2026-38.
Also, we're on GCC 8! TnT '17 is the highest it can go.