I've noticed that more and more people like me have and use large alternative history "standard libraries" that add functionality, reimagine the design, and in some cases reimplement core components based on a modern C++ cleanroom. I've noticed that use of the standard library in code bases is shrinking as result. You can do a lot more with the language if you have a standard library that isn't shackled by its very long history.
One of the things I did for safety is that all access methods of all of my containers will bounds check and throw on null pointer dereferences ... in debug and stable mode. And all of that will be turned off in the optimized release mode, for applications where performance is absolutely critical. The consistency is very important.
Whenever I get a crash in a release mode, I can rebuild in debug mode and quickly find the issue. And for code that must be secure, I leave it in stable mode and pay the small performance penalty.
Not to mention C++ does not really provide the facilities necessary for convenient, memory-safe and fast APIs[0].
And as demonstrated by e.g. std::optional the standard will simply offer an API which is convenient, fast and unsafe (namely that you can just deref' an std::optional and it's UB if the optional is empty).
[0] I guess using lambdas hell of a lot more would be an option but that doesn't seem like the committee's style so far.
if that was not the case, `optional` would get exactly zero usage. The point of those features is that you build in debug mode or with whatever your standard library's debug macro is to fuzz your code, but then don't inflict branches on every dereference for the release mode.
The standard library certainly is lacking things which are commonly used (say, JSON parsing or database connection), but I think this is a conscious decision (and IMO the correct decision) to include only elements that have a somewhat settled, "obvious", lowest-common-denominator semantics. There's rhyme and reason to most of the most commonly used elements that is decidedly lacking from e.g. Python's (much more extensive) standard library.
I strongly disagree. It's quite obvious that the C++ standard library does not need to add support for "common things", because they already exist as third-party modules.
In fact, this obsession to add all sorts of cruft to the C++ standard is the reason we're having this discussion.
If there is no widely adopted JSON or DB library for C++ then who in their right mind would believe it would be a good idea to force one into the standard?
And don't get me started on the sheer lunacy of the proposal to add a GUI library. Talk about a brain-dead idea.
People working on other programming language stacks already learned this lesson a lot of time ago. There's the core language and there's the never-ending pile of third-party modules. Some are well-made and well thought-out, others aren't. That doesn't matter, because these can be replaced whenever anyone feels like it. This is not the case if a poorly thought-out component is added to an ISO standard.
Can you, from the top of your head, tell me what irregular modified cylindrical Bessel functions are and the last time you needed to use one? And yet, they were included in the standard library in C++17: https://en.cppreference.com/w/cpp/numeric/special_math/cyl_b...
I work on other projects, or on my own stuff at home, and I can breathe again. I don't always need reverse iterators on a deque, but dammit they are there if I need them.
However, I have been in too much C runtime code to be entirely happy. I've seen too many super-complicated disasters, for instance the someone who really wanted to write the Great American OS Kernel but who wasn't allowed on the team, and so had to make their bid for greatness in stdio.h instead. You learned to tread carefully in that stuff, the only good news being that if you broke something it might have turned out to be already busted anyway and no harm done, philosophically speaking, I mean.
There are no good answers :-)
As such it just sounds like a mature technology which a huge adopted base and is still holding traction. Generally maturity, traction and adaptability can be considered indicators of health and not malady.
Beauty is overstated. Engineering can be art but it doesn't have to be.
Jokes aside, I use C++ daily and see it as Warty McWartface and could spend a long time ruminating about it's faults. But adapting old stuff to new boundaries is always going to be messy. Generally rewriting history creates more problems than solves them.
The good thing here is that the standard library doesn't require 'magic' to be implemented (unlike Swift where the standard library relies on hidden language hacks).
For example, since the standard library does not have a Matrix class suitable for numerical applications (or maybe it does today...) using multiple libraries each with its own Matrix class is difficult. Multiple libraries are needed since one library may not contain all numerical algorithms one may require for a given app.
This is not a problem for Google where I assume everyone is using internally written code -- but is a problem for most of us.
POCO comes to mind.
There was some talk about an std2, but I gather support for it is too low to be pursued seriously.
An excellent example is std::unordered_map. This type was introduced to address perf problems with std::map. But unordered_map forces closed addressing, separate allocation, etc. which limit its performance. In return you get stronger iterator invalidation guarantees but these are rarely useful. Meanwhile Abseil's swiss tables, LLVM's DenseMap, etc. illustrate what a high-performance C++ hash table could be.
Herb Sutter has concrete proposals to address this issue and Clang already supports them: https://www.infoworld.com/article/3307522/revised-proposal-c...
That’s the whole point: your caveat shows that’s it’s C/C++ which are unsafe in their very nature and therefore should not be used in code exposed to potentially malicious (e.g. user or network) input. Which is just about everything useful.
HPC are generally closed systems and have different threats, but the industry just needs to run (not walk) away from C/C++ for the majority of use cases.
It has been many years since I shipped a memory bug in C++. It is just not a real worry for me. I am constantly dealing with design, specification, and logic flaws, which affect Rust equally, or moreso.
I am aware that there are plenty of other programmers out there, writing bad code in what they would call C++. I would like them to write good code. If it takes Rust to make them write good code, so be it. But if they began writing decent C++ code, that is just as good.
The threshold is not zero memory errors. The threshold is many fewer memory errors than logic or design errors. The more attention your language steals from logic and design, the more of those errors you will have. Such errors have equally dire consequences as memory errors, and are overwhelmingly more common in competent programmers' code, in C++ and in Rust.
C++ is (still) quite a substantially more expressive language than Rust, which is to say it can capture a lot more semantics in a library. Every time I use a powerful, well-tested library instead of coding logic by hand because it can't be captured in a library, that is another place errors have no opportunity to creep in.
So it's great that Rust makes some errors harder to make, but that is no grounds for acting holier-than-thou. Rust programmers have simply chosen to have many more of the other kinds of errors, instead.
Every programmer who switches from C to Rust makes a better world; likewise Java to Rust, or C# to Rust, or Go to Rust. Or, any of those to C++.
Switching from C++ to Rust, or Rust to C++, is of overwhelmingly less consequence, but the balance is still in C++'s favor because C++ still supports more powerful libraries.
You might disagree, but it is far from obvious that you are correct.
Sure, for the typical user facing application HN readers talk about then C++ can certainly contain vulnerabilities that are worrisome. Many performance critical applications can tolerate vulnerabilities in favor of latency.
It seems to me that the world of realtime systems including avionics, autonomous control software, trading, machine learning, and more is "not useful" as per your comment. The extreme low level control that C++ offers and powerful metaprogramming allows for performance that even Rust cannot hope to rival.
The industry has moved away from C++ for plenty of these user facing use cases. Codebases like Chrome and Firefox can't just be rewritten in Rust overnight. You can try and rewrite eg; SSL libraries but that has its own host of problems (eg; guaranteeing constant time operations).
I encourage the people parroting a move away from C++ to really think about what it is that should move and what the pros/cons are. I think you'll find that many of the things at risk (i.e user facing applications) are already on their way to being rewritten in Go/Rust.
I think we need to stop talking about C/C++ as if they are particularly related. My opinion about performance and C is I'll happily give up some of that for better security.
Which is both a blessing and a curse. A blessing as it allowed us Pascal/Ada/Modula refugees never to deal with what was already outdated, unsafe language by the early 90's.
But also makes it relatively hard to write safe code when we cannot prevent team members, or third party libraries, to use Cisms on their code.
Regarding the alternatives, Swift is definitly not an option outside Apple platforms. And even there, Apple still focus on C++ for IO Kit, Metal and LLVM based tooling.
Rust, yes. Some day it might be, specially now with Google, Microsoft, Amazon, Dropbox,... adopting it across their stacks.
However for many of us it still doesn't cover the use cases we use C++ for, so it is not like I will impose myself, the team and customers, a productivity pain, take the double amount of time that it takes to write a COM component or native bindings in C++ for .NET consumption just to feel good.
When we get Visual Rust, with mixed mode debugging, Blend integration and a COM/UWP language projection for Rust, then yeah.
I mean that's a bit of a cop-out given C++ has more non-C warts and UBs than it has C warts and UBs at this point. It's not just "copy-paste compatibility with C" which made std::unique_ptr or std::optional deref and UB.
The large majority of C++ UB comes from compatibility with ISO C UB 200+ documented cases.
And ISO C++ working group is trying to reduce the amount of UB in ISO C++, which is exactly the opposite of ISO C 2X ongoing proposals.
string_view is really a non-mutable borrow. But the compiler does not know this.
https://herbsutter.com/2018/09/20/lifetime-profile-v1-0-post...
Why does static analysis not work here?
But it does exist, and does catch some of these errors. Example: https://godbolt.org/z/CZTfSx
> Dereferencing a nullptr gives a segfault (which is not a security issue, except in older kernels).
I know a lot of people make that assumption, and compilers used to work that way pretty reliably, but I'm pretty confident it's not true. With undefined behavior, anything is possible.
This is something really hardwired into the C and C++ language. Even if the underlying operating system perfectly supports dereferencing null pointers, compilers will always treat them as undefined behavior. (In Linux root can mmap a page of memory at address 0, and certain linker options can cause the linker to place the text section starting at address 0 as well.)
Also, you can "safely" dereference nullptr, just so long as you dont attempt to actually access the memory. C++ references are nothing more than a fancy pointer with syntactic sugar.
For example: int* foo = nullptr; int& bar = *foo; // no blow up std::cout << bar << std::end; // blowup here
My personal $0.02 is that the C++ standard falls short with language like "undefined/unspecified behavior, no diagnostic required." A lot of problems could be prevented if diagnostics (read: warnings) were required, assuming devs pay attention to the warnings, which doesnt always happen. For example: Google ProtoBuf has chosen to ignore at their own and clients' peril potential over/underflow errors and vulnerabilities by ignoring signed/unsigned comparison warnings.
Either way, C++ is certainly not for every project, but the articles scattered around the web claiming it should be superseded by Rust are plentiful. These opinion pieces make no attempt to credit C++ for when it does make sense to use. Despite it's quirks, it is still the most optimal way to program HPC applications or cross platform GUIs that are not Electron based. The security tools around it and the fact that it's an ISO standard language make it a solid choice for many enterprises.
FWIW, I use C++, not Rust or Swift, and I have a fair amount of knowledge and experience vested in it, but I think these questions are worth asking.
I think 'hate' really represent the mind of some people (even if they are a minority) but even if we ignore this extreme, the level of irrationality in technical discussions is generally quite high. You need to have rational people to have a rational discussion. The sad reality is that a lot of technical discussions are only superficially rational and are often a political play to assert superiority on other people (it's true for languages, frameworks, code editors, methodologies, etc ... ).
Meanwhile the Firefox rewrite, the premium example of what they propose is still plodding along and Mozilla PR blogs aside, Firefox is still plugging vulnerabilities in each release and will be for the foreseeable future.
Now let's look at the Swift community... do we have blog posts from them every week about how awesome Swift is and why one should rewrite their working C and C++ code in Swift? No, they keep doing their thing, Swift is becoming better at cross platform, it's also getting some support for machine learning.
That's how one grows a language, through building successful projects, staying positive (and having an entire platform behind it). Not through doomsday scenarios and a constant barrage of criticism.
As Rust (or another language with similar safety/performance properties) matures and its ecosystem grows, C++ will increasingly become a language of tiny niches and legacy codebases.
In other words: C++ is the new Fortran.
Which makes Rust the new... APL?
I think the analogy is pretty apt as far as it goes. Fortran by the 70's was a crufty language with a bunch of legacy mistakes that remained very popular and very useful and would continue to see active use for decades to come.
And everyone knew that. And everyone had their own idea about the great new language that was "clearly" going to replace Fortran. And pretty much everyone was wrong. The language that did (C) was one no one saw coming and frankly one that didn't even try to fix a lot of the stuff that everyone was sure was broken.
For myself, I despair that Rust has already jumped the proverbial shark. It's complexity is just too severe, the only people who really love Rust are the people writing Rust libraries and toolchains, and not the workaday hackers who are needed to turn it into a truly successful platform.
Rust could easily go the same way.
But, C/C++ is the best option for us for high-performance network processing. We're dabbling with Rust for small applications where we would use Python previously and it's working pretty well -- but there's no way we could use Rust for the core application yet. Modern C++ has really grown on me and it's sometimes a love/hate relationship but totally a huge improvement over ancient C++ or C.
I think the article maybe doesn't do enough to outline the full extent of the problem by focussing on a few counterintuitive cases that are present in C++17, because you're right, all of those cases in the article are ones that can be learned and remembered without issue. The real problem, as I see it, is actually that the core language semantics mean that there's no foreseeable end to the foot-shooting treadmill. Since the language is fundamentally permissive of such things, it's likely that further spec revisions will introduce abstractions like string_view that are easy to use unsafely, aren't flagged by static analysis tools, and end up in security-critical code.
Because this feels like a necessary disclaimer, I don't think that fact justifies migrating every active C++ codebase out there to Rust or anything, since pragmatically speaking there are a lot more factors beyond just core language semantics that go into evaluating the best choice of implementation language. I guess my takeaway is neatly expressed by the post title: there's a sense that I get from C++ users (granted, maybe only naive ones) that sticking to the features and abstractions introduced in C++11/14/17/etc basically eliminates all of the potholes of old, and it's evident that that's not true and will probably continue to be not true.
I'd love to hear more information on this! In my mental model, you could just use Go to replace your Python utilities, but Rust might be workable for your core (or at least its designers would like it to be and would like to know why it isn't).
Hopefully that stuff will be helped with things like Language Server Protocol and Debug Adapter Protocol.
C++ isn't going anywhere. In 20 years you may not be writing in it, but you'll still be calling into it somewhere in the software stack (especially if things continue moving the WebAssembly direction).
Even if you're using Python's SciPy today, you're calling into LAPACK written into Fortran.
Rust is also a good language. Trash-talking C++ does no one any good.
Overwhelmingly, the substantial gains to be made are moving people off of C. Every other possible benefit is a rounding error. It is still much easier to get people to C++. Once dislodged, they might continue on to Rust, or br seduced by C++'s greater expressive power and more powerful libraries. Either way the world will be better.
C++ today seems weighed down by legacy cruft, compared to Rust, but Rust is rapidly accumulating its own legacy cruft. By the time it is mature it will have easily as much of its own.
Then, you have the cognitive overhead of translating the documentation, and other sample code. In my experience with bindings, this ends up requiring knowledge of the language you're binding to. It seems easier to just write it in the native language instead and deal with those quirks rather than bindings quirks.
Do the Rust bindings show the Qt docs in the autocomplete? If there's not input validation on the binding side, then you'll end up in C++ again figuring out how to sort things out.
Regarding CUDA, I think we're all hoping for a cross GPU alternative. There's OpenCL, Sycl, ROCm, Kokkos but their API is also written in (you guessed it) C++. Need to render to OpenGL? You'll be writing in C. Unless one of the companies decides to replace driver interfaces with Rust, any application using them will be dependent on N bindings working.
You're ultimately not escaping C/C++ for any systems development. You either deal with the complexity of interfacing between language A and C/C++ or just deal with the quirks of C/C++ themselves. Pick your poison.
I’ve done a fair bit of mixed Rust + CUDA C++ though, and found it to be a very nice way to build high performance code with safe high-level interfaces that someone can grab and use with little to no understanding of GPU architectures. It’s even pretty straightforward to build wrapper types that leverage Rust’s ownership system to track lifetimes and safe management of device buffers as well (unfortunately I can’t release that code but it really was pretty simple so hopefully someone else will soon do it openly, or by now maybe someone already has)
You make it sound like Ada stopped in the 80's.
They don't release standards in rapid succession but 'Ada 2012' has pretty much all of the features that people were asking for in C++ since 2011.
The only issue (on top of the obvious lack of coolness and hype around it) is that professional grade Ada compilers/toolchain are still quite a high cost for single developers or small companies. AdaCore's business model is still pretty much focused on support contracts to big Aerospace/ATC/Defense clients.
<source>:8:16: warning: passing a dangling pointer as argument [-Wlifetime]
std::cout << sv;
^
<source>:7:38: note: temporary was destroyed at the end of the full expression
std::string_view sv = s + "World\n";
^
And cppcheck will catch the second example: <source>:7:12: warning: Returning lambda that captures local variable 'x' that will be invalid when returning. [returnDanglingLifetime]
return [&]() { return *x; };
^
<source>:7:28: note: Lambda captures variable by reference here.
return [&]() { return *x; };
^
<source>:6:49: note: Variable created here.
std::function<int(void)> f(std::shared_ptr<int> x) {
^
<source>:7:12: note: Returning lambda that captures local variable 'x' that will be invalid when returning.
return [&]() { return *x; };
^
Cppcheck could probably catch all the examples, but it needs to be updated to understand the newer classes in C++.Godbolt's compiler explorer is a great tool to try new features (language, compiler, standard library, etc.).
But there is an alternative/complementary approach, which is to simply avoid potentially unsafe C++ elements, like pointers/references, arrays, std::string_views, std::threads, etc., substituting them with safe, largely compatible replacements[2]. This approach has the benefit that an associated safety-enforcing "linter" would not impose the same kinds of "severe" usage restrictions that the lifetime profile checker (or, say, the Rust compiler) does.
[1] https://devblogs.microsoft.com/cppblog/lifetime-profile-upda...
[2] https://github.com/duneroadrunner/SaferCPlusPlus
edit: grammar
There isn’t really any useful safe subset of C++. If there were, Rust may never have been created in the first place.
Until Rust's tooling catches up with C++/CLI, C++/CX, C++/WinRT + .NET or Java + C++ (Eclipse/Netbeans/Oracle Studio), CUDA, Unreal/Unity, GLSL/HLSL/Metal Shaders allow for, it will stay as a safe way to write CLI apps and a couple of UNIX libs.
I like the language and advocate it often, but I am also very pragmatic regarding the areas I and customers work on.
C++ still does not have an absolutely safe subset, but it has a safe-enough subset, and plenty of other merits that will ensure its continued competitiveness.
Rust will continue improving, too, and someday may be as expressive as C++ is today, or perhaps even as expressive as C++ is then. That will be a good day, although by then some other language will be on the rise, its users hoping to displace C++ and, given enough luck and hard work, Rust.
V could be interesting.
"It was already tried and failed, why is this time better"
"Mainstream languages always end up not using it"
"People should reference more the original works of the past"
...
One of the many explainations of the name Rust is that it represents a collection of old ideas. What was the point you were trying to convey in specific?
> Now, the volatile accesses are performed automatically through the read and write methods. It's still unsafe to perform writes, but to be fair, hardware is a bunch of mutable state and there's no way for the compiler to know whether these writes are actually safe, so this is a good default position.
https://github.com/rust-embedded/book/blob/9c05a419fc2ad231c...
I'm being a little snarky here, but if you are truly a macho developer, then crapping out that unsafe code in optimized assembly or C or something is a really really easy time to show off and shouldn't be such a big deal for such a seasoned developer. Instead, the question is always raised in this more insecure way: for that tiny percentage of the time you have to do some bit-banging (and it's usually pretty small and encapsulated on most embedded projects) you had might as well do the whole thing in C or C++.
Ada has a history on microcontrollers, and is old enough we can answer with some certainty.
You may end up having to bypass bottlenecks caused by runtime checks... And that means unsafe code in a critical location.
Performance or safety. At different parts of your code you may find yourself choosing.
If you can write arbitrarily to any position in RAM then it's not a memory-safe language. You can hide accesses behind some kind of abstraction (unsafe blocks, libraries or whatever). But that's not a memory-safe language, it's the developer making a contract with himself agreeing to not allowing direct, straight accesses (like those caveats we C/C++ developers do to not make "memory bugs"). Some microcontrollers can protect certain memory areas and raise access faults, but those are the same faults for a program written in C, C++, assembler, etc., and not related to the language.
One make the safest RTOS in trendy-safe-lang for some small microcontroller, but the end-user (developers) would still be able to write some unsafe code and blow a fuse.
A different thing is for MCUs with MMU and/or an OS that handles virtual memory per process/thread, but that wasn't your question.
As if C++ compile times aren't crazy enough already.
So, how then? That's the main question indeed :)
By then, many will also be writing it in Rust, and you will be sneering at them, too. It has always bern easy to sneer at people busy making things work.
I feel like the lambda example is pretty contrived. If I was returning a lambda that was capturing values by reference, I would already be pretty wary of UB.
Is this really true? Surely it just gives you an uninitialised `int` (or whatever is in the `optional`)?
Then came an OS, with a symbolic price instead of the typical market prices of competing OSes, alongside source code tapes, and a systems programming language that was the "JavaScript" of system languages.
Assume we have this abstract developer that has a good knowledge in programming theory but has no experience in programming languages.
The developer starts a new project, but in what language?
web: Don't see any reason for this. Exist lots of great alternatives. This is not really one of C++ strengths so is not that strange.
desktop-GUI: Probably one of the biggest strengths of C++ is the Qt framework, it is solid choice. I can see this a possible choice. However with electron dominating & PWA:s becoming a more viable option it is probably much higher chance that a HTML/JS environment is picked instead, especially how it already dominates the web. And by using TypeScript you can do it an solid language.
mobile apps: Most apps are written in some web technology or directly with Swift or Java. Qt has some support for this, but not sure how widely used. My experience with NDK was not pleasant. I can't really see this as a viable option.
embedded: I don't do embedded, but my understanding is that plain C is much more common here & if faster development is needed you integrate something like Lua. Maybe?
memory safe: use rust I guess.
compiled binary: Use golang, no complicated buildstep.
Parallellism: Better to use a language designed for this like erlang.
Game development: For the majority of games today a scripting language like JavaScript or Lua is good enough. HTML/JS has some really good frameworks for game development today.
3D game development: Probably a good fit to use C++, but I think that C# with Unity is a much better choice. Great framework, good community, however C++ is not bad choice for this. Possible.
Commandline tool: If the developer is building the next grep, C++ could fit that, but most commandline tools does not have that performance requirement, Probably do some HTTP, JSON decoding, DB access. Bash is good enough or any other dynamic language.
Scientific: my understanding is that today this is mostly python or matlab. Maybe?
System development (drivers etc): I know too little about this to make a good assessment, to be fair I put this as a possible choice.
And if the developer do decides to use C++ for a new project, the initial cost is quite high to just understand the basics, even if he/she uses the latest C++ version. Copy constructors, lvalue & rvalue (xvalue, glvalue, prvalue...), move semantics, const refs, rvo, smart pointers, auto etc
Any good arguments to pick C++ for a new project?
desktop-GUI: I guess you might be joking here with Electron, I rather use my GPU for something else other than blinking cursors. Even with Cocoa, UWP and WPF, the underlying UI shaders are written in C++.
Embedded: Yes, C does rule over C++, which is a reason why embedded is so open to security exploits due to wrong manipulation of string and arrays.
Parallellism: HPC, FinTech, GPGPU all domains where C++ rules for the time being.
3D game development: C++ is king here, even with Unity the core engine is written in C++. Yes many of us hope to see the day when Unity is 100% written in a mix of C# and HPC#, but even then, LLVM will be part of the stack.
Scientific: Someone needs to write those Fortran and C++ libs called by Python and Matlab.
System development: Google, Apple and Microsoft use C++ on their driver layers for their respective OSes.
IDE tooling: C++ is known for not having IDEs that match what Java/.NET are capable of. Languages that want to take C++'s place, are even worse than C++ in IDE tooling.
Embedded: But how is that an argument for C++? I can do safe stuff in Lua without the hassle of C++ & then use C when needed.
Parallellism: Agree. Here we have a case.
Scientific: Yes, someone needs to write the underlying libraries for Python & Matlab, but if you are starting a new project, do you actually start writing a library first or do you use an existing one?
IDE tooling: Yes, good tooling can be a good argument in itself to pick a technology. With C++ maturity is as a clear advantage, however some of the languages that i listed do have quite nice tooling today, e.g. C# & TypeScript.
Probably niche, but almost all audio dsp (vst &co) uses C++. See JUCE framework. Possibly other "multimedia" stuff (video, image manipulation, etc)
HotSpot, ART, V8, SpiderMonkey, Chakra, JSC, Dart, and CLR are all written in C++. Are there any modern serious language VMs that aren't written in C++?
Having a program cause a memory violation and be killed by the OS is the best possible outcome in this case, it stops the program from doing any damage and you get a clear symptom of the problem for debugging.
It's when the issue is not that obvious that you're in real trouble because it may start behaving erratically, corrupt data and be exploited by malicious actors to get access to resources that shouldn't be exposed.
I always thought that undefined behaviors were historical accidents. But apparently sometime people just say "hey, lets add a few more undefined behaviors"
This is the insanity of C++
There’s this article [1] about compilers exploiting undefined behavior but ... it’s already undefined behavior.
[1] https://devblogs.microsoft.com/oldnewthing/20140627-00/?p=63...
An iteration statement whose controlling expression is not a constant expression, that performs no input/output operations, does not access volatile objects, and performs no synchronization or atomic operations in its body, controlling expression, or (in the case of a for statement) its expression, may be assumed by the implementation to terminate.
And C++11 added this bit that wasn't there in C++03:
A loop that, outside of the for-init-statement in the case of a for statement, - makes no calls to library I/O functions, and - does not access or modify volatile objects, and - performs no synchronization operations (1.10) or atomic operations (Clause 29) may be assumed by the implementation to terminate.
[Note: This is intended to allow compiler transformations, such as removal of empty loops, even when termination cannot be proven. -- end note]