If I have a lightning cable and a device with a lightning port, I can plug it in and not really think much about it.
If I have a device with a USB-C port and a USB-C cable, I have no idea whether I can charge at full speed or even charge at all.
As an example, I plug my Xbox controller into my PC via USB-C. It keeps it fully charged. Recently I tried to plug in my USB-C headset into the same cable and it didn’t charge at all. In fact my headset died shortly after I unplugged it (after “charging” it all night), which was a bit of a pain. Turns out it can charge with my laptop charger or my iPad charger but not the Xbox cable plugged into my PC.
Maybe this isn’t a totally fair comparison, but it’s my personal experience.
The only difference now is that you also have the option of using other cables and adapters if you want/need to, as well as use your Apple charging stuff for charging other things.
The specification is very clear with regards to chargers: a charger with more watts is always better than one with fewer watts. For any pair of chargers, if charger A provides X watt and charger B provides Y watt, if Y > X every device which can be charged by A can also be charged by B. This means a laptop charger will always also charge your smartphone - although the opposite might not be true due to the laptop having a minimum power requirement.
Here I can charge my laptop (although slowly) via the usb-c on my PC. All other devices work fine too.
And I’ve had more lightening cables break in the last 2 years than I have had lightening cables in the last 7 years.
I have cca 10 different usbc devices, and use all their provided cables to charge all other accessories. All works as expected, charges fast, transfers fast. The key is to not cheapen out and buy the cheapest crap. Just like with anything else in life.
Considering all the devices you listed need less than 60W, they should charge with all cables. If they do not, the manufacturer gave you a broken cable which does not follow the USB-C specification. Blame the manufacturer, not USB-C.
Most of the time, you avoid problems of "PD device hierarchy" by using a proper charger (i.e. a device that only has the functionality to charge other devices).
The problem is with the "most", which means you can't count on it.
I have usb-c headphones that can be charged with my hp usb-c laptop charger (which is a "proper charger", I guess, since it only does charging) or from a regular computer usb port. They don't pretend to have any high-powered charge mode (manual says 3 hour charge time and "usb charging").
My usb-c ecig won't charge from that. It will only charge from either my pc's usb-a port or random "low power" usb phone charger.
I haven't tested it with any of the fancier high-powered adapters, since I don't own any. But, clearly, usb-c charging ports are not that universal.
It’s ultimately not a big deal, and it’s surely something I can learn to intuit. But I think lots of consumers (I think about my poor mother) would benefit from a clearer labeling scheme on the various devices, ports, chargers, or cables.
Lightning isn’t that much better though. You can get lightning to USB-C cables so you can plug into various bad devices and have other USB-C problems, though you likely won’t try to plug into a monitor for example.
There’s maybe an argument to be made about the ports too. I think it would be harder to clean dust/fluff out of a USB-C port and maybe they could be more fragile too because of the spike in the middle of the port.
It still seems to me like Apple probably also like that they have a lot more control over lightning. And they already have USB-C on iPads and computers so they clearly don’t think it is totally terrible. But there are a lot more iPhones in the world than other Apple devices and I can imagine eg Apple having to spend a load of money on customer support, etc, due to the USB-C issues, or getting blamed when things go wrong because of the cable or device on the other end.
I think it would be better if the possibility of connecting two devices with a cable implied that they would work together, but I’m not sure how that could be done without either more ports or more expensive cables/charging bricks. And I don’t know why a USB-like organisation would have more success at ensuring things follow the standards than USB currently have.
The simple solution as done elsewwhere across many technical endeavors is to mandate standards, that's why the ISO exists.
For example, if I plug a 220/240 Volt appliance into a power wall socket that's rated at either 220 or 240V then it should work properly—one doesn't expect say 400V out of said socket.
That countries mandate a given voltage ± a specified tolerance that's well defined is specifically to avoid malfunctions/equipment damage.
A manufacturer that goes against mandated standards should suffer the consequences.
If other industries have no problem with mandated standards then why should the IT/computer industry be excepted or any different?