For example, in iOS development (and, more in general, on Apple platforms), there has been a huge shift from Objective-C to Swift.
The same arguments should apply there. Swift is much better, but Objective-C got the work done, and many codebases were written in it, especially at Apple. And yet, the whole community switched pretty quickly.
One could argue that Swift was easier to pick up for newcomers. While that's true, I would then expect the argument to apply also to SQL alternatives.
So, what is the difference here?
Apple is automatically the loudest voice in the room for iOS development. If they embrace Swift, the writing is on the wall for Objective-C. It's not just sticks, I'm sure they also put a lot of effort into making the transition as easy as possible.
In SQL - there is no equivalent. I think a better example is x86.
It took a generational new form of computing (mobile) to give ARM the momentum to seriously challenge x86. It's been almost 15 years since the original iPhone and we're only now seeing ARM based processors in computers.
SQL IMO is EVEN harder to displace than x86. x86 has a massive ecosystem but only only two serious manufacturers. SQL has a similarly massive ecosystem AND is the de facto language of the vast majority of major databases. Going to be very hard to unseat.
Are we just ignoring the Acorn Archimedes series of computers which gave rise to the ARM processors in the first place, fully 20 years before the first iPhone was launched?
The consensus among relational theorists appears to be that data types can be arbitrarily complex. For instance, C.J Date and Hugh Darwen write the following ([1] page 56):
Third, we remind you that types are not limited to simple things like integers. Indeed, we saw in Chapter 1 that values and variables can be arbitrarily complex—and that is so precisely because the types of those values and variables can be arbitrarily complex. Thus, to paraphrase a remark from that chapter, a type might consist of geometric points, or polygons, or X rays, or XML documents, or fingerprints, or arrays, or stacks, or lists, or relations (and on and on).
I think the same can be said of Java -> Kotlin, C -> Python (I know, I know), and lots of other medium-to-large scale language migrations over the past several decades. When people move to a new language, it's because there's strong interoperability with what came before that everyone would like to quit using but can't because they have too much invested in it.
This suggests to me that anything that wants to beat SQL will in fact have to compose with it - probably partly by generating it, but also by having a fairly solid "drop down to SQL" story. In other words, a language that, at least on the read-side, can somehow take two separate SQL queries and automatically rewrite them as subparts of a different SQL query. It might not be fast, but it needs to work, because you're going to want to reuse that work on that one gnarly query you did that gets all the important business metrics, and you also are going to want your results to be free of consistency issues.
This is only true for hobbyists and indie/startup developers. The largest and most popular iOS apps remain Obj-C (with a lot of C++ and in-house frameworks in the mix). There’s no incentive to rewrite something like the Facebook app in Swift.
If one company had all of SQL market share and they switched to a new lang, everyone else would have too.
But that's not the case here.