The square brackets made very clear when you were engaging with SmallTalk style message passing semantics. But the language was still a full superset of C, allowing calling any legacy code you wanted. Or using C syntax for performance critical parts.
And for as much criticism as the language received, it was still the perfect fit for the original iPhone. Performance critical parts in C for the constrained hardware resources. But still allowed for rapid iteration on dynamic UIs for applications.
Meanwhile, Swift is such a mess that even its own creator said the following in an interview:
“Swift, the original idea was factor complexity (…) massively failed, in my opinion (…) Swift has turned into a gigantic, super complicated bag of special cases, special syntax, special stuff”
Now 10 years later, Apple is stuck with hundreds of engineers trying to improve the language, rewrite some of the API in Swift. And Apps as well. All with very little user benefits. This actually reminds me of Apple without Steve Jobs era.
Imagine if Apple had simply said we are going to use Objective-C for another 10 years on a wait and see approach. I think the decline in Apple's quality is simply because when Apple has too many resources, different teams are all trying to get resources and put something out for credit in Resume Driven development.
It’s a stark contrast to e.g. Android world where making use of languages/libraries outside of the JVM bubble is technically possible but not necessarily a good idea in many if not most situations due to the caveats involved.
From the official NDK documentation.
"The NDK may not be appropriate for most novice Android programmers who need to use only Java code and framework APIs to develop their apps. However, the NDK can be useful for cases in which you need to do one or more of the following:
- Squeeze extra performance out of a device to achieve low latency or run computationally intensive applications, such as games or physics simulations.
- Reuse your own or other developers' C or C++ libraries. "
Anyone that keeps not understanding that point of view naturally run into walls that Android team has no plans to ever change.
Isn't Truffle/GraalVM an attempt to bring in more interpreted languages under the JVM?
This is the aspect that IMO was most harmed by the transition to Swift - and then later to a much deeper extent by SwiftUI, which makes quickly refactoring UI code very painful.
I'm surprised the wider FOSS community didn't adopt the language. I've been building a GTK4 app recently, and the macro-heavy class boilerplate, C-style casting everywhere, and custom signaling mechanisms would all be far cleaner in Objective-C. It's easy to imagine glib and GTK as what could have been a FOSS parallel to Core Foundation and Cocoa.
If you like C, then Objective-C is definitely worth a look. You don't need a Mac to try it either [1].
I'm not sure. I've written so much GUI code over several decades, and I think dynamism is only slightly helpful. I've been writing a new GUI in TS (not at all ready for publicity yet) that aims to rethink GUIs from the ground up as if the 80s and 90s never happened, but with the benefit of hindsight, much like Go did with C. I've been meaning to do a proper write up on some of the innovations I think are genuine improvemenets over the status quo. I should probably do one at a time and start today instead of waiting until release like I planned. But in my GUI, dynamism is only needed in maybe one or two core places. I'm not sure it makes any use of the fact that JS has string keys (equiv of objc_msgSend/etc), and can probably be written in boringish C++ just fine, or maybe even boringish Go, although op overloading would clean up one or two APIs really nicely.
It does look interesting though it lacks most modern DX, which means its adoption is going to be limited I imagine
I wonder why this type of style hasn't caught on with React and friends? It would be really nice to be able to have an AppKit-quality UI programmable in React or Svelte.
I know I know mobile blah blah. But lots of web apps are complicated enough to only be useful on a large screen, like Figma.
I remember that the company behind it was called "280 South". They seem to have opensourced it before they shut down.
There is only OS ABI, and the ABI of C compilers tend to overlap with the OS ABI, on the cases where the OS was written in C.
This is easily visible outside the UNIX ecosystem.
They were often used together though.
And to some degree I echo the sentiment as well. While I was never in search of the divine programming language, I too felt that as Objective-C was being sunset and Swift was in ascendency, perhaps it was my time to also step out of my career — sunset myself so to speak.
Swift was something of a hard sell for me. It seem(s/ed) to borrow everything from every popular language allowing two different code bases to look as though they might have been written in two different languages (depending on the preferences/style of the two coders).
To be sure, a lot of the young engineers seem to have been drawn into the Apple ecosystem not because, like me, they grew up worshiping the user-interface brilliance of the Mac but because they are fans of the Swift language.
And like the author of the piece, I say, "Knock yourselves out, kids. Sayonara."
That's ok, C++23 is going to add another group of features that will be half-adopted at best in legacy codebases that will totally fix everything this time for real.
[0] in the same codebase via the unholy chimera that is Objective-C++
And ARC (automatic reference counting) blew my mind when it was first released. Rust was a blip on the radar, so the idea of automatic memory management, handled at compile time, without GC, was amazing.
The only worthwhile runtime available (that doesn't depend of MinGW or some such) is libobjc2 from GNUstep. I decided to not use the full GNUstep Foundation since it is clearly bloated and reflects a very Java-esque sensibility of the 90s, not to mention it depends on third party libraries like libcurl and whatnot. However, it turns out that the root class NSObject is defined in Foundation itself, and you need a root class to get anywhere with the language.
Fine, I decided, I'll write my own lightweight root class. That turned out to be so much more than I bargained for. In the end, I have one that supports manual reference counting and ARC (GC would've meant dealing with Boehm, one problem at a time). https://gist.github.com/forksnd/264d80858ee98e6d44e89e8972c0...
However, it is clearly not done. I can't invoke an arbitrary method on an object through the smalltalk syntax (get compilation error) and trying to do it through objc_msgSend fails silently. I was just trying to get the method tracing working, but it seems like it requires pthread (so Linux only then?).
It's insane how trying to get a minimal working workspace in this language is so difficult. No, I don't want a huge framework, all I want is inline SmallTalk in C. No wonder this language never found any footing outside of Apple's walled garden.
I certainly managed to use it for some test programs a number of years ago.
EDIT: Apparently, MSYS2's Clang has an option "-gcodeview" that can generate PDBs. Would try it tomorrow and see how it goes.
This is why it has been an ordeal. I came to a similar impasse. It went away when I changed my mind. It's a little bloated I'll give you that, but it's not that bad. Certainly better than bootstrapping 10+ years worth of language features
That's the thing, I think ignoring those library features and rethinking the role of message passing OOP in plain C can actually lead to a much better language. But I do need a root class.
At the time my supervisor wanted to save the research done in 3D visualisation techniques with particles engine, developed on a NeXT Cube, and the Apple/NeXT acquisation was yet to happen.
The department was ramping down their use of NeXTSTEP, was it was clear the OpenSTEP efforts were also not going to save the company.
Thus several students got to rewrite applications from Objective-C into something else.
Had they known what would happen with NeXT's reverse acquisition of Apple, and OS X, most likely those thesis proposals would never happened in first place.
this is deeply uninformed, with bald prejudice added.
(Seriously, if you feel the temptation to do that, don't waste your time. You won't get the nice quick answer you want. A better use of your time would be trying to translate 間 into English.)
The work has bogged down whatever interest I have in programming, and the only sane solution is to somehow magically remove all financial burdens, go into a cabin in a mountain, and program my own projects and read some science, preferably with a dog and a fire.
https://corecursive.com/remote-developer/
To give just a taste, here's a forum post that quotes a few highlights from that CoRecursive episode:
https://retrocomputingforum.com/t/remote-developer-1970s-app...
The problem Steve had with that nice ObjectiveC system was that the fools who ran Corporate America were from the generation that was still "shell shocked" from all of the vendor lock-in that went on during the wars between DEC, IBM, HP, Spurrows (Unisys) and smaller players like Nixdorf, none of which were software (or hardware) compatible with each other, meaning their customers were held hostage, often with incredibly expensive long term contracts on less than state-of-the-art machines. In the new Desktop era (that had just begun), for a short while at least, they wanted "cross-platform apps" that ran on universal hardware (think PC and Mac "clones") and that meant using object oriented frameworks and the only OOP systems that were mature enough to have platform-specific GUI libraries for DOS, Windows and Mac were some interesting Smalltalk packages and some exotic C-macro based systems like Neuron Data's Open Interface Toolkit and eventually Microsoft MFC (which was available for Classic Mac and all the versions of Microsoft Windows). Of course, as Windows took over the game, the need for cross-platform apps ended --just as Visual Studio, Microsoft FoxPro/Access and Visual BASIC were cleaning up and really locking everyone in for the decade. There was no more need for object oriented systems like NeXTStep or Smalltalk.
But then the WWW became a thing and NeXT made a bold move with WebObjects, which allowed their ObjectiveC visual tools (Project Builder) to output HTML in realtime. About the same time, Sun Microsystems launched Java with a really terrible UX library, but the promise of the portable Java interpreter (which was similar to the promise of UCSD p-code back in Steve's earlier Apple days) and that meant Java could "run (ugly looking but portable) code" in web browsers on any hardware. Oh happy days, a way to get back into Corporate America without the word Microsoft on your business cards. While the FIRST end of ObjectiveC is described in the article (before NeXT picked it up), Steve saw the potential to replace that ugly Java UX library with their WebObjects masterpiece and pivoted to rewrite WebObjects to output Java --and that was the SECOND end of ObjectiveC.
Somehow my Dad's old colleague working with Rear Admiral Grace Hopper on behalf of CalPERS was able to bail the shareholders of NeXT and Apple out. But NeXT was all about WebObjects by then and you couldn't run Macs with WebObjects (that would be a "thin client" which was a much maligned concept and offered no value-add for Apple), so that meant Avie Tevanian's team had to hold their nose and fuse the stinking classic Mac operating system into NeXTStep, breathing new life into ObjectiveC.
As that post-NeXT Mac operating system was "forked" to make battery powered phones, the idea of allowing developers to write apps in ObjectiveC became a liability. Suddenly they had a class of "fart apps" that were draining batteries, closing unexpectedly, heating up devices in people's hands, etc. --and to a casual user with limited computer experience, that looked like an iPhone problem. Apple had the incredible app-review process going on, but it's not the correct prescription for curing stupid programming. They needed a solution like Java (being used by their competitor and embroiled in litigation involving Sun/Oracle/Microsoft/Google) or even their old UCSD p-Code interpreter that ran Pascal, a language that had long been obsoleted by Ada, which everyone with half a brain hates. So..
> The end came for Objective-C in June of 2014, when Apple announced the debut of Swift
I wouldn't say that the THIRD death of Objective-C happened in 2014 nor is it upon us in 2025. Aside from a lot of existing code and performance reasons to use Objective-C, there is also the fact that the popular AI coder models were all trained on GitHub 2023 and the Swift code that's out there is from six different versions of the language, mostly written by people who were just learning it (so it's not leveraging Swift's value prop much). Cross-platform Linux/Mac code, like Moonlight for iOS, is also written in Objective-C (that had superior C++ integration than Swift until very recently). It's possible to remove ObjectiveC from Apple's product line, but it can't be a priority given the industry move to agentic-apps.