I will admit my interest in learning APL has been piqued. Though I do find it odd that they never once mention the most popular array programming language of all time, MATLAB
Sure, you can learn to read and write what superficially looks like line noise, but the market has spoken: very few people want to.
Look at the success of Tensorflow and PyTorch. They’re also tensor / array programming systems. They’re wildly popular, with at least two orders of magnitude more users each than all array languages put together.
The difference is that they’re built on top of Python, which is famously user-friendly in its syntax.
More importantly: they were parallel and GPU-accelerated from the beginning.
What’s the point of an array-based language if it’s slower and harder to write than the equivalent in mainstream languages!?
Their current niche of “quants analysing time-series data” is tiny and shrinking all the time.
There was even a chap here advocating the speed and terseness of his preferred array-based language.
Meanwhile the equivalent in Rust was something like a thousand times faster and not much longer!
Array-based languages have been infected by a particular style largely unique to certain types of mathematicians: brevity over clarity, obscure syntax over English, idioms instead of identifiers, etc…
They’ll never be popular while they remain purposefully niche, appealing only to the type of developer that uses single-character file names: https://news.ycombinator.com/item?id=31363844
How many times would you throw away a 1000 line program and start over with completely different data structures and algorithms, just to try a different approach? What about a 10 line program? Or a 1-line program?
When it's basically free to rewrite your entire program, you tend to explore more. That's one benefit terseness affords you.
Indeed it does! Not to try to be confrontational, but have you written any substantial programs in an array language like APL? I'm sure that any APL programmer will be the first to tell you that writing APL would be unbearable if those "unreadable" symbols were replaced with names! Why? Because in APL, each symbol is a unit of meaning, and there's simply no reason for each unit of meaning to be more than a single character. Why should I type `add folded divided_by count` when I can just write `+/÷≢`?
> Sure, you can learn to read and write what superficially looks like line noise, but the market has spoken: very few people want to.
True! And it will surely always be true! But no one would want to write APL without the symbols. ("But isn't Tensorflow just like APL without the symbols?" No. Tensorflow is based on the array paradigm, but it is very, very different from an array __language__ like APL.)
> obscure syntax over English
This is effectively like berating the Chinese for inventing a writing system that looks nothing like the Latin script. Is it totally different? Yes. Does that make it inherently bad? No. (Can it still be bad? Yes! But I don't think APL and friends are as bad as people might think.)
Anyway, that's my two cents on the obscure syntax of array languages are a tool, not a problem. They'll always limit the userbase, there's no doubt about that, but I couldn't imagine a world without them.
Except for Excel. Which is probably three orders of magnitude more popular, if that is how you measure things.
APL is a lot cooler as a language, but is missing a lot of built-ins that Matlab has for numerical computing. The APL community would just tell you to write a few lines of code to implement what you need rather than call out to a massive black-box function. It's certainly a better approach if you have the time/ability, but I often don't. More recently, Dyalog Apl and Kdb+ (the two big commercial Apl or Apl-derived languages) have support to use Python libraries if you need that.
The array language community seems friendly and exceedingly competent. It also appears to have a strong "get it done" attitude which prioritizes engineering over freedom. The community is tightly entwined with proprietary software.
I can't bring myself to invest the time to meaningfully learn APL because it's hard to see it as a real investment—the community doesn't own its contributions. It looks like trading dollars for arcade tokens.
As much as I would like people to look at my implementation, if you want to play around with something more complete, and also uses symbols in the same way APL does, then BQN is the one to look at.
You know, there's a reason why there's so many "one-man hobby project" implementations of, say, APL. It's actually not that hard! If you're willing to spend a week (yes, a week), you can have your own barebones implementation of APL. And from there, it's really sort of your playground, which is neat. Thought of a cool idea? Implement it! Obviously, most people won't want to do this, investing time (even if it's not that much) in something they don't really even have any experience invested in. But it's an easier option than you might expect.
But that holds for other paradigms too:
Host: Conor Hoekstra
Panel: Marshall Lochbaum, Adám Brudzewsky, Stephen Taylor and Bob Therriault.
The design of Go feels almost as if he doesn't even know about those ML languages and it feels as if he doesn't like FP. Of course he probably does know about ML style languages. But I would be very interested to hear his take on it and his opinion of FP because I couldn't find anything from google. Anybody know of any links to writings/videos where he elucidates his viewpoints?
I think Pike definitely had not, at that point, explored the way types work in ML-style languages.
FP wasn't really referenced at all in that quote. FP definetely has it's own vocabulary, but nowhere near as extensive as OOP.
I think it'd be a bit of a shame if everyone were pushed to be a polyglot with a finger in every currently popular paradigm. Connections between different approaches to programming are very useful, but the sort of effort made in Go, to refine and simplify one imperative/OO approach, also helps push our understanding of programming into new territory.
Seems like there's an obvious solution that was avoided or not known about here. It's not even about being clever. Its about being practical.
I don't think you can attribute this to ignorance, because it's very clear that Pike is not ignorant of programming language design - even if you disagree with his decisions, he has had tremendous success at implementing his vision. To me it speaks more of disinterest - FP doesn't seem to really register on his radar as a useful mine of PL ideas. That's fine, although I agree with the previous poster that it's ironic to be such a proponent if APL, and yet have such a blind spot to another very fascinating area of his field.
I remember having a course about programming paradigms which introduced different languages and we wrote a small project based on functional languages.
Algebra based designs can be formed around many data structures and many languages generalize this concept like Haskell. With Haskell you can create your own algebraic DSL around arrays and anything else you can think of. It seems he's enamored with the specific array instance of algebra based designs and unaware of how it's only one specific case of a general concept.