background - I am a computer science major with 30+ years experience. I did do a mandatory class of 'implement your own lisp' many eons ago. It just never really 'clicked' for me. I do, by accident, assimilation and lazyness,employ FP style designs in my software. And I guess fp techniques gradually rub off on me from e.g. javascript, lambdas,closures, and map-filter-reduce. in particular, lambdas are useful to me. But I am one of the guys who continue to read the "let me tell you what monads really are", and every time I fall off the bicycle. So, well, I appreciated this 'Xfor 5year olds" :-)
Haskell is functional in that it demands its functions be functions, not subroutines. A function has inputs mapped to outputs and no side-effects. Functions can be composed and composition always works. Haskell uses monads to represent the regrettable fact that having an impact on the outside world is, in a very real sense, a side-effect, so it marks all side-effecting functions with an indelible stain. Haskell requires a different mode of thought from Python, or even from C++, and it's definitely not another Lisp.
That the "enlightenment" of Lisp is that you can use functions everywhere. Write macros that look like functions and modify behavior, and build your code as a language.
Things like monads are more on the evolution of functional languages, and I also fall off the bike. It's as difficult as you want it to be, and I find scheme and lisp to be easier high level languages than javascript or python and makes more sense.
The forward and preface to SICP is good reading.
https://mitp-content-server.mit.edu/books/content/sectbyfn/b...
The Dan Friedman books are pretty good in general: "The Little Schemer," and the sequel "The Seasoned Schemer" which are both more "recursion" books. He also has another book "Scheme and the Art of Programming." Which I think is a great comp sci book that's not too difficult and doesn't seem too well known.
How to Design Programs is supposed to be a pretty good comp sci intro:
"Liberal arts," nice :)
I love Lisp. The last few paragraphs are a pretty good description. It's nice to have a very flexible set of tools, instead of being forced to conform to object-oriented design or whatever paradigm. IMO the only legitimate reason in sticking steadfast to a design paradigm is for performance reasons, but of course this can only really justify array programming/imperative programming. But at the point where you want some flexible abstractions, it's nice to have the power to do introspection, delayed evaluation, and so on. Disclaimer: my background is physics/math, so function abstractions are much more intuitive to me than objects, or whatever other structures are taught to CS students.
https://news.ycombinator.com/item?id=28851992
https://news.ycombinator.com/item?id=44359454
No comments on any of them.
It sounded of interest to me, but I read it and closed the tab within a page or so as it wandered off into tech arcana. Shame. There may be an interesting idea in here but it's phrased in terms I think few will be able to follow and understand.
I did not finish it but I saw no mention of the lambda calculus or of currying, both of which -- from my very meagre understanding -- seem directly relevant to what I understood to be the core point, which seems to be about anonymous functions.
The extra point might be that more languages should facilitate defining your own control abstractions just as they support defining your own data abstractions. Functions are one way of making data abstractions, but languages often provide multiple ways. Closures are one way of doing a type of control abstraction (involving such things as delayed or multiple evaluation), but there are other ways too. For some reason we see value and a need for defining our own data abstractions, but not so much for control abstractions, even though (according to the book) once they were often co-designed, like Fortran's arrays and DO loop. And for some reason even in the few languages that do support making your own control abstractions, like Lisp, you'll still find users who disapprove of doing so, claiming all you need are the standard existing methods like looping, map/reduce style functions, and some non-local exits.
In both Lisp, and C++, taking some isolated snippet from a codebase, you can't quite really be sure what it's doing without reading the rest of the program because of what might be called the "excessive" power of the languages.
In C, it is much more likely that you can look at an isolated snippet of code from some codebase and be reasonably sure about what it is doing, and be able to extract this snippet more or less as-is, and re-use it in some other, unrelated code base. At least, this has been my experience, ymmv.
[1] https://www.tuhs.org/cgi-bin/utree.pl?file=V7/usr/src/cmd/sh
Graham's On Lisp is a really interesting book
https://paulgraham.com/onlisptext.html
which is allegedly about programming with macros but I'd say 80% of the time he implements something with closures and then makes a macro-based implementation that peforms better. That 80% can be done in Python and the other 20% you wouldn't want to do in Python because Python already has those features... And if you wanted to implement meta-objects in Python you would do it Pythonically.
Graham unfortunately doesn't work any examples that involve complex transformations on the expression trees because these are hard and if you want to work that hard you're better off looking at the Dragon book.
You can work almost all the examples in Norvig's Common Lisp book
https://www.amazon.com/Paradigms-Artificial-Intelligence-Pro...
in Python and today Norvig would advocate that you do.
my_if (points <= 100, printf ("%D", points), error ("Invalid point total"));
Where the various parameters are lazily evaluated.
Or like: frobnicate (frazzle: foo, frozzle: bar, frizzle: baz);
Where frazzle, frozzle, and frizzle are position-independent keyword variables.Allowing those in C would require a modicum of effort, while other languages make these kinds of syntax extension fairly easy.
Variable<Integer> x = newVariable();
Expression<Integer> = add(x,literal(5));
x.set(15);
System.out.println(eval(x)) // prints "20"
and it is not that hard to either serialize these to code or run them in a tree-walking interpreter where quote() and eval() imply an extended language where you can write functions that work on Expression<Expression<X>>. Type erasure causes some problems in Java that make you sometimes write a type you shouldn't have to and you do have to unerase types in method names which is a little ugly but it works.I did some experiments towards this to convince myself it would work
https://github.com/paulhoule/ferocity/blob/main/ferocity0/sr...
had I really kept at it I would have bootstrapped by developing a ferocity0 which was sufficient to write a code generator that could generate stubs for the Java stdlib + a persistent collections library and then write a ferocity1 in ferocity0, and if necessary ferocity(N+1) in ferocityN until it supported "all" of Java, though "all" might have omitted some features like "var" that are both sugar and use type inference that ferocity would struggle with -- if you need sugar in this system you implement it with metaprogramming.
The idea is that certain projects would benefit from balls-to-the-walls metaprogramming and the code compression you get would compensate for the code getting puffed up. My guess is a lot of people would see it as an unholy mating of the worst of Java and Common Lisp. However, I'm certain it would be good for writing code generators.
I like those that read something like a punch-line, that come across as something different that just a summary of the article. But these maybe work best for literature, prose, movies, etc.
R takes it up a notch though by making all syntactic constructs boil down to a function call. Function definitions are themselves calls, for example, and so are assignments and even curly braces.
If the purpose is to try and convince non-Lispers to use Lisp, a more convincing argument (for me at least) would be to demonstrate modern commercial software written faster and more bug free in Lisp.
For example: "Here is a modern biz web application written in Lisp" showing step by step how Lisp makes the development process faster and less buggy than implementing the same application using (say) Typescript/C++.
Notes: I use custom code generators to generate more than 90% of the Typescript/C++ code needed to implement biz applications. Leaving only the core biz logic. So macros for code generation doesn't really give me anything I don't have already. And using macros for defining my own DSL's within the language would just makes the code unreadable for other developers. So it is not a feature I actually want.
typedef int fn_t(int, int);
int iff(bool cond, fn_t a, fn_t b) {
if (cond)
return(a());
else
return(b());
}
Now just write the implementation in terms of a() and b(). I don't get it. C doesn't have convenient syntax but this is compiled and not an evaluated language. This argument didn't make sense to me.The fact that you can't create functions at runtime is only due to needing a compiler. If you link your program against a compiler you can absolutely turn strings into code at runtime and then pass it around to other functions.