Here's JS string interpolation:
`OK ${10+20} == 30` // result: "OK 30 == 30" — a JS string
Here's a quasiquote-unquote:
`(OK ,(+ 10 20) == 30) ; result: (OK 30 == 30) — a list of 4 atoms
In my toy Lisp I used $ instead of comma, which I found a tad "more readable" coming from today's JS, Perl, Bash world. (And makes quasiquote... "the money quote"? =)
handle = (exists(file) and open(file))
That uses the short-circuiting behaviour of && to only open a file if it exists. Not great practice to write it in that way IMO but whatever. You can't implement it as a function: handle = and(exists(file), open(file))
def and(exists, file):
... No viable implementation in Python afaik ...
That doesn't work because functions arguments don't/can't implement short circuiting - both arguments are evaluated before our function is called (although we could make it work in this case by passing the file name in - the point is that the function signature has to change).Macros however can implement something that has short circuiting behaviour. It is a mechanism for implementing new syntax - more powerful than functions but as a trade off more error prone. Quasiquoting and whatever falls out of that as implementation details once you've got all that as context.
The challenge with the syntax is that there is no syntax. Work that we're used to offloading to syntax is instead carried by your brain.
Kinda like the challenge to eating salad is all the chewing. It's not just that the brain needs to parse a lot more language to express a simple concept; it's also that the fingers also need to type all that out.
I wrote a book whose third chapter is a much "gentler" introduction to macros. I don't know if it's actually intelligible (see: the monad tutorial fallacy) but it presents them the way that I was first able to understand them -- explicitly starting without quasiquote and working up to it. Easier for me than getting lost in the notation. https://janet.guide/macros-and-metaprogramming/
If anyone actually wants to get their hands dirty to learn about Lisp macros, I recommend picking a Lisp implementation like SBCL, GNU Guile, Emacs, Clojure, or Hylang depending on what kind of environment you're comfortable with. The key about each of the Lisp implementations I mentioned here is that they all support "Common Lisp style macros", which are the bare bones most obvious way to do macros in Lisp.
Then I recommend using your choice of Lisp to implement a language feature you use in another language. It doesn't matter if that language feature already exists in your choice of Lisp, you can still implement it yourself. For example, you can choose to implement C-style for loops or while loops, asynchronous coroutines like Go, pattern matching, lambdas, whatever. I actually implemented asnyc/await in IronScheme and pushed it upstream[0].
If you want to read more about Lisp macros, I have really enjoyed the book Let over Lambda. I have also heard a lot about On Lisp by pg, but I haven't read that myself yet. Also if you really want to dive off the deep end into the beauty of programming, I recommend SICP.
(defun compute-hash (key hash f)
"Get the value for KEY in HASH or compute it with F, enter into HASH and return."
(multiple-value-bind (val win)
(gethash key hash)
(if win
val
(setf (gethash key hash) (funcall f key)))))
(defun memoized (f)
(let ((cache (make-hash-table)))
(flet ((memo (x g) (compute-hash x cache g)))
(lambda (&rest args) (apply f #'memo args)))))
(defun fib (n)
(if (<= n 1)
n
(+ (fib (1- n)) (fib (- n 2)))))
;; MEMO is a function that takes a key and a computing function.
;; If a key has been memoized, it returns the cached value, otherwise it calls the computing
;; function with the key and caches the result.
(let ((example (memoized (lambda (memo x)
(format t "X: ~a~%" x)
(let ((result (funcall memo x #'fib)))
(format t "~a~%" (* 2 result)))))))
(trace fib)
(funcall example 5)
(funcall example 5)
(funcall example 5)
(untrace fib))> I want it to be the case that this function only actually calls do-something-very-expensive once per unique value of x, even across separate invocations of dumb-example.
My code memoizes results of function FIB _inside_ the lambda assigned to EXAMPLE, even across separate invocations of EXAMPLE.
if you're willing to do the global transformation yourself instead of enlisting the computer to do it for you, you don't even need macros at all; you can do that with henry's example:
const resultMap = new Map();
above the function> I think that you basically can’t, in JavaScript. Or, more accurately: I can’t think of a way to do it.[1]
Oh, this is a case for WeakMaps right?
const MemoCache = new WeakMap();
function memoize(f, x) {
const cache = MemoCache.get(f) || new Map()
MemoCache.set(f, cache)
if (!cache.has(x)) {
cache.set(x, f(x))
}
return cache.get(x);
}
Oh wait:> 1. You could create a global memoization map keyed on the function that you’re calling, but this would actually have different semantics than I’m imagining. If I said `memoize(f, 1) + memoize(f, 1)` I would expect those to each invoke `f`, because instances of `memoize` shouldn’t share results. Why not? Because this is a fake example, and a global memoization is a different (easier!) thing than per-call-site memoization.
Like I get what you're saying but you could just cache the call site too?
const MemoCache2 = new WeakMap();
function memoize2(f, x) {
const callsite = new Error().stack
const macro_cache = MemoCache2.get(f) || {};
const micro_cache = macro_cache[callsite] || new Map();
macro_cache[callsite] = micro_cache;
MemoCache2.set(f, macro_cache)
if (!micro_cache.has(x)) {
micro_cache.set(x, f(x))
}
return micro_cache.get(x);
}
I admit that this is something of a trickery though, but I mean, it's trickery specifically to work around that this person doesn't want to write `const my_f1 = memoize(f), my_f2 = memoize(f)` in some location on the screen. Precisely because people who write JavaScript are not accustomed to macros, they are not expecting `memoize(f, 1) + memoize(f, 1)` to be a proper memoization expression, they aren't expecting weird stuff with weakmaps and inspecting stack traces to identify call sites and all that.i mean, you know, this isn't really... this isn't really a thing that you would ever want to do, but i am glad that life found a way
how to fibonacci space complexity time complexity
------------------------- ---------------- ---------------
insane recursion exponential exponential
memoized insane recursion linear linear
The space complexity of "insane recursion" without memoization is the maximum stack-depth; the worst case stack is, fib(n)
fib(n-1)
fib(n-2)
...
fib(1)
Which is n stack frames (and the stack frames are of constant size); the space complexity of the whole thing is thus linear in the size of n. (While the call tree is itself exponential in size, the memory required is only the depth of that tree, since we can't call fib(n-1) & fib(n-2) simultaneously[2].(The time complexity is, of course, exponential, and I agree with the "insane" moniker. I also like your comment elsewhere in this thread about people hyperfocusing on the example and missing the larger point of the article … and I'm so sorry but I've been sniped by this.)
[1]: https://ianthehenry.com/posts/fibonacci/
[2]: the little demons in my mind are now trying to scheme up an insaner recursion that attempts this. Threads maybe?
It's absurd to actually compute it that way, but it's beautiful to express it that way.
From the blog post:
def fib(n):
if n <= 1:
return n
return fib(n - 1) + fib(n - 2)
Easy to read, easy to comprehend. You just need a smart compiler to do it efficiently."Accidentally Quadratic" was a fun tumblr dedicate to more real world instances of this, and it's sad they're not longer posting, though I still use that term to name real-world sightings of O(lol) time. (O(lol) space is just called "Java".)
¹and since you say "to actually compute it that way", I think you get this, but I want to point it out here
8 / 41 = 0.1951219
(8 + 41 = 49) / 8 = 6.125
(49 + 8 = 57) / 49 = 1.16326531
(57 + 49 = 106) / 57 = 1.85964912
(106 + 57 = 163) / 106 = 1.53773585
That second line is screwed up, which also screws up the subsequent lines. It should look like (41 + 8 = 49) / 41 = 1.19512195
which then means the line after that should be (49 + 41 = 90) / 49 = 1.83673469
and so on local fib do
local impl impl = function(n, n1, n2)
if n == 0 then return n1 end
return impl(n - 1, n1 + n2, n1)
end
fib = function(n) return impl(n - 1, 1, 0) end
end
for i = 1, 10 do print(fib(i)) endI've wanted the ability to reference the callsite in functions, and lobbied the V8 team for something like arguments.callsite, but was (probably rightly) politely told no.
But if you're willing to abuse tagged template literal syntax, they're really just function calls, so you can do something like:
const dumbExample = (x) => {
while (someComplicatedStuffHappens()) {
pretendLikeThisFunctionIsBig();
}
const result = memoize(doSomethingVeryExpensive, x)``;
doMoreInterestingWork();
}
memoize() must return a template tag function, which will be invoked with a TemplateStringsArray (because of the ``) that can act like a callsite identifier, which can be a key into a WeakMap of memoization dictionaries.It's mostly a curiosity because who wants that syntax, but it's interesting that JavaScript does have the special power hidden behind one feature.
[1]: https://gist.github.com/aziis98/c8de74a29d5b1721983501fe28eb...
I don't know precisely but this feels like this would be very useful for something. I'll continue to experiment with this in the future
> This allows the tag to cache the result based on the identity of its first argument. To further ensure the array value's stability, the first argument and its raw property are both frozen, so you can't mutate them in any way.
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...
Is it really absurd? If the compiler can turn it into iteration, then it's a big boy compiler. If not, then meh?