As somebody who read a couple of the author's books, and also somebody who spent almost a decade studying compilers, I am genuinely curious about the author himself.
These works are something I both understand and would never achieve myself. These are cultural artifacts, like deeply personal poetry, made purely for the process of it. Not practically useful, not state of the art, not research level, but... a personal journey?
If the author is reading this... can you share your vision? Motivation?
nils-m-holm 1 days ago [-]
Thank you so much for reading my books and describing my work in such
beautiful words! You basically answered your own question! My motivation
is just the creation of something I find beautiful. The vision, to pass
knowledge to those who seek it in the simplest possible way, where
"simple" does not necessarily mean in the tersest form, but in a form
that invites being digested.
I do not usually talk much about "myself". I tried, but with no-one
asking, I find it difficult to say anything.
mark_l_watson 19 hours ago [-]
I just bought ePubs for your Raja Yoga Revisted (I usually study SRF material, but alternatives are good!) and Scheme 9 for Empty Space. Your web site is very nice, I loved the ‘Who am I?’ page. I have been using Lisp languages since 1978 but except for studying Peter Norvig’s Lisp in Python, I have never dropped below the abstraction layer into a Lisp implementation so I am looking forward to that.
Thanks for An Introduction to Mental Development, I've throughly enjoyed it!
nils-m-holm 24 hours ago [-]
So have I! :)
And thanks for the Cage quote. I enjoyed that, too!
ronald_petty 1 days ago [-]
I just ordered this book. Looking forward to learning! Thank you for your effort.
butterisgood 1 days ago [-]
I love that your lisp implementations are so portable. I believe one is available for Plan 9.
nils-m-holm 24 hours ago [-]
Thanks! And yes, Scheme 9 from Empty Space should compile on Plan 9! http://t3x.org/s9fes/
latexr 22 hours ago [-]
> Scheme 9 from Empty Space
As a fan of word plays, that got a genuine chortle out of me. Thank you.
matheusmoreira 1 days ago [-]
Thank you for your work! I share your feelings.
marttt 1 days ago [-]
+1, long time follower of nmh's work. His books are brief and concise, but carry a peculiar "something", a precision of expression etc that is hard to put into words - but can often be noticed in long-time practicioners of some mental teaching. :)
It is always interesting to spot a person on the interwebs who seems to actually have managed to turn buddhist or some other teachings into real world deeds. Living really modestly (IIRC, he/you also uses modest, underclocked laptops?), publishing for the benefit of many, and doing all this for years and years. Like, there seems to be no "overhead" in this way of living. Hugely inspirational.
I would also point out the "Essays" section on nmh's webpage, especially the ones discussing sensitivity and high IQ: https://t3x.org/#essays
Having purchased several of your books, thanks for your work, nmh!
nils-m-holm 24 hours ago [-]
Thank you for your kind description of my work!
Turning the Buddhist (or other) teachings into deeds
is not too hard once you have understood who you are,
and, maybe more importantly, who you are not. Figuring
/that/ out can be tough and require a lot of practice.
What people perceive as modest is really an acceptance
or even appreciation of what is. My apartment has not
been renovated in decades, I repair what needs repair and
otherwise leave things to themselves. I wear clothes until they
disintegrate, and my hardware is already old when I
buy it. This is the course of things. Things age and
change and at some point disappear. Why prefer the new
over the old? Why the old over the new? It is just that
things and beings get old on their own, and it is much
more joyful to witness this than trying to resist it.
nils-m-holm 24 hours ago [-]
Many thanks to everybody who is wrote in this thread! Your words mean a lot to me! I will reply to some individual messages. If I don't, please substitute "thank you!" :)
AlexeyBrin 1 days ago [-]
I second this, would be great if someone did a long form video interview with the author.
bakul 1 days ago [-]
Read the author’s “Raja Yoga Revisited”.
matheusmoreira 1 days ago [-]
> These are cultural artifacts, like deeply personal poetry, made purely for the process of it. Not practically useful, not state of the art, not research level, but... a personal journey?
I can't speak for the author but this is exactly how I look at the lisp I'm developing. It's a lifetime project. I had some kind of vision depicting how different things could be, and at some point I started trying to make it happen. I want to convince myself I'm not insane for thinking it was possible in the first place.
nils-m-holm 23 hours ago [-]
It's good to have a life-time project and watch it evolve over time. Nothing insane about that!
I love it so much, and seeing your bibliography makes me feel like a kid in a candy store. The confluence of Asian philosophy and computing is delightful.
Where is the bibliography? (I searched for it, but couldn't find it, expecting to find a list of books which the author referenced in writing/researching)
2wrist 24 hours ago [-]
That is a joy! Thank you.
tromp 2 days ago [-]
Looking at file church.scm from the provided zip file [1], I see the following
functions used to construct lists:
I forgot to add the atom boolean to the nil representation. I made some changes that hopefully fix that.
nils-m-holm 1 days ago [-]
Indeed, thanks!
2 days ago [-]
john-tells-all 1 days ago [-]
Purchased the author's `Scheme 9 from Empty Space` book and loved it. Lots of very well-commented and explained code, on how to build a language up from the beginning. So much fun.
I clicked on this and immediately wanted to buy it. But then someone in the comments said to also look at your other books and well damn, now I want to read all of them and I can't choose which to start with.
That's super helpful! I downloaded the samples for S9fES and LfN, checking those out first. I love how generous you are with the number of pages in the free samples by the way.
mindcrime 1 days ago [-]
Looks awesome. Just ordered a copy. I'm just now picking up Peter Seibel's Practical Common Lisp again and taking another stab at immersing myself in the world of Lisp. So this is perhaps fortuitous timing.
I love Lisp (I'm an Emacs user and often write in Racket for personal projects) but the one thing I never understood about the Lisp community is the emphasis placed on metacircular evaluators.
I sure find them beautiful and all, but why do they take center stage so often? Beside the aesthetics and instructional value, I don't get the appeal. Also I feel that a bunch of the heavy lifting behind metacircular evaluators is actually done by the Polish notation syntax as well as the actual implementation, and these concepts don't get nearly as much love.
Any Lisper who can illuminate me?
Quitschquat 1 days ago [-]
Long time lisper. It just doesn’t feel right unless your language can compile your language. It’s like wearing someone else’s underwear.
fuzztester 1 days ago [-]
or interpret
Y_Y 21 hours ago [-]
Agreed, I hate interpreting other people's underwear.
bifftastic 18 hours ago [-]
Username checks out
rootnod3 1 days ago [-]
The metacircular evaluator shows how code is data and data is code.
And in a way it’s like Maxwell’s equations. A simple proof of computation that also somehow implements a very neat language.
Y_Y 21 hours ago [-]
But of course you must close the loop by representing Maxwell's equations electromagnetically.
I know this is a classic analogy, but now you've got me wondering, originally Maxwell wrote a messy pile of equations of scalrs, later someone (Gibbs?) gave them the familiar vector calculus form. Nowadays we have marvellously general and terse form, like (using the differential of the Hodge dual in naturalised units),
d star(F) = J
My question is, when are we going to get some super-compact unified representation of `eval`?
tromp 17 hours ago [-]
It's hard to get a super compact eval for LISP with its many primitives.
It's somewhat easier for the lambda calculus which inspired LISP, with obnly the 3 primitives of variable, abstraction, and application. In the binary lambda calculus this allows for a self-interpreter
that tokenizes and parses a closed lambda term from a raw binary input stream and passes the term and the remainder stream to a given continuation [1].
It is a bit similar. The first versions in Lisp 1.0 and the Lisp 1.5 paper were working, but not fully refined. SICP and others later presented it a bit more refined in my opinion.
There's also version for the metacirculator interpreter written in full on M-expr, but they kinda break the spirit of things.
I think the version of eval that we have is already pretty terse for what it is. You could maybe code-golf it into something smaller, or you could code-golf it into something fully immutable.
My only gripe is that they all rely on an already existing reader that parses the expressions for you and represents them. Which is exactly what the book is about.
Finding a small enough interpretation that does ALL of it would be a dream, but I doubt it could be anywhere near as concise as the (modern) Maxwell equations.
kingaillas 19 hours ago [-]
>later someone (Gibbs?) gave them the familiar vector calculus form.
It was Oliver Heaviside (https://en.wikipedia.org/wiki/Oliver_Heaviside) that rewrote Maxwell's original equations (20 of them in differential form) into the notation used today (4 of them in vector calculus form).
Thanks for resubmitting the posting! I appreciate it! :)
nils-m-holm 4 days ago [-]
Second edition, with a new chapter on lambda calculus.
gritzko 2 days ago [-]
Thanks. I recently had to reinvent LISP to script my CRDT database.
That was not much work, because I already had the notation (I use RDX, a JSON superset with CRDT types).
Still, I stumbled at the idiosyncratic LISP bracketing. Luckily, RDX allows for different tuple notations. So, I styled it to look less alien to a curly-braced developer. Like this https://github.com/gritzko/go-rdx/blob/main/test/13-getput.j...
For example, print change-dir make-dir; is equivalent to (print (change-dir (make-dir) ) ) in the old money. I wonder if I am reinventing too much here.
Did LISPers try to get rid of the brackets in the past?
drob518 2 days ago [-]
There have been many attempts to get rid of sexprs in favor of a “better” syntax. Even John McCarthy, the inventor (discoverer?) of Lisp had plans for an “M-expression” syntax to replace “S-expressions.” It never happened. The secret is that Lispers actually view sexprs as an advantage, not something to be worked around. Once you discover symbolic editing and code manipulation based on sexprs, you’ll never go back to weak line editing. That said, some Lisp dialects (e.g. Clojure and Racket) have embraced other symbols like square and curly brackets to keep the code more terse overall and optically break up longer runs of parentheses.
Probably the best example of a “Lisp without parentheses” is Dylan. Originally, Dylan was developed as a more traditional Lisp with sexprs, but they came up with a non-sexr “surface syntax” before launching it to avoid scaring the public.
xedrac 1 days ago [-]
I actually really appreciate Racket's judicious use of square brackets in let expressions. It just makes visual parsing that much easier.
drob518 1 days ago [-]
Exactly. I also like Clojure’s use of square brackets for vectors and curly braces for maps. It eliminates all the “vector-” and “map-” function calls.
xedrac 1 days ago [-]
Those are big quality of life improvements. I wish the other lisps would follow suit. I suppose I could just implement them myself with some macros, but having it standard would be sweet.
tmtvl 21 hours ago [-]
The Revised Revised Revised Revised Revised Revised Report on the Algorithmic Programming Language Scheme (R6RS) specified that square brackets should be completely interchangeable with round brackets, which allows you to write let bindings or cond clauses like so:
(let ([a (get-some-foo 1)]
[b (get-some-foo 2)])
(cond [(> a b) -1]
[(< a b) 1]
[else 0]))
...but I hate that, I'd much prefer if square brackets were only used for vectors, which is why I have reader macros for square brackets -> vectors and curly brackets -> hash tables in my SBCL run commands.
drob518 18 hours ago [-]
I think the R6S behavior helps with visual matching, but squanders using square brackets for something more useful (e.g. vectors), which is a shame. Another thing Clojure does is copy Arc in eliminating parentheses around the pairs of forms in let bindings and cond forms, which really aren’t needed. It just expects pairs of forms and the compiler objects if given an odd number. The programmer can use whitespace (notably newlines) to format the code so the pairings are visibly apparent. That reduces a surprising amount of needless parentheses because let binding forms are used all over (less so cond forms).
tmtvl 13 hours ago [-]
Doing structural editing with unbracketed let bindings is pretty awful, though. And cond clauses being bracketed helps when needing multiple forms:
True, with modern machine-generated mass-operations refactoring is easier than with older tools, but that doesn't mean a given set of brackets is 'useless'.
kragen 10 hours ago [-]
I think it's a matter of whether you're programming in a mostly applicative way† or in a more imperative way. Especially in the modern age of generational GC, Lisp cons lists support applicative programming with efficient applicative update, but sacrifice efficiency for certain common operations: indexing to a numerical position in a large list, appending to a list, or doing a lookup in a finite map such as an alist. So, in Common Lisp or Scheme, we are often induced to use vectors or hash tables, sacrificing applicative purity for efficiency—thus Perlis's quip about how purely applicative languages are poorly applicable, from https://www.cs.yale.edu/homes/perlis-alan/quotes.html.
In general a sequence of expressions of which only the value of the last is used, like C's comma operator or the "implicit progn" of conventional cond and let bodies, is only useful for imperative programming where the non-last expressions are executed for their side effects.
Clojure's HAMTs can support a wider range of operations efficiently, so Clojure code, in my limited experience, tends to be more purely applicative than code in most other Lisps.
Incidentally, a purely applicative finite map data structure I recently learned about (in December 02023) is the "hash trie" of Chris Wellons and NRK: https://nullprogram.com/blog/2023/09/30/. It is definitely less efficient than a hash table, but, in my tests so far, it's still about 100ns per hash lookup on my MicroPC and 250ns on my cellphone, compared to maybe 50ns or 100ns respectively for an imperative hash table without FP-persistence. It uses about twice as much space. This should make it a usable replacement for hash tables in many applications where either FP-persistence, probabilistically bounded insertion time, or lock-free concurrent access is required.
This "hash trie" is unrelated to Knuth's 01986 "hash trie" https://www.cs.tufts.edu/~nr/cs257/archive/don-knuth/pearls-..., and I think it's a greatly simplified HAMT, but I don't yet understand HAMTs well enough to be sure. Unlike HAMTs, it can also support in-place mutating access (and in fact my performance measurements above were using it).
______
† sometimes called "functional", though that can alternatively refer to programming with higher-order functions
I sometimes wonder if the issue is really the parentheses or the ease of nesting. In LISP it’s natural to write
(f (g (h x))).
Whereas most people are used to.
a = h(x);
b = g(a);
c = f(b);
In C/C++ most functions return error codes, forcing the latter form.
And then there are functional languages allowing:
x -> h -> g -> f
but I think the implicit parameter passing doesn’t sit well with a lot of programmers either.
jrapdx3 1 days ago [-]
Interesting comment. I found the lisp/sexpr form instantly understandable. While the others weren't hard to grasp it took a moment to consciously parse them before their meaning was as clear. Perhaps the functional arrow notation is least appreciated because it's seems more abstract or maybe the arrows are just confusing.
More likely than not it's a matter of what a person gets used to. I've enjoyed working in Lisp/Scheme and C, but not so much in primarily functional languages. No doubt programmers have varied histories that explain their preferences.
As you imply, in C one could write nested functions as f (g (h (x))) if examining return values is unnecessary. OTOH in Lisp return values are also often needed, prompting use of (let ...) forms, etc., which can make function nesting unclear. In reality programming languages are all guilty of potential obscurity. We just develop a taste for what flavor of obscurity we prefer to work with.
AlexeyBrin 2 days ago [-]
If you don't get an answer here, try to contact the author directly through his website, he is pretty responsive.
bryanlarsen 2 days ago [-]
Many times. A google for "sweet expressions lisp" will give you a bunch of implementations and opinions.
The book looks awesome. However, I find some irony in the presence of a “no AI” badge on the back cover considering Lisp was AI research.
nils-m-holm 23 hours ago [-]
I am well aware of that :)
AlexeyBrin 1 days ago [-]
I read it as "no Gen AI" was used to write this book.
hirvi74 1 days ago [-]
"No AI was harmed in the making of this book."
nils-m-holm 23 hours ago [-]
And no AI has harmed the planet in the making of this book.
AnonC 1 days ago [-]
Under “The Intended Audience” (page 10 of the PDF sample on the site), it says that this is not an introduction to LISP and that it would be more enjoyable with some prerequisites.
Where does one — who has no knowledge of these prerequisites or about LISP (except that the latter has been heard in programming circles as something esoteric, extremely powerful, etc.) — start, before reading this book?
There's ANSI Common Lisp by Paul Graham. I've never read it and I'm not sure it's the best introduction but thumbing through it I don't see how you can get any more basic than that.
AnonC 18 hours ago [-]
Thanks for the recommendation. I appreciate it. I’ve heard about this book before, but never read it.
m-a-t-t-i 23 hours ago [-]
If you prefer hands-on learning, How to Design Programs is pretty good resource for the foundations, with lots of examples and exercises: https://htdp.org
But learning the basics of lisp is more like a side effect, the focus is on program design.
Thanks a lot. I’ve heard about this too, but didn’t spend time to follow through on reading it.
epr 21 hours ago [-]
When I was a beginner, A Gentle Introduction to Symbolic Computation worked for me. As the title suggests, it gently introduces concepts in a very beginner friendly manner, so even macros are easy enough to grasp by the time you get there. The diagrams and examples are great.
Thanks for the recommendation. I appreciate it. I’ll definitely check this out.
Jach 22 hours ago [-]
One source of awe people have with the idea of Lisp is how much you can build off of so little. I like pg's Roots of Lisp paper on that https://justine.lol/sectorlisp/jmc.pdf The core thing was the meta-circular evaluator (eval) in the original Lisp paper. You can work through it or try re-implementing it in something else. I like this recent tiny version https://justine.lol/sectorlisp2/
Another source of awe is about Lisp being more of a programming system than a language, and Common Lisp was the standardization of a lot of efforts towards that by companies making large and industrial pieces of software like operating systems, word processors, and 3D graphics editors. At the language level, "compile", "compile-file", "disassemble", "trace", "break", "step" are all functions or macros available at runtime. When errors happen, if there's not an explicit handler for it (like an exception handler) then the default behavior isn't to crash but to trigger the built-in debugger. And the stack isn't unwound yet, you can inspect the local variables at every layer. (There's very good introspection in general for everything.) Various restarts will be offered at different parts of the stack -- for example, a value was unknown, so enter it now and continue. Or you can recompile your erroneous function and restart execution at one of the stack frames with the original arguments to try again. Or you can apt-get install some foreign dependency and try reloading it without having to redo any of the effort the program had already made along the way.
Again, all part of the language at runtime, not a suite of separate tools. Implementations may offer things beyond this too, like SBCL's code coverage or profiling features. All the features of the language are designed with this interactivity and redefinability in mind though -- if you redefine a class definition, existing objects will be updated, but you can control that more finely if you need to by first making a new update-instance-for-redefined-class method. (Methods aren't owned by classes, unlike other OOP languages, which I think eliminates a lot of the OOP design problems associated with those other languages.)
I like the book Successful Lisp as a tour of Common Lisp, it's got a suggested reading order in ch 2 for different skill levels: https://dept-info.labri.fr/~strandh/Teaching/MTP/Common/Davi... It's dated in parts as far as tooling goes but if you're mostly interested in reading about some bits rather than actively getting into programming with Lisp that's not so bad. If you do want to get into it, https://lispcookbook.github.io/cl-cookbook/ has some resources on getting started with a Lisp implementation and text editor (doesn't have to be emacs).
AnonC 18 hours ago [-]
Thank you very much for an elaborate reply. I really appreciate it. I’ll check out the books and links from your comment.
Fraterkes 1 days ago [-]
Has anyone here read his “Practical Compiler Construction”? It’s on of the shorter compiler books Ive seen, seems like it might be a good way to learn a bit more about assembly
shoobiedoo 1 days ago [-]
I was very curious about this too. I've had my finger hovering over the "buy" button for months but there are next to no reviews on it. I'm wondering how it differs from other, similar works
nils-m-holm 23 hours ago [-]
There are always the sample chapters, and the code from the book is in the public domain. :)
The book is basically a modern and more complete version of the "Small C Handbook" of the 1980's. I goes through all the stages of compilation, including simple optimizations, but keeps complexity to a minimum. So if you just want to learn about compiler writing and see what a complete C compiler look like under the hood, without investing too much into theory, then this is probably one of very few books that will deliver.
Edit: and then Warren Toomey has written "A Compiler Writing Journey" based on PCC, which may shed a bit more light on the book: https://github.com/DoctorWkt/acwj
Fraterkes 20 hours ago [-]
Thx, I’m going to buy it I think!
neonrider 6 hours ago [-]
A hacker and a mystic. We need more of those.
hermitcrab 1 days ago [-]
The title "Lisp from nothing"
doesn't seem to fit with:
"INTENDED AUDIENCE
This is not an introduction to LISP."
on page 10.
gentooflux 1 days ago [-]
Nothing as in "from scratch", as opposed to Nothing as in "Visual Basic's NULL".
nils-m-holm 23 hours ago [-]
Yes, the book is about the bootstrapping of LISP, both in a historical and practical context. Hence "from nothing".
user3939382 1 days ago [-]
Did you guys hear Ladybird is gonna be ClojureScript by default /dream
nils-m-holm 2 days ago [-]
tug2024 wrote:
> Doesn’t lisp extend lambda calculus (abstraction . application)? As a consequence, lisp (abstraction . application . environment)!
Another valid question downvoted into oblivion.
The environment in (lexically scoped) LISP is an implementation detail. Lambda calculus does not need an environment, because variables are substituted on a sheet of paper. So lambda calculus equals lexically scoped LAMBDA in LISP.
Sure, you could view LISP as LC plus some extra functions (that are not easily implemented in LC).
tug2024 1 days ago [-]
[dead]
rootnod3 1 days ago [-]
Damn. I ordered the first edition a few weeks back and now the second edition is out :D
nils-m-holm 23 hours ago [-]
Aww, man. If you haven't yet purchased the 2nd ed, send me an email!
rootnod3 18 hours ago [-]
Done
fermigier 1 days ago [-]
"... and the chicks for free "?
aidenn0 1 days ago [-]
I have listened to that song probably 100s of times and always heard "checks," so I just learned something new about that song. Thanks.
Maybe less embarrassing than talking about Rock the Cashbar by The Clash (though that one was corrected the first time I saw the back of the album).
nils-m-holm 1 days ago [-]
Haha, yes! You are the first one to notice or, at least, to respond.
fuckaj 1 days ago [-]
[dead]
globular-toast 1 days ago [-]
Can anyone compare this with Queinnec's Lisp in Small Pieces? I was waiting for an English version of the 2nd edition but I guess it's never happening and my French has unfortunately regressed since then.
nils-m-holm 23 hours ago [-]
LISP in Small Pieces discusses very sophisticated techniques, while LISP From Nothing is more about the quirks and implementations of early LISP. Of course you can write a modern LISP based on the things covered in LFN, but if you are planning to write more than a toy, then Queinnec's book is the one to read.
tug2024 2 days ago [-]
[dead]
Woodi 20 hours ago [-]
When it will stop ? The minimal languages... To be useful for something language need to have at least minimal standard library.
Or just possibility to do syscalls to do something. What is more important then new syntax and sugar over basic instructions.
stellalo 20 hours ago [-]
I don’t think the book aims at being “useful” in the usual sense of the term. Neither the minimal language it builds does.
(They are probably “useful” in the dissemination of what the real essence of computation can reduce to, in practical terms.)
Not everything needs to be useful in fact: certain things can be just enjoyed in their essence, just looked at and appreciated. A bit like… art?
I am implementing my own Scheme as well. Why? I don’t know, one needs to do things that serve no apparent purpose, sometimes.
madmulita 20 hours ago [-]
Do I need a standard library to learn how to implement a language?
Our objectives might, and most probably will, be different.
Rendered at 08:24:25 GMT+0000 (Coordinated Universal Time) with Vercel.
These works are something I both understand and would never achieve myself. These are cultural artifacts, like deeply personal poetry, made purely for the process of it. Not practically useful, not state of the art, not research level, but... a personal journey?
If the author is reading this... can you share your vision? Motivation?
I do not usually talk much about "myself". I tried, but with no-one asking, I find it difficult to say anything.
EDIT: https://usesthis.com/interviews/nils.m.holm/
Cage.
Thanks for An Introduction to Mental Development, I've throughly enjoyed it!
And thanks for the Cage quote. I enjoyed that, too!
As a fan of word plays, that got a genuine chortle out of me. Thank you.
It is always interesting to spot a person on the interwebs who seems to actually have managed to turn buddhist or some other teachings into real world deeds. Living really modestly (IIRC, he/you also uses modest, underclocked laptops?), publishing for the benefit of many, and doing all this for years and years. Like, there seems to be no "overhead" in this way of living. Hugely inspirational.
I would also point out the "Essays" section on nmh's webpage, especially the ones discussing sensitivity and high IQ: https://t3x.org/#essays
Having purchased several of your books, thanks for your work, nmh!
Turning the Buddhist (or other) teachings into deeds is not too hard once you have understood who you are, and, maybe more importantly, who you are not. Figuring /that/ out can be tough and require a lot of practice.
What people perceive as modest is really an acceptance or even appreciation of what is. My apartment has not been renovated in decades, I repair what needs repair and otherwise leave things to themselves. I wear clothes until they disintegrate, and my hardware is already old when I buy it. This is the course of things. Things age and change and at some point disappear. Why prefer the new over the old? Why the old over the new? It is just that things and beings get old on their own, and it is much more joyful to witness this than trying to resist it.
I can't speak for the author but this is exactly how I look at the lisp I'm developing. It's a lifetime project. I had some kind of vision depicting how different things could be, and at some point I started trying to make it happen. I want to convince myself I'm not insane for thinking it was possible in the first place.
I love it so much, and seeing your bibliography makes me feel like a kid in a candy store. The confluence of Asian philosophy and computing is delightful.
To put you in the correct headspace this Saturday morning: https://t3x.org/whoami.html
Enjoy your stay!
https://t3x.org/s9book/index.html
And then, at least for the compiler books, there is: http://t3x.org/files/whichbook.pdf
I sure find them beautiful and all, but why do they take center stage so often? Beside the aesthetics and instructional value, I don't get the appeal. Also I feel that a bunch of the heavy lifting behind metacircular evaluators is actually done by the Polish notation syntax as well as the actual implementation, and these concepts don't get nearly as much love.
Any Lisper who can illuminate me?
And in a way it’s like Maxwell’s equations. A simple proof of computation that also somehow implements a very neat language.
I know this is a classic analogy, but now you've got me wondering, originally Maxwell wrote a messy pile of equations of scalrs, later someone (Gibbs?) gave them the familiar vector calculus form. Nowadays we have marvellously general and terse form, like (using the differential of the Hodge dual in naturalised units),
My question is, when are we going to get some super-compact unified representation of `eval`?[1] https://tromp.github.io/cl/cl.html
There's also version for the metacirculator interpreter written in full on M-expr, but they kinda break the spirit of things.
I think the version of eval that we have is already pretty terse for what it is. You could maybe code-golf it into something smaller, or you could code-golf it into something fully immutable.
My only gripe is that they all rely on an already existing reader that parses the expressions for you and represents them. Which is exactly what the book is about.
Finding a small enough interpretation that does ALL of it would be a dream, but I doubt it could be anywhere near as concise as the (modern) Maxwell equations.
It was Oliver Heaviside (https://en.wikipedia.org/wiki/Oliver_Heaviside) that rewrote Maxwell's original equations (20 of them in differential form) into the notation used today (4 of them in vector calculus form).
Here's a nice comparison: https://ddcolrs.wordpress.com/2018/01/17/maxwells-equations-...
(credit to https://aphyr.com/posts/340-reversing-the-technical-intervie..., I always get a kick out of that and the follow up https://aphyr.com/posts/341-hexing-the-technical-interview).
Lisp from Nothing - https://news.ycombinator.com/item?id=24809293 - Oct 2020 (29 comments)
Lisp from Nothing - https://news.ycombinator.com/item?id=24798941 - Oct 2020 (5 comments)
For example, print change-dir make-dir; is equivalent to (print (change-dir (make-dir) ) ) in the old money. I wonder if I am reinventing too much here.
Did LISPers try to get rid of the brackets in the past?
Probably the best example of a “Lisp without parentheses” is Dylan. Originally, Dylan was developed as a more traditional Lisp with sexprs, but they came up with a non-sexr “surface syntax” before launching it to avoid scaring the public.
In general a sequence of expressions of which only the value of the last is used, like C's comma operator or the "implicit progn" of conventional cond and let bodies, is only useful for imperative programming where the non-last expressions are executed for their side effects.
Clojure's HAMTs can support a wider range of operations efficiently, so Clojure code, in my limited experience, tends to be more purely applicative than code in most other Lisps.
Incidentally, a purely applicative finite map data structure I recently learned about (in December 02023) is the "hash trie" of Chris Wellons and NRK: https://nullprogram.com/blog/2023/09/30/. It is definitely less efficient than a hash table, but, in my tests so far, it's still about 100ns per hash lookup on my MicroPC and 250ns on my cellphone, compared to maybe 50ns or 100ns respectively for an imperative hash table without FP-persistence. It uses about twice as much space. This should make it a usable replacement for hash tables in many applications where either FP-persistence, probabilistically bounded insertion time, or lock-free concurrent access is required.
This "hash trie" is unrelated to Knuth's 01986 "hash trie" https://www.cs.tufts.edu/~nr/cs257/archive/don-knuth/pearls-..., and I think it's a greatly simplified HAMT, but I don't yet understand HAMTs well enough to be sure. Unlike HAMTs, it can also support in-place mutating access (and in fact my performance measurements above were using it).
______
† sometimes called "functional", though that can alternatively refer to programming with higher-order functions
In C/C++ most functions return error codes, forcing the latter form.
And then there are functional languages allowing: x -> h -> g -> f but I think the implicit parameter passing doesn’t sit well with a lot of programmers either.
More likely than not it's a matter of what a person gets used to. I've enjoyed working in Lisp/Scheme and C, but not so much in primarily functional languages. No doubt programmers have varied histories that explain their preferences.
As you imply, in C one could write nested functions as f (g (h (x))) if examining return values is unnecessary. OTOH in Lisp return values are also often needed, prompting use of (let ...) forms, etc., which can make function nesting unclear. In reality programming languages are all guilty of potential obscurity. We just develop a taste for what flavor of obscurity we prefer to work with.
Thanks.
Where does one — who has no knowledge of these prerequisites or about LISP (except that the latter has been heard in programming circles as something esoteric, extremely powerful, etc.) — start, before reading this book?
But learning the basics of lisp is more like a side effect, the focus is on program design.
https://www.cs.cmu.edu/~dst/LispBook/book.pdf
Another source of awe is about Lisp being more of a programming system than a language, and Common Lisp was the standardization of a lot of efforts towards that by companies making large and industrial pieces of software like operating systems, word processors, and 3D graphics editors. At the language level, "compile", "compile-file", "disassemble", "trace", "break", "step" are all functions or macros available at runtime. When errors happen, if there's not an explicit handler for it (like an exception handler) then the default behavior isn't to crash but to trigger the built-in debugger. And the stack isn't unwound yet, you can inspect the local variables at every layer. (There's very good introspection in general for everything.) Various restarts will be offered at different parts of the stack -- for example, a value was unknown, so enter it now and continue. Or you can recompile your erroneous function and restart execution at one of the stack frames with the original arguments to try again. Or you can apt-get install some foreign dependency and try reloading it without having to redo any of the effort the program had already made along the way.
Again, all part of the language at runtime, not a suite of separate tools. Implementations may offer things beyond this too, like SBCL's code coverage or profiling features. All the features of the language are designed with this interactivity and redefinability in mind though -- if you redefine a class definition, existing objects will be updated, but you can control that more finely if you need to by first making a new update-instance-for-redefined-class method. (Methods aren't owned by classes, unlike other OOP languages, which I think eliminates a lot of the OOP design problems associated with those other languages.)
I like the book Successful Lisp as a tour of Common Lisp, it's got a suggested reading order in ch 2 for different skill levels: https://dept-info.labri.fr/~strandh/Teaching/MTP/Common/Davi... It's dated in parts as far as tooling goes but if you're mostly interested in reading about some bits rather than actively getting into programming with Lisp that's not so bad. If you do want to get into it, https://lispcookbook.github.io/cl-cookbook/ has some resources on getting started with a Lisp implementation and text editor (doesn't have to be emacs).
The book is basically a modern and more complete version of the "Small C Handbook" of the 1980's. I goes through all the stages of compilation, including simple optimizations, but keeps complexity to a minimum. So if you just want to learn about compiler writing and see what a complete C compiler look like under the hood, without investing too much into theory, then this is probably one of very few books that will deliver.
Edit: and then Warren Toomey has written "A Compiler Writing Journey" based on PCC, which may shed a bit more light on the book: https://github.com/DoctorWkt/acwj
doesn't seem to fit with:
"INTENDED AUDIENCE This is not an introduction to LISP."
on page 10.
Another valid question downvoted into oblivion.
The environment in (lexically scoped) LISP is an implementation detail. Lambda calculus does not need an environment, because variables are substituted on a sheet of paper. So lambda calculus equals lexically scoped LAMBDA in LISP.
Sure, you could view LISP as LC plus some extra functions (that are not easily implemented in LC).
Maybe less embarrassing than talking about Rock the Cashbar by The Clash (though that one was corrected the first time I saw the back of the album).
Or just possibility to do syscalls to do something. What is more important then new syntax and sugar over basic instructions.
(They are probably “useful” in the dissemination of what the real essence of computation can reduce to, in practical terms.)
Not everything needs to be useful in fact: certain things can be just enjoyed in their essence, just looked at and appreciated. A bit like… art?
I am implementing my own Scheme as well. Why? I don’t know, one needs to do things that serve no apparent purpose, sometimes.
Our objectives might, and most probably will, be different.