NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
Lush: My favorite small programming language (scottlocklin.wordpress.com)
andai 236 days ago [-]
Previously:

https://news.ycombinator.com/item?id=34908067

https://news.ycombinator.com/item?id=9602430

https://news.ycombinator.com/item?id=2406325

Also this comment:

> "Lush" stands for "Lisp Universal Shell". It has not just S-expression syntax but recursion, setq, dynamic typing, quoting of S-expressions and thus lists and homoiconicity, cons, car, cdr, let*, cond, progn, runtime code evaluation, serialization (though bread/bwrite rather than read/print), and readmacros. Its object system is based on CLOS.

https://news.ycombinator.com/item?id=28728302

alpinesol 236 days ago [-]
Fun fact: Lush was invented by Yann LeCun, of convnet and FAIR fame.
ngriffiths 236 days ago [-]
Makes me curious what state R was at the time, or whatever else could've been useful for deep learning, and the benefits of a new language vs adapting something that exists. Seems like it was a big investment
antononcube 236 days ago [-]
R and its ecosystem have some unbeatable features, but, generally speaking, the "old", base R is too arcane to be widely useful. Also, being "made by statisticians for statisticians" should be a big warning sign.
nxobject 234 days ago [-]
Despite being made by statisticians, I ironically find that munging R packages together for certain classes of analysis such a slog that it prevents me from doing the actual statistical thinking. Sometimes the plots fall behind commercial packages, sometimes the diagnostics, and sometimes you have to combine multiple incompatible packages to get what a commercial package can do.

(Survival analysis and multilevel modeling comes to mind.)

wdkrnls 232 days ago [-]
This is so far from my experience. For me, R codes do tend to skimp on polish so it takes longer to get to the initial figure, but that is made up for by enabling me to see the data from a much richer perspective (to some extent because I had to think harder about what the output meant) such that I can find all the bugs in the data and in the underlying experimental plan: the stuff which makes it clear all the commercial reports are mostly useless anyway because Garbage in -> Garbage out
wdkrnls 232 days ago [-]
On the contrary, I find base R less arcane than the current de jour python libraries which copied it
_Wintermute 236 days ago [-]
In my opinion R should thought of as an unbeatable graphical calculator, but an awful programming language.
williamcotton 236 days ago [-]
The tinyverse collection of packages makes things a lot more sane, IMO:

  penguins <- read_csv("penguins.csv") |>
    na.omit() |>
    select(species, island, bill_length_mm, body_mass_g) |>
    group_by(species, island) |>
    summarize(
      mean_bill_length = mean(bill_length_mm),
      mean_mass = mean(body_mass_g),
      n = n()
    ) |>
    arrange(species, desc(mean_bill_length))
  
  penguins |>
    ggplot(aes(x = species, y = mean_bill_length, fill = island)) +
    geom_col(position = "dodge") +
    labs(
      title = "Mean Bill Length by Species and Island",
      y = "Mean Bill Length (mm)"
    ) +
    theme_minimal()
_Wintermute 235 days ago [-]
True, but trying to wrap any of that into a function rather than simple scripts makes you delve into the ever-deprecated API for non-standard evaluation.
currymj 236 days ago [-]
i would compare base R to basically a shell. meant to be used interactively. okay for small scripts. you can write big programs but it will get weird.
wdkrnls 232 days ago [-]
You must hate lisp/scheme then too, which has similar semantics as R. In that case books such as SICP would be lost on you.
perrygeo 235 days ago [-]
That's how I view it. I still use R for plotting and quick stats analyses but it is painful to do any real work.

I recommend the article "Evaluating the Design of the R Language" [1] - it reads like a horror story. The memory usage and performance is abysmal, the OO features are a mess, and the semantics are very weird ("best effort semantics" is about as predictable as it sounds!). The lexical scoping is based on Scheme but has so many weird edge cases. It's a dumpster fire of a language, but it somehow works for its intended purpose.

[1] http://janvitek.org/pubs/ecoop12.pdf

knighthack 236 days ago [-]
What does 'small' really mean?

I would think of a language like Go as small (say, in comparison to Rust or Swift) - the language itself at least, if you discount the standard library.

I find the use of the word 'small' quite confusing.

jerf 236 days ago [-]
The author appears to be defining it in terms of the effort put in to the language, basically, person-hours.

Go may be a small language by some definitions (and as my phrasing implies, perhaps not by others), but it is certainly one that has had a lot of person-hours put into it.

emmanueloga_ 236 days ago [-]
The problem is that there's no universal definition of "small" when it comes to languages.

An article on the Brown PLT blog [1] suggests analyzing languages by defining a core language and a desugaring function. A small core simplifies reasoning and analysis but can lead to verbose desugaring if features expand into many constructs. The boundary between the core and sugared language is flexible, chosen by designers, and reflects a balance between expressiveness and surface simplicity.

Feature complexity can be evaluated by desugaring: concise mappings to the core suggest simplicity, while verbose or intricate desugarings indicate complexity.

So, a possible definition of a "small" language could be one with both a small core and a minimal desugaring function.

--

1: https://blog.brownplt.org/2016/01/08/slimming-languages.html

cannibalXxx 236 days ago [-]
do you already program with this language? what is your paradigm?
kgwgk 236 days ago [-]
“Already”?

This is about a language abandoned 15 years ago!

andai 236 days ago [-]
It's buried in the article, but Lush is from 1987!
kgwgk 236 days ago [-]

  SN(1987) neural network simulator for AmigaOS (Leon Bottou, Yann LeCun)
   |
  SN1(1988) ported to SunOS. added shared-weight neural nets and graphics (LeCun)
   |   \ 
   |   SN1.3(1989) commercial version for Unix (Neuristique)
   |   /
  SN2(1990) new lisp interpreter and graphic functions (Bottou)
     |   \ 
     |   SN2.2(1991) commercial version (Neuristique)
     |    |
     |   SN2.5(1991) ogre GUI toolkit (Neuristique)
     |   / \ 
      \ /  SN2.8(1993+) enhanced version (Neuristique)
       |     \ 
       |   TL3(1993+) lisp interpreter for Unix and Win32 (Neuristique)
       |      [GPL]
       |        \_______________________________________________
       |                                                        |
     SN27ATT(1991) custom AT&T version                          |
       |        (LeCun, Bottou, Simard, AT&T Labs)              |
       |                                                        |
     SN3(1992) IDX matrix engine, Lisp->C compiler/loader and   |
       |       gradient-based learning library                  |
       |       (Bottou, LeCun, AT&T)                            |
       |                                                        |
     SN3.1(1995) redesigned compiler, added OpenGL and SGI VL   |
       |         support (Bottou, LeCun, Simard, AT&T Labs)     |
       |                                                        |
     SN3.2(2000) hardened/cleanup SN3.x code,                   |
       |         added SDL support (LeCun)                      |
       | _______________________________________________________|
       |/
       |
     ATTLUSH(2001) merging of TL3 interpreter + SN3.2 compiler
     [GPL]         and libraries (Bottou, LeCun, AT&T Labs).
       |
     LUSH(2002) rewrote the compiler/loader (Bottou, NEC Research Institute)
     [GPL]
       |
     LUSH(2002) rewrote library, documentation, and interfaced packages
     [GPL]      (LeCun, Huang-Fu, NEC)
https://lush.sourceforge.net/credits.html
peagreen 236 days ago [-]
I love this diagram. Is there a tool that generates such things? Or is there a name for this style of diagram that I could search for?

My prime use would be generating diagrams of function call chains in large Python code bases.

bandie91 235 days ago [-]
i am also interested in this.

i found vijual[1] and mermaid-ascii[2] are good starting projects.

[1]: http://www.lisperati.com/vijual/ [2]: https://github.com/AlexanderGrooff/mermaid-ascii

johnisgood 235 days ago [-]
How about pycallgraph that can be exported to Graphviz?

FWIW it is called evolutionary or lineage (or hierarchical lineage) diagram I believe.

fraserphysics 236 days ago [-]
Where does Ralf Juengling's work on lush fit in to this picture?
kgwgk 236 days ago [-]
funny_falcon 230 days ago [-]
But commits exists even this year: https://sourceforge.net/p/lush/activity/
236 days ago [-]
revskill 236 days ago [-]
[flagged]
anonzzzies 236 days ago [-]
And that's related to someone liking a language how? Especially one that's dead for a lot time...

Not to mention; you seem to be religiously pushing react which is more of a dsl but still..

revskill 236 days ago [-]
You mean what do i mean and what do you mean ? Thanks.
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 20:39:29 GMT+0000 (Coordinated Universal Time) with Vercel.