> Lisp hackers have been effortlessly reshaping the language for decades using the powerful macro system and extending and bending the language to their will.
I've written a bit of Racket code (https://github.com/evdubs?tab=repositories&q=&type=&language...) and I still haven't written a macro. In only one case did I even think a macro would be useful: merging class member definitions to include both the type and the default value on the same line. It's sort of a shame that Racket, a Scheme with a much larger standard library and many great user-contributed libraries, has to deal with the Scheme/Lisp marketing of "you can build low level tools with macros" when it's more likely that Racket developers won't need to write macros since they're already written and part of the standard library.
> But the success of Parsec has filled Hackage with hundreds of bespoke DSLs for everything. One for parsing, one for XML, one for generating PDFs. Each is completely different, and each demands its own learning curve. Consider parsing XML, mutating it based on some JSON from a web API, and writing it to a PDF.
What a missed opportunity to preach another gospel of Lisp: s-expressions. XML and JSON are forms of data that are likely not native to the programming language you're using (the exception being JSON in JavaScript). What is better than XML or JSON? s-expressions. How do Lisp developers deal with XML and JSON? Convert it to s-expressions. What about defining data? Since you have s-expressions, you aren't limited to XML and JSON and you can instead use sorted maps for your data or use proper dates for your data; you don't need to fit everything into the array, hash, string, and float buckets as you would with JSON.
If you've been hearing about Lisp and you get turned off by all of this "you can build a DSL and use better macros" marketing, Racket has been a much more comfortable environment for a developer used to languages with large standard libraries like Java and C#.
aidenn0 4 hours ago [-]
How do Lisp developers deal with XML and JSON? Convert it to s-expressions.
As a common lisp developer, that is only very vaguely true for me.
This falls out of my desire for the mapping to be bijective:
- The only built-in type that is unambiguously a mapping type is hash-tabe.
- nil is the only value that is falsy in CL
- () is the same as nil, so we can't use it as an empty list; vectors are the obvious alternative
- Not really any obvious values left to use for "null" so punt to a keyword.
0x3444ac53 5 hours ago [-]
For what it's worth, anytime I have written a macro it's usually not because it's needed, but just because I think it'll be fun :)
sparkie 1 hours ago [-]
When I learned Scheme, I liked the language but strongly disliked macros and quotation. I'd only been using it a short while and when I searched for solutions to a few problems these "fexpr" things kept appearing up, which i didn't understand, and this "Kernel" language. I decided to learn it since "fexprs" were apparently the solution to several of my problems. This wasn't easy at first - I had to read the Kernel Report several times, but I ended up finding it way more intuitive than using macros and quotes.
I've not written a Scheme macro since. I've written hundreds of Kernel operatives though.
I was also a typoholic previously, but am in remission now thanks to Kernel.
Sometime back 15 years ago [0], I hit a bit of an existential crisis regarding my career and the kind of work I was doing.
I thought the particular technology I was working in was "part of the problem", as I felt pigeon-holed by .NET and C# to always be a corporate-monkey CRUD consultant. So, I went out in search of something better. Different programming languages. Different environments. Just something that wasn't working for asshole clients who thought it was okay to yell at people about an outage in a hotel on the complete opposite side of the country that was more due to local radio interference than anything I had done in the database code that configured things. Long story involving missing a holiday with my family over something completely outside of my control and yet I still got blamed for it. The problem wasn't the technology, it was the company I was working for, but at that time in my life, I didn't understand the difference.
Racket was a life preserver at that time.
It's really hard to explain, because I never actually ended up working in Racket full-time and I haven't even touched it in probably 10 years. But it still has this impact on my identity as a software developer. I learned Racket. I forced myself out of being a Glub programmer and into someone who saw the strings that underwrote The Universe. The beauty of S-Expressions and syntactic forms and code-is-data and all that. It had a permanent impact on my view of what this job could be.
I still work primarily in .NET. Most of the things that were technological issues about .NET Framework got absolved by what was first .NET Core and what is now .NET. So, I no longer feel like my tools are holding me back. And I'll forever be thankful to Racket (and the community! The Racket listserve was amazing back then. Probably still is, I just don't interact with it anymore) for being there for me.
Edit: Haskell was in fact another language I explored at that time, in addition to Ocaml and Ruby and Python (ugh! Don't get me started on Python!) and many other things. They were all "cool" in their own way, but nothing felt like Racket. They all had their own weird rules that felt like being bossed at again. Racket felt like art. Racket felt like it was there for me, not the other way around.
[0] I still think of this time as the "mid-point" in my career, but it's now been long enough ago that I've been more past the crisis than I was ever in it. Strange feelings.
privong 3 hours ago [-]
> You can pause, inspect objects, change values, and even redefine a broken function on the fly to test a fix in any environment (yes even in production, while running).
I see this mentioned often, and it sounds amazingly useful (especially the part about fixing in production!). But how truly widespread is it among the Lisp dialects to be able to connect to a running program, debug, and hotfix it? I understand Common Lisp has it, but I struggled to figure out how to do it in, say, Racket. Admittedly I'm am relatively inexperienced Lisp programmer, so maybe I wasn't looking in the right place or for the right words. Which Lisp dialects do indeed support the extreme version of this capability to inspect and edit running programs?
drob518 34 minutes ago [-]
It’s common in Clojure as well as other Lisps. I was just doing that exact thing, modifying a running program in production, earlier this week, adding in print calls to gather debugging information and then modifying the code to fix the bug and it immediately going live and the correct behavior verified.
b00ty4breakfast 3 hours ago [-]
it's been my experience that when most people say "Lisp does this that or the other", what they usually mean is "Common Lisp does this that or the other". Often there's an implicit "with SLIME" in there as well
drob518 32 minutes ago [-]
This is doable in Common Lisp, Scheme/Racket, and Clojure. Yes, it might require some tooling.
privong 2 hours ago [-]
That could very well be it. I guess I had gotten my hopes up, seeing the statement in a piece that purported to be specifically about Scheme .
rtpg 3 hours ago [-]
Python is not Lisp, but jumping into a Python REPL in a halfway-run program and poking at the internals easily is _very_ useful as a debugging tool, quickly getting you answers on some messier programs.
It's a shame that other scripting languages that theoretically have the capabilities to do this don't do this (looking at you, node! Chrome dev tools are fine but way too futzy compared to `import pdb; pdb.set_trace()` and "just" using stdin)
I do also use Emacs, and with Emacs Lisp `trace-function` means you can very quickly get call traces in your running instance without having to pull out a debugger and the like. Not like you can't trace functions with `gdb` of course. But the lowered barrier to entry and the ability to do in-process debugging dynamically means you just have access to richer debugging tools from the outset.
ch4s3 53 minutes ago [-]
In ruby it used to be common to ssh into a box, attach to the console and edit files from the REPL and rerun the code to see if your patch worked. I haven’t touched it in years and I doubt many people do that anymore.
wonger_ 2 hours ago [-]
Not Lisp, but for those interested in editing programs that are running in production:
I read some Erlang article saying that hot swapping is not actually very useful in production because of some reasons, and instead a blue-green deployment is preferred. Can't find the link atm. This was close: https://learnyousomeerlang.com/relups
Compare to this comment: https://news.ycombinator.com/item?id=42405168
Hot swaps for small patches and bugfixes, and hard restarts for changing data structures and supervisor tree.
ch4s3 56 minutes ago [-]
It not that hot swapping isn’t useful, it’s just difficult to do well and you need to write your code in a way that supports it. If you need 0 downtime on a device that can do a blue green deployment then the BEAM has you covered. Most people just don’t need that, so the extra hassle isn’t worth constantly considering how to migrate data in flight.
dmux 3 hours ago [-]
I also see this mentioned often and have wondered the same. I can sort of envision this working in a single threaded application, but how would this work in a web application for example? If a problematic function needs to be debugged, can you pick what thread you're debugging? If not, do all incoming requests get blocked while you debug and step through stack frames?
arikrahman 3 hours ago [-]
The nREPL is present even in newer dialects. It is as easy as installing Calva vscode extension for Clojure, or jacking in with Cider. This makes it perfect for LLM interaction as well.
kolme 7 hours ago [-]
> Of course, to be completely fair about my toolkit, standard Scheme can sometimes lack the heavyweight, “batteries-included” ecosystem required for massive enterprise production compared to the JVM.
I was thinking the whole time, "this person would _love_ Clojure".
nathan_compton 7 hours ago [-]
Kawa is a Scheme which runs on the JVM and is pretty great.
I am one of these people who cannot countenance a Lisp that doesn't have `syntax-case`.
packetlost 7 hours ago [-]
as a part time schemer, I also love Clojure and reach for it more often than Scheme these days.
ggm 18 hours ago [-]
> Actually, in my opinion, Scheme (and Lisp) allows you to express complex systems and problem domains in more simple terms than any other language can.
Short article. Worth reading. But all I swallowed was this one sentence.
Its the sytax. If you like semicolons, thats why you like Pascal-like languages.
reikonomusha 8 hours ago [-]
For all practical purposes, the syntax of Lisp isn't just a cosmetic choice, though.
rauli_ 7 hours ago [-]
Lisp was meant to be written with M-expressions instead of S-expressions anyway.
drob518 27 minutes ago [-]
If you want a Lisp that basically has M-expressions, try Dylan. It even started with an S-expression syntax initially and then converted to infix.
reikonomusha 6 hours ago [-]
For a brief period of time over 60 years ago, yes. :)
SideQuark 5 hours ago [-]
M-expressions were never implemented and never used.
vincent-manis 40 minutes ago [-]
Actually, variations on M-expressions have been created many times in the Lisp world. (Look what you can do with macros!) So far, none of them has caught on. The latest attempt for Scheme is SRFI-266, which creates a very nice infix expression sublanguage. If I were working on a team, I would encourage them to use this, but I don't know if it has enough traction to become widespread.
ux266478 5 hours ago [-]
Haskell's syntax comes from ISWIM, which was motivated quite a lot by m-expressions.
Grosvenor 5 hours ago [-]
Except in mathematica - which isn’t formally a lisp, but practically it’s used like one a lot of the time.
coldtea 5 hours ago [-]
Because they're elegant. Haskell is a conceptual and syntax mess.
isatty 42 minutes ago [-]
Haskell is very elegant and pretty. It's hard to describe what pretty is when it comes to programming languages, but imo golang is ugly, rust is good, and Haskell the best.
ngruhn 4 hours ago [-]
Compared to lisp? Ok fine. Syntax doesn't get more simple than Lisp. But compared to JavaScript? C++? C#? Haskell is top tier when it comes to syntactic and conceptual elegance. The biggest problem is tooling, I would say.
jes5199 2 hours ago [-]
I could not agree less. People used to call Python “executable pseudocode” - in that spirit, Haskell is executable pseudo-math. If you’ve done enough higher math that a professor’s whiteboard notation feels natural to you, then Haskell might feel like a reasonable approximation of that style. Otherwise: it’s line noise.
(I write Haskell professionally)
coldtea 3 hours ago [-]
I don't think:
"Haskell: more elegant than Javascript and C++" would make a good promotional motto.
That's like bragging how prettier you are than Danny Trejo.
If you know lisp, just reach for Coalton instead of Haskell
anonzzzies 7 hours ago [-]
Coalton has some evolution to go before that, but it is good and flexible enough.
reikonomusha 7 hours ago [-]
What evolution in particular do you think? The developers use it for commercial products in quantum computing and defense [1]. That doesn't mean it's done in some complete language ecosystem sense (which is discussed in [1], and one could argue Haskell also never feels "finished"), but it also doesn't seem like an unfinished hobby project. Given that it's embedded in Common Lisp, there's always a way to fill in the library gaps, sort of like how if a "native" library doesn't exist in Clojure, one can always reach for Java.
[1] From Toward Safe, Flexible, and Efficient Software in Common Lisp at the European Lisp Symposium, "[Coalton] has been used for the past 5 or so years [...] first in quantum computing and now a serious defense application." https://youtu.be/xuSrsjqJN4M&t=9m14s
anonzzzies 7 hours ago [-]
I am an avid sbcl and coalton user (and sponsor of both when I can) and never said it was not a great thing; comparing it to Haskell is, outside the theoretical type system roots, just a bit early type system wise.
I agree with you further and you did an excellent promotional comment for Coalton and CL; keep doing that please. I have said many times here before that I did not like my time away from CL and Coalton makes it even better.
busterarm 8 hours ago [-]
I learned Scheme before Haskell and as much as I enjoyed the experience, I still wouldn't reach for Haskell first. It's pretty much limited to my xmonad configuration.
nathan_compton 7 hours ago [-]
I have written a very large codebase in Scheme (gambit) and in the end I really, really, wanted a type system to catch bugs.
Boxxed 1 hours ago [-]
Can you say more about the system? A lifetime ago I was really excited about gambit (and bigloo) but I never had the chance to work with them beyond messing around here and there after work.
That's why I switched to Common Lisp, its type system isn't perfect but it works well enough for my needs (especially with the occasional (describe 'sycamore:tree-insert) in the REPL).
rahen 6 hours ago [-]
Jank looks promising if you want a typed Lisp. It’s essentially native Clojure without the JVM:
https://jank-lang.org/
In case you're into machine learning, I'm also building something similar - a tensor-first, native Clojure-like ML framework.
busterarm 5 hours ago [-]
I get where you're coming from but I talked to a few folks working in large Haskell codebases and I'm not sure I would make that trade.
crabbone 6 hours ago [-]
I don't believe monads are a "heavy handed abstraction" and that's what prevents people from prototyping in Haskell.
What really prevents people from writing in Haskell at a reasonable speed is the poor language design. Programming languages are supposed to aid in reading by emphasizing structure. It's important to emphasize that a particular group of "words" constitutes a function call, or a variable definition, or a type definition -- whatever the language has to offer.
Haskell is a word salad. Every line you read, you have to read multiple times, every time trying to guess the structure from the disconnected acronyms. It belongs to the "buffalo buffalo buffalo buffalo" gimmick family. This is a huge roadblock on the way to prototyping as well as any other activity that implies the ability to read code quickly. And then it's also spiced by the most bizarre indentation rules invented by men.
This is not at all a problem with eg. SML or Erlang, even though they are roughly in the same category of languages.
Haskell would've been a much better language if it made its syntax more systematic and disallowed syntactical extensions s.a. introduction of user-invented infix operators, overloading of literals (heaven, why???) and requiring parenthesis around function arguments both for definition and for application. The execution model is great, the typesystem is great... but the surface, the front door to all these nice things the language has is just some amateur level nonsense.
* * *
As for the upsides of using languages from the Lisp family for practical problems... I don't find (syntax-rules ...) all that exciting. I understand this was an attempt to constrain the freedom given by Common Lisp macros, and I don't think it worked. I think it's clumsy and annoying to deal with. The very first time I tried to use it, I ran into its limitations, and that felt completely unjustified. To prototype, you want freedom of movement, not some pedantry that will stand in your way and demand you work around it somehow.
The absolute selling point, however, is SWANK. Instead of editing the source code, you are editing the program itself, that can be interacted with in points of your choosing. I don't know of any modern language that offers this kind of experience. I think, even still in the 80s, this approach to programmers interacting with computers was common. At school, we had terminals with some variety of Basic, and it worked just like that: you type the program and it instantly shows the effect of your changes. Then, there was also Forth, which also worked in a similar way: it felt like you are "talking" to the computer in a very organized and structured way, but real-time.
Most mainstream languages today sprouted from the idea of batch jobs, where the programmer isn't at the keyboard when the program runs. They came with the need to anticipate and protect the programmer from every minor mistake they might've easily detected and fixed during an interactive session far, far in advance.
Whenever I think about writing in C, or Rust, or Haskell, I imagine being tasked with going to the grocery blindfolded: I'd need to memorize the number of steps, the turns, predict the traffic, have canned strategies for what to do when potatoes go on sale... I deeply regret that programming evolved using this evolution path, and our idea of what it means to program is, mostly, the skill of guessing the impossible to predict future, instead of learning to react to the events as they unfold.
drob518 20 minutes ago [-]
Try Clojure with CIDER/nREPL (roughly similar to SLIME/SWANK).
openuntil3am 4 hours ago [-]
Your criticism of Haskell is entirely subjective. There are lots of people, myself included, that like and prefer Haskell's syntax.
jonahx 4 hours ago [-]
From your last paragraph, I am curious which languages / paradigms you advocate for. Sorry it wasn't clear to me except that you like SWANK, which I'm not familiar with.
ngruhn 4 hours ago [-]
> Haskell is a word salad. Every line you read, you have to read multiple times, every time trying to guess the structure from the disconnected acronyms. This is a huge roadblock on the way to prototyping as well as any other activity that implies the ability to read code quickly.
I couldn't disagree more. Yes, there is more upfront work understanding Haskell code. But it's very dense. Once you understand the patterns, you can read it much quicker. Just like map/filter/fold are harder to understand then a for-loop, but once you do, you can immediately see what kind of iteration is applied. The for-loop can do all kinds of crazy index manipulation that you always have to digest from scratch.
> And then it's also spiced by the most bizarre indentation rules invented by men.
Again, quite surprised by this criticism. The rule is extremely simple: inner expressions must be indented more. You're free to decide by how much. That's why there are many "styles" out there. Maybe that's what you mean with bizarre. But it's not like the language is forcing weird constraints on you. If anything the constraints are too lax. Any other language with non-mandatory indentation allows that as well. In general, I really don't understand why not more languages do mandatory indentation. You only need curly braces and semicolons if you want the option to write a whole if/else/while/... statement in one line. But nobody does that.
T-R 2 hours ago [-]
> inner expressions must be indented more
Not to support the parent comment, which I disagree with, but If you use multi-line let-bindings, those require that you indent not just more than the previous line, but as much as the first token after the let keyword on the previous line. It’s a very strange rule, all the more surprising because it’s inconsistent even with the rest of the language. It is totally avoidable if you, like I think most experienced haskellers do, just prefer ‘where’, but people more familiar with procedural code usually lean into using ‘let’ everywhere because it feels more familiar.
I think the strange indentation used to be required in more places - I vaguely remember running into it a lot more when I started with Haskell 20 years ago, but that was also just when I was new to the language. These days I just keep ‘let’ to a bare minimum, so it doesn’t bother me. One thing that made Elm frustrating was that it disallowed ‘where’ clauses, forcing you to deal with this weird edge case all the time.
floxy 59 minutes ago [-]
So you want to line the equals signs up or similar?
let
f = 9
fo = 10
foo = 123
in f+fo+foo
vs.
let
f = 9
fo = 10
foo = 123
in f+fo+foo
T-R 12 minutes ago [-]
No, the issue is if the first binding is on the same line as the `let`, you are required to write, e.g.:
someValue = let f = 9
fo = 10
foo = 123
in f+fo+foo
rather than:
someValue = let f = 9
fo = 10
foo = 123
in f+fo+foo
I think it used to be the case even if it was't on the same line. Note also that `in` has to be indented past `someValue`, but doesn't need to be indented as far `let`.
This is fine:
someValue = let
f = 9
fo = 10
foo = 123
in f+fo+foo
So, it is possible to land on sane indentation, but the parser is much pickier than, e.g., Python's off-sides rule, so it takes some trial and error for new users to find it, and it can be frustrating if you're just temporarily modifying an expression to quickly try something out.
I honestly think it would be less surprising if the parser just disallowed writing the first binding on the same line as the `let` entirely, treating it as a block.
floxy 4 hours ago [-]
>And then it's also spiced by the most bizarre indentation rules
Are you mixing tabs and spaces? Maybe an example here would help.
>overloading of literals (heaven, why???)
No, this is important, so that default strings don't to have to be something crummy. Even C++ got on this bandwagon.
>and requiring parenthesis around function arguments both for definition and for application.
??? Again, an example would be helpful. Usually the complaint with Haskell is that people don't use enough parenthesis.
>The execution model is great
...I thought lazy execution was widely agreed to be the worst part of Haskell.
6 hours ago [-]
jgalt212 2 hours ago [-]
Irrespective of the language, I love the REPL. For this reason, among others, I just cannot get into Agentic Coding. It seems like a step back to batch processing.
z3ratul163071 8 hours ago [-]
[flagged]
anthk 6 hours ago [-]
I tried some ML language once, it's difficult even to write a basic factorial example,
which in Scheme I could do it iteratively and recursively with ease.
Either with S9 Scheme for quick fun (it has Unix sockets and ncurses :D ) or Chicken Scheme for completeneless (R5RS/R7RS-small + modules), I always have fun with both.
Oh, and well, Forth, too, but more like a puzzle (altough it shines to teach you that you can do a lot with a fixed point). Hint: write helpers for rationals -a/b where a is an integer and b a non-zero integer- and complex numbers by placing two items in the stack for each case (for rat helpers you need four (a/b [+-*/] c/d) .
You can have a look at qcomplex.tcl (either online or installed) as an example on how can it work even under JimTCL itself by just sourcing that file. Magic, complex numbers under jimsh thanks to the algebraic properties. So, you can implement the same for yourself in some Forths, even under EForth for Muxleq. Useless? It depends, under an ESP32 it can be damn fast, faster than Micropython.
zarakshR 6 hours ago [-]
I don't see how:
Racket:
> (define (fact n)
(if (= n 1)
1
(* n (fact (- n 1)))))
> (fact 6)
720
OCaml:
# let rec fact = function
| 1 -> 1
| n when n > 1 -> n * (fact (n - 1))
in fact 6;;
- : int = 720
kubb 5 hours ago [-]
Whenever someone complains about not being able to use a slightly different syntax, I assume they just don't have any neuroplasticity anymore.
drob518 15 minutes ago [-]
I think syntax matches with our brains or not. I think anyone is capable of learning any syntax. The question is whether they want to. At some level, programming is art.
zelphirkalt 5 hours ago [-]
From my limited SMLNJ experience I think for something as simple as factorial, it is nearly the same. Both have TCO, recursion, inner functions, pattern matching and those good things. You can structure the code the same way.
ngruhn 4 hours ago [-]
Haskell:
fac n = product [1 .. n]
floxy 3 hours ago [-]
Obligatory "The Evolution of a Haskell Programmer":
I mean, in Scheme it is longer to write. I enjoy Lisps and use Emacs for everything, but Haskell can be as terse, or even more terse. (Which is not always a good thing.)
the_af 2 hours ago [-]
> I tried some ML language once, it's difficult even to write a basic factorial example
What do you mean? It's one of the first things taught in any tutorial for the ML family or Haskell.
Rendered at 03:05:38 GMT+0000 (Coordinated Universal Time) with Vercel.
I've written a bit of Racket code (https://github.com/evdubs?tab=repositories&q=&type=&language...) and I still haven't written a macro. In only one case did I even think a macro would be useful: merging class member definitions to include both the type and the default value on the same line. It's sort of a shame that Racket, a Scheme with a much larger standard library and many great user-contributed libraries, has to deal with the Scheme/Lisp marketing of "you can build low level tools with macros" when it's more likely that Racket developers won't need to write macros since they're already written and part of the standard library.
> But the success of Parsec has filled Hackage with hundreds of bespoke DSLs for everything. One for parsing, one for XML, one for generating PDFs. Each is completely different, and each demands its own learning curve. Consider parsing XML, mutating it based on some JSON from a web API, and writing it to a PDF.
What a missed opportunity to preach another gospel of Lisp: s-expressions. XML and JSON are forms of data that are likely not native to the programming language you're using (the exception being JSON in JavaScript). What is better than XML or JSON? s-expressions. How do Lisp developers deal with XML and JSON? Convert it to s-expressions. What about defining data? Since you have s-expressions, you aren't limited to XML and JSON and you can instead use sorted maps for your data or use proper dates for your data; you don't need to fit everything into the array, hash, string, and float buckets as you would with JSON.
If you've been hearing about Lisp and you get turned off by all of this "you can build a DSL and use better macros" marketing, Racket has been a much more comfortable environment for a developer used to languages with large standard libraries like Java and C#.
As a common lisp developer, that is only very vaguely true for me.
The mapping I prefer for json<->Lisp is:
This falls out of my desire for the mapping to be bijective:- The only built-in type that is unambiguously a mapping type is hash-tabe.
- nil is the only value that is falsy in CL
- () is the same as nil, so we can't use it as an empty list; vectors are the obvious alternative
- Not really any obvious values left to use for "null" so punt to a keyword.
I've not written a Scheme macro since. I've written hundreds of Kernel operatives though.
I was also a typoholic previously, but am in remission now thanks to Kernel.
https://web.cs.wpi.edu/~jshutt/kernel.html
I thought the particular technology I was working in was "part of the problem", as I felt pigeon-holed by .NET and C# to always be a corporate-monkey CRUD consultant. So, I went out in search of something better. Different programming languages. Different environments. Just something that wasn't working for asshole clients who thought it was okay to yell at people about an outage in a hotel on the complete opposite side of the country that was more due to local radio interference than anything I had done in the database code that configured things. Long story involving missing a holiday with my family over something completely outside of my control and yet I still got blamed for it. The problem wasn't the technology, it was the company I was working for, but at that time in my life, I didn't understand the difference.
Racket was a life preserver at that time.
It's really hard to explain, because I never actually ended up working in Racket full-time and I haven't even touched it in probably 10 years. But it still has this impact on my identity as a software developer. I learned Racket. I forced myself out of being a Glub programmer and into someone who saw the strings that underwrote The Universe. The beauty of S-Expressions and syntactic forms and code-is-data and all that. It had a permanent impact on my view of what this job could be.
I still work primarily in .NET. Most of the things that were technological issues about .NET Framework got absolved by what was first .NET Core and what is now .NET. So, I no longer feel like my tools are holding me back. And I'll forever be thankful to Racket (and the community! The Racket listserve was amazing back then. Probably still is, I just don't interact with it anymore) for being there for me.
Edit: Haskell was in fact another language I explored at that time, in addition to Ocaml and Ruby and Python (ugh! Don't get me started on Python!) and many other things. They were all "cool" in their own way, but nothing felt like Racket. They all had their own weird rules that felt like being bossed at again. Racket felt like art. Racket felt like it was there for me, not the other way around.
[0] I still think of this time as the "mid-point" in my career, but it's now been long enough ago that I've been more past the crisis than I was ever in it. Strange feelings.
I see this mentioned often, and it sounds amazingly useful (especially the part about fixing in production!). But how truly widespread is it among the Lisp dialects to be able to connect to a running program, debug, and hotfix it? I understand Common Lisp has it, but I struggled to figure out how to do it in, say, Racket. Admittedly I'm am relatively inexperienced Lisp programmer, so maybe I wasn't looking in the right place or for the right words. Which Lisp dialects do indeed support the extreme version of this capability to inspect and edit running programs?
It's a shame that other scripting languages that theoretically have the capabilities to do this don't do this (looking at you, node! Chrome dev tools are fine but way too futzy compared to `import pdb; pdb.set_trace()` and "just" using stdin)
I do also use Emacs, and with Emacs Lisp `trace-function` means you can very quickly get call traces in your running instance without having to pull out a debugger and the like. Not like you can't trace functions with `gdb` of course. But the lowered barrier to entry and the ability to do in-process debugging dynamically means you just have access to richer debugging tools from the outset.
I read some Erlang article saying that hot swapping is not actually very useful in production because of some reasons, and instead a blue-green deployment is preferred. Can't find the link atm. This was close: https://learnyousomeerlang.com/relups
Compare to this comment: https://news.ycombinator.com/item?id=42405168 Hot swaps for small patches and bugfixes, and hard restarts for changing data structures and supervisor tree.
I was thinking the whole time, "this person would _love_ Clojure".
https://www.gnu.org/software/kawa/index.html
I am one of these people who cannot countenance a Lisp that doesn't have `syntax-case`.
Short article. Worth reading. But all I swallowed was this one sentence.
Its the sytax. If you like semicolons, thats why you like Pascal-like languages.
(I write Haskell professionally)
"Haskell: more elegant than Javascript and C++" would make a good promotional motto.
That's like bragging how prettier you are than Danny Trejo.
[1] From Toward Safe, Flexible, and Efficient Software in Common Lisp at the European Lisp Symposium, "[Coalton] has been used for the past 5 or so years [...] first in quantum computing and now a serious defense application." https://youtu.be/xuSrsjqJN4M&t=9m14s
I agree with you further and you did an excellent promotional comment for Coalton and CL; keep doing that please. I have said many times here before that I did not like my time away from CL and Coalton makes it even better.
In case you're into machine learning, I'm also building something similar - a tensor-first, native Clojure-like ML framework.
What really prevents people from writing in Haskell at a reasonable speed is the poor language design. Programming languages are supposed to aid in reading by emphasizing structure. It's important to emphasize that a particular group of "words" constitutes a function call, or a variable definition, or a type definition -- whatever the language has to offer.
Haskell is a word salad. Every line you read, you have to read multiple times, every time trying to guess the structure from the disconnected acronyms. It belongs to the "buffalo buffalo buffalo buffalo" gimmick family. This is a huge roadblock on the way to prototyping as well as any other activity that implies the ability to read code quickly. And then it's also spiced by the most bizarre indentation rules invented by men.
This is not at all a problem with eg. SML or Erlang, even though they are roughly in the same category of languages.
Haskell would've been a much better language if it made its syntax more systematic and disallowed syntactical extensions s.a. introduction of user-invented infix operators, overloading of literals (heaven, why???) and requiring parenthesis around function arguments both for definition and for application. The execution model is great, the typesystem is great... but the surface, the front door to all these nice things the language has is just some amateur level nonsense.
* * *
As for the upsides of using languages from the Lisp family for practical problems... I don't find (syntax-rules ...) all that exciting. I understand this was an attempt to constrain the freedom given by Common Lisp macros, and I don't think it worked. I think it's clumsy and annoying to deal with. The very first time I tried to use it, I ran into its limitations, and that felt completely unjustified. To prototype, you want freedom of movement, not some pedantry that will stand in your way and demand you work around it somehow.
The absolute selling point, however, is SWANK. Instead of editing the source code, you are editing the program itself, that can be interacted with in points of your choosing. I don't know of any modern language that offers this kind of experience. I think, even still in the 80s, this approach to programmers interacting with computers was common. At school, we had terminals with some variety of Basic, and it worked just like that: you type the program and it instantly shows the effect of your changes. Then, there was also Forth, which also worked in a similar way: it felt like you are "talking" to the computer in a very organized and structured way, but real-time.
Most mainstream languages today sprouted from the idea of batch jobs, where the programmer isn't at the keyboard when the program runs. They came with the need to anticipate and protect the programmer from every minor mistake they might've easily detected and fixed during an interactive session far, far in advance.
Whenever I think about writing in C, or Rust, or Haskell, I imagine being tasked with going to the grocery blindfolded: I'd need to memorize the number of steps, the turns, predict the traffic, have canned strategies for what to do when potatoes go on sale... I deeply regret that programming evolved using this evolution path, and our idea of what it means to program is, mostly, the skill of guessing the impossible to predict future, instead of learning to react to the events as they unfold.
I couldn't disagree more. Yes, there is more upfront work understanding Haskell code. But it's very dense. Once you understand the patterns, you can read it much quicker. Just like map/filter/fold are harder to understand then a for-loop, but once you do, you can immediately see what kind of iteration is applied. The for-loop can do all kinds of crazy index manipulation that you always have to digest from scratch.
> And then it's also spiced by the most bizarre indentation rules invented by men.
Again, quite surprised by this criticism. The rule is extremely simple: inner expressions must be indented more. You're free to decide by how much. That's why there are many "styles" out there. Maybe that's what you mean with bizarre. But it's not like the language is forcing weird constraints on you. If anything the constraints are too lax. Any other language with non-mandatory indentation allows that as well. In general, I really don't understand why not more languages do mandatory indentation. You only need curly braces and semicolons if you want the option to write a whole if/else/while/... statement in one line. But nobody does that.
Not to support the parent comment, which I disagree with, but If you use multi-line let-bindings, those require that you indent not just more than the previous line, but as much as the first token after the let keyword on the previous line. It’s a very strange rule, all the more surprising because it’s inconsistent even with the rest of the language. It is totally avoidable if you, like I think most experienced haskellers do, just prefer ‘where’, but people more familiar with procedural code usually lean into using ‘let’ everywhere because it feels more familiar.
I think the strange indentation used to be required in more places - I vaguely remember running into it a lot more when I started with Haskell 20 years ago, but that was also just when I was new to the language. These days I just keep ‘let’ to a bare minimum, so it doesn’t bother me. One thing that made Elm frustrating was that it disallowed ‘where’ clauses, forcing you to deal with this weird edge case all the time.
This is fine:
So, it is possible to land on sane indentation, but the parser is much pickier than, e.g., Python's off-sides rule, so it takes some trial and error for new users to find it, and it can be frustrating if you're just temporarily modifying an expression to quickly try something out.I honestly think it would be less surprising if the parser just disallowed writing the first binding on the same line as the `let` entirely, treating it as a block.
Are you mixing tabs and spaces? Maybe an example here would help.
>overloading of literals (heaven, why???)
No, this is important, so that default strings don't to have to be something crummy. Even C++ got on this bandwagon.
>and requiring parenthesis around function arguments both for definition and for application.
??? Again, an example would be helpful. Usually the complaint with Haskell is that people don't use enough parenthesis.
>The execution model is great
...I thought lazy execution was widely agreed to be the worst part of Haskell.
Either with S9 Scheme for quick fun (it has Unix sockets and ncurses :D ) or Chicken Scheme for completeneless (R5RS/R7RS-small + modules), I always have fun with both.
Oh, and well, Forth, too, but more like a puzzle (altough it shines to teach you that you can do a lot with a fixed point). Hint: write helpers for rationals -a/b where a is an integer and b a non-zero integer- and complex numbers by placing two items in the stack for each case (for rat helpers you need four (a/b [+-*/] c/d) .
You can have a look at qcomplex.tcl (either online or installed) as an example on how can it work even under JimTCL itself by just sourcing that file. Magic, complex numbers under jimsh thanks to the algebraic properties. So, you can implement the same for yourself in some Forths, even under EForth for Muxleq. Useless? It depends, under an ESP32 it can be damn fast, faster than Micropython.
Racket:
OCaml:https://people.willamette.edu/~fruehr/haskell/evolution.html
I mean, in Scheme it is longer to write. I enjoy Lisps and use Emacs for everything, but Haskell can be as terse, or even more terse. (Which is not always a good thing.)
What do you mean? It's one of the first things taught in any tutorial for the ML family or Haskell.