Functional Programmers need to take a look at Zig
62 points by doyougnu
62 points by doyougnu
This was a fun read—thank you for writing it!
Monads are not some kind of obscure math-y thing that only the big brains think are necessary. No, instead monads are a fundamental abstract algebraic description of imperative programming as a computational context.
With kindness, I have to say: my eyes glazed over for a second over that last sentence.
I do agree, though, that a lot of simple concepts are locked up behind complex language. The interesting thing about Zig's IO interface (and about your analogy!) is that it makes the concept much more legible. Before reading your post, the only thing I knew about monads was that they're monoids in the category of endofunctors—i.e., nothing. But now I have a new reference point: oh, they're kind of like Zig's IO interface! I know interfaces. I know passing dependencies into functions. This makes sense.
the analogy only takes you so far, because under strict semantics you cannot compose programs without also executing their effects.
Edit: before people take me for a pointy-headed academic (I'm not), consider what Andrew wrote on Zig IO not being a monad.
they're kind of like Zig's IO interface!
I think I'd say it quite differently: monads allow you to do similar things to what Zig's IO interface does (i.e. make particular computational effects available at particular points of a program), but they're a much more general way of doing that. (Actually, as I Haskeller I prefer Zig's approach, and I'm trying to take Haskell down the less-explored branch of the design space: just passing stuff in).
I keep seeing effects everywhere. Which I believe are just another flavor of monads. I don't want to avoid these terms but my attention has a habit of sliding off the details
I would say monads are one way of doing effects. You could do effects without monads (Koka is an example, I think).
I'm not completely sure what to take away from this article. There's a lot of functional patterns that have made their way into Zig, which is quite cool. I'm very glad to see tagged unions as a fundamental construct in a systems language. I'm glad to see a more structured form of staged compilation than a preprocessor, even if it's decoupled from macros. I'm glad to see a mainstream language like Zig taking a monadic approach to I/O. (Even if they don't think that they are!)
But what should functional programmers learn from Zig?
The benefits of Zig, as discussed here, are 1) no spooky action at a distance and 2) it doesn't have a garbage collector. But... I am really not convinced that either of these are that valuable to the average functional programmer? I find myself particularly unconvinced by section on garbage collection. The primary benefit of a garbage collector is not in reducing noise, it's in making your program safe.
Zig is unsafe! It may add some guardrails and slightly nicer abstractions compared to C, but it makes no attempt to give any guarantees whatsoever about memory safety. It has first-class support for raw access to the hardware, sure, but so does Rust, and Rust gives you safety-by-default, zero-cost abstractions, and various other guarantees. Neither serious abstractions nor serious guarantees are possible in Zig due to its eschewing of the type system: even (especially) with comptime.
I really don't think that a language lacking a garbage collector is an advantage in 2026. Memory safety issues remain the no. 1 source of bugs. This is primarily a social problem: most memory safety issues are in languages that make it easy to have memory safety issues. Zig makes it easy to have memory safety issues. If I understand the Zig allocator interface correctly, you're able to slot in your own GC if you'd really like to: but aside from Zig being unsafe-by-default, external GCs are seriously kneecapped in performance by not being able to perform control flow analysis and the subsequent optimizations languages like Rust and Lean (aided by their type systems) are able to take advantage of. I don't think I buy the claim that CPU performance increases have made garbage collectors a poor tradeoff, either. The benefit of safety is absolute (IMO) but also -- though I haven't looked into it, I expect garbage collection strategies to have evolved over the years as CPU clock speed vs. RAM access speed has become increasingly lopsided.
The cognitive risk to garbage collection, I might buy. I'd more so blame tech consumerism though.
But what should functional programmers learn from Zig?
Dunno about technical things, but I think there’s a meta thing about the kinds of compile-time language features that programmers find attractive, which I hope might provoke research into how this kind of approachable compile-time programming fits into the broader landscape, and how it might all be simplified.
At the moment there are a bunch of fairly disparate flavours:
macros, which might be lexical (C) or structural (Lisp) and more-or-less hygienic (Scheme / Rust)
phasing into staged compilation, macros with quasiquote (Lisp / Template Haskell)
type-level hackery of the Haskell / Rust variety where the type system is an esolang that’s secretly Prolog
C++ templates that are in a cursed halfway house between macrology and type hackery
dependent types, where the type-level language is the same functional language as the value-level language, but the compile-time vs run-time distinction is blurred
programming the optimizer with things like inline annotations or loop unroll hints
partial evaluation, optimize me harder
load-time evaluation in the dynamic linker, and static initialization before main()
Zig’s interesting because it uses the same language at the type level and the value level, but it rejects the cultural trappings of dependent types. It gives the programmer control over when evaluation happens, but without the use/mention confusion of staged compilation.
I have a vague notion that it might be worth designing a language around partial evaluation as its compilation model. But I dunno how features like C++ constexpr or Zig comptime or inline for could be recast into a language with fewer simpler primitives in a way that isn’t horrible to use.
where the type system is an esolang that’s secretly Prolog
I feel this
type-level hackery of the Haskell / Rust variety where the type system is an esolang that’s secretly Prolog
I feel this
In Haskell type system is a language powerful enough that Prolog can be embedded. The value level system is also powerful enough that Prolog can be embedded. In neither case do you have to embed Prolog. Some people do things along those lines. I program simply.
The other aspect that I as a functional programmer remain fairly unconvinced of is that "spooky action at a distance" is bad -- and that Zig's lack thereof is particularly innovative here. Scheme similarly has nominal structs/records, pattern matching, monads as a design pattern rather than a type, and passes around lambdas rather than having a magic typeclass dictionary. It does comptime as derived behaviour of define-syntax and syntax-rules / syntax-case, and most implementations ship with an error and syntax-error. Control flow is similarly local by convention. All you have is macros, data, functions, and continuations.
I guess I could be convinced about the UX. Scheme's macro invocations looking like functions is sometimes annoying. And not having keywords means that you can't tie them to particular "spooky action at a distance" operations... and I suppose Scheme's control flow is only local by convention, since you can do fancy tricks with continuations, and I also suppose Scheme's focus on syntax-rewriting macros probably does lead to worse UX for staged compilation.
And it definitely is neat to be able to be able to implement a sort-of trait system atop comptime. But looking at it -- I can't help but think it'd be a lot neater with support for either traits / interfaces, or proper generics? They have the problem of action at a distance, for sure, but Rust gets rid of a fair amount of the "spooky" part by requiring coherency restrictions. But what mostly leaves me unconvinced of Zig's approach here is that it seems kind of hard to build abstractions in Zig. Maybe it's not too bad in practice. Maybe there's some magick around using structs as proper modules that a future post on higher-ordered functions will enlighten me on.... but building and using abstractions is a core pillar of the functional paradigm in my opinion, and I don't see how to do it well in Zig.
(Also, don't mistake this all as Zig hate! I think it's a very effective C++ replacement. And I thought this was a really interesting perspective to read. I'm just not terribly convinced that I, as a functional programmer, have much a reason to learn it over C (hardware) or Scheme (abstractions) or Rust (hardware and abstractions) -- I'm not sure what it would teach me.)
Agree with you that a) OP doesn't make a strong case for the claimed thesis b) the thesis isn't true, as far as I can tell. I do want to reply with a couple of things though:
[Zig] makes no attempt to give any guarantees whatsoever about memory safety
I would say that this is an incorrect characterization. Zig does provide spatial memory safety in ReleaseSafe mode, and some amount of temporal memory safety. It is, of course, far from Rust, but the distance to C is also substantial. It's unclear to me how the empirical observation about C/C++ codebases having 70% of vulnerabilities due to memory un-safety translates Zig. Has anyone pwned (lib)ghostty? It should be a juicy target by now!
Regarding the title question, I think there are a couple of things Zig can teach about PLT (and there's an ample overlap between PTL and FP). Things that I've learned personally:
The "dependent-types lite" cluster:
I: Iterator, I::Item: Clone in Rust, that's a small program in logical programming language. Now I viscerally feel that is a program.The rest:
0. And I would have specified those types in Rust as well: even though it would work without types, it'd defaulted to i32, and I do want unsigned here. When you start reading Zig code you see a bit of type noise around functions like std.mem.eql(u8, xs, ys), but that's actually a consequence of library design. It could have been just xs.eql(ys).comptime is intentionally much, much weaker than macros. There's no way for Zig program to reflect on its syntax or to generate new syntax. But turns out you don't actually need to abstract over syntax for systems programming. Zig's meta programming is simultaneously significantly more restricted than Rust's combinations of traits, const-generics, and macros, and way, way more convenient in practice.Those who like the idea of "passing Io into functions" might like my Haskell capabilities library, Bluefin, in particular the Bluefin.IO module. Bluefin's equivalent of Io is called IOE and you can't do IO in Bluefin without it!
Am I the only one finding Zig a bit "noisy" on the line? I mean
var http_client: std.http.Client = .{ .allocator = gpa, .io = io };
is too many dots. Syntax is subjective, sure - but if it didn't matter, Elixir wouldn't have been such a hit because Erlang would have been enough. I find Rust's syntax noisy as well. I think Go strikes a much more elegant balance.
I'm actually surprised I don't see this comment more. That was my first thought as well. Once I started writing it though, my brain internalized the idea that the "weird" .s were prefixed by an invisible type that's inferred. It took maybe 2 days of writing it before it literally didn't even register to me anymore ¯\_(ツ)_/¯
I guess in practice you could have 2 less dots.
var http_client: std.http.Client = .init(gpa, io);
And if you wanna get wild,
const std = @import("std");
const HttpClient = std.http.Client;
var http_client: HttpClient = .init(gpa, io);
The .init() pattern is fairly common in other structures. I'm not exactly sure why it's not available for the standard HTTP client here.
In any case, I think we can say that syntax is subjective, and still matters. It's just up to you to say in which ways it matters. So we'll always have iterations of many "Elixirs" in response to "Erlangs" because somebody wants to express something in a different way.
I think the ergonomics/readability matter. It seems a little odd to think just because you can represent something in $LANG1 in $LANG2, doesn’t mean it will be easy to work with, be readable, or be idiomatic to the community. The ML family has ergonomics that make its style of programming the most ideomatic & readable. Compare in OCaml
let sum_list_rec lst =
let rec loop acc = function
| [] -> acc
| x :: xs -> loop (acc + x) xs
in
loop 0 lst
let result_lst = sum_list_rec [1; 2; 3]
let sum_arr_iter arr =
let acc = ref 0 in
for i = 0 to (Array.length arr - 1) do
acc := !acc + arr.(i)
done;
!acc
let result_arr = sum_arr_iter [|1; 2; 3|]
The ability to mutably iterate exists, but the ergonomics leans towards recursion, currying, etc. The things FP users want to do are going to be better suited in an FP-first language where features like tail recursion are often a given.
I loved reading the HN discussion where people argued with Andrew Kelley over whether Zig has monads now.
Regardless of whether they do or don't, I'm increasingly seeing the value in passing IO to functions and look forward to using it for a future project
But as a Scheme enjoyer garbage collectors can be convenient.
Thank you for posting this! Zig's allocator interface is the thing that helped me understand what monads actually are in practice in other languages. It's a powerful pattern, and in general I've found it to just help with program construction in general.