which programming resource changed your career?
79 points by paulocuambe
79 points by paulocuambe
For me it was reading Operating Systems in 3 easy pieces. The amount of aha moments I had and the confidence to talk and tackle even more complex projects after I read this book was day and night.
I wonder if you guys can also pinpoint a point in time where you learned something that changed your career or programming life?
Lobsters, actually!
Tell us why? :)
I learn often from the articles on this site, but more importantly Lobsters is a resource for me to see how other programmers think. I am still pretty fresh out of university and I work in a software development position without any peer programmers, so Lobsters is some of the only exposure I have to other programmers.
Parse, don't validate by Alexis King. It set me off on a path (or at the very least gave me a kick down a path I was already ambling down) that's helped me move into some very interesting work.
I always fluff Forth in these threads so I'll switch it up: on the hour and a half either way to and from my first job, I read The Design and Implementation of the 4.4 BSD Operating System. The amount of fundamental knowledge I gained reading that book completely changed my understanding of what my operating system actually was, and what it was doing, and how it was programmed, and the patterns and software design underlying it. If you don't know how your OS works I think it's a good introduction to the subject.
Oh also to buff this suggestion, I got it personally from John Carmack over Twitter in 2019. His other recommendations (all awesome) were:
Computer Architecture A Quantitative Approach
The Mythical Man-Month: Essays on Software Engineering, Anniversary Edition
You reminded me of his 2012 blog post on applying functional style in C++. That was probably my earliest clue to next decade and change of functional style and language features finding homes in OO world, and our ever growing aversion to shared mutable state.
Rereading it, here's a quote I can appreciate better with experience:
Most developers are not very good at predicting the future time integrated suffering their changes will result in.
I think there were three:
Elements of Clojure is so good and rhymes with Ousterhout's advice in "A Philosophy of Software Design." Slight pity for Zach's sake that it was marketed as Clojure-specific but it's damn good anyway.
It's a pity that clojure got some of the feelings associated with lisps, as I think it was also overshadowed by that. It seemed like a well designed language by and for software engineering with good practical tradeoffs for interoperability with java. It was not a computer scientist's tool, as many lisps have as a connotation (even if that connotation is not explicit and often wrong).
As a teenager,
My doctoral level probability and stochastic processes course was career changing, but not a programming resource, and not really a think I can point to other than: go take the measure theory based, graduate course in real analysis, followed by the measure theory based probability course, especially if it's taught by an old mathematical physicist.
Ullman's 'Elements of ML Programming'
This book is very high on my list, too. It changed my programming style from statement-based to expression-based.
I remember talking to a former Microsoft fellow. I told him I read that as a teenager and he said, "I am so envious. I didn't find that book until I was in my thirties."
One day, a friend told me about the Entity Component System -architecture and that lead me into data-oriented design and specifically Mike Acton's Data-Oriented Design and C++ talk. This very much split my career in two: before this talk when I was programming and even doing performance work without really thinking about what the computer really was doing when it ran my code, and after this talk where zen is all, all is zen.
To feel the Shannon entropy in every bit, in every read he makes. That is the way of the programmer.[1]
This talk (and the general concept) were going to be my entry. It's not even one of the main focus points of the presentation, but Acton's (slightly incredulous) response that of course if your input data has significantly changed, you should rewrite your program instead of already having made something generic enough to handle both - if your input data has changed that much, you're solving a different problem!
MIT’s missing semester has been very important to me, if only because I read it at exactly the right time.
Books that made certain things click in a way they never had before for me:
…and tons of others. But each one of those expanded my mind in a way that made me a better programmer.
I’ve got the XINU book - second edition for BB and Galileo, I’ve surprised it doesn’t come up more in there discussions. I’ve also got the second volume dealing with networking and drivers. I haven’t managed to finish either, but I’m impressed with both.
Solaris Internals ... by Richard McDougall : I worked with Richard at Sun and since the discussion about writing books was at the top of the list recently, I was reminded of his comment that writing books is a thankless job.
Hah! Well if you see him, please tell him that Solaris Internals’ description of kernel multithreading and the VFS/vnode layer were good enough that I would go sit in my car and read them on my lunch break in 2001 or so.
Computer magazines in the 80s and 90s!
I believe we got our C64 when I was 4, and I consider I wrote my first decent programs 12 years later with Blitz Basic on the Amiga. So I always think that yes, learning to program is difficult, it likely took me over a decade!
It's likely my subjectivity, but I actually think programming was more accessible in many ways those days. Although I have colleagues who are 20 years older than me that are way more advanced than I was at their age, so I don't know!
Yup. Computer magazines in the 80s were so influential.
But the one thing I recall more than anything was this poster: Beagle Bro's Peeks, Pokes, and Pointers
The Elements of Computing Systems by Noam Nisan and Shimon Schocken. This book took all of the mystery out of what exactly a computer is, from soup to nuts. If you've ever wanted to build an ALU from exclusively nand gates, this book is for you.
Specifying Systems by Leslie Lamport. This book made software behavior, that is what software actually does during its execution, click for me more than any other.
Concrete Semantics by Tobias Nipkow and Gerwin Klein. Software is generally not defined as a state machine though, and this book finally made me grok programming language design and semantics. Doing it through the lens of using a proof assistant for all of the definitions and theorems is a huge plus.
Designing Data-Intensive Applications by Martin Kleppmann. This one made me really understand the general principles and challenges of modern web application design. Putting replication, sharding, consensus, transactions, etc., all in one book gives an amazing birds eye view of the kinds of systems I mostly work on.
Database Internals by Alex Petrov. Databases are preeminently important, and I happen to work on a DB observability product so understanding what DB engines are doing at the lowest level has been very important to me.
And, lastly, this blog post has had the biggest impact on my life in the last year or so: Formal Methods Only Solve Half My Problems by Marc Brooker. I seriously have not read a more impactful post in recent memory for me personally. Before this, I was almost entirely focusing on system correctness, namely functional correctness. I was bullish on the idea that testing and verification could substantially increase the quality of software overall. This is still true, but only presents half the picture: correctness requires reliability, and reliability has much more to do with system load, performance, and the ability of a system to adapt to big changes in these.
This post put the nail in the coffin on that point for me: non-functional requirements like performance and elasticity are a huge part of system reliability, and an unreliable system cannot be correct by definition. The ideas suggested there, namely simulation, are pretty much all I've been thinking about since.
Marc is excellent. Code Only Says What It Does was an eye- opening read for me about how it is insufficient for code to be merely correct.
I had a very weird introduction to programming: while my first programming languages were technically GW-BASIC and QBASIC, I was beyond lucky to accidentally stumble onto both Squeak Smalltalk and Delphi 2 right at the same time that I was ready to tackle "real" stuff. The first showed me what a truly interactive development experience could achieve, and the latter showed me how efficient, drag-and-drop GUI development wasn't at odds with performance or access to native hardware.
The downside to this, of course, is that I've basically never been genuinely happy with my tooling since then. Smalltalk has a lot of problems, but it's one of maybe two languages I'm aware of where you can write an entire program by writing the code you wish worked, and then recursively fleshing out the rest in the debugger, in real time, such that your program retroactively Just Worked when you're done. Delphi/Turbo Pascal likewise have their issues, but I'd take the overall feel any day of the week over Jetpack Compose/Swift UI, let alone something like Gtk Blueprints or the like. (Qt comes really freaking close, if you're willing to deal with C++ or Python, which I'm really not these days, but I do want to acknowledge that.)
But the plus side is that I've had these shining stars of what should be possible that has powered a lot of my career. Kiln ultimately failed as a product, but we had a lot of source control features that were inspired by what Smalltalk's DVCSes could do; I wasn't beholden to the models offered by Git/Mercurial/Subversion. I used Smalltalk for experimentation and prototyping up until the mid 2010s, too, which made it possible for me to rip through a pile of ideas very quickly, even if I knew none of the code would survive. That helped a ton with validating approaches and rejecting bad ideas. And while I long ago put away Delphi, I was still using Free Pascal/Lazarus to explore possible UI designs up until a few years ago, when the web finally ate things enough that using Lazarus for prototyping stopped making sense.
Exception Driven Development! I heard about this style of programming from the gentlemen at GemTalk (nèe GemStone) back when they were working on MagLev (Ruby on their Smalltalk VM)
M A Jackson, Principles of Program Design (1975) -- read after 12 years of programming as an employee; the following year I went independent.
That’s a real career change.
What was the main lesson you took away from the book? What in it motivated you to work independently?
For actually writing code, probably Partial Evaluation and Automatic Program Generation, 1993, by Jones, Gomard, and Sesoft, commonly known as The Book. This book fundamentally changed how I think of compilers and opened up a path to working with high-performance compilation schemes. It hasn't made me any more money, but it's allowed me to pursue research that is otherwise out of reach.
It's not about writing code, but An Introduction to Gödel's Theorems, 2007, by Smith, is the book that allowed me to fully understand the arithmetic portions of Gödel's theorem and the roots of computability theory. There's a direct line from this to being able to tell bosses "No, because of Rice's theorem."
Partial Evaluation and Automatic Program Generation, 1993, by Jones, Gomard, and Sesoft
I also love this, though it didn't influence my career that much.
Another book of Sestoft's, Programming Language Concepts, is also good. I still refer to that book sometimes because it has the most approachable introduction to HM type inference that uses levels and union-find that I know. (another introduction to the topic is here)
Excellent question! Stuff which I think materially affected direction of path, and not just velocity:
(For the reference, less specific list of “good stuff”: https://matklad.github.io/2023/08/06/fantastic-learning-resources.html, https://matklad.github.io/links.html)
I'm in a book-listing mood:
The direct career impact award: Haskell Programming from First Principles. I ground through it in its entirety on my work commute, and it finally made Haskell click for me. That got me into my first Haskell job about 10 years ago, and I've been lucky enough to hold down a Haskell job somewhere ever since.
The shaped my aesthetic sense of software award: The Art of Unix Programming. Open standards, small sharp tools that do individual things well, simple transport formats to connect components. FP and the Unix philosophy are compatible with each other, and while this book doesn't say everything or get it all right, it gets enough right in a small number of pages that its overall insight:page ratio was very good for my younger self.
The changed how I wrote software award: Refactoring: Improving the Design of Existing Code. Prefer the first edition — since typed languages are back in fashion thanks to TS, its use of Java is much more relevant than the later edition's Ruby. Steve Yegge has it on his list of ten great books with the remark, "when I read this book for the first time, in October 2003, I felt this horrid cold feeling, the way you might feel if you just realized you've been coming to work for 5 years with your pants down around your ankles". Refactoring teaches you how to transform code without changing behaviour. This is tremendously useful for unpicking knotty functions without breaking anything and for getting a toe-hold on understanding complex subsystems. When you get on a roll it feels like the code becomes soft clay in your hands, ready to be reshaped into something better.
The absolute work of art award: Crafting Interpreters. Every line of code works. Every diagram is beautiful. Everything is explained beautifully. This level of care is worth aspiring to in more things.
Handmade Hero.
It jump-started me early in my programming journey, at a time when I had just been learning Python for a few months. It introduced me to the fundamentals of systems programming, memory management, debugging, profiling, multithreading, and graphics.
As soon as I got to the point where I had an OS window that I could draw pixels into, I was addicted.
Most importantly, it taught me not to fear technical topics I didn't understand, but to take joy in exploring and learning how things work under the hood.
My intro to programming was Java and the OO paradigm. People used to tell me that the mental models to learn OO made sense because they mimick real-life, even though they've never clicked for me. Learn You a Haskell was my first step into the functional programming world and the knowledge gained from it felt like a superpower with far more application than any of the Gang-of-Four design patterns.
And SICP which made me see programming as art.
I was working at Fon as an R&D engineer at the time. I used to code, but only for our PoCs at the company, mostly Python, JS, some C for embedded stuff... Basically anything mainstream we could write/read, but not at a very good level. My programs worked, but I didn't really have a deep understanding of what I was doing. I'm a telecom engineer, so I'm not a Computer Scientist or a software engineer, I was more on the goal than in the process, sort to speak.
I met a few people in the fediverse when mastodon was starting, many of them contributed to this but one guy encouraged me to learn Clojure when I was leaving the company to become a freelance R&D guy but I didn't know very well what I was going to do. I was young and I was afraid.
I bought "Clojure for the brave and true". I followed the book and started to dig further into functional programming, Lisps, programming language design... It was great for me to make simple PoCs that were very modular and easy to read, but that wasn't the real change in my career. In fact, I don't write Clojure anymore. Instead, it somehow made me understand that I could get involved in programming language design and implementation and that's what my career has led me to. I started to dig in Scheme and almost 10 years later I'm working in GNU Mes and I contribute to GNU Guix and Guile.
I think that was one of the many things that sparked the process but I wanted to mention it because the guy in question, Esteban Manchado, passed away a few years ago, and I didn't have the chance to thank him.
He was a nice guy, and I miss him.
EDIT: Somehow, I just wanted to say it's people who changed my approach to things, and tuning to their interests. I know it's not a great answer, but it's cool to see how something led me to somewhere else that has almost nothing to do with the original.
https://haskellbook.com/ without hesitation. The pedagogical work of Julie Moronuki changed me for the best.
I tried learning programming with some C++ books around 2001 and failed. But I succeeded on the second attempt when building web pages with PHP a year later.
Since I had only an expensive dial up connection via modem, the offline copies of the PHP reference manual for their standard library (including the very useful examples) and SelfHTML (German HTML/CSS/JS documentation and tutorials) were the things that helped me most.
In chronological order:
Not a programming book but "Hackers and Painters" was translated to my native language when I was at a crossroads, and it somehow influenced me into choosing CS instead of another engineering or physics.
It also made me learn Common Lisp and Scheme.
Land of Lisp is probably not a great book, but it was just released when I was getting into Common Lisp, and the alternatives weren't any more approachable (e.g. Practical Common Lisp).
Most importantly this book made me interested in Haskell because it had a page or two where it speaks highly of Haskell.
Real World Haskell is probably what made programming fun for me. I knew Python and Java at the time and didn't want to write anything else after this book. This started my Haskell career which took about 8 years.
Structure and Interpretation of Computer Programs was probably the second most influential on my career development, after Hackers and Painters. I started getting interested in PL implementations with this book.
There are more that I can mention but after SICP I don't think anything influenced me that much. I've decided to do PL with SICP and more than a decade later I'm still working on languages :-)
Computer Systems: A programmer’s perspective - 3rd ed (x86-64) : appreciate the focus on low level details and representation. Great exercises.
Essentials of Programming Languages - Friedman, Wand, Haynes : this was my first intro to scheme, very accessible but still challenging. If you’ve never managed to make it though SICP, try this one.
For me it wasn't a book. I was given a Cray XMT supercomputer to work on, which uses neither CPUs nor cache. It changed the way I think about software scalability in a way that defined my career.
The Cray XMT is an exotic silicon architecture. The programming model is based on deep latency hiding, designing dynamic schedules for up to a million concurrent operations. Things that are expensive on a CPU, such as spawning threads or using locks for concurrency control are essentially free, consuming no memory or CPU because they are primitives of the low-level silicon architecture. If you could reason about the design of complex schedules, it was the most efficient and scalable general-purpose compute model I've ever used.
It made me wonder if the same schedule-centric model could be implemented in software on CPUs. Some deterministic operations on the XMT could only be done by inference of state on a CPU but in principle you could design comparable mechanics on a CPU at coarser granularity.
I built a runtime that could compile XMT code on x86 and simulate some of the silicon behaviors. Instant 10x boost in scalability on x86 compared to a conventional implementation. Shortly thereafter I designed a database kernel based on the same ideas which proved wildly effective. I've been doing it ever since.
Every old programmer has a few main tricks that they use over and over to great effect. Treating every high-scale/high-performance software system as a "design a giant scheduler" problem is one of mine, thanks to the Cray XMT.
Can you elaborate a bit more about what "design a giant scheduler" looks like?
After reading a bit about the architecture what I'm envisioning is that you take into account the latencies of operations are you are basically working while the other is waiting by properly scheduling operations so that there is no downtime. Are you doing something similar with normal hardware? How does this compare to things like an async runtime? Is it that you know the task at hand so your scheduling is a lot more accurate and thus more performant? I guess async can still be memory starved, and the scheduler takes into account memory latencies as well?
This is directionally correct. It is fundamentally an async model without an async runtime.
The first step is disabling as many existing system schedulers (implicit or explicit) as you can. In other words, pinning a single thread per core, static allocation at startup, bypassing the kernel, doing direct I/O into userspace, etc. You eliminate delegation of decisions about when and how things are done to the extent practical. It is imperfect but you can get close enough that it works well. To your point, you need to have an accurate model of what every operation will cost at runtime even if that cost varies dynamically.
The next step is to reimplement all of this functionality, taking advantage of now having a precise and modelable global view of how the code across all these subsystems interacts. A primary objective is to eliminate all blocking-like behavior, notably shared resource contention (e.g. locking), cache misses, and resource exhaustion.
In latency hiding hardware, these objectives are achieved by eliminating the cost. On the Cray XMT, the silicon can detect these situations for any thread of execution and context switch every clock cycle. Blocking is free as long as some subset of threads are immediately executable that it can switch to.
Nothing like this is possible in software. Instead, you design the software to anticipate future execution that will lead to a blocking state and dynamically rewrite the execution order at a granular level to minimize the probability that these future states occur. Ensuring this pervades the code design; it isn't like there is a separate "scheduler" module that hovers over otherwise normal code.
The caveat is that reasoning about the design of singular schedulers across complex systems is unreasonably difficult. It is an NP-Hard problem generally and the anticipation and rewriting mechanics must be extremely cheap to be useful.
My early attempts worked but were pretty rough. I've had time to iterate and refine design patterns over many systems such that I've been able to distill principles and idioms for most aspects.
Software Foundations is pretty sweet and really helped me catch up on theory as I was moving into formal methods / PLT related work.
The first book was Writing Solid Code. While it was C-centric (and from Microsoft no less), it did fundamentally change how I approach API design and introduced me to "design-by-contract" (even though that isn't mentioned at all in the text) where I almost never have to check for NULL in my code.
The second book was Thinking Forth. While this was Forth-centric, it fundamentally changed how I design programs. Stuff like prefer table-driven over code logic, don't check for the same condition twice, etc.
Working Effectively with Legacy Code by Michael Feathers about 2009 or so. Helped me go from I can write code to I can find problems, test them, fix them know that they're fixed, with a solid process behind things. Well worth a read for early career people.
Don't have a "real" career to speak of but Crafting Interpreters really made a lot of things click and got me much more invested into programming language implementation and compiler development.
Computer Networking Top Down. Knowing approximately how TCP/IP works is a DevOps superpower.
"Code Complete" - When I read this way back in college it was the only book I had read that covered practical coding issues like API design, code formatting, commenting, best practices for software construction, etc. The analogy between physical construction and software construction really resonated with me as well. The other books I had read up to that time had focused entirely on languaged details, or the nitty gritty of algorithms and techniques. I feel like I write better, easier to maintain software because of this book.
"The Pragmatic Programmer" - I read this one early in my career as well, and I feel that most of the advice, especially around using automation and learning your tools well has really paid off. I feel like I'm genuinely more productive than many of my coworkers because I've followed the advice here.
Introduction to Computing Systems: From Bits & Gates to C/C++ & Beyond - I really appreciated going from quartz crystal oscillator to NOR gates to eventually ASCII, assembly, and C. At the time, I hadn't ever mapped each step in that sequence and still have really enjoyed revisiting the material later. In terms of how it changed my career, it removed some mystery of low level systems programming and made me realize that I could start focusing on systems level languages and dig into C code bases or assembly or whatever else, if needed, given some time and patience to learn how it works.
By time of reading (from first) and by impact:
And apart from that, it was great for soft skills: Behave: The Biology of Humans at Our Best and Worst by Robert Sapolsky
Programming Perl (3rd edition) is the book that really got me started in the early 2000s. Beej's Guide to Network Programming had an influence early on as well.
Later on the Minix (raccoon) Book is the one that really got me into systems stuff.
Moving to an always-on internet connection instead of paying per minute :)
Wouldn't have really called it a career at that point, but I was being paid to write software in school before graduating and it was just the tail end of dialup.
Back in 2017 my career took a turn towards security; Ross Anderson's Security Engineering was the best introduction to "Security Thinking" for me at the time... I should go read it again.
If I were to answer this by putting everything on a bell curve of impact I would definitely put the following on the extreme right hand side. I think about how impactful something I've read in terms of how many times I revisited them or how many times I was looking at something and had a light bulb moment.
There are tons more but most recently the Amazon paper Using Lightweight Formal Methods to Validate a Key-Value Storage Node in Amazon S3 comes to mind because it was the first time I was convinced that maybe I can get away with not learning TLA this year again.
Pragmatic programmer, SICP, Software Abstractions from Daniel Jackson, Programming Languages Applications and Implementations (PLAI)… I think the biggest Aha moments were studying languages really…
cursor with figma make. I can actually think about designing without spending weeks so frustrate with designer tools cos I am not one. I just wanna build some cool stuff and see if it works
I can think of a few resources. In rough chronological order ...
These three got me started:
Then once started these five helped ...
There are a few management classics that really helped with that phase of my career, too, but are OT for programming.
Rich Hickey's talks, but most prominently the two:
I don't use Clojure regularly and have never used it professionally, but the lessons these two talks teach have stuck around with me for a long time.
Reading Crafting Interpreters inspired me to leave my decade long career of web/backend developer and shift to systems programming. Now I work on parsers, compilers, etc.
Probably https://ferd.ca/queues-don-t-fix-overload.html and nearly all of his blogposts and talks.
It helped me put into words and actions the lessons I had learned in EE and port them into Software.
The Gentoo Handbook.
In my late teens a friend told me that if I wanted to get real about Linux I should try Gentoo. That install took me two weeks (compiling a kernel took almost an hour on a pentium II !) but learned me a lot of concepts that have been useful over the next 20 years. Also pushed me to CS courses, at a time when I was hesitating between wildly different career paths...
Thanks, gentooists !
Paul Graham's books on lisp made me actually like programming for itself, instead of viewing it as a tedious tool (my previous experience was a pretty old dialect of FORTRAN.)
Functional Programming in Scala by Paul Chiusano and Runar Bjarnason got me into the statically typed world, and helped me get my first FP job.
Rich Hickey and Zach Tellman's talks have all been really interesting and useful, even if I've never used Clojure for anything serious. I'd say Zach's talk On Abstraction is my favorite programming talk ever.
PyCharm opened my eyes to just how good IDEs could be. I still don't like Python, but it made working in big Python codebases a lot less painful.
IRC! I’ve met a lot of great friends there over the past two decades. Those friendships led to wonderful opportunities that I never would have otherwise received. I would highly recommend cultivating relationships. Programming communities on Discord have taken over IRC channels for me these days, but the principle is the same.
I still remember how amazed I felt ~30 years ago when I've read Programming perl, it was life changing for me. Following this, the FreeBSD handbook was also great and it's great that it's still is a great up-to-date and deep resource.
A less technical book where I've learnt a good deal of some of the why's on UNIX systems and was very nice to read was A Quarter Century of Unix.
I guess these really helped me decide my career rather than changing my career, but:
Those books, and "boot to BASIC", were amazing.
Ellis & Stroustrup: The Annotated C++ Reference Manual - Prior to it I didn't really understand what was going on with OO stuff, after it I had a good clue as to what all was going on in (that version) of C++. Having gotten my head around that stuff, moving on to things not covered in the ARM was a lot easier.
Nothing really. Sure I learned for my own sake and always enjoyed it. But I've found in the last decade or so that no one is interested in hiring someone who learns things. The only employees wanted are those who have already memorized random trivia. What you could learn is irrelevant because you are a fraud until it is proven that you've memorized a random selection of factoids first.
The Computer Hobbyist's Handbook by R.A and J.W. Penfold was huge for me at a time when computing books were hard to come by (1980's Ireland).
As a professional / grown up: The Practice of Programming taught me a lot about a methodical approach to designing programs. I'm due a re-read.
Functional core, imperative shell was a big one when I was getting started.
A philosophy of software design gave me much better ways of describing "good" and "bad" in things I review
In addition to books, I learned it was useful to read certain volumes of documentation cover to cover. The first one was Direct3D 8; I was just following my interest in graphics. Looking over the entire text taught me a) everything the authors thought was important, b) how things fit together in patterns, c) all kinds of odd details that later became creative options, d) where to find things I wasn't going to remember verbatim. This is bottom-up study and it's a great complement to a top-down guide and some practice. I've done this for Objective-C era iOS, Docker, Godot, all the manpages for /bin/*, lots of things. I did a halfassed job of it with the Common Lisp Hyperspec. Sometimes it takes days, but an axe works better if you sharpen it first.