Use Python for Scripting
38 points by hyPiRion
38 points by hyPiRion
Bash is my favorite programming language, but lately I write any non-trivial script in Python.
My main annoyance is that all my scripts start with a wrapper for subprocess.run that sets check=True by default and prints out the command to be run with shlex.join. This function grows a few more functionalities depending on the script.
Another annoyance is that I have not found an equivalent I like for Bash's exec.
But pathlib.Path.(read|write)_(text|bytes), f-strings, tempfile.NamedTemporaryFile(delete_on_close=False) (very recent, I was basically using temporary directories for everything normally), textwrap.dedent with multiline strings, built-in json, tarfile, argparse, getpass, concurrent.futures' map, context managers... are very useful batteries without resorting to dependencies.
It's still not entirely mainstream to have tools that allow embedded dependencies, but at some point we'll get there, I guess.
I was tempted to have a short section on uv and embedded dependencies, but the problem with embedded dependencies is that it requires the network, you need to install uv (for now)... and because you don't need to pin a specific version of your dependency, not only may things just start to randomly break one day, you also introduce a potential security vulnerability. Lots of things you don't really want a reliable build script -- which was my main focus in the post -- though for other scripts it's a pretty natural extension it you ask me (and which I use a lot).
and because you don't need to pin a specific version of your dependency
In case you didn't know, you can lock dependencies for scripts.
Hmmm, I think you can specify exact versions of dependencies, although your dependencies won't do that.
IIRC, pipx also supports them- and it might be more widely available (e.g. Debian has it but not uv).
But yeah, I still go with no dependencies all the time for similar reasons.
https://docs.astral.sh/uv/guides/scripts/#locking-dependencies
You can also lock all of your script's dependencies with uv.
My main annoyance is that all my scripts start with a wrapper for subprocess.run that sets check=True by default and prints out the command to be run with shlex.join. This function grows a few more functionalities depending on the script.
I do the same exact thing! I call mine def run(cmd):
It will either raise an exception or return stdout
There are indeed a ballet bajillion run wrappers. I write all my scripts with my aush library, but piping is still inconvenient in Python.
In the scripts I write, piping tended to be to do text processing, which on Python scripts I do in Python.
There's other scenarios where you want true piping, but so far I haven't had the need.
This comment looks eerily close to what I'd have said if this comment was not here.
Hello to a fellow Bash enjoyer, I guess :-)
But are we in the same boat? I feel sad because nowadays, like I do 90% scripting in Python, 10% in Bash.
Still, I enjoy the pleasures of Bash as my REPL, because nothing (well, nothing that is not a shell) beats Bash at being a REPL for quick manipulations...
This is a great post, but in practice, the tooling ecosystem around python had made it somewhat untenable in the past. With uv these days, though, I feel confident to write scripts in python that will work on a coworker's machine:
https://akrabat.com/using-uv-as-your-shebang-line
#!/usr/bin/env -S uv run --script
# /// script
# requires-python = ">=3.12"
# dependencies = [
# "click",
# ]
# ///
I tend to use Go for scripting if Bash won't cut it.
Solves distribution issues, runs fast, has types. What more need be said...
I'm glad it works for you, but its introspection is zero, whereas when a textual program goes kablam, one can at least read what it was trying to do. I cite error handling, because that's near and dear to me, but security sensitive folks are also the target audience for "what is this about to do to my system?"
Off the top of my head, there are a few reasons I tend to use shell for quick scripts rather than python:
subprocess.uv, losing the portability benefits).cat file | dostuff is a huge performance improvement.These downsides, plus shellcheck being so good, mean that I'm pretty happy sticking to bash for most cases (as long as I don't have to do anything with data structures).
Python will work the same on all the machines you run your script on
This is exactly why I prefer starting automation in Python once powershell or bash extends past a few lines. However, paths and executable names are the usual things that break between platforms.
The problem I haven't solved is how to scale up larger sets of scripts. There's a region quite a bit past "basic automation" where things get much more complicated with many smaller scripts or an "uber-script" with subcommands; you have a lot of functionality you're duplicating from or want to use in binaries and ensuring cross-platform functionality becomes less trivial, especially if you're calling your own executables with changing CLIs or output formats or with lesser used workflows. It seems similar to the sort of problems I've encountered with complicated Jenkins or GitHub action flows.
standard library is there and won’t go away
The Python standard library isn't perfect but it's great. There's also shlex for command line parsing and difftool for diffs.
standard library is there and won’t go away
So long as you aren’t talking about the 2-3 transition.
Shout out also to babashka if youre lisp-inclined: https://babashka.org/ fast and portable
I use https://github.com/google/zx for my scripts.
I'll second zx (thank you, @antonmedv!).
But I'll also add bun + Bun Shell to the list.
I've long advocated using zx for shell scripting in our org (local and CI tooling), but
bash with it, I've always leaned towards keeping the scripts as "self-contained" and "executable" as possible (sometimes using a nix-shell shebang, sometimes accepting npm install + adding a node --experimental-strip-types wrapper script to PATH, to be used as shebang); andUsing bun as the runtime solves b) (without type stripping, i.e. actual type checking), and having Bun Shell as part of its stdlib makes it hard to justify having the additional step of installing zx for simpler cases. They mostly have similar behavior (although I've seen some edge cases and still prefer zx's API and design, namely its composability, piping, ...), but the convenience of just having a bun shebang + chmod +x with no need to install other dependencies has been leading me to transition towards bun.
@antonmedv any suggestions on "self-contained" approaches (no auxiliary files/npm install) for using zx + TypeScript? (maybe bun + relying on auto-install + version specifiers for pinning zx in the file itself, similar to @adithya's comment above?)
@jlbribeiro this is very dependent on the company. =) at google - hermetic builds, at other - docker. Scope the problem you-re trying to solve.
I've been meaning to use Janet more as in this post.
To save others the trouble, doing so seems to be a feature of https://janet-lang.org/jpm/project.janet.html#Creating-an-executable with some concrete examples shown as https://github.com/bakpakin/littleserver#building and https://github.com/janet-lang/circlet/blob/2e84f542bffde5e0b08789a804fa80f2ebe5771e/project.janet for how one can integrate a library with an OS dependency