Nine Reasons to Use OSH
56 points by andyc
56 points by andyc
This list was interesting. One thing that I don’t like about Fish (and the reason why I still use ZSH) is because its syntax is so different it means my shell is not the same language as the language I program my scripts (generally either POSIX Shell or Bash depending on the task complexity and what is available), and while I hate shell scripting it is undeniable that shell is still one of the best REPLs experience we have. So having OSH as a POSIX/Bash compatible shell is definitely a plus, and also having access to YSH when I got used with the syntax (instead of the other way around, learning the syntax first before migrating like is the case of Fish).
I may eventually migrate. Does anyone know what is the “ecosystem” for OSH/YSH? Like, is there any option for things like auto-completion, syntax-highlighting, maybe something similar to oh-my-zsh?
One thing that I don’t like about Fish (and the reason why I still use ZSH) is because its syntax is so different it means my shell is not the same language as the language I program my scripts (generally either POSIX Shell or Bash depending on the task complexity and what is available)
Heh, so the thing is, once you start getting into “advanced” shell scripting constructs, you begin to discover that Zsh is not exactly compatible or similar to Bash at all :}
I use Zsh as my interactive shell, and… my shell is still not the same language as the language I program my scripts in.
I am aware of this. Sometimes I need to switch to Bash because of some specific syntax that I want to test why it is not working (and I always use #!/usr/bin/env bash
as shebang in my scripts), but it is rare to be honest.
I use Zsh for scripting as well. It has some constructs I sorely miss when writing Bash.
Interesting. If you feel like it, feel free to mention some specifics. Would be very interesting to know since Bash has some warts that are not so pleasant to work with.
One of the main things Zsh has over Bash, IME, is incredibly flexible ${}
syntax (called “parameter expansion”) that supports nesting and a wide selection of modifiers:
heads=( HEAD "${(@f)$(git rev-parse --quiet --verify --symbolic-full-name 'HEAD@{u}')}" )
There are also a bunch of other contexts that allow their own modifiers, such as globbing (called “filename generation”):
for f in $dir/(^completion*|colors*)(-.N); do
source $f
done
Oh, and there is also no implicit word splitting after parameter expansion, so you can do source $f
and it will do the right thing.
my shell is not the same language as the language I program my scripts
Funny, I see that as a neutral-to-good thing since they’re such wildly different contexts. I’d never have expected them to be the same language if they weren’t already.
One thing that I don’t like about Fish (and the reason why I still use ZSH) is because its syntax is so different it means my shell is not the same language as the language I program my scripts (generally either POSIX Shell or Bash depending on the task complexity and what is available), and while I hate shell scripting it is undeniable that shell is still one of the best REPLs experience we have.
I smiled when I read that because I just made the exact opposite decision (switching from zsh to fish) based on very similar considerations. For years, I stuck to bash because I wanted to have my scripts run on the same interpreter as my interactive shell. I eventually moved to zsh, choosing that over fish for a similar reason. And then I got bitten by an incompatibility anyway, so I decided to switch to fish and always explicitly run my scripts with bash. When I need a REPL, I just start a bash session. For me, I didn’t want to keep using bash interactively, and zsh proved to be an unsatisfying half-measure. Drawing the explicit line between interactive use and scripts has (for the first quarter of 2025) been a happier landing spot for me.
This is all very familiar. I used fish for a little while, years ago, and liked it much better than bash, even for non-interactive use… but then I realized that I still had to know my way around bash, not because of the scripts I write for my own use, but because of all those other scripts I encounter day-to-day. The “better” (as in, more portable) ones are posix sh, which is even more clunky than bash, but bash seems to be the least common denominator, and I couldn’t escape it while living in linux. Also, I can’t really use fish when I write scripts to share: might as well just use python.
Now that I’m back in MacOS land for work, and most everybody seems to stick with the new default zsh here, I’m not sure my reasoning still holds. Maybe I should just accept that my environment is more multilingual than I’d prefer no matter what I do about it, so I should just seek out the small comforts where I can get them: fish in particular.
But for linux, I’m very optimistic about Oils, in that it offers an upgrade path: a way to gradually escape from the crufty morass of legacy shells. If we could get one of the smaller and more “principled” distros to use osh as the system default, that would be a good next step.
Also, I can’t really use fish when I write scripts to share: might as well just use python.
Yes, if I am writing scripts it is generally because I have some task that I want to solve and need to share with others, otherwise I would just write Python (that I still do, but there is more friction even when there is zero dependencies) or Go. So it is either POSIX Shell or Bash, and the later is at least slightly more powerful, so it ends up being my choice most of the time.
To each their own I guess. To be clear, I tried Fish twice, but never got around the difference in syntax and also AFAIR there were a few plugins that I use in ZSH that didn’t had an equivalent in Fish (one of them was zsh-autopairs
, maybe there is a equivalent now). So ZSH for me is both more compatible and more powerful.
And then I got bitten by an incompatibility anyway, so I decided to switch to fish and always explicitly run my scripts with bash.
Like I said above, yes, I know that ZSH is not 100% compatible with Bash, but this almost never is a problem and I still get the benefits of my custom ZSH configuration 99% of time. Every time I need to start Bash to test some Bash specific thing it ends up being much more annoying since I have zero configuration there.
I hope it didn’t sound like a critique! I was just amused that such similar considerations could lead us to very different conclusions.
As it happens, I have 99% of my configuration available in bash as well as in zsh, just because the things I use happen to automate that, so I mostly don’t suffer the downside you mention there.
OSH runs a few bash plugins as-is, like the Starship prompt
But most effort right now is going toward the scripting/language side, as opposed to the interactive side – I put that at the top of the page, since it’s a common question
But we have had significant contributions to the interactive shell, like POSIX job control, and the bash bind
builtin (in progress)
And I welcome more in that direction
Nice to know, thanks.
I will probably wait them, maybe I will install OSH to use it as a “shell compiler” since the list of debugging features looks interesting, but will definitely keep an eye once the shell interactive mode starts to become usable.
For personal stuff I am not afraid of just doing #!/usr/bin/env nu
(or python etc)
I think some people overrate the cost of “please install zsh
/nu
to use this bundle of scripts”. For a larger project where you really want to be serious and can spend a lot of time on script munging, it’s good to try and be open. But if your project is already asking you to install like 20 things from apt
or whatever… its “one more thing”. It’s a value judgement like others, and it’s OK to just insist that someone has a certain shell available as a binary, just like some projects expect perl or python.
Maybe you think it’s not worth it but I like having scripts that work and I think people using systems they’re comfortable with plays a big part in that.
In general I take reproducibility serious. It is why for anything non-trivial to setup I generally just setup a shell.nix
. Go is probably the only language I don’t, because in general it works well enough with any recent enough go
installed (at least if you don’t use anything depending on CGO).
And the reason why I take reproducibility serious? Because it is better to invest a few minutes to make sure that everything will work in the future than trying to understand in a few months when I try to run something again why it is not working. So this is also why I avoid writing ZSH scripts, or assuming that the system has GNU coreutils installed (if this is a multi-platform script). Nix is definitely a plus for those cases.
I mostly do my “shell scripting” in Rust these days, but when I do write an actual shell script, I just use Fish. Cuz like what’s the point of using a POSIX shell for scripting when Fish is just better?
mostly do my “shell scripting” in Rust these days
Any examples? That sounds inconvenient. FWIW I’m amazed that some people use golang in that way. I’ve done that exactly once to work around (non) portability of time and touch arguments.
I had to do this at work for a while, and it’s better than you’d expect.
For scripts, you are not really fighting the borrow checker, async, or any other Rust bits.
https://github.com/matklad/xshell the xshell crate provides a nice way to invoke programs. The example on the readme is quite illustrative. It has a few warts, but it’s pleasant to use.
The other interesting bit is the cargo-script
nightly feature that provides a shebang for Rust single-file scripts.
I would only recommend Rust scripting in specific circumstances, though. I had to write a script for a team of developers who write Rust, so they all have cargo
installed and all of them would be happier to contribute if I used Rust, so it worked out well. In other situations, hell no.
Besides liking the language, I was very pleased because really refactors and if it compiles it works were much more true than I expected writing scripts.
https://github.com/matklad/xshell the xshell crate provides a nice way to invoke programs. The example on the readme is quite illustrative. It has a few warts, but it’s pleasant to use.
Is there anything that makes pipelines ergonomic?
I more or less agree with the docs:
xshell
doesn’t implement string processing utils likegrep
,sed
orawk
– there’s no need to, built-in language features work fine, and it’s always possible to pull extra functionality from crates.io.
In general, when I script with any other programming language which is not a shell, I find pipes are not so necessary.
However, there are situations where pipes might still be necessary; you might want to look at https://crates.io/crates/duct.
However, there are situations where pipes might still be necessary; you might want to look at https://crates.io/crates/duct
I saw this mention in the docs, but this library does not integrate with xshell, and is even less complete on its own.
This is the problem with emulating shell scripting: for all its flaws, the shell is a coherent experience. Trying to string together multiple libraries, all scratching a different subset of the itches, in your favorite $language does not feel nearly as coherent.
Bash is my favorite language.
But I think there are unavoidable tradeoffs at play. The same reasons shell languages are so productive are more or less the same reasons that make shell scripts hard to maintain and hard to be 100% correct. Some ideas in fish and zsh improve some stuff, but in the end, you either choose between the ergonomics of shells or the structure of other programming languages.
As I mentioned, if you use pipes for manipulating text (grep
, sed
, awk
)… you are forced to use those in shell scripts because shell languages do not have sufficient text manipulation. But on a real programming language, you do!
However, the most interesting use of pipes is stuff such as tar c | ssh tar x
. For this purpose, pipes are unbeatable, and I would not suggest anything else than a shell for those purposes in most cases.
But I think there are unavoidable tradeoffs at play. The same reasons shell languages are so productive are more or less the same reasons that make shell scripts hard to maintain and hard to be 100% correct. Some ideas in fish and zsh improve some stuff, but in the end, you either choose between the ergonomics of shells or the structure of other programming languages.
Yeah, I 100% agree with you here.
But the kind of problem I was referring to is more about “accidental complexity” than “inherent complexity”. There is no reason that there cannot be a Rust library that combines ergonomic process creation and ergonomic pipelines (with support for in-process pipeline stages!) and a bunch of other facilities for massaging I/O and OS resources, it just… apparently doesn’t exist.
However, the most interesting use of pipes is stuff such as tar c | ssh tar x. For this purpose, pipes are unbeatable, and I would not suggest anything else than a shell for those purposes in most cases.
Yup, this is the kind of task I often need pipelines for.
I fully agree that you don’t need pipelines solely for sed/grep/awk, because in a real programming language, you have real data manipulation tools to replace sed/grep/awk themselves. But even then, to replace sed/grep/awk, you’d want the ability to build true in-process pipelines!
To truly be able to replace sed/grep/awk with real data manipulation tools available to you as part of a real programming language, you’d want something like this instead (this is rough pseudocode, I don’t have nearly enough experience to design APIs on the fly, so please excuse any unsoundness or internal inconsistencies):
let hosts = [ "foo", "bar" ];
let uids = HashMap<(string, u32), usize>::new();
sh.data(&hosts)
.xargs(|host| {
cmd!(sh, "ssh", "root@{host}", "cat", "/etc/passwd")
.stderr(|s| s.to_lines().map(|line| log_as_stderr!("ssh {host}: {line}")))
.into_stdout()
.to_lines()
.map(|line| (host, PasswdEntry.parse(line)))
)
.filter(|host, entry| if (host == "somehost") { entry.uid >= 500 } else { entry.uid >= 1000 })
.map(|host, entry| uids.entry((host, entry.uid)).and_modify(|nr| *nr += 1).or_insert(1))
.run()?;
FWIW, I’d love to have something like the above in any language. Any ideas?
I don’t think this is what I want, but I implemented everything except your filter/map.
Which with reasonably simple libraries could be tidied to:
["foo", "bar"]
.iter()
.map(|host| {
run_output(["ssh", host, "cat", "/etc/passwd"])?
.trim()
.lines()
.map(|s| s.to_owned().try_into())
.process_results(|e| e.collect::<Vec<PasswdEntry>>())
.unwrap()
.into_iter()
.map(|pes| (host, pes))
.collect::<Vec<_>>()
})
.collect::<Vec<_>>();
, which I don’t think it’s too bad. The problem is of course that it took me ages to get that, because I’m not so fluent…
The ?
and unwrap
could be replaced with unwrap_or_else(panic!("problems with {host}"))
or stuff like that, for better error reporting.
One thing that I don’t like about Fish (and the reason why I still use ZSH) is because its syntax is so different
That’s exactly the reason I switched from Fish back to Zsh. Even thin
This page and the ysh one are looking good! Maybe I should try daily driving ysh. It’s just annoying that I can’t also use it at work. It’s kind of crazy that we still put up with bash. People often say to switch to a “real language” once a script gets longer than a few lines, but that gives up the advantages of shell. “The ultimate glue language” is exactly what I want a lot of the time.
Thanks!
I hope that when the OSH test framework is solidified, you may have a reason and ability to use it at work
OSH code, including the test framework and your tests, will still run under bash (that is done on purpose). So if you want reliability, and not relying on bash-isms [1], you can dip your toes in by using our test framework
Personally I was not very happy with existing test frameworks like BATS and the one that git uses
I know that not many people will take the time to test scripts under say PyPy and CPython, but it does happen, and IMO it is more warranted with bash.
Probably if we add profiling this will be more compelling – it’s something you can’t do with bash, and you don’t have to change any code. Many shell scripts are slow, e.g. forking too many processes in a loop
Recent discussion on that: https://lobste.rs/s/fl7ly9/traceboot_precise_lightweight_tracing
BTW I was going through the “project scope” and I came across the previous thread about “wait for it”
https://github.com/zombocom/wait_for_it
https://lobste.rs/s/qsaevf/semaphores_golang_gnu_make#c_eunaux
I am going to hit this myself, so I want to have an idiom for it in YSH … It’s a bit hard because
There are Three Kinds of Language Runtime (a blog post I want to write)
waitpid(-1)
loop!
Just wanted to say I didn’t forget about that … there are multiple possible solutions
[1] I keep discovering MORE after 9 years! e.g. with recent bash array work
Has anyone tried both OSH and Nushell? I’m curious to hear how they compare.
I haven’t written anything in OSH/YSH but I’ve tried it out briefly and followed its development for a while. I’ve used nushell a bit. They are fascinating in how different their approaches are to development: nushell’s binary is 80mb+ to OSH/YSH’s 2mb; nushell leverages lots of existing rust libraries for functionality, and it feels like a larger group of contributors, while Oils is mostly @andyc and much more self-contained. nushell feels like a fun toolbox (including things that replace standard utils like find
) glued together with a smart but quite different approach to the shell; Oils is like a surgical removal and replacement of Bash with something that has had most of its warts removed, and it gives you a nice upgrade path to a world with structured data and nicer syntax and quoting.
I’m sure someone more technical than me could write a nice comparison of technical/design decisions. I will mention my favorite thing about nushell, though, which is the seamless equivalence between function parameters and command arguments/flags (e.g. imagine if Typer was built into the language – no more confusing case+shift statements).
They are both exciting projects! I’m mostly waiting for their 1.0 releases though.
FWIW I added a FAQ here:
https://github.com/oils-for-unix/oils/wiki/YSH-FAQ#whats-the-difference-between-ysh-and-nushell
and BTW I chatted with a nushell developer recently, and they recognized/acknowledged the interior/exterior distinction
It’s a tradeoff, but personally for the stuff I do, “exterior” is generally more powerful because it’s closer to the OS. For Windows, the “interior” design might be more convenient, since Windows exposes a bunch of stuff as shared libraries, which the PowerShell VM can access, etc.
It sounds odd that one of the reasons for adopting OSH is “You can upgrade to YSH.” YSH does look nifty! But it doesn’t sound like a selling point for OSH; to my mind it’s a little like if Microsoft’s PR team said, “The great thing about Windows is that at any time you can install Linux instead”—hey, uh, which product are you promoting exactly?
Though on second read, maybe I misunderstood something. If shopt
allows you to seamlessly switch back and forth between “compatibility mode” and “power mode” (can you shopt
back to OSH while in the middle of a YSH script? can you embed a block of OSH code inside of a YSH script, and vice versa?), then the two languages are more closely integrated than I thought. If that’s the case, I’d suggest putting less emphasis on YSH and more emphasis on the “escape hatch” nature of being able to switch.
andyc will probably pop in to correct me if I get this wrong:
OSH and YSH are the same binary. Call it with one name and it acts that way.
Oils is a “project” – I would not refer to it as a “product”
Think of OSH and YSH like C and C++ … some people switched to C++ right away; some people switched after 20 years; some people never switched
You can also think of it like Rust/Zig and C – even on the most aggessive timelines, there’s still going to be C in those systems, probably long after we’re dead :-)
So it’s not either-or
It’s also valid to use OSH forever!
I’ll add a FAQ about the second question, since that has come up – generally speaking you should not have OSH and YSH code in the same process. You can technically do it, but it will be confusing. Consider when you have “mixed stacks” - OSH calling YSH calling OSH, etc.
But during migration, it may be a useful technique to have YSH shell out to OSH, and vice versa, just like you shell out to awk or whatever
It’s not a bad proposition, to me it sounds like -hey we’re posix compatible while having these small upgrades but if you want more there’s ysh at no extra expense. This matters quite a bit for anyone who appreciate posix compatibility. If I wasn’t so dug into fish, I’d be tempted.
Oh yeah, it’s a fantastic proposition! I’m just trying to say the communications could be improved, especially for newcomers.
I’m probably in the target market for people who might try this out. I use a shell on a daily basis, I sometimes get frustrated by weird syntax and limitations, and I’m open minded enough to occasionally try new tools (most recently, setting aside my Git muscle memory for Jujutsu). But as a newcomer, the terminology is confusing. If you had asked me before today, I would have guessed OSH and YSH were two separate projects by the same author—similar to how Bob Nystrom likes to design programming languages, and has Magpie, Finch, and Wren under his belt. Yesterday me would have assumed Oils, OSH, and YSH were the same way.
(This is yesterday me in my ignorance: “Oils is the first shell andyc developed, though I think that’s its old name? I remember it got renamed at some point. OSH is mostly a rewrite of Oils in a different language, with some changes. He learned a bunch of stuff in the process, and I think YSH is his latest shell which is a radical redesign of OSH? Boy, this guy is prolific. Maybe I should hold off on trying any of these, just in case he starts a fourth one.”)
I apologize for sounding critical, but I’m typing this with love. I want this project to succeed! But I fear that, even if the project follows through on its ambitious technical goals, it won’t get traction because the first impression is, “It’s complicated.”
Please don’t apologize, your comment was fine, sorry if I spooked you, we were just having a discussion is all :) Anyway, I get you on the communication part, that was always a struggle for them ever since they split their main project; and believe it or not, the current website is way clearer than it used to be.
Oh wow, it seems like Oils is more ready for use? I have not gotten that impression before. Every other update gave me the impression that it was not yet mature enough for general adoption, but this sure looks like a “it’s mature enough for adoption!” page? Is that right? Did I miss an announcement? Considering the blog history, @andyc you may want to have some kind of release announcement or maturity announcement of some kind. I feel like there’s something missing between the updates in 0.24.0 (“we are polishing the language” and “compatibility improvements”) and “Nine Reasons to Use OSH”. For me my reaction was “I don’t need nine reasons! I’m already convinced, I just didn’t know it was ready!”
Maybe I’m just silly, but I’ve been supporting Oils from the sidelines for years, but haven’t actually downloaded or tried anything. In hindsight, I guess I’ve been waiting for a “we’ve completed some maturity target milestones” blog posts before I did, but I suppose I’ll give it a try now.
OSH is very mature - it is extremely compatible and only gets more so over time. I emphasize at the top that it’s focused on scripting
I also put “in progress” next to the stdlib – it exists and we use it, but it needs a little more validation
Likewise, the YSH page has a couple “in progress”, but in the last couple months it’s basically “feature complete” (?)
i.e. the shell + Python + Regex + JSON + YAML framing is kinda new, and that describes a compelling 1.0 IMO
Although “feature complete” is some distance from 1.0 - at the moment, YSH is changing much more than OSH
Apologies for a likely tired question, but I didn’t find the answer in the “why not …” FAQ, so: why Python 2?
Are there intentions to upgrade to Python 3, or is the plan to stick with an in-tree copy of CPython 2.7.x indefinitely?
Relevant bit from the project README:
It’s written in Python, so the code is short and easy to change. But we automatically translate it to C++ with custom tools, to make it fast and small. The deployed executable doesn’t depend on Python.
Yes, I saw that. However, that does not answer why Python 2 is used.
The FAQ in this blog post seems to answer. Specifically the link to this Reddit comment provides a lot of detail.
What would be gained from upgrading to Python 3? They’d have to reconfigure the automatic translation tool they’re using, if it even supports Python 3, and update all the 2-isms that 3 isn’t compatible with. Man-hours are already a precious commodity for Oils, so updating Python 2 to 3 would have to have a significant benefit to be worthwhile. But as GP noted, Python isn’t even part of the final executable, so the potential impact is limited.
I added this answer to the end of the wiki page:
Why Python 2? (2018) Because our Unicode is UTF-8 based, like Go and Rust, not like Python 2 or 3. Conceptually, it’s similar to PyPy, which is also uses Python 2 as the basis for a metalanguage.
Technically, the “executable spec” of Oils is able to run under the Python 2 interpreter. But I would think of it as being written in a mix of Python-based DSLs (regular languages, Zephyr ASDL, pgen2, a subset of MyPy).
Oils needs a killer app.
Its competition are Bash (default), POSIX Shell (universal), and Python (popular). IMHO it needs to be overwhelming better.
Frustratingly, the last time I tried OSH, its strict mode was less useful than ShellCheck. And I’ve yet to see a clear explanation of how YSH solves its claimed “unix sludge” and “cloud sludge” pains. @andyc, please, show a YSH snippet that’s better than the unholy mix of Make, shell, and sed!
I also have a laundry list of linguistic criticisms from trying to use the Oils languages in anger; but, you, fine reader, don’t care. And they’d be irrelevant if either of the languages were worth it.
Also perl, a good scripting language widely supported.
I think the evidence is that the scripting language space never really consolidates, and very, very slowly certain scripting languages die out (you don’t find much in the way of rebol or teco scripts but you can get a lot of rebol descendants if you really want to; tcl is mostly not used for new projects but it’s also not dead and widely available).
Perl truly snatched defeat from the jaws of victory.
How so?
It was “duct tape that held the Internet together.” It was the original P in LAMP. Perl 5.8 was peak. Then it all imploded.
I believe Debian still includes Perl as an essential package. But how many other Linux distros do? Does Fedora? Do any of the BSDs?
I hadn’t thought of it this way at all. Of the three competitors you list, only two are shells. (I’m aware of https://xon.sh but it’s pretty niche). I’m not particularly motivated by a new scripting language, but I am interested in having a “modern” shell, and I can see how this goal is immediately complicated by the massive pile of “legacy” shell scripts in more or less every unix system. Oils killer feature is its upgrade path: osh for legacy compatibility, ysh for improvements.
Bash didn’t gain its place as the de-facto common linux system shell by being overwhelmingly better than any of its predecessors. It’s just the Bourne shell plus a few csh/ksh features, but GNU. Is zsh overwhelmingly better than bash? I guess some might say so. It’s mostly-compatible in a messy, pragmatic, unprincipled way, just like bash is mostly-compatible with pure-posix sh. Apple decided to promote zsh to the system default because of licensing, IIUC. I’d say fish is overwhelmingly better than bash… but it sort of cheats by being backwards-incompatible. Nobody’s going to build a viable distro around fish. It’s a thorny problem, and I think Oils’ principled approach is a ladder out of the tarpit. That’s killer-app enough for me!
Oils isn’t a full featured interactive shell, yet. Moreover its advertising almost exclusively focuses on its non-interactive advantages— that it’s a better scripting language than the incumbents.
IMHO a tool either needs to give its user a new capability or remove a pain to make an “upgrade” compelling. And the bar is high when the competition is only two to four characters away for anyone to use.
We must have experienced the 90s differently. In my memory, Bash was SO MUCH better than its competitors. It was robust in every config (OS, terminal) I used it in, was borg like in its language “design,” and included heaps of interactive niceties out of the box. Interactive Bash felt good like interactive does Fish today. And, yeah, it was GNU and thus available. Linux hadn’t won yet and BSD wasn’t as portable. (I swear, the hate GNU gets these days has almost memory holed how much of a game changer it was…)
Someone building a popular distro around Oils would suplly credibility in spades. And I imagine slavishly duplicating all of Fish’s affordances atop Oils would help with traction. Suck the air out of both oh-my-zsh and Fish.
Excellent points. And I’ll admit, bash was the only shell I ever used in the nineties.
Better interactivity UX should be both fairly well understood and fun to implement. If I had time to contribute to Oils, that’s probably what I’d focus on; as you suggest, cribbing from fish.
I also have a laundry list of linguistic criticisms
BTW I’d like to hear what those criticisms are, feel free to record them at:
https://github.com/oils-for-unix/oils/issues
(or join https://oilshell.zulipchat.com/)
First impressions are valuable!
Some people are writing YSH now, but others have definitely had issues with:
=
, e.g. var x = a[i] + f(x) + 42
If there are problems other than that, especially for people who know shell and Python/JS, I’d be very interested
Those sections are marked “in progress” … I think they will happen because other people are pushing on them, but we can use more help
Metaprogramming/reflection is a difficult design problem, but from my surveys we are better, in several dimensions, than Python/Ruby/JS (and shell, it goes without saying).
We are a bit closer to Lisp (e.g. we already have the “t strings” from the upcoming Python, which sorta fell out “for free” - ^"echo $name"
is a value of type Expr
)
Personally what I am interested in is using YSH as the CI for all my future projects. Our CI is already written in shell, and it desperately needs some YSH features - https://op.oilshell.org/uuu/github-jobs/
I encourage everyone to work on their own killer apps, like the GUI, etc. It’s a “project”, not a “product”
Sorry for calling out that example then. I didn’t (and don’t) see where it’s marked “in progress.” AFAICT that annotation is only on the sections far below on that page.
Why do you mention metaprogramming and reflection?
Metaprogramming and reflection are how we “reuse” the syntax of YSH to express shell/awk/YAML-like patterns.
I will follow up on this, there is some example code here:
https://github.com/oils-for-unix/blog-code/blob/main/hay/iac-demo.ysh
based on https://lobste.rs/s/t0uh3q/yoke_is_really_cool#c_rorymb
Also, on Perl, I wouldn’t view it a “horse race”, but I do get the sense that Perl’s implementation “got beyond” the team … it became too complex
That’s one reason I’ve been careful to limit the complexity of Oils … and write it in a “spec-driven” fashion
After 8 Years, Oils Is Still Small and Flexible
It still has room to grow! And I hope that others can reimpelemnt YSH only, without OSH
I think it’s a great evolution of Bash, and I think evolutions are the way to go when it comes to progress.