Software can be finished
41 points by edwardloveall
41 points by edwardloveall
I have a bike that I have to service periodically. Is my bike "unfinished" because it requires maintenance? Why are we, as an industry, so averse to the idea that technology requires upkeep?
It's ironic that the author cites embedded sytems - and particularly consumer devices and appliances - as a success story. I'd be hard pressed to name a category of software artefacts which are less likely to be feature complete and secure.
Certainly, some embedded software does meet the stated criteria for being "finished". But the reason that embedded systems don't need to be patched is that they are, essentially, disposable. Most modern consumer devices have lifespans measured in years rather than decades, and are built to be replaced wholesale, rather than repaired.
This suits the companies that sell these devices pretty well, but it's not so great for everyone else. And it's not just that planned obsolescence is user-hostile: the environmental impact of e-waste is enormous.
Perhaps instead of leaning into some of the worst tendencies in our industry, we could focus on:
It's not much, but it might be a small step towards getting out of the mess we're in.
But your bike needs maintenance because onbwear and tear. Because of entropy. Software and digital content in general do not need this kind of maintenance.
Yes - there are more complicated metaphors that might work better.
One which I considered was a railway system: railways need continuous maintenance and improvement work, not just for "wear and tear", but because of changing commuter demands, or improvements in signalling, or electrification, etc. etc. etc.
Analogously, software has to change in response to changes in the software, hardware and human environment which it exists within. The author sort of acknowledges this problem:
And in any case, even if your environment does change, my point in this article is to get you to think: what if the environment did not change, and I could write finished software.
But in my opinion this is a cop out - why would we engage in this hypothetical when it's clearly not applicable in the majority of cases?
why would we engage in this hypothetical when it's clearly not applicable in the majority of cases?
To remind ourselves which environments filter out a large part of the entropy (e.g. Linux from the userspace program point of view, Common Lisp implementations, Free Pascal Compier + Lazarus) and which actively add entropy?
But in my opinion this is a cop out - why would we engage in this hypothetical when it's clearly not applicable in the majority of cases?
Because people build software for themselves or as a hobby as well and don't have infinite time. It is nice to think you finish something you can use any time you want and doesn't live in your mind rent free because there is something you have to add, fix, etc.
there is entropy in the system and shared library interfaces that your program interacts with, not to mention various file formats, protocols etc by which it might have to interact with the external world.
Sure, you take your car into the shop for upkeep. When it comes out, it isn't a completely different car. That's because, functionally, the car is a finished product. The same can be true for software, which is an important nuance.
fair point. I think the article would get a lot less pushback if it had used the term "complete" rather than "finished"; the latter sounds too rigid to apply to software that still needs potential bug fixes and evolution.
Some kinds of software can be etched in stone (LTS versions) and work fine for decades even after support ended. Some examples are Winrar and 7Zip (archive extraction apps, their formats are pretty much standard), putty (its fundamental design has remained unchanged) and Vim. You can still argue that they can be made better with new fixes and feature updates but even their older stable versions are still highly usable for most pragmatic use cases.
I'd be hard pressed to name a category of software artefacts which are less likely to be feature complete and secure.
I dunno; I consider the software in my pressure cooker pretty secure. Because it is an offline appliance. Network-connected appliances are of course much worse — and they regularly become e-waste exactly via refusal of upkeep.
Software that accomodates "changing environments", so that when hardware fails it can be replaced with off-the-shelf components
Given the risk that you'll get components from a different generation, this needs something to make driver situation workable. Maybe serious liability for hardware without open and correct documentation of how to interact with it; the story how Tesla demanded from Nvidia the Full Corresponding Sources of kernel patches for the SoCs to pass to a complaining customer and got unbuildable garbage first three times doesn't make enforcement sound feasible. The most plausible way I can imagine is probably a setup where drivers can be finished and survive updates of other drivers.
Of course, this also needs stable hardware interfaces for the replacements to be possible at all…
Why are we, as an industry, so averse to the idea that technology requires upkeep?
To me, it isn't this. At least not the core of what I think is an actual issue within the software development industry. Upkeep is fine, but if we are comparing things to the analog world, upkeep of software would be things like security fixes, updates for changing OSes, etc.
Upkeep, to me, shouldn't be continuous changing of functionality within the same product. Which, within many companies, seems to be done just to justify work for teams, product owners, or whoever. A good example of this, again to me, is how Google seems to approach software development. Where a piece of software is never stable and can change functionality seemingly at random. An example that pops into mind is that recently on Android they completely redid the UI of the clock application. One that had been stable, functional for years, and one I rely on to wake up. Suddenly I couldn't swipe to snooze, changing alarms now required more clicks, saving changes now actually required me to click save which tripped me up a few times and a few other things.
Again, drawing from the analog world, it would be like buying a car where without notice you can find that the steering wheel is now located in the rear seat.
When we talk about "software being finished", personally, I think about the feature set being stable and not in need of massive redesigns. More importantly, giving users a choice in the matter and if we really want to redesign something do that in a new version that is optional.
Again, with analog products I have the choice to keep using older models. I can get a second hand car from 10 years ago and it will still work. It might require some work, but after that work it will still be the same car.
I am responding to your specifically, but honestly it is surprising to me that a lot of people that have responded to this thread so far seem to take very binary literal takes to this.
I think I am mostly in agreement with you. A big part of my objection to the conception of "finished" in the original post is that it doesn't accommodate user choice. Upkeep should, absolutely, be less centralised! This means making software which is open to user modification, including the option for finer-grained control over versions and patches, as you suggest. It would probably also need to entail much broader exemptions to predatory IP enforcement against legitimate 3rd party repair services (i.e. "right to repair"), but that's a whole legislative can of worms.
if somebody finds a bug in the Gameboy, and they post a patch publicly , they have improved the software. All users would want the patch applied. Why should it matter that it is hard to apply the patch? It shouldn't matter whether patches are hard or easy to apply. Free hardware and software make it possible to easily patch. I don't care whether the software was deemed finished, I care whether it is bug free.
Why should it matter that it is hard to apply the patch?
Maybe it doesn't — but what definitely matters is whether the developer aims to maximise the chances of not needing any patches after some point. Difficulty to apply patches does sometimes incentivise aiming for something finishable.
Anyone here have software they've "finished"?
The one piece of software I have that I consider "finished" is my greylist daemon, which was first released in 2007 and was feature complete. The subsequent 22 releases have all been bug fixes or improvements to the code to avoid future bugs (the last bug fix was two years ago; the last release was last year to update the documentation).
For the requirements, I think the most important thing for creating “finished software” is the environment. Mathematical software running on Linux will probably last longer than an AI tool written for the latest ClosedAI platform.
The example of the Game Boy is instructive because it highlights how differently we can use computers. They never found a single bug in any game in three decades, but I recall finding bugs in at least two cartridges (Wario 2 and Zelda 4) from casual play, and at least the latter title is still popular to this day.
Right - the original Pokémon games on there have one of the most famous video game bugs in history. It just doesn’t ring true.
Yes. Please stop taking perfectly good systems and adding, adding, adding ... so they become horribly complex, ridden with subtle bugs, increasingly difficult to use, and then get slowly discarded.
I can see that Python is undergoing this process. Not that long since we jumped to 3, discarding 2. Now we've crept up to 3.14. Each version adds ever more obscure features. Each moves further away from what makes/made it popular. Each new feature appeals to increasingly smaller use cases.
Don't forget, each new feature has a cost - the system gets bigger and more resource intensive. It becomes harder to understand the whole system, so it becomes more and more specialist. It becomes bug ridden, perhaps just due to complexity/obscurity, perhaps due to interactions between features. And it becomes difficult to find your way around the new features, so most users fall back on what they know and ignore the new (and possibly better) ways of doing things.
Some time, not too far away, we'll get the another new and best pythoneque language (being rather similar to simplified python) and the whole cycle repeats.
(PS. I'm not talking about adding libraries. Users can and do ignore the mass of libraries.)
Minor addition to the 3rd example: Cars
I've never done any software updates on my car, even though it probably has more LoC than most other products/programs I use.
My car (a 2022 BEV) updates constantly - but generally stuff like the infotainment system, maps etc.
For "core" car software, "chipping", where you basically reprogram the performance curve of an ICE engine, has been a thing for a long time.
Software behind finished is software doing what is supposed to. A big fix is because it doesn't do what is supposed to.
Other things like portiing, etc. are I'd argue are changes in requirements.
I think being clear with requirements and limitations is really important and they sadly aren't done well.
It's probably a big contributor to bloat, to buggy software, eben to vulnerability.
Our field is littered with software that gets bent io new requirements until things break. Maybe a coat hook isn't the best foundation to make a multi tool even if it's metal, has pointy things and can be shaped with relative ease.
Not only does stuff break we also frequently have software with carefully designed properties used to exactly go against deliberately made trade off often leading to very poor performing software which wastes time and energy.
Sometimes it's really well meaning not reinventing the wheel but often it's simply the wrong base being used, when every other option out there is better, or implementing the right thing yourself would be trivial.
Popularity is not a good measure for something being the right tool. I fear it will only get worse with LLMs.
Sure you can use a hummer to make holes but don't be surprised if the outcome is a lot of cracks in your tiles.
The magnifying glass, the desk microscope and the electron microscope are all finished items with the same goal of showing small things. Yet companies don't start out with magnifying glasses and change them till they become microscope. They finish the magnifyinyglass and then might work on a microscope.. Maybe they can use the learning making end using lenses but they don't take a magnifyinyglass reshape it gluing on things.
You totally could and it can teach you stuff.. Maybe for some things it's even nugget but then you add your phone as a camera and this DVD that and it's fragile and messy and you realize you made some changes that make things weird and you put on stickers to not touch this DVD never press that button. And sometimes you wished you could now just have the magnifying glass to quickly check if that's a tick on your leg but you can't even pick it up without having to fear it falls apart and even if you could a microscope doesn't make for a great way to look at you leg anyways. Then someone comes along with a magnifying glass and maybe it even has a small LED light on it which your old magnifying glass didn't have. So it's the cool simple thing to use and give that wreck that is neither s nice magnifying glass nor a proper microscope to someone else. And you hope that the new magnifying glass doesn't end up like that. But maybe you could just change that one thing. You have experience now and just changing one thing isn't going to make it a microscope. You don't even want that. But hmm maybe something to hold things like specimens would also be nice. You can make it detachable so you only plug it in when you use it...
I found it a bit amusing that I was expected to always provide a "definition of done" for my tasks, when I worked on commercial software with no such definition itself.
Normally, I would want my subtasks' DoD to move me closer to an overall DoD. But in this case, no matter how many DoD-equipped tasks I completed, the software was never any closer to being done. One must imagine the software developer happy.
I think a good definition of "done" for software is "working as intended at this moment in time."
I initially wrote my blogging engine in December of 1999. It's still in use to this day. After 25 years one would think it should be done by now, but no, it's not (I'm currently in the process of moving one feature to be run via the hook mechanism, cutting another feature I no longer use, and simplifying another feature I use), but there are times when a few years might go by when nothing changes.