Encryption using SSH Keys with age in Linux
27 points by WorksOnMyMachine
27 points by WorksOnMyMachine
Additionally, the ssh-agent is not supported.
Oof; what a drag. I'd be happy to replace my use of gpg entirely, but not having agent support is a deal-killer.
Side Note: I’ll use base64 encoding to make it more compatible with more services as some tools might not like the binary encoding.
I wonder why they don't use the --armor option to do this instead?
not having agent support is a deal-killer
Yeah.
The ssh and gpg agents work in rather different ways. The gpg agent stores your passphrase, and hands it out when needed. The ssh agent stores your private keys, and acts as a signing oracle.
age uses stunt cryptography to make ssh keys work safely for encryption, but the ssh agent doesn’t know about the magic so isn’t able to help.
I think it would make sense to create an extended ssh agent that is able to perform the encryption/decryption ops that age needs. Dunno if anyone has done that yet!
age uses stunt cryptography to make ssh keys work safely for encryption, but the ssh agent doesn’t know about the magic so isn’t able to help. I think it would make sense to create an extended ssh agent that is able to perform the encryption/decryption ops that age needs. Dunno if anyone has done that yet!
This is fairly trivial with the Golang x/ssh/agent libra-.. oh god I'm writing another age plugin 😢
Heh, an age plugin to interact with yubikey-agent over an ssh-agent extension has been on my TODO forever :)
I'll probably take a small stab at this.
I've also been considering writing ssh-agent-proxy that would allow you to point specific extensions towards a specific socket, and then also collect up several agent sockets into one socket.
It's quite a neat feature with ssh-tpm-agent and I think the idea is generic enough.
https://github.com/Foxboron/ssh-tpm-agent?tab=readme-ov-file#proxy-support
It would be good to keep track of session-bind@openssh.com messages on the connection, and deny forwarded decryption requests by default. (Note that for authentications you also get a is_forwarding false session-bind@openssh.com message, but if using the agent socket for other things I think you'll just get the injected is_forwarding true message when it's forwarded.)
You are right - I've added a section for the --armor part as an alternative to the base64 encoding. Thank you!
Filippo wrote a lot about PGP use cases and why he made age, I really like where this goes and strongly encourage you to give age a chance, and perhaps check a few articles on how it changes over time
https://words.filippo.io/giving-up-on-long-term-pgp/
https://words.filippo.io/age-authentication/
PS. I think it's worth to mention the author
What I never know if I should look at per-host SSH keys or not- or maybe a hardware-protected SSH key.
If you do per-host keys, then encryption with SSH keys is not good, right?
(But if you do any sort of public key encryption, a single key makes things easier... so perhaps a single SSH key is the way to go?)
You can't really use this with age, but i wrote ssh-tpm-agent. https://github.com/Foxboron/ssh-tpm-agent
I'll probably see if I can hack something into the agent to support age, along with a plugin?
The bottom of the article has a way to encrypt to multiple recipients, which also to me means single recipients with multiple computers.
But then, if I add a computer (and thus a new key), should I go around and update everything that I have encrypted? If I use per-host SSH keys, decomissioning old workstations might mean I cannot read old encrypted stuff.
Right now I think that key-per-person is the way to go; it is not ideal, but it's the least worse option?
sounds like you are shifting into a backup/restore problem now instead of an encryption problem.
I would solve this two ways:
Could you store your old keys in your password manager? Part of the decommissioning process of a host is to move the key into cold storage?
I guess I could, but doesn't that sound excessively complex?
Lately I'm leaning that encryption makes many things harder... for not such a big benefit.
(Of course, I believe most stuff should still use encryption, and it's still a net benefit!)
Security pretty much always adds more complexity, not less. There is some trade-off that everyone has to make for how much extra security complexity buys you in terms of security you gain from it.
If to get the secret you already have enough access to do no good, terrible bad things, encrypting the secret doesn't really buy you much.
Nobody can tell you/your org where the right trade off is. You/your org has to think about the no good terrible bad people you are trying to protect yourself from. In security land this is called the threat actor/threat model.
If you are trying to protect yourself from nation states or very dedicated attackers, your life gets incredibly complex in a giant hurry. Embrace the security complexity you will have a lot of it.
If you are just trying to protect yourself from most people that might take advantage if it's easy to do, you probably don't need a lot of security complexity.
My message was poorly worded.
I mean that I think that even those with some awareness continue to underestimate the usability costs of encryption. I meet capable people who struggle using Matrix.
For backups, I think encrypted backups are pretty nifty. (They unlock the great use case of mutual backups with "untrusted" peers.) However, then I think about possible failure scenarios and they are not so uncommon.
So I still do not encrypt my backups (physically stored at my two flats in different cities), although I think it's something worth pursuing in many scenarios and I would like to learn how to do it properly. (And that's one of my reasons for being curious about age.)
Ah, I agree. I mean I don't use Matrix either(for different reasons). That said, I definitely punt on security stuff even if I know better sometimes.
Encrypted backups are nifty, but they definitely raise the complexity.
How we currently handle this, is we do an encrypted backup, and then we immediately do a restore of that same backup from a different instance and make that restore useful to some set of humans.
For instance in our production PG database, we rename it to _nightly and we put that restored database in production for a certain subset of our staff(high level user support and developers). This gives them a copy of production from yesterday, so they can duplicate/replicate user problems. They know it gets over-written every night, so they are welcome to mess around and change stuff in there if it helps them get their job done.
Not every org can do what we do, but if you can automate your restores and put that in the useful pipeline somehow, so when it breaks it's noticed, you can ease your mind about your backups working and the complexity not being so high as to be unusable.
As they always say an untested backup is not a backup. So if you can find a way to get a tested backup into some semblance of production, all the better.
We use borg backup and are happy with it: https://www.borgbackup.org/