Secure Boot, TPM and Anti-Cheat Engines
25 points by WilhelmVonWeiner
25 points by WilhelmVonWeiner
I might be starting to sound like a broken record, but be careful what you wish for. The exact same mechanisms that are used to take away your ability to run cheats can, and at some point will, be used to take away your ability to block ads.
I don't like cheaters in games either but corporations either have the ability to tell you what software you can run or they don't, there is no in between.
I am losing my ability to care in the opposite direction — I want cheating hardware (display intercept + USB HID emulation) to become widespread, both because (1) you are absolutely right that attestation of client side is a part of undermining general purpose computing, but also (2) its side applications in non-game UI automation would be most welcome.
Conveniently for me, though, I don't care about online multiplayer, so if behaviour detection won't work out, it won't be my problem…
I think for something like anti-cheat on Linux requires such specific configuration that it's unlikely to catch on with vendors in that way. There are easier arms races for them to drag on first.
This article answered a lot of questions I've had for a while about where you'd go implementing anti-cheat on Linux, too. If your distro provided a signed kernel and modules, they could provide a kernel module (or blob?) that implements anti-cheat functionality in a tamper-resistant way. Maybe a project that SteamOS or Bazzite users would benefit from.
Yep, this is ultimately where I could see the anti-cheat issue being solved. The steam deck has enough market share to be commercially relevant, and Valve is in a rather good position to implement an (open-source, even!) kernel-level anti-cheat module that uses Secure Boot to provide the same guarantees. In practice this is identical to the situation on windows, just that Windows makes it a lot more annoying to load unsigned drivers so kernel-level anti-cheat is effective even without secure boot.
Hopefully it’ll also incentivize distros to get their secure boot story in order. I’m tired of having to disable secure boot if I want drivers to work properly.
Windows makes it a lot more annoying to load unsigned drivers so kernel-level anti-cheat is effective even without secure boot
Nah, vulnerable drivers are a dime a dozen. Here's the first source that came to mind, which admittedly hasn't been updated in ages - but if you look through the Anti-Cheat Bypass section you'll find a bunch of new vulnerable drivers too.
Right, I never meant that Windows didn't have that. In Linux you don't need any vulnerable driver, you can just make your own with no problem. What I meant is just that Windows, even without measured boot, makes it annoying enough to load unsigned drivers that kernel-level anti-cheat is still somewhat effective.
I think there is a lot to solve around secure boot and kernel modules. Like what's to stop an attacker kexecing into a malicious kernel? Lockdown mode? There's a lot to this and it's entirely out of my depth. But it is cool.
Indeed. Systemd is doing a ton of work in this, especially with TPM support, but it’s sad to see distros have mostly not taken it up. In NixOS at least you can kind of hack it on, but for other distros I’ve not really seen it used, unfortunately.
If you have the ability to load third party kernel modules, couldn’t someone just load a module that allows cheating or otherwise subverts the anticheat kernel module? What’s good is signing the kernel if the kernel can happily run untrusted modules (which have all the privileges of ordinary kernel code, as far as I understand)? And if a kernel module can’t subvert the anticheat, what’s the point of signing the kernel?
Well, a signed kernel plus CONFIG_SECURITY_LOCKDOWN_LSM would restrict the running system to only signed kernel modules. Then you would have to have some kind of way to check the kernel only has modules signed with trusted keys, or maybe your boot chain would show that the booted kernel was one known to be in lockdown mode without loadable module support. There appear ways to restrict the user environment in this way but it's very complex and mostly out of my depth (for now).
Lockdown is one way, but you can also set IMA_ARCH_POLICY which also would restrict to only accept signed kernel modules.
https://ima-doc.readthedocs.io/en/latest/ima-configuration.html#config-ima-arch-policy
Ah, interesting! Yeah, everything you shared is out of my depth, but I'm very interested to learn more if anyone has deeper knowledge to share.
There really are security uses for secure boot and TPMs, but using them for keeping people from cheating at games is swatting a fly with a bazooka. The real solution is just for people to take vidya less seriously.
"Less seriously" isn't exactly a solution; developers can't just ignore cheating. Team Fortress 2 was unplayable for years because there was no barrier to or penalty for cheating. You'd join a match on a public server and immediately get sniped leaving spawn by a bot.
The solution here is a community problem as opposed to anything else, however.
Back in the day, when servers were open enough to be run by anyone, cheaters would just be banned by the server op. Yes, this was flaky, but I don't see how a technological solution solves this. Especially when it is still able to be bypassed (see, above in the comments for a big list of bypasses lmao), and easily misused by both governments (see: A recent proposal by the UK House of Lords proposing that censorship software be built into computers) and companies (see: the top post's mention of advertisers using this system to ensure you see adverts). The entire situation is trying to solve a social problem (cheating) with technical tools (mechanical behaviour detection).
What would be an actual solution here is making servers runnable by a community (or the company hiring moderators), and implementing moderation tools for the server. This already exists for many games. I play Battleblit, and there's a user report system that works very well. I report a cheater, the cheat system pings a discord, a mod from the discord peeks in and looks at recorded footage or a bunch of different things and banhammers the cheater. Because it uses Steam's account system, that person is banned for life from the game.
Despite anticheat and a linux-platform ban, games like GTA still have cheaters. The rub is, the common discourse in a lot of spaces is that Linux is too complicated for people to install, and too small a marketshare for companies to support. So, either Linux is so easy to install and use that 12 year old cheaters are installing it specifically to cheat in certain videogames, and it has a big enough marketshare that this is a problem, or Linux is in fact not contributing significantly to the cheating problem, and implementing anticheat is a waste of time.
How much money are companies spending on developer salaries to implement flaky anticheat systems that are bypassed, versus hiring moderators? I'm pretty sure moderators would be cheaper to implement than sinking millions of dollars into anticheat systems.
I don't think there are enough volunteer server admins to keep games with the player counts of Call of Duty, Fortnite etc running. And as long as that remains true, players of these big games will demand that developers handle cheaters themselves.
Valve tried to let the community moderate Counter-Strike with it's "overwatch" system where community members could watch demos recorded from matches with reported players and help speed up the ban process by saying "yeah this guy is doing xyz". What happened is cheat developers started botting the overwatch system to false report randomly and constantly in an attempt to ruin the overwatch system. Valve eventually removed it.
TF2 has community servers and has had them since launch. They're typically bot-free for the reasons you mentioned, but AFAIK the problem is that they tend to be either full or geared more towards veteran/hardcore players.
I'm pretty sure moderators would be cheaper to implement than sinking millions of dollars into anticheat systems.
Figure a moderator is paid a good salary of $50k/year. IIRC a rule of thumb is that cost to employ someone is double their salary (payroll taxes, insurance, etc), so double that. That means that 10 moderators cost $1M/year.
The community server era isn't coming back, because what a lot of people forget is how impenetrable games with it often were. Most servers were very much their own social cliques with their own rules, regulars, etc, and it could be alienating either on the social or gameplay level. If you just wanted to play a game, modern matchmaking systems with official servers are a massive improvement.
Ignoring cheating sounds like a perfect solution to me, honestly, given that we're only talking about entertainment. Taking video games less seriously also removes the incentive to cheat.
... I believe that the only true solution to cheating is server-side behavioural analysis ...
Say no more! https://cv.co.hu/images/52het-36-ai.png
This was the algorhitm I used to fight cheaters on my CS 1.6 servers in combination with a "cookie" I placed in their configs to prevent circumventing the bans.
(full article - sadly only in hungarian - here)
What are the economic incentives for cheating? Are there enough people who want to “win” at any cost that people earn a living creating cheats?
At any rate, all this will soon be moot because the cheating will be a robot set up with a camera and input device.
What are the economic incentives for cheating? Are there enough people who want to “win” at any cost that people earn a living creating cheats?
In my understanding, the best customers for cheat software tend to be mid-tier streamers. It's basically the same thing as why athletes turn to PEDs.
Do cheaters pay for cheats? Or is it the classic hacker trope of "I'm not allowed to do this, so I'll figure out a way to do it anyway"?
Some people will pay for cheats, yeah. People will pay for MMO gold, people will pay for highly-leveled accounts, Hell, people will literally pay for other people to log into their character and kill hard bosses for them so they can get the shiny status symbol weapon. (These aren't necessarily "cheating" in the same way because the in-game stuff was obtained legitimately, but it's still in kind of the same space.)
And of course some people will just do it to be assholes. Can't say what the ratio is.
The thing that's sadly not described in detail is how the server enforces users to actually run untampered versions of their clients: What if I simply edit the binary to perform the checks but add additional code ? Even if the server would request a TPM signed hash of the client what prevents me from simply asking for a signed hash of the original binary ? (Same for other challenges of this kind.) I must be missing something.
Even further: For a lot of machines wouldn't the TPM have the same values in their register ? An attacker could simply start emulating a full TPM based on what a valid machine would output.
If you have a signed kernel running a signed kernel module that monitors the memory of a process, and the kernel can detect if a binary has been modified via IMA: so you could have the kernel verify you're executing the correct binary, and then monitor the memory of the process created by executing that binary.
Right?
Out of curiosity, what prevents the signed kernel module from being tampered with after loading, for instance by another kernel module or by code running on the host system the kernel is just part of a guest VM on? Even if that's prevented - I'm not an expert, but it seems like a setup that supplied a fake EKPub and EKCert could also mess with whatever OS code tries to validate those for client-side code, and tamper with network requests to splice in PCR values lifted from a legitimate system to fool remote attestation.
Seems to me that you'd need to entangle game state with operations that depend on EKPriv and on the PCR value, such that cheaters with a mismatch between the EKPriv used to run their game binary and the PCR values reported to the server create illegal game states that can be recognized by the server. You can't just slap a signature with EKPriv on a game state + EKPub + PCRs bundle, because legitimate software can be handed the game state as an opaque blob from some other, untrustworthy system and do the signature with its own EKPub and PCRs. But this doesn't seem like it can be modularized independent of the associated game.
After a night's sleep caused me to miss the edit window, I realized even that wouldn't work. A suitably designed system could also query the legitimate system for all the available-to-remote-attestation information to ensure its game state remains legal, too, before handing it over to the legitimate system for any work that depends on a legitimate EKPriv. You can't make the game state depend on EKPriv itself because that's information the client is guaranteed not to have under the design assumptions, but as far as I know there's no way to tell a legitimate TPM that the cryptographic operations it's performing represent activity on some other device altogether and it should therefore refuse to cooperate.
The security properties I'm convinced this does provide are economics - it's expensive to reverse engineer any given system to set up all the necessary infrastructure to fool the remote attestation process, and every player who gets caught cheating through legitimate means can still have their legitimate TPM banned, requiring them to buy a new one.
Off-topic: What a great website design. I know it's using a batteries-included framework, and the markup is scary to me primarily learning web design via CSS Zen Garden but my general sentiment remains valid.
Reminds me of the many hand-crafted personal sites of developers/photographers/designers I used to mix with ~15 years ago (and I mean that in a very good way!)
Tasteful use of photos (either self-taken, or appropriately credited), subtle enough gradients, a hint of depth, clear navigation. A++ would read again.
I have been far too decoupled from anything photography, design, and general visual creativeness ever since I got sucked into backend development for a career. I don't regret my choices one bit, but the creative gap it has carved out thus far is unfortunate.