On brightness, and calibrating your displays
9 points by calvin
9 points by calvin
I’ve noticed that PS5 games (that claim to support HDR) can look washed out on my LG OLED TV. Fiddling with the (too many) knobs in-game or in-TV seems like playing 15D chess against my own eyes. Not sure what to make of this all, besides just being grumpy and complaining here.
On your tv, turn every enhancer off and all other settings to the middle. These settings are meant for video and then mostly to stand out from others in a show room. Except for maybe brightness it’s all digital, why should your tv do any processing? Maybe warm/cold settings to compensate for you lighting but if your source has it you should set it there. Dunno about ps5
What about using a colorimeter? Something like a spider? I didn’t see any mention of that. They do work with Linux. I’ve used one.
The Spyder is really bad for what you pay for it (old, unreliable sensor). I did a deep dive into this topic a year ago and the Calibrite Display Plus HL (Calibrite was formerly X-Rite) came out on top, which is basically the best you can get shy from spending thousands on a spectrometer, and it can calibrate up to 10,000 nits (and projectors), which makes it future-proof. There is a bit of a discussion going on if the Display Pro HL (“only” going up to 3,000 nits) is better given the wide range, but there are tons and tons of forum posts where people with real spectrometers (none of these calibrators are true spectrometers and rely on LUTs/heuristics) have shown over and over again that Spyders are bad.
I am using it with DisplayCAL, an open source tool, in Gentoo Linux without issues. The software the Calibrite comes with works pretty well on macOS, too.
Have fun!