Points are a weird and inconsistent unit of measure
65 points by hwayne
65 points by hwayne
At the end when consulting Frink there is mention of the difference between the US Survey Inch and the International Inch. I’ve written before about how the international inch came to be (it’s one of my favourite stories in metrology) which suggests to me that the different definitions of the point from around 1900 were likely due to the limits of available measurement precision around that time. Certainly the difference between the Knuth TeX point and the NIST point is smaller than the difference you get by basing them on different inches.
And the amount of variation you would get when packing metal type into a frame, or casting hot lead on a Linotype, dwarfs the level of precision you would need to tell the difference between these different points, so there were practical reasons that the differences didn’t matter given the printing technology of the time.
Another historical note on the computer 1/72 inch point: in the 1980s Apple displays were typically 72 pixels per inch, i.e. one point per pixel, so it was easy for software to ensure that the size of a document on screen matched the printed page.
Oh, another thought: two other important technologies were typewriters and line printers.
Typewriters had a nominal 6 lines per inch. On an 11" page, 12 point type at 72.27 points per inch gives you nearly 66.25 lines instead of 66 lines.
That quarter line per page probably doesn’t matter for typewriters, but it does matter for bulk printing on fanfold paper where the page length needs to be an integer multiple of the line spacing so that everything remains aligned. This was a big application of computer printing in the decades before desktop publishing (think payroll or bank statements).
And before laser printers, graphical printouts used dot matrix printers. They were also designed to match the 6 lines per inch pitch of fanfold paper.
Which suggests to me that there was a backwards compatibility reason for choosing 72 points per inch in desktop publishing, but they were aiming for compatibility with older low-fidelity computer printing technology rather than typesetting. Whereas Knuth’s target for TeX was phototypesetters, so it needed to match the dimensions used in traditional printing.
You slightly simplified Jo's reason for a single 25.4 mm inch block: US and UK metrology specified slightly different measurement temperatures, and it would have taken machining to the extremes of available accuracy to make blocks for both standards.
The reason that Jo blocks caught on is because Henry Ford liked them so much (like Victor Kiam) he bought the company
The reason that Jo blocks caught on is because Henry Ford liked them so much (like Victor Kiam) he bought the company
Jo blocks were in massive use before the war and had a massive uptick in use because of WWI and the arms manufacturing requirements. It was already considered a strategic good by the US and NBS, and the war interrupting supply (jo blocks were still only manufactured in europe at that time) led to the funding of Hoke blocks by the ordnance department.
Ford bought CEJ in 1923 as the post WWI environment got rough on CEJ: the company was on the brink of collapse, and Johansson appealed to Ford to help save the company. Ford bought the company because Jo blocks had caught on in the entire US manufacturing supply chain, not the other way around.
US and UK metrology specified slightly different measurement temperatures
I went looking and what I’ve found so far is that the standard temperature in the UK was 62°F a
And the US yard was supposed to match the UK yard at the same temperature b c
Neither the UK metric act of 1864 nor the US metric act of 1866 specify the temperature at which comparisons are supposed to be made.
Isn't the difference between the UK and US inches simply due to the different definitions in the cited statutes?
UK: 1m == 1ft 3.3708 in == 39.3708 in
US: 1m == 39.37 in
Yes, that’s what I wrote previously but ~scruss said there was a temperature difference as well.
Eventually I found this history of the standard reference temperature by NIST, which says that
There’s much more; it’s a nice readable and informative article.
First instinct here too was to think the difference was between survey inch and statute inch...
oh nuts, i thought it was defined to be exactly 1/72nd lol. so in the rich text format (rtf files) there's this other unit... called the "twip" which is 1/20th of a point which is used for most measures in the file. I don't know the history of that, but if the point isn't what I thought it was, it means the twip isn't either.... the errors compound! golly.
I think it’s obvious that a point should be 72 points per inch: 12 × 6 = 72. What’s really surprising to me is that anyone would have standardised on anything but that.
This didn't make it into the newsletter, but the reason for the divergence was that printers were already using pica measures that were approximately 1/6 inches but in reality slightly different from printing company to printing company. The standard pica was chosen to match the most common "in reality" measure.
Why 12x6?
Because 12 is a good number to use for things. It’s divisible by 2, 3, 4 and 6! There’s a reason that we have words for a dozen, a gross and a great gross (well, that last one is a phrase rather than a word, but you know what I mean). Having 144 divisions per inch is probably too many for font sizes, so 12 × 12 is out (although half-point font sizes aren’t completely unknown). On the other hand, 36 divisions per inch wouldn’t offer enough fine adjustment, so 12 × 3 is out too. 12 × 4 wouldn’t be terrible: 48 is a pretty good number for a lot of stuff, but 12 × 6 is even better because it multiplies by both two and three: 72 = 2³ × 3². This enables all sorts of nice proportions and ratios.
On the other hand, 72.27 divisions per inch is just weird. I think the suggestion made elsewhere that it likely originated as an even 72 per a slightly different inch is very attractive.
That assumes you're using inches, which most of the world doesn't. The great printing houses were in France and Germany, and the USA was a comparative backwater for quality printing until well into the 19th century.
HP's plotter unit as used in the HP-GL page description language was clever. 1 HP-GL unit = 1/40 mm, which is also exactly 1/1016"
Wrong link but I enjoyed.
Measurement units are like models. It doesn't matter if they are correct, only that they are useful. When school assignments just require 12 point font, it does not really matter what that number means. Maybe if aircraft flew in points per second we'd have a more standard unit.
I wonder if there's still a point in using pt. Does it have some kind of benefit or is it just there because of inertia like the imperial system?
The book is formatted in LaTeX using a pseudo-grid of 10.8pt × 7.2pt
I gotta ask: did the author come up with that set of dimensions without supervision? 'Cos that's a kind of cockamamie setup. I mean, I get that it makes a nice 3:2 proportion, but it would be entirely unambiguous to have defined the grid as 0.15 × 0.1 in. LaTeX would have got it right, and Inkscape would have got it right too. If you're going to get all "72.27 pt = 1 in" on me¹, then you're suggesting that the author really cares that their grid is actually 0.1494396 × 0.099626401 in, which from a practical standpoint is a big nope and book production people would laugh at you². If the book is being proofed on a 1200 dpi laser printer, then the grid is slightly more than ⅔ of a single laser printer dot off the sensible size. If they're including Inkscape SVG in their manuscript, the scaling will be beneath trivial. Also, Inkscape will change the whole size of the figure depending on the Interface → Bounding Box to use setting anyway. Fun times, huh?
If you have GNU units installed, reading the file returned by units --unitsfile is instructive. Especially the Printing section. Actual point definitions in use over centuries have varied between 2.867–2.659 to the millimetre. Because they could, mostly.
--
¹: which was a simplification by Knuth, anyway. The precise NIST value would be 72.270000722700007227 pt = 1 in. But I'm not sure if that was corrected from old inches at any time, and I don't actually care enough about barbarian units to check.
²: having worked in book publishing for several years, production people are merciless. Sign off on a stupid size of proof, receive a printing and shipping bill 5× over budget as you've just created a completely non-standard size of book.
Wow, this got hostile real fast. Yes, I'm being supervised by an actual graphic designer.
¹: which was a simplification by Knuth, anyway. The precise NIST value would be 72.270000722700007227 pt = 1 in.
That is explicitly discussed in the post.
The title is also true in the entirely different field of agile software planning / estimation. But it's embraced in that case.
There's something to be said about which overly generic nouns become terms of art in each industry. "point", "object", "class", "key", ...