How LLMs Distort Our Written Language

41 points by vesto


regulator

This is deeply disturbing to me. The example given in which the LLM changes the argument about the self-driving car is especially upsetting.

It is not shocking to me at all that the LLMs take neutral stances: as far as I understand mainstream LLMs, this is essentially a core goal of them as products. That is, they are only supposed to argue for "known" truths and in support of the user, and otherwise equivocate and take the middle ground.

It is mind-boggling to me that people reach for LLMs in order to write or edit anything of value.

h3mulen

I saw this a lot when I was trying to use Claude as a copy editor. It took multiple iterations on the prompt to get it to pay attention to spelling, grammar, and punctuation only. I suspect the tendency to shift meaning has to do with the way embeddings work.

Corbin

The page doesn't consistently load correctly for me. There is a preprint available.

conartist6

The frequency graph is jaw-dropping and looks, honestly, pretty much exactly like I expected it would look.

Consider this a gift: those things on the left: those are words that are now powerful. Those works on the right, those are words that are now (increasingly) meaningless