Sloppy Copies

44 points by mdr


munificent

I still think we should just view AI as a tool; It’s the intent of the person driving it that matters, I think. And sadly, it appears that there are a lot of complete dickheads out there.

I see this argument everywhere: that technology is morally neutral and the consequences of it are entirely caused by the people who use it.

The deep flaw in that argument is that human behavior is highly sensitive to environment, incentive, and opportunity. Essentialism is a pervasive belief in our culture today. The idea is that some people have some deep immutable core that is good or bad. An "actually" bad person who consistently does good things is merely a wolf in sheep's clothing waiting for the moment to pounce, and a good person who does bad things is simply a helpless victim of circumstances or systemic flaws in society. (How a person gets the good or bad essence isn't specified, but it tends to boil down to "do they seem like a member of my in-group?)

This belief is toxic and wrong. It is of course true that people have relatively consistent personalities over time and that there are a small number of deeply hurtful people that have a repeated pattern of doing harm while trying to hide it. But we aren't born picking a square on the D&D alignment chart. People just do a mixture of good and bad things.

We'd like to believe that our actions always spawn from our essence because that centers our lives in our own agency. But in reality, we're all just trying to make the best choices bases on the menu that the world offers us in that moment and that menu has more influence on who we are than we'd like to admit.

When you give people a new technology, you change that menu and the result is going to be different behavior. Incentives incentivize, temptations tempt, and lowering barriers increases the number of people who will hop over it. If a technology makes it dramatically easier to kill people (weapons), over-eat (agriculture), scream at each other (social media), or copy the work of others (AI), then at scale, the number of people doing that thing is going to go up.

Put more simply: what we do is a combination of human nature and the technology we have access to. If you see people using technology to do a bad thing, do you think it's easier to change human nature, or the technology?

kolja

So now we have Meta-MALUSes (not linking to that shite) running wild on the internet. Let's hope this works badly enough and gets boring soon for whoever pays for that.