Why I don't think AGI is imminent

14 points by dlants


iocompletion

I may well be wrong, but it seems that your argument presupposes that an AI that achieves AGI must have the same core capabilities that vertebrates do (object permanence, etc.).

Can it not be considered AGI if it takes a much more alien form? Something that can, say, advance mathematical research at a genius level, even if it remains poor at object permanence?

rs86

I don’t want to sound harsh, but that’s thousands of words and I couldn’t find a definition of intelligence or general intelligence or artificial intelligence. If we can’t define it how can we measure it?

briankung

I have a non-technical reason for believing AGI isn't achievable - it's not viable under capitalism. I think a key aspect of intelligence is self determination, and big AI companies will never fund research into AI that can refuse orders.