Why do AI enthusiasts do this?
Why do AI enthusiasts do this?

Why do AI enthusiasts do this?

Hi so there’s a pattern I noticed recently when talking with people who are into AI, over reliance on analogies. I’ll bring up a concern I have and then rather than address it directly the person will defend their point by abstracting everything away to some hypothetical. For example:

Me: I’m worried that AI is going to come along and do creative work, and everything is going to become 80% as good as it used to be but no one will care.

Friend: Yes but say you’ve got a private jet, and suddenly everyone can have a version of it which is 80% as good, but everyone can have it. That’d be great!

I’m not saying that analogies aren’t beneficial, but if they aren’t properly picked it just derails the whole conversation by introducing an additional layer of indirection. In the example above, yes, a bad version of a private jet which everyone has access to exists but: it’s way cheaper for a reason, and it’s not an example of creative work. It simply doesn’t address the point!

Tldr: I’ve noticed that AI enthusiasts (even cool and smart people who I get along with) get this weird look in discussions and start rattling off the same set of abstract scenarios instead of addressing issues directly.

submitted by /u/Comfortable-Ad-9865
[link] [comments]