Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

ChatGPT is an AI by any definition I know about. Perhaps you're thinking of AGI.


> ChatGPT is an AI by any definition I know about

Tesla's autopilot is pretty much autopilot by a pilot's definition. That isn't the definition the public uses. GPT is AI by an academic definition. That isn't the definition the public uses.


The Oxford English Dictionary defines AI as: "The capacity of computers or other machines to exhibit or simulate intelligent behaviour".

I think GPT-4 clearly fits that, so I think the burdon is on you to show that the public has some other widely used definition of AI. Certainly, people seem entirely willing to describe their phone searching for pictures using facial recognition to be an instance of AI, which I would argue is probably further from your definition than GPT-3 is.


This punts the question to defining “intelligent behaviour.” We arguably had that with the first chess-playing algorithms.


Yes, and chess engines are commonly referred to by lay people as "chess AIs". The general population has a far more generous definition of AI than the typical HN user does.


I don't think the problem with Tesla's autopilot is its name, but rather the misleading promises that were made by Musk.

What is the definition of AI that "the public" uses, and where did you find it?


> What is the definition of AI that "the public" uses, and where did you find it

There isn't an agreed-upon definition academics use, either. (Intelligence, broadly, remains rigorously undefined.)

But one component of the public definition involves a sense of "knowing," i.e. understanding what is true. This is a source of the confusion and frustration with GPT-4 providing "wrong" answers. The answers aren't usually technically wrong: they're linguistically and logically valid; but the public's expectations of what the model is supposed to be doing doesn't match what it does.


There are many academic definitions of AI, and I would bet ChatGPT would fit 90%+ of them.

People get confused because they associate language with intelligence, or maybe they are just not technically literate. I don't think we should abandon correctly used terminology because laymen did not catch up to it yet.


> are many academic definitions of AI, and I would bet ChatGPT would fit 90%+ of them

This wasn't contested.

> don't think we should abandon correctly used terminology because laymen did not catch up to it yet

No. But the general understanding should be a consideration when marketing a product. (I'd also argue that the conventional definitions of artificiality and intelligence vastly predate the technical definitions of AI. The terminology was always aspirational. Sort of like autopilot.)


Have you actually tried showing GPT-4 to random non-technical public? I'd say that most very much do believe that it is AI by their own "common sense" definition, just based on how they see it talk. It's the technical people who try to argue that it isn't based on the way it's implemented.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: