If a search engine result says water is wet, they’ll tell you about it.
If not, then we should consider all the issues around water and wetness, but note that water is a great candidate for wetting things, though it is important to remember that it has severe limitations with respect to wetting things, and, at all costs some other alternatives should be considered, including list of paragraphs about tangential buzzwords such as buckets and watering cans go here.
Why does this apply for math but not for being a doctor?? It can do basic math, but you say that of course it can't do math- math isn't language. The fact that it can do some basic diagnosis does not mean it's good at doctor things or even that its better than webmd.
Arithmetic requires a step-by-step execution of an algorithm. LLMs don't do that implicitly. What they do is vector adjacency search in absurdly high-dimensional space. This makes them good at giving you things related to what you wrote. But it's the opposite of executing arbitrary algorithms.
Or, look at it this way: the LLM doesn't have a "voice in its head" in any form other than a back-and-forth with you. If I gave you any arithmetic problem less trivial than the times table, you won't suddenly come up with the right answer - you'll do some sequence of steps in your head. If you let an LLM voice the steps, it gets better at procedural tasks too.
Despite the article, I don’t think it would be a good doctor.
I read a report of a doctor who tried it on his case files from the ER (I’m sure it was here in HN) It called some of the cases correctly, missed a few others, and would have killed one woman. I’m sure it has its place, but use a real doctor if your symptoms are in any way concerning.