Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Besides, if LLMs only recycled training data with no changes, they'd just be really bad search engines.

I will admit that in my experience, LLMs tend to be really good at tip-of-my-tongue type stuff. There are certain very particular types of queries that LLMs seem to greatly excel at, and they're mostly where the words I want aren't in the words that I am using. I can just spam vibes into the prompt and have an LLM give me words/phrases that exactly fit what I am looking for, even if I couldn't recall any of the proper terms that would allow good results to turn up from a search engine.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: