

You’re joking right? “making up answers” in the case of search results just means a dead link. If you get a good link 99% of the time and don’t have to use an enshitified service, that’s good enough for 99% of people. Try again is the worst case scenario.
(the best) Local LLMs are FOSS though, if bias is introduced it can be detected and the user base can shift away to another version, unlike centralized cloud LLMs that are private silos.
I also don’t think LLMs of any kind will fully replace search engines, but I do think they will be one of a suite of ML tools that will enable running efficient local (or distributed) indexing and search of the web.